The statistical analysis of global climate change studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, J.W.
1992-01-01
The focus of this work is to contribute to the enhancement of the relationship between climatologists and statisticians. The analysis of global change data has been underway for many years by atmospheric scientists. Much of this analysis includes a heavy reliance on statistics and statistical inference. Some specific climatological analyses are presented and the dependence on statistics is documented before the analysis is undertaken. The first problem presented involves the fluctuation-dissipation theorem and its application to global climate models. This problem has a sound theoretical niche in the literature of both climate modeling and physics, but a statistical analysis inmore » which the data is obtained from the model to show graphically the relationship has not been undertaken. It is under this motivation that the author presents this problem. A second problem concerning the standard errors in estimating global temperatures is purely statistical in nature although very little materials exists for sampling on such a frame. This problem not only has climatological and statistical ramifications, but political ones as well. It is planned to use these results in a further analysis of global warming using actual data collected on the earth. In order to simplify the analysis of these problems, the development of a computer program, MISHA, is presented. This interactive program contains many of the routines, functions, graphics, and map projections needed by the climatologist in order to effectively enter the arena of data visualization.« less
Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco
NASA Astrophysics Data System (ADS)
Bounoua, Z.; Mechaqrane, A.
2018-05-01
An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.
The GEOS Ozone Data Assimilation System: Specification of Error Statistics
NASA Technical Reports Server (NTRS)
Stajner, Ivanka; Riishojgaard, Lars Peter; Rood, Richard B.
2000-01-01
A global three-dimensional ozone data assimilation system has been developed at the Data Assimilation Office of the NASA/Goddard Space Flight Center. The Total Ozone Mapping Spectrometer (TOMS) total ozone and the Solar Backscatter Ultraviolet (SBUV) or (SBUV/2) partial ozone profile observations are assimilated. The assimilation, into an off-line ozone transport model, is done using the global Physical-space Statistical Analysis Scheme (PSAS). This system became operational in December 1999. A detailed description of the statistical analysis scheme, and in particular, the forecast and observation error covariance models is given. A new global anisotropic horizontal forecast error correlation model accounts for a varying distribution of observations with latitude. Correlations are largest in the zonal direction in the tropics where data is sparse. Forecast error variance model is proportional to the ozone field. The forecast error covariance parameters were determined by maximum likelihood estimation. The error covariance models are validated using x squared statistics. The analyzed ozone fields in the winter 1992 are validated against independent observations from ozone sondes and HALOE. There is better than 10% agreement between mean Halogen Occultation Experiment (HALOE) and analysis fields between 70 and 0.2 hPa. The global root-mean-square (RMS) difference between TOMS observed and forecast values is less than 4%. The global RMS difference between SBUV observed and analyzed ozone between 50 and 3 hPa is less than 15%.
Optimizing human activity patterns using global sensitivity analysis.
Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M
2014-12-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.
Optimizing human activity patterns using global sensitivity analysis
Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.
2014-01-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080
Optimizing human activity patterns using global sensitivity analysis
Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; ...
2013-12-10
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less
A global goodness-of-fit statistic for Cox regression models.
Parzen, M; Lipsitz, S R
1999-06-01
In this paper, a global goodness-of-fit test statistic for a Cox regression model, which has an approximate chi-squared distribution when the model has been correctly specified, is proposed. Our goodness-of-fit statistic is global and has power to detect if interactions or higher order powers of covariates in the model are needed. The proposed statistic is similar to the Hosmer and Lemeshow (1980, Communications in Statistics A10, 1043-1069) goodness-of-fit statistic for binary data as well as Schoenfeld's (1980, Biometrika 67, 145-153) statistic for the Cox model. The methods are illustrated using data from a Mayo Clinic trial in primary billiary cirrhosis of the liver (Fleming and Harrington, 1991, Counting Processes and Survival Analysis), in which the outcome is the time until liver transplantation or death. The are 17 possible covariates. Two Cox proportional hazards models are fit to the data, and the proposed goodness-of-fit statistic is applied to the fitted models.
NASA Technical Reports Server (NTRS)
Hailperin, Max
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.
NASA Technical Reports Server (NTRS)
Hailperin, M.
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.
Identifiability of PBPK Models with Applications to ...
Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy
Mapping the global health employment market: an analysis of global health jobs.
Keralis, Jessica M; Riggin-Pathak, Brianne L; Majeski, Theresa; Pathak, Bogdan A; Foggia, Janine; Cullinen, Kathleen M; Rajagopal, Abbhirami; West, Heidi S
2018-02-27
The number of university global health training programs has grown in recent years. However, there is little research on the needs of the global health profession. We therefore set out to characterize the global health employment market by analyzing global health job vacancies. We collected data from advertised, paid positions posted to web-based job boards, email listservs, and global health organization websites from November 2015 to May 2016. Data on requirements for education, language proficiency, technical expertise, physical location, and experience level were analyzed for all vacancies. Descriptive statistics were calculated for the aforementioned job characteristics. Associations between technical specialty area and requirements for non-English language proficiency and overseas experience were calculated using Chi-square statistics. A qualitative thematic analysis was performed on a subset of vacancies. We analyzed the data from 1007 global health job vacancies from 127 employers. Among private and non-profit sector vacancies, 40% (n = 354) were for technical or subject matter experts, 20% (n = 177) for program directors, and 16% (n = 139) for managers, compared to 9.8% (n = 87) for entry-level and 13.6% (n = 120) for mid-level positions. The most common technical focus area was program or project management, followed by HIV/AIDS and quantitative analysis. Thematic analysis demonstrated a common emphasis on program operations, relations, design and planning, communication, and management. Our analysis shows a demand for candidates with several years of experience with global health programs, particularly program managers/directors and technical experts, with very few entry-level positions accessible to recent graduates of global health training programs. It is unlikely that global health training programs equip graduates to be competitive for the majority of positions that are currently available in this field.
Methods and apparatuses for information analysis on shared and distributed computing systems
Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA
2011-02-22
Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.
Evidence for a Global Sampling Process in Extraction of Summary Statistics of Item Sizes in a Set.
Tokita, Midori; Ueda, Sachiyo; Ishiguchi, Akira
2016-01-01
Several studies have shown that our visual system may construct a "summary statistical representation" over groups of visual objects. Although there is a general understanding that human observers can accurately represent sets of a variety of features, many questions on how summary statistics, such as an average, are computed remain unanswered. This study investigated sampling properties of visual information used by human observers to extract two types of summary statistics of item sets, average and variance. We presented three models of ideal observers to extract the summary statistics: a global sampling model without sampling noise, global sampling model with sampling noise, and limited sampling model. We compared the performance of an ideal observer of each model with that of human observers using statistical efficiency analysis. Results suggest that summary statistics of items in a set may be computed without representing individual items, which makes it possible to discard the limited sampling account. Moreover, the extraction of summary statistics may not necessarily require the representation of individual objects with focused attention when the sets of items are larger than 4.
Revisiting the climate impacts of cool roofs around the globe using an Earth system model
NASA Astrophysics Data System (ADS)
Zhang, Jiachen; Zhang, Kai; Liu, Junfeng; Ban-Weiss, George
2016-08-01
Solar reflective ‘cool roofs’ absorb less sunlight than traditional dark roofs, reducing solar heat gain, and decreasing the amount of heat transferred to the atmosphere. Widespread adoption of cool roofs could therefore reduce temperatures in urban areas, partially mitigating the urban heat island effect, and contributing to reversing the local impacts of global climate change. The impacts of cool roofs on global climate remain debated by past research and are uncertain. Using a sophisticated Earth system model, the impacts of cool roofs on climate are investigated at urban, continental, and global scales. We find that global adoption of cool roofs in urban areas reduces urban heat islands everywhere, with an annual- and global-mean decrease from 1.6 to 1.2 K. Decreases are statistically significant, except for some areas in Africa and Mexico where urban fraction is low, and some high-latitude areas during wintertime. Analysis of the surface and TOA energy budget in urban regions at continental-scale shows cool roofs causing increases in solar radiation leaving the Earth-atmosphere system in most regions around the globe, though the presence of aerosols and clouds are found to partially offset increases in upward radiation. Aerosols dampen cool roof-induced increases in upward solar radiation, ranging from 4% in the United States to 18% in more polluted China. Adoption of cool roofs also causes statistically significant reductions in surface air temperatures in urbanized regions of China (-0.11 ± 0.10 K) and the United States (-0.14 ± 0.12 K); India and Europe show statistically insignificant changes. Though past research has disagreed on whether widespread adoption of cool roofs would cool or warm global climate, these studies have lacked analysis on the statistical significance of global temperature changes. The research presented here indicates that adoption of cool roofs around the globe would lead to statistically insignificant reductions in global mean air temperature (-0.0021 ± 0.026 K). Thus, we suggest that while cool roofs are an effective tool for reducing building energy use in hot climates, urban heat islands, and regional air temperatures, their influence on global climate is likely negligible.
Revisiting the Climate Impacts of Cool Roofs around the Globe Using an Earth System Model
NASA Astrophysics Data System (ADS)
Zhang, J.; Ban-Weiss, G. A.; Zhang, K.; Liu, J.
2016-12-01
Solar reflective "cool roofs" absorb less sunlight than traditional dark roofs, reducing solar heat gain, and decreasing the amount of heat transferred to the atmosphere. Widespread adoption of cool roofs could therefore reduce temperatures in urban areas, partially mitigating the urban heat island effect, and contributing to reversing the local impacts of global climate change. The impacts of cool roofs on global climate remain debated by past research and are uncertain. Using a sophisticated Earth system model, the impacts of cool roofs on climate are investigated at urban, continental, and global scales. We find that global adoption of cool roofs in urban areas reduces urban heat islands everywhere, with an annual- and global-mean decrease from 1.6 to 1.2 K. Decreases are statistically significant, except for some areas in Africa and Mexico where urban fraction is low, and some high-latitude areas during wintertime. Analysis of the surface and TOA energy budget in urban regions at continental-scale shows cool roofs causing increases in solar radiation leaving the Earth-atmosphere system in most regions around the globe, though the presence of aerosols and clouds are found to partially offset increases in upward radiation. Aerosols dampen cool roof-induced increases in upward solar radiation, ranging from 4% in the United States to 18% in more polluted China. Adoption of cool roofs also causes statistically significant reductions in surface air temperatures in urbanized regions of China (-0.11±0.10 K) and the United States (-0.14±0.12 K); India and Europe show statistically insignificant changes. Though past research has disagreed on whether widespread adoption of cool roofs would cool or warm global climate, these studies have lacked analysis on the statistical significance of global temperature changes. The research presented here indicates that adoption of cool roofs around the globe would lead to statistically insignificant reductions in global mean air temperature (-0.0021 ± 0.026 K). Thus, we suggest that while cool roofs are an effective tool for reducing building energy use in hot climates, urban heat islands, and regional air temperatures, their influence on global climate is likely negligible.
NASA Astrophysics Data System (ADS)
Wang, Audrey; Price, David T.
2007-03-01
A simple integrated algorithm was developed to relate global climatology to distributions of tree plant functional types (PFT). Multivariate cluster analysis was performed to analyze the statistical homogeneity of the climate space occupied by individual tree PFTs. Forested regions identified from the satellite-based GLC2000 classification were separated into tropical, temperate, and boreal sub-PFTs for use in the Canadian Terrestrial Ecosystem Model (CTEM). Global data sets of monthly minimum temperature, growing degree days, an index of climatic moisture, and estimated PFT cover fractions were then used as variables in the cluster analysis. The statistical results for individual PFT clusters were found consistent with other global-scale classifications of dominant vegetation. As an improvement of the quantification of the climatic limitations on PFT distributions, the results also demonstrated overlapping of PFT cluster boundaries that reflected vegetation transitions, for example, between tropical and temperate biomes. The resulting global database should provide a better basis for simulating the interaction of climate change and terrestrial ecosystem dynamics using global vegetation models.
LaRiccia, Patrick J; Farrar, John T; Sammel, Mary D; Gallo, Joseph J
2008-07-01
To determine the efficacy of the food supplement OPC Factor to increase energy levels in healthy adults aged 45 to 65. Randomized, placebo-controlled, triple-blind crossover study. Twenty-five (25) healthy adults recruited from the University of Pennsylvania Health System. OPC Factor,trade mark (AlivenLabs, Lebanon, TN) a food supplement that contains oligomeric proanthocyanidins from grape seeds and pine bark along with other nutrient supplements including vitamins and minerals, was in the form of an effervescent powder. The placebo was similar in appearance and taste. Five outcome measurements were performed: (1) Energy subscale scores of the Activation-Deactivation Adjective Check List (AD ACL); (2) One (1) global question of percent energy change (Global Energy Percent Change); (3) One (1) global question of energy change measured on a Likert scale (Global Energy Scale Change); 4. One (1) global question of percent overall status change (Global Overall Status Percent Change); and (5) One (1) global question of overall status change measured on a Likert scale (Global Overall Status Scale Change). There were no carryover/period effects in the groups randomized to Placebo/Active Product sequence versus Active Product/Placebo sequence. Examination of the AD ACL Energy subscale scores for the Active Product versus Placebo comparison revealed no significant difference in the intention-to-treat (IT) analysis and the treatment received (TR) analysis. However, Global Energy Percent Change (p = 0.06) and Global Energy Scale Change (p = 0.09) both closely approached conventional levels of statistical significance for the active product in the IT analysis. Global Energy Percent Change (p = 0.05) and Global Energy Scale Change (p = 0.04) reached statistical significance in the TR analysis. A cumulative percent responders analysis graph indicated greater response rates for the active product. OPC Factor may increase energy levels in healthy adults aged 45-65 years. A larger study is recommended. Clinical Trials.gov identifier: NCT03318019.
Qiao, Xue; Lin, Xiong-hao; Ji, Shuai; Zhang, Zheng-xiang; Bo, Tao; Guo, De-an; Ye, Min
2016-01-05
To fully understand the chemical diversity of an herbal medicine is challenging. In this work, we describe a new approach to globally profile and discover novel compounds from an herbal extract using multiple neutral loss/precursor ion scanning combined with substructure recognition and statistical analysis. Turmeric (the rhizomes of Curcuma longa L.) was used as an example. This approach consists of three steps: (i) multiple neutral loss/precursor ion scanning to obtain substructure information; (ii) targeted identification of new compounds by extracted ion current and substructure recognition; and (iii) untargeted identification using total ion current and multivariate statistical analysis to discover novel structures. Using this approach, 846 terpecurcumins (terpene-conjugated curcuminoids) were discovered from turmeric, including a number of potentially novel compounds. Furthermore, two unprecedented compounds (terpecurcumins X and Y) were purified, and their structures were identified by NMR spectroscopy. This study extended the application of mass spectrometry to global profiling of natural products in herbal medicines and could help chemists to rapidly discover novel compounds from a complex matrix.
An Analysis of San Diego's Housing Market Using a Geographically Weighted Regression Approach
NASA Astrophysics Data System (ADS)
Grant, Christina P.
San Diego County real estate transaction data was evaluated with a set of linear models calibrated by ordinary least squares and geographically weighted regression (GWR). The goal of the analysis was to determine whether the spatial effects assumed to be in the data are best studied globally with no spatial terms, globally with a fixed effects submarket variable, or locally with GWR. 18,050 single-family residential sales which closed in the six months between April 2014 and September 2014 were used in the analysis. Diagnostic statistics including AICc, R2, Global Moran's I, and visual inspection of diagnostic plots and maps indicate superior model performance by GWR as compared to both global regressions.
Linear retrieval and global measurements of wind speed from the Seasat SMMR
NASA Technical Reports Server (NTRS)
Pandey, P. C.
1983-01-01
Retrievals of wind speed (WS) from Seasat Scanning Multichannel Microwave Radiometer (SMMR) were performed using a two-step statistical technique. Nine subsets of two to five SMMR channels were examined for wind speed retrieval. These subsets were derived by using a leaps and bound procedure based on the coefficient of determination selection criteria to a statistical data base of brightness temperatures and geophysical parameters. Analysis of Monsoon Experiment and ocean station PAPA data showed a strong correlation between sea surface temperature and water vapor. This relation was used in generating the statistical data base. Global maps of WS were produced for one and three month periods.
BrightStat.com: free statistics online.
Stricker, Daniel
2008-10-01
Powerful software for statistical analysis is expensive. Here I present BrightStat, a statistical software running on the Internet which is free of charge. BrightStat's goals, its main capabilities and functionalities are outlined. Three different sample runs, a Friedman test, a chi-square test, and a step-wise multiple regression are presented. The results obtained by BrightStat are compared with results computed by SPSS, one of the global leader in providing statistical software, and VassarStats, a collection of scripts for data analysis running on the Internet. Elementary statistics is an inherent part of academic education and BrightStat is an alternative to commercial products.
Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter
2016-05-01
Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.
Yu, Xiaojin; Liu, Pei; Min, Jie; Chen, Qiguang
2009-01-01
To explore the application of regression on order statistics (ROS) in estimating nondetects for food exposure assessment. Regression on order statistics was adopted in analysis of cadmium residual data set from global food contaminant monitoring, the mean residual was estimated basing SAS programming and compared with the results from substitution methods. The results show that ROS method performs better obviously than substitution methods for being robust and convenient for posterior analysis. Regression on order statistics is worth to adopt,but more efforts should be make for details of application of this method.
NASA Technical Reports Server (NTRS)
Christensen, E. J.; Haines, B. J.; Mccoll, K. C.; Nerem, R. S.
1994-01-01
We have compared Global Positioning System (GPS)-based dynamic and reduced-dynamic TOPEX/Poseidon orbits over three 10-day repeat cycles of the ground-track. The results suggest that the prelaunch joint gravity model (JGM-1) introduces geographically correlated errors (GCEs) which have a strong meridional dependence. The global distribution and magnitude of these GCEs are consistent with a prelaunch covariance analysis, with estimated and predicted global rms error statistics of 2.3 and 2.4 cm rms, respectively. Repeating the analysis with the post-launch joint gravity model (JGM-2) suggests that a portion of the meridional dependence observed in JGM-1 still remains, with global rms error of 1.2 cm.
Do climate extreme events foster violent civil conflicts? A coincidence analysis
NASA Astrophysics Data System (ADS)
Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.
2014-05-01
Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework
NASA Astrophysics Data System (ADS)
Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.
2016-12-01
The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.
Improving Incremental Balance in the GSI 3DVAR Analysis System
NASA Technical Reports Server (NTRS)
Errico, Ronald M.; Yang, Runhua; Kleist, Daryl T.; Parrish, David F.; Derber, John C.; Treadon, Russ
2008-01-01
The Gridpoint Statistical Interpolation (GSI) analysis system is a unified global/regional 3DVAR analysis code that has been under development for several years at the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center. It has recently been implemented into operations at NCEP in both the global and North American data assimilation systems (GDAS and NDAS). An important aspect of this development has been improving the balance of the analysis produced by GSI. The improved balance between variables has been achieved through the inclusion of a Tangent Linear Normal Mode Constraint (TLNMC). The TLNMC method has proven to be very robust and effective. The TLNMC as part of the global GSI system has resulted in substantial improvement in data assimilation both at NCEP and at the NASA Global Modeling and Assimilation Office (GMAO).
The Global Oscillation Network Group site survey. 1: Data collection and analysis methods
NASA Technical Reports Server (NTRS)
Hill, Frank; Fischer, George; Grier, Jennifer; Leibacher, John W.; Jones, Harrison B.; Jones, Patricia P.; Kupke, Renate; Stebbins, Robin T.
1994-01-01
The Global Oscillation Network Group (GONG) Project is planning to place a set of instruments around the world to observe solar oscillations as continuously as possible for at least three years. The Project has now chosen the sites that will comprise the network. This paper describes the methods of data collection and analysis that were used to make this decision. Solar irradiance data were collected with a one-minute cadence at fifteen sites around the world and analyzed to produce statistics of cloud cover, atmospheric extinction, and transparency power spectra at the individual sites. Nearly 200 reasonable six-site networks were assembled from the individual stations, and a set of statistical measures of the performance of the networks was analyzed using a principal component analysis. An accompanying paper presents the results of the survey.
Effect of local and global geomagnetic activity on human cardiovascular homeostasis.
Dimitrova, Svetla; Stoilova, Irina; Yanev, Toni; Cholakov, Ilia
2004-02-01
The authors investigated the effects of local and planetary geomagnetic activity on human physiology. They collected data in Sofia, Bulgaria, from a group of 86 volunteers during the periods of the autumnal and vernal equinoxes. They used the factors local/planetary geomagnetic activity, day of measurement, gender, and medication use to apply a four-factor multiple analysis of variance. They also used a post hoc analysis to establish the statistical significance of the differences between the average values of the measured physiological parameters in the separate factor levels. In addition, the authors performed correlation analysis between the physiological parameters examined and geophysical factors. The results revealed that geomagnetic changes had a statistically significant influence on arterial blood pressure. Participants expressed this reaction with weak local geomagnetic changes and when major and severe global geomagnetic storms took place.
Global Document Delivery, User Studies, and Service Evaluation: The Gateway Experience
ERIC Educational Resources Information Center
Miller, Rush; Xu, Hong; Zou, Xiuying
2008-01-01
This study examines user and service data from 2002-2006 at the East Asian Gateway Service for Chinese and Korean Academic Journal Publications (Gateway Service), the University of Pittsburgh. Descriptive statistical analysis reveals that the Gateway Service has been consistently playing the leading role in global document delivery service as well…
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2012-11-01
Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1993-01-01
This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.
Local conformity induced global oscillation
NASA Astrophysics Data System (ADS)
Li, Dong; Li, Wei; Hu, Gang; Zheng, Zhigang
2009-04-01
The game ‘rock-paper-scissors’ model, with the consideration of the effect of the psychology of conformity, is investigated. The interaction between each two agents is global, but the strategy of the conformity is local for individuals. In the statistical opinion, the probability of the appearance of each strategy is uniform. The dynamical analysis of this model indicates that the equilibrium state may lose its stability at a threshold and is replaced by a globally oscillating state. The global oscillation is induced by the local conformity, which is originated from the synchronization of individual strategies.
Zhu, Z.; Waller, E.
2003-01-01
Many countries periodically produce national reports on the status and changes of forest resources, using statistical surveys and spatial mapping of remotely sensed data. At the global level, the Food and Agriculture Organization (FAO) of the United Nations has conducted a Forest Resources Assessment (FRA) program every 10 yr since 1980, producing statistics and analysis that give a global synopsis of forest resources in the world. For the year 2000 of the FRA program (FRA2000), a global forest cover map was produced to provide spatial context to the extensive survey. The forest cover map, produced at the U.S. Geological Survey (USGS) EROS Data Center (EDC), has five classes: closed forest, open or fragmented forest, other wooded land, other land cover, and water. The first two forested classes at the global scale were delineated using combinations of temporal compositing, modified mixture analysis, geographic stratification, and other classification techniques. The remaining three FAO classes were derived primarily from the USGS global land cover characteristics database (Loveland et al. 1999). Validated on the basis of existing reference data sets, the map is estimated to be 77% accurate for the first four classes (no reference data were available for water), and 86% accurate for the forest and nonforest classification. The final map will be published as an insert to the FAO FRA2000 report.
Ahlborn, W; Tuz, H J; Uberla, K
1990-03-01
In cohort studies the Mantel-Haenszel estimator ORMH is computed from sample data and is used as a point estimator of relative risk. Test-based confidence intervals are estimated with the help of the asymptotic chi-squared distributed MH-statistic chi 2MHS. The Mantel-extension-chi-squared is used as a test statistic for a dose-response relationship. Both test statistics--the Mantel-Haenszel-chi as well as the Mantel-extension-chi--assume homogeneity of risk across strata, which is rarely present. Also an extended nonparametric statistic, proposed by Terpstra, which is based on the Mann-Whitney-statistics assumes homogeneity of risk across strata. We have earlier defined four risk measures RRkj (k = 1,2,...,4) in the population and considered their estimates and the corresponding asymptotic distributions. In order to overcome the homogeneity assumption we use the delta-method to get "test-based" confidence intervals. Because the four risk measures RRkj are presented as functions of four weights gik we give, consequently, the asymptotic variances of these risk estimators also as functions of the weights gik in a closed form. Approximations to these variances are given. For testing a dose-response relationship we propose a new class of chi 2(1)-distributed global measures Gk and the corresponding global chi 2-test. In contrast to the Mantel-extension-chi homogeneity of risk across strata must not be assumed. These global test statistics are of the Wald type for composite hypotheses.(ABSTRACT TRUNCATED AT 250 WORDS)
NASA Astrophysics Data System (ADS)
Thomas, J. N.; Huard, J.; Masci, F.
2017-02-01
There are many reports on the occurrence of anomalous changes in the ionosphere prior to large earthquakes. However, whether or not these changes are reliable precursors that could be useful for earthquake prediction is controversial within the scientific community. To test a possible statistical relationship between ionospheric disturbances and earthquakes, we compare changes in the total electron content (TEC) of the ionosphere with occurrences of M ≥ 6.0 earthquakes globally for 2000-2014. We use TEC data from the global ionosphere map (GIM) and an earthquake list declustered for aftershocks. For each earthquake, we look for anomalous changes in GIM-TEC within 2.5° latitude and 5.0° longitude of the earthquake location (the spatial resolution of GIM-TEC). Our analysis has not found any statistically significant changes in GIM-TEC prior to earthquakes. Thus, we have found no evidence that would suggest that monitoring changes in GIM-TEC might be useful for predicting earthquakes.
NASA Astrophysics Data System (ADS)
Yazid, N. M.; Din, A. H. M.; Omar, K. M.; Som, Z. A. M.; Omar, A. H.; Yahaya, N. A. Z.; Tugi, A.
2016-09-01
Global geopotential models (GGMs) are vital in computing global geoid undulations heights. Based on the ellipsoidal height by Global Navigation Satellite System (GNSS) observations, the accurate orthometric height can be calculated by adding precise and accurate geoid undulations model information. However, GGMs also provide data from the satellite gravity missions such as GRACE, GOCE and CHAMP. Thus, this will assist to enhance the global geoid undulations data. A statistical assessment has been made between geoid undulations derived from 4 GGMs and the airborne gravity data provided by Department of Survey and Mapping Malaysia (DSMM). The goal of this study is the selection of the best possible GGM that best matches statistically with the geoid undulations of airborne gravity data under the Marine Geodetic Infrastructures in Malaysian Waters (MAGIC) Project over marine areas in Sabah. The correlation coefficients and the RMS value for the geoid undulations of GGM and airborne gravity data were computed. The correlation coefficients between EGM 2008 and airborne gravity data is 1 while RMS value is 0.1499.In this study, the RMS value of EGM 2008 is the lowest among the others. Regarding to the statistical analysis, it clearly represents that EGM 2008 is the best fit for marine geoid undulations throughout South China Sea.
Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.
Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven
2016-02-06
The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
Pan, Wei; Hu, Yuan-Jia; Wang, Yi-Tao
2011-08-01
The structure of international flow of acupuncture knowledge was explored in this article so as to promote the globalization of acupuncture technology innovation. Statistical methods were adopted to reveal geographical distribution of acupuncture patents in the U.S.A. and the influencing factors of cumulative advantage of acupuncture techniques as well as innovation value of application of acupuncture patents. Social network analysis was also utilized to establish a global innovation network of acupuncture technology. The result shows that the cumulative strength on acupuncture technology correlates with the patent retention period. The innovative value of acupuncture invention correlates with the frequency of patent citation. And the U. S. A. and Canada seize central positions in the global acupuncture information and technology delivery system.
Climate science: Breaks in trends
NASA Astrophysics Data System (ADS)
Pretis, Felix; Allen, Myles
2013-12-01
Global temperature rise since industrialization has not been uniform. A statistical analysis suggests that past changes in the rate of warming can be directly attributed to human influences, from economic downturns to the regulations of the Montreal Protocol.
A global estimate of the Earth's magnetic crustal thickness
NASA Astrophysics Data System (ADS)
Vervelidou, Foteini; Thébault, Erwan
2014-05-01
The Earth's lithosphere is considered to be magnetic only down to the Curie isotherm. Therefore the Curie isotherm can, in principle, be estimated by analysis of magnetic data. Here, we propose such an analysis in the spectral domain by means of a newly introduced regional spatial power spectrum. This spectrum is based on the Revised Spherical Cap Harmonic Analysis (R-SCHA) formalism (Thébault et al., 2006). We briefly discuss its properties and its relationship with the Spherical Harmonic spatial power spectrum. This relationship allows us to adapt any theoretical expression of the lithospheric field power spectrum expressed in Spherical Harmonic degrees to the regional formulation. We compared previously published statistical expressions (Jackson, 1994 ; Voorhies et al., 2002) to the recent lithospheric field models derived from the CHAMP and airborne measurements and we finally developed a new statistical form for the power spectrum of the Earth's magnetic lithosphere that we think provides more consistent results. This expression depends on the mean magnetization, the mean crustal thickness and a power law value that describes the amount of spatial correlation of the sources. In this study, we make a combine use of the R-SCHA surface power spectrum and this statistical form. We conduct a series of regional spectral analyses for the entire Earth. For each region, we estimate the R-SCHA surface power spectrum of the NGDC-720 Spherical Harmonic model (Maus, 2010). We then fit each of these observational spectra to the statistical expression of the power spectrum of the Earth's lithosphere. By doing so, we estimate the large wavelengths of the magnetic crustal thickness on a global scale that are not accessible directly from the magnetic measurements due to the masking core field. We then discuss these results and compare them to the results we obtained by conducting a similar spectral analysis, but this time in the cartesian coordinates, by means of a published statistical expression (Maus et al., 1997). We also compare our results to crustal thickness global maps derived by means of additional geophysical data (Purucker et al., 2002).
[Visual field progression in glaucoma: cluster analysis].
Bresson-Dumont, H; Hatton, J; Foucher, J; Fonteneau, M
2012-11-01
Visual field progression analysis is one of the key points in glaucoma monitoring, but distinction between true progression and random fluctuation is sometimes difficult. There are several different algorithms but no real consensus for detecting visual field progression. The trend analysis of global indices (MD, sLV) may miss localized deficits or be affected by media opacities. Conversely, point-by-point analysis makes progression difficult to differentiate from physiological variability, particularly when the sensitivity of a point is already low. The goal of our study was to analyse visual field progression with the EyeSuite™ Octopus Perimetry Clusters algorithm in patients with no significant changes in global indices or worsening of the analysis of pointwise linear regression. We analyzed the visual fields of 162 eyes (100 patients - 58 women, 42 men, average age 66.8 ± 10.91) with ocular hypertension or glaucoma. For inclusion, at least six reliable visual fields per eye were required, and the trend analysis (EyeSuite™ Perimetry) of visual field global indices (MD and SLV), could show no significant progression. The analysis of changes in cluster mode was then performed. In a second step, eyes with statistically significant worsening of at least one of their clusters were analyzed point-by-point with the Octopus Field Analysis (OFA). Fifty four eyes (33.33%) had a significant worsening in some clusters, while their global indices remained stable over time. In this group of patients, more advanced glaucoma was present than in stable group (MD 6.41 dB vs. 2.87); 64.82% (35/54) of those eyes in which the clusters progressed, however, had no statistically significant change in the trend analysis by pointwise linear regression. Most software algorithms for analyzing visual field progression are essentially trend analyses of global indices, or point-by-point linear regression. This study shows the potential role of analysis by clusters trend. However, for best results, it is preferable to compare the analyses of several tests in combination with morphologic exam. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Revised Perturbation Statistics for the Global Scale Atmospheric Model
NASA Technical Reports Server (NTRS)
Justus, C. G.; Woodrum, A.
1975-01-01
Magnitudes and scales of atmospheric perturbations about the monthly mean for the thermodynamic variables and wind components are presented by month at various latitudes. These perturbation statistics are a revision of the random perturbation data required for the global scale atmospheric model program and are from meteorological rocket network statistical summaries in the 22 to 65 km height range and NASA grenade and pitot tube data summaries in the region up to 90 km. The observed perturbations in the thermodynamic variables were adjusted to make them consistent with constraints required by the perfect gas law and the hydrostatic equation. Vertical scales were evaluated by Buell's depth of pressure system equation and from vertical structure function analysis. Tables of magnitudes and vertical scales are presented for each month at latitude 10, 30, 50, 70, and 90 degrees.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollias, Pavlos
This is a multi-institutional, collaborative project using a three-tier modeling approach to bridge field observations and global cloud-permitting models, with emphases on cloud population structural evolution through various large-scale environments. Our contribution was in data analysis for the generation of high value cloud and precipitation products and derive cloud statistics for model validation. There are two areas in data analysis that we contributed: the development of a synergistic cloud and precipitation cloud classification that identify different cloud (e.g. shallow cumulus, cirrus) and precipitation types (shallow, deep, convective, stratiform) using profiling ARM observations and the development of a quantitative precipitation ratemore » retrieval algorithm using profiling ARM observations. Similar efforts have been developed in the past for precipitation (weather radars), but not for the millimeter-wavelength (cloud) radar deployed at the ARM sites.« less
NASA Astrophysics Data System (ADS)
Barré, Anthony; Suard, Frédéric; Gérard, Mathias; Montaru, Maxime; Riu, Delphine
2014-01-01
This paper describes the statistical analysis of recorded data parameters of electrical battery ageing during electric vehicle use. These data permit traditional battery ageing investigation based on the evolution of the capacity fade and resistance raise. The measured variables are examined in order to explain the correlation between battery ageing and operating conditions during experiments. Such study enables us to identify the main ageing factors. Then, detailed statistical dependency explorations present the responsible factors on battery ageing phenomena. Predictive battery ageing models are built from this approach. Thereby results demonstrate and quantify a relationship between variables and battery ageing global observations, and also allow accurate battery ageing diagnosis through predictive models.
Statistical polarization in greenhouse gas emissions: Theory and evidence.
Remuzgo, Lorena; Trueba, Carmen
2017-11-01
The current debate on climate change is over whether global warming can be limited in order to lessen its impacts. In this sense, evidence of a decrease in the statistical polarization in greenhouse gas (GHG) emissions could encourage countries to establish a stronger multilateral climate change agreement. Based on the interregional and intraregional components of the multivariate generalised entropy measures (Maasoumi, 1986), Gigliarano and Mosler (2009) proposed to study the statistical polarization concept from a multivariate view. In this paper, we apply this approach to study the evolution of such phenomenon in the global distribution of the main GHGs. The empirical analysis has been carried out for the time period 1990-2011, considering an endogenous grouping of countries (Aghevli and Mehran, 1981; Davies and Shorrocks, 1989). Most of the statistical polarization indices showed a slightly increasing pattern that was similar regardless of the number of groups considered. Finally, some policy implications are commented. Copyright © 2017 Elsevier Ltd. All rights reserved.
Portraits of self-organization in fish schools interacting with robots
NASA Astrophysics Data System (ADS)
Aureli, M.; Fiorilli, F.; Porfiri, M.
2012-05-01
In this paper, we propose an enabling computational and theoretical framework for the analysis of experimental instances of collective behavior in response to external stimuli. In particular, this work addresses the characterization of aggregation and interaction phenomena in robot-animal groups through the exemplary analysis of fish schooling in the vicinity of a biomimetic robot. We adapt global observables from statistical mechanics to capture the main features of the shoal collective motion and its response to the robot from experimental observations. We investigate the shoal behavior by using a diffusion mapping analysis performed on these global observables that also informs the definition of relevant portraits of self-organization.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Global map of lithosphere thermal thickness on a 1 deg x 1 deg grid - digitally available
NASA Astrophysics Data System (ADS)
Artemieva, Irina
2014-05-01
This presentation reports a 1 deg ×1 deg global thermal model for the continental lithosphere (TC1). The model is digitally available from the author's web-site: www.lithosphere.info. Geotherms for continental terranes of different ages (early Archean to present) are constrained by reliable data on borehole heat flow measurements (Artemieva and Mooney, 2001), checked with the original publications for data quality, and corrected for paleo-temperature effects where needed. These data are supplemented by cratonic geotherms based on xenolith data. Since heat flow measurements cover not more than half of the continents, the remaining areas (ca. 60% of the continents) are filled by the statistical numbers derived from the thermal model constrained by borehole data. Continental geotherms are statistically analyzed as a function of age and are used to estimate lithospheric temperatures in continental regions with no or low quality heat flow data. This analysis requires knowledge of lithosphere age globally. A compilation of tectono-thermal ages of lithospheric terranes on a 1 deg × 1 deg grid forms the basis for the statistical analysis. It shows that, statistically, lithospheric thermal thickness z (in km) depends on tectono-thermal age t (in Ma) as: z=0.04t+93.6. This relationship formed the basis for a global thermal model of the continental lithosphere (TC1). Statistical analysis of continental geotherms also reveals that this relationship holds for the Archean cratons in general, but not in detail. Particularly, thick (more than 250 km) lithosphere is restricted solely to young Archean terranes (3.0-2.6 Ga), while in old Archean cratons (3.6-3.0 Ga) lithospheric roots do not extend deeper than 200-220 km. The TC1 model is presented by a set of maps, which show significant thermal heterogeneity within continental upper mantle. The strongest lateral temperature variations (as large as 800 deg C) are typical of the shallow mantle (depth less than 100 km). A map of the depth to a 600 deg C isotherm in continental upper mantle is presented as a proxy to the elastic thickness of the cratonic lithosphere, in which flexural rigidity is dominated by olivine rheology of the mantle. The TC1 model of the lithosphere thickness is used to calculate the growth and preservation rates of the lithosphere since the Archean.
Participatory Sensing Marine Debris: Current Trends and Future Opportunities
NASA Astrophysics Data System (ADS)
Jambeck, J.; Johnsen, K.
2016-02-01
The monitoring of litter and debris is challenging at the global scale because of spatial and temporal variability, disconnected local organizations and the use of paper and pen for documentation. The Marine Debris Tracker mobile app and citizen science program allows for the collection of global standardized data at a scale, speed and efficiency that was not previously possible. The app itself also serves as an outreach and education tool, creating an engaged participatory sensing instrument. This instrument is characterized by several aspects including range and frequency, accuracy and precision, accessibility, measurement dimensions, participant performance, and statistical analysis. Also, important to Marine Debris Tracker is open data and transparency. A web portal provides data that users have logged allowing immediate feedback to users and additional education opportunities. The engagement of users through a top tracker competition and social media keeps participants interested in the Marine Debris Tracker community. Over half a million items have been tracked globally, and maps provide both global and local distribution of data. The Marine Debris Tracker community and dataset continues to grow daily. We will present current usage and engagement, participatory sensing data distributions, choropleth maps of areas of active tracking, and discuss future technologies and platforms to expand data collection and conduct statistical analysis.
NASA Astrophysics Data System (ADS)
Guadagnini, A.; Riva, M.; Dell'Oca, A.
2017-12-01
We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.
Ratajczak, Karina; Płomiński, Janusz
2015-01-01
The most common fracture of the distal end of the radius is Colles' fracture. Treatment modalities available for use in hand rehabilitation after injury include massage. The aim of this study was to evaluate the effect of isometric massage on the recovery of hand function in patients with Colles fractures. For this purpose, the strength of the finger flexors was assessed as an objective criterion for the evaluation of hand function. The study involved 40 patients, randomly divided into Group A of 20 patients and Group B of 20 patients. All patients received physical therapy and exercised individually with a physiotherapist. Isometric massage was additionally used in Group A. Global grip strength was assessed using a pneumatic force meter on the first and last day of therapy. Statistical analysis was performed using STATISTICA. Statistical significance was defined as a P value of less than 0.05. In both groups, global grip strength increased significantly after the therapy. There was no statistically significant difference between the groups. The men and women in both groups equally improved grip strength. A statistically significant difference was demonstrated between younger and older patients, with younger patients achieving greater gains in global grip strength in both groups. The incorporation of isometric massage in the rehabilitation plan of patients after a distal radial fracture did not significantly contribute to faster recovery of hand function or improve their quality of life.
NASA Astrophysics Data System (ADS)
Thomas, J. N.; Huard, J.; Masci, F.
2015-12-01
There are many published reports of anomalous changes in the ionosphere prior to large earthquakes. However, whether or not these ionospheric changes are reliable precursors that could be useful for earthquake prediction is controversial within the scientific community. To test a possible statistical relationship between the ionosphere and earthquakes, we compare changes in the total electron content (TEC) of the ionosphere with occurrences of M≥6.0 earthquakes globally for a multiyear period. We use TEC data from a global ionosphere map (GIM) and an earthquake list declustered for aftershocks. For each earthquake, we look for anomalous changes in TEC within ±30 days of the earthquake time and within 2.5° latitude and 5.0° longitude of the earthquake location (the spatial resolution of GIM). Our preliminary analysis, using global TEC and earthquake data for 2002-2010, has not found any statistically significant changes in TEC prior to earthquakes. Thus, we have found no evidence that would suggest that TEC changes are useful for earthquake prediction. Our results are discussed in the context of prior statistical and case studies. Namely, our results agree with Dautermann et al. (2007) who found no relationship between TEC changes and earthquakes in the San Andreas fault region. Whereas, our results disagree with Le et al. (2011) who found an increased rate in TEC anomalies within a few days before global earthquakes M≥6.0.
Load balancing for massively-parallel soft-real-time systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hailperin, M.
1988-09-01
Global load balancing, if practical, would allow the effective use of massively-parallel ensemble architectures for large soft-real-problems. The challenge is to replace quick global communications, which is impractical in a massively-parallel system, with statistical techniques. In this vein, the author proposes a novel approach to decentralized load balancing based on statistical time-series analysis. Each site estimates the system-wide average load using information about past loads of individual sites and attempts to equal that average. This estimation process is practical because the soft-real-time systems of interest naturally exhibit loads that are periodic, in a statistical sense akin to seasonality in econometrics.more » It is shown how this load-characterization technique can be the foundation for a load-balancing system in an architecture employing cut-through routing and an efficient multicast protocol.« less
Long-term sea level trends: Natural or anthropogenic?
NASA Astrophysics Data System (ADS)
Becker, M.; Karpytchev, M.; Lennartz-Sassinek, S.
2014-08-01
Detection and attribution of human influence on sea level rise are important topics that have not yet been explored in depth. We question whether the sea level changes (SLC) over the past century were natural in origin. SLC exhibit power law long-term correlations. By estimating Hurst exponent through Detrended Fluctuation Analysis and by applying statistics of Lennartz and Bunde, we search the lower bounds of statistically significant external sea level trends in longest tidal records worldwide. We provide statistical evidences that the observed SLC, at global and regional scales, is beyond its natural internal variability. The minimum anthropogenic sea level trend (MASLT) contributes to the observed sea level rise more than 50% in New York, Baltimore, San Diego, Marseille, and Mumbai. A MASLT is about 1 mm/yr in global sea level reconstructions that is more than half of the total observed sea level trend during the XXth century.
Maintaining Atmospheric Mass and Water Balance Within Reanalysis
NASA Technical Reports Server (NTRS)
Takacs, Lawrence L.; Suarez, Max; Todling, Ricardo
2015-01-01
This report describes the modifications implemented into the Goddard Earth Observing System Version-5 (GEOS-5) Atmospheric Data Assimilation System (ADAS) to maintain global conservation of dry atmospheric mass as well as to preserve the model balance of globally integrated precipitation and surface evaporation during reanalysis. Section 1 begins with a review of these global quantities from four current reanalysis efforts. Section 2 introduces the modifications necessary to preserve these constraints within the atmospheric general circulation model (AGCM), the Gridpoint Statistical Interpolation (GSI) analysis procedure, and the Incremental Analysis Update (IAU) algorithm. Section 3 presents experiments quantifying the impact of the new procedure. Section 4 shows preliminary results from its use within the GMAO MERRA-2 Reanalysis project. Section 5 concludes with a summary.
Martens, Pim; Akin, Su-Mia; Maud, Huynen; Mohsin, Raza
2010-09-17
It is clear that globalization is something more than a purely economic phenomenon manifesting itself on a global scale. Among the visible manifestations of globalization are the greater international movement of goods and services, financial capital, information and people. In addition, there are technological developments, more transboundary cultural exchanges, facilitated by the freer trade of more differentiated products as well as by tourism and immigration, changes in the political landscape and ecological consequences. In this paper, we link the Maastricht Globalization Index with health indicators to analyse if more globalized countries are doing better in terms of infant mortality rate, under-five mortality rate, and adult mortality rate. The results indicate a positive association between a high level of globalization and low mortality rates. In view of the arguments that globalization provides winners and losers, and might be seen as a disequalizing process, we should perhaps be careful in interpreting the observed positive association as simple evidence that globalization is mostly good for our health. It is our hope that a further analysis of health impacts of globalization may help in adjusting and optimising the process of globalization on every level in the direction of a sustainable and healthy development for all.
Is globalization healthy: a statistical indicator analysis of the impacts of globalization on health
2010-01-01
It is clear that globalization is something more than a purely economic phenomenon manifesting itself on a global scale. Among the visible manifestations of globalization are the greater international movement of goods and services, financial capital, information and people. In addition, there are technological developments, more transboundary cultural exchanges, facilitated by the freer trade of more differentiated products as well as by tourism and immigration, changes in the political landscape and ecological consequences. In this paper, we link the Maastricht Globalization Index with health indicators to analyse if more globalized countries are doing better in terms of infant mortality rate, under-five mortality rate, and adult mortality rate. The results indicate a positive association between a high level of globalization and low mortality rates. In view of the arguments that globalization provides winners and losers, and might be seen as a disequalizing process, we should perhaps be careful in interpreting the observed positive association as simple evidence that globalization is mostly good for our health. It is our hope that a further analysis of health impacts of globalization may help in adjusting and optimising the process of globalization on every level in the direction of a sustainable and healthy development for all. PMID:20849605
Smooth quantile normalization.
Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada
2018-04-01
Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.
Global, Local, and Graphical Person-Fit Analysis Using Person-Response Functions
ERIC Educational Resources Information Center
Emons, Wilco H. M.; Sijtsma, Klaas; Meijer, Rob R.
2005-01-01
Person-fit statistics test whether the likelihood of a respondent's complete vector of item scores on a test is low given the hypothesized item response theory model. This binary information may be insufficient for diagnosing the cause of a misfitting item-score vector. The authors propose a comprehensive methodology for person-fit analysis in the…
Hawthorne L. Beyer; Jeff Jenness; Samuel A. Cushman
2010-01-01
Spatial information systems (SIS) is a term that describes a wide diversity of concepts, techniques, and technologies related to the capture, management, display and analysis of spatial information. It encompasses technologies such as geographic information systems (GIS), global positioning systems (GPS), remote sensing, and relational database management systems (...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jiachen; Zhang, Kai; Liu, Junfeng
Solar reflective “cool roofs” absorb less sunlight than traditional dark roofs, reducing solar heat gain, and decreasing the amount of heat transferred to the atmosphere. Widespread adoption of cool roofs could therefore reduce temperatures in urban areas, partially mitigating the urban heat island effect, and contributing to reversing the local impacts of global climate change. The impacts of cool roofs on global climate remain debated by past research and are uncertain. Using a sophisticated Earth system model, the impacts of cool roofs on climate are investigated at urban, continental, and global scales. We find that global adoption of cool roofsmore » in urban areas reduces urban heat islands everywhere, with an annual- and global-mean decrease from 1.6 to 1.2 K. Decreases are statistically significant, except for some areas in Africa and Mexico where urban fraction is low, and some high-latitude areas during wintertime. Analysis of the surface and TOA energy budget in urban regions at continental-scale shows cool roofs causing increases in solar radiation leaving the Earth-atmosphere system in most regions around the globe, though the presence of aerosols and clouds are found to partially offset increases in upward radiation. Aerosols dampen cool roof-induced increases in upward solar radiation, ranging from 4% in the United States to 18% in more polluted China. Adoption of cool roofs also causes statistically significant reductions in surface air temperatures in urbanized regions of China (0.11±0.10 K) and the United States (0.14±0.12 K); India and Europe show statistically insignificant changes. The research presented here indicates that adoption of cool roofs around the globe would lead to statistically insignificant reductions in global mean air temperature (0.0021 ±0.026 K). This counters past research suggesting that cool roofs can reduce, or even increase global mean temperatures. Thus, we suggest that while cool roofs are an effective tool for reducing building energy use in hot climates, urban heat islands, and regional air temperatures, their influence on global climate is likely negligible.« less
Iterative Monte Carlo analysis of spin-dependent parton distributions
Sato, Nobuo; Melnitchouk, Wally; Kuhn, Sebastian E.; ...
2016-04-05
We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳ 0.1. Furthermore, the study also provides the first determination of the flavor-separated twist-3 PDFsmore » and the d 2 moment of the nucleon within a global PDF analysis.« less
Understanding spatial organizations of chromosomes via statistical analysis of Hi-C data
Hu, Ming; Deng, Ke; Qin, Zhaohui; Liu, Jun S.
2015-01-01
Understanding how chromosomes fold provides insights into the transcription regulation, hence, the functional state of the cell. Using the next generation sequencing technology, the recently developed Hi-C approach enables a global view of spatial chromatin organization in the nucleus, which substantially expands our knowledge about genome organization and function. However, due to multiple layers of biases, noises and uncertainties buried in the protocol of Hi-C experiments, analyzing and interpreting Hi-C data poses great challenges, and requires novel statistical methods to be developed. This article provides an overview of recent Hi-C studies and their impacts on biomedical research, describes major challenges in statistical analysis of Hi-C data, and discusses some perspectives for future research. PMID:26124977
NASA Astrophysics Data System (ADS)
Holt, C. R.; Szunyogh, I.; Gyarmati, G.; Hoffman, R. N.; Leidner, M.
2011-12-01
Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs. We implement the Local Ensemble Transform Kalman Filter algorithm, a vari- ation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area anal- ysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific. The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs. The global component of the ensemble-based system shows improvement in po- sition analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional com- ponent of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.
Colizza, Vittoria; Barrat, Alain; Barthélemy, Marc; Vespignani, Alessandro
2006-02-14
The systematic study of large-scale networks has unveiled the ubiquitous presence of connectivity patterns characterized by large-scale heterogeneities and unbounded statistical fluctuations. These features affect dramatically the behavior of the diffusion processes occurring on networks, determining the ensuing statistical properties of their evolution pattern and dynamics. In this article, we present a stochastic computational framework for the forecast of global epidemics that considers the complete worldwide air travel infrastructure complemented with census population data. We address two basic issues in global epidemic modeling: (i) we study the role of the large scale properties of the airline transportation network in determining the global diffusion pattern of emerging diseases; and (ii) we evaluate the reliability of forecasts and outbreak scenarios with respect to the intrinsic stochasticity of disease transmission and traffic flows. To address these issues we define a set of quantitative measures able to characterize the level of heterogeneity and predictability of the epidemic pattern. These measures may be used for the analysis of containment policies and epidemic risk assessment.
Agriculture, population growth, and statistical analysis of the radiocarbon record.
Zahid, H Jabran; Robinson, Erick; Kelly, Robert L
2016-01-26
The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.
Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra; ...
2017-11-20
The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra
The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less
Multi objective climate change impact assessment using multi downscaled climate scenarios
NASA Astrophysics Data System (ADS)
Rana, Arun; Moradkhani, Hamid
2016-04-01
Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional and global scale. In the present study, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from a set of statistically downscaled GCM projections for Columbia River Basin (CRB). Analysis is performed using 2 different statistically downscaled climate projections namely the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. Analysis is performed on spatial, temporal and frequency based parameters in the future period at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice-versa for temperature. Frequency analysis provided insights into possible explanation to changes in precipitation.
Ozone data and mission sampling analysis
NASA Technical Reports Server (NTRS)
Robbins, J. L.
1980-01-01
A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.
NASA Astrophysics Data System (ADS)
Endreny, Theodore A.; Pashiardis, Stelios
2007-02-01
SummaryRobust and accurate estimates of rainfall frequencies are difficult to make with short, and arid-climate, rainfall records, however new regional and global methods were used to supplement such a constrained 15-34 yr record in Cyprus. The impact of supplementing rainfall frequency analysis with the regional and global approaches was measured with relative bias and root mean square error (RMSE) values. Analysis considered 42 stations with 8 time intervals (5-360 min) in four regions delineated by proximity to sea and elevation. Regional statistical algorithms found the sites passed discordancy tests of coefficient of variation, skewness and kurtosis, while heterogeneity tests revealed the regions were homogeneous to mildly heterogeneous. Rainfall depths were simulated in the regional analysis method 500 times, and then goodness of fit tests identified the best candidate distribution as the general extreme value (GEV) Type II. In the regional analysis, the method of L-moments was used to estimate location, shape, and scale parameters. In the global based analysis, the distribution was a priori prescribed as GEV Type II, a shape parameter was a priori set to 0.15, and a time interval term was constructed to use one set of parameters for all time intervals. Relative RMSE values were approximately equal at 10% for the regional and global method when regions were compared, but when time intervals were compared the global method RMSE had a parabolic-shaped time interval trend. Relative bias values were also approximately equal for both methods when regions were compared, but again a parabolic-shaped time interval trend was found for the global method. The global method relative RMSE and bias trended with time interval, which may be caused by fitting a single scale value for all time intervals.
Multi-region statistical shape model for cochlear implantation
NASA Astrophysics Data System (ADS)
Romera, Jordi; Kjer, H. Martin; Piella, Gemma; Ceresa, Mario; González Ballester, Miguel A.
2016-03-01
Statistical shape models are commonly used to analyze the variability between similar anatomical structures and their use is established as a tool for analysis and segmentation of medical images. However, using a global model to capture the variability of complex structures is not enough to achieve the best results. The complexity of a proper global model increases even more when the amount of data available is limited to a small number of datasets. Typically, the anatomical variability between structures is associated to the variability of their physiological regions. In this paper, a complete pipeline is proposed for building a multi-region statistical shape model to study the entire variability from locally identified physiological regions of the inner ear. The proposed model, which is based on an extension of the Point Distribution Model (PDM), is built for a training set of 17 high-resolution images (24.5 μm voxels) of the inner ear. The model is evaluated according to its generalization ability and specificity. The results are compared with the ones of a global model built directly using the standard PDM approach. The evaluation results suggest that better accuracy can be achieved using a regional modeling of the inner ear.
Zhu, Yun; Fan, Ruzong; Xiong, Momiao
2017-01-01
Investigating the pleiotropic effects of genetic variants can increase statistical power, provide important information to achieve deep understanding of the complex genetic structures of disease, and offer powerful tools for designing effective treatments with fewer side effects. However, the current multiple phenotype association analysis paradigm lacks breadth (number of phenotypes and genetic variants jointly analyzed at the same time) and depth (hierarchical structure of phenotype and genotypes). A key issue for high dimensional pleiotropic analysis is to effectively extract informative internal representation and features from high dimensional genotype and phenotype data. To explore correlation information of genetic variants, effectively reduce data dimensions, and overcome critical barriers in advancing the development of novel statistical methods and computational algorithms for genetic pleiotropic analysis, we proposed a new statistic method referred to as a quadratically regularized functional CCA (QRFCCA) for association analysis which combines three approaches: (1) quadratically regularized matrix factorization, (2) functional data analysis and (3) canonical correlation analysis (CCA). Large-scale simulations show that the QRFCCA has a much higher power than that of the ten competing statistics while retaining the appropriate type 1 errors. To further evaluate performance, the QRFCCA and ten other statistics are applied to the whole genome sequencing dataset from the TwinsUK study. We identify a total of 79 genes with rare variants and 67 genes with common variants significantly associated with the 46 traits using QRFCCA. The results show that the QRFCCA substantially outperforms the ten other statistics. PMID:29040274
SDGs and Geospatial Frameworks: Data Integration in the United States
NASA Astrophysics Data System (ADS)
Trainor, T.
2016-12-01
Responding to the need to monitor a nation's progress towards meeting the Sustainable Development Goals (SDG) outlined in the 2030 U.N. Agenda requires the integration of earth observations with statistical information. The urban agenda proposed in SDG 11 challenges the global community to find a geospatial approach to monitor and measure inclusive, safe, resilient, and sustainable cities and communities. Target 11.7 identifies public safety, accessibility to green and public spaces, and the most vulnerable populations (i.e., women and children, older persons, and persons with disabilities) as the most important priorities of this goal. A challenge for both national statistical organizations and earth observation agencies in addressing SDG 11 is the requirement for detailed statistics at a sufficient spatial resolution to provide the basis for meaningful analysis of the urban population and city environments. Using an example for the city of Pittsburgh, this presentation proposes data and methods to illustrate how earth science and statistical data can be integrated to respond to Target 11.7. Finally, a preliminary series of data initiatives are proposed for extending this method to other global cities.
Data resource profile: United Nations Children's Fund (UNICEF).
Murray, Colleen; Newby, Holly
2012-12-01
The United Nations Children's Fund (UNICEF) plays a leading role in the collection, compilation, analysis and dissemination of data to inform sound policies, legislation and programmes for promoting children's rights and well-being, and for global monitoring of progress towards the Millennium Development Goals. UNICEF maintains a set of global databases representing nearly 200 countries and covering the areas of child mortality, child health, maternal health, nutrition, immunization, water and sanitation, HIV/AIDS, education and child protection. These databases consist of internationally comparable and statistically sound data, and are updated annually through a process that draws on a wealth of data provided by UNICEF's wide network of >150 field offices. The databases are composed primarily of estimates from household surveys, with data from censuses, administrative records, vital registration systems and statistical models contributing to some key indicators as well. The data are assessed for quality based on a set of objective criteria to ensure that only the most reliable nationally representative information is included. For most indicators, data are available at the global, regional and national levels, plus sub-national disaggregation by sex, urban/rural residence and household wealth. The global databases are featured in UNICEF's flagship publications, inter-agency reports, including the Secretary General's Millennium Development Goals Report and Countdown to 2015, sector-specific reports and statistical country profiles. They are also publicly available on www.childinfo.org, together with trend data and equity analyses.
The EUSTACE project: delivering global, daily information on surface air temperature
NASA Astrophysics Data System (ADS)
Ghent, D.; Rayner, N. A.
2017-12-01
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-2018, https://www.eustaceproject.eu) we have developed an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. This includes developing new "Big Data" analysis methods as the data volumes involved are considerable. We will present recent progress along this road in the EUSTACE project, i.e.: • identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; • estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; • using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.
Simulation skill of APCC set of global climate models for Asian summer monsoon rainfall variability
NASA Astrophysics Data System (ADS)
Singh, U. K.; Singh, G. P.; Singh, Vikas
2015-04-01
The performance of 11 Asia-Pacific Economic Cooperation Climate Center (APCC) global climate models (coupled and uncoupled both) in simulating the seasonal summer (June-August) monsoon rainfall variability over Asia (especially over India and East Asia) has been evaluated in detail using hind-cast data (3 months advance) generated from APCC which provides the regional climate information product services based on multi-model ensemble dynamical seasonal prediction systems. The skill of each global climate model over Asia was tested separately in detail for the period of 21 years (1983-2003), and simulated Asian summer monsoon rainfall (ASMR) has been verified using various statistical measures for Indian and East Asian land masses separately. The analysis found a large variation in spatial ASMR simulated with uncoupled model compared to coupled models (like Predictive Ocean Atmosphere Model for Australia, National Centers for Environmental Prediction and Japan Meteorological Agency). The simulated ASMR in coupled model was closer to Climate Prediction Centre Merged Analysis of Precipitation (CMAP) compared to uncoupled models although the amount of ASMR was underestimated in both models. Analysis also found a high spread in simulated ASMR among the ensemble members (suggesting that the model's performance is highly dependent on its initial conditions). The correlation analysis between sea surface temperature (SST) and ASMR shows that that the coupled models are strongly associated with ASMR compared to the uncoupled models (suggesting that air-sea interaction is well cared in coupled models). The analysis of rainfall using various statistical measures suggests that the multi-model ensemble (MME) performed better compared to individual model and also separate study indicate that Indian and East Asian land masses are more useful compared to Asia monsoon rainfall as a whole. The results of various statistical measures like skill of multi-model ensemble, large spread among the ensemble members of individual model, strong teleconnection (correlation analysis) with SST, coefficient of variation, inter-annual variability, analysis of Taylor diagram, etc. suggest that there is a need to improve coupled model instead of uncoupled model for the development of a better dynamical seasonal forecast system.
A statistical-based scheduling algorithm in automated data path synthesis
NASA Technical Reports Server (NTRS)
Jeon, Byung Wook; Lursinsap, Chidchanok
1992-01-01
In this paper, we propose a new heuristic scheduling algorithm based on the statistical analysis of the cumulative frequency distribution of operations among control steps. It has a tendency of escaping from local minima and therefore reaching a globally optimal solution. The presented algorithm considers the real world constraints such as chained operations, multicycle operations, and pipelined data paths. The result of the experiment shows that it gives optimal solutions, even though it is greedy in nature.
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
Anantha M. Prasad; Louis R. Iverson; Andy Liaw; Andy Liaw
2006-01-01
We evaluated four statistical models - Regression Tree Analysis (RTA), Bagging Trees (BT), Random Forests (RF), and Multivariate Adaptive Regression Splines (MARS) - for predictive vegetation mapping under current and future climate scenarios according to the Canadian Climate Centre global circulation model.
Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing.
Xiao, Hao; Sun, Tianyang; Meng, Bo; Cheng, Lihong
2017-01-01
The rise of global value chains (GVCs) characterized by the so-called "outsourcing", "fragmentation production", and "trade in tasks" has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics.
Novice nurses' level of global interdependence identity: a quantitative research study.
Kozlowski-Gibson, Maria
2015-01-01
Often, therapeutic relationships are cross-cultural in nature, which places both nurses and patients at risk for stress, depression, and anxiety. The purpose of this investigation was to describe novice nurses' level of global interdependence identity, as manifested by worldminded attitudes, and identify the strongest predictors of worldminded attitudes. Prospective descriptive with multiple regression study. The various nursing units of a large hospital in the great Cleveland, OH, area. The participants were novice nurses up to two years after graduation from nursing school and employed as hospital clinicians. Descriptive statistics with the mean and standard deviation of the scores was used for the delineation of the development of the participants. The study relied on a survey instrument, the Scale to Measure Worldminded Attitudes developed by Sampson and Smith (1957). The numerical data was scored and organized on a Microsoft Excel spreadsheet. The Statistical Package for Social Sciences (SPSS) version 21 was the program used to assist with analysis. The assessment of the models created through regression was completed using the model summary and analysis of variance (ANOVA). The nurses' mean level of global interdependence identity was slightly above the neutral point between extreme national-mindedness and full development of global interdependence identity. The best predictors of worldminded attitudes were immigration, patriotism, and war conceptualized under a global frame of reference. Novice nurses did not demonstrate an optimum developmental status of global interdependence identity to safeguard cross-cultural encounters with patients. The recommendation is the inclusion of immigration, patriotism, and war in the nursing curriculum and co-curriculum to promote student development and a turnaround improvement in patient experience. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.
2018-02-01
In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.
Evaluation of variability in high-resolution protein structures by global distance scoring.
Anzai, Risa; Asami, Yoshiki; Inoue, Waka; Ueno, Hina; Yamada, Koya; Okada, Tetsuji
2018-01-01
Systematic analysis of the statistical and dynamical properties of proteins is critical to understanding cellular events. Extraction of biologically relevant information from a set of high-resolution structures is important because it can provide mechanistic details behind the functional properties of protein families, enabling rational comparison between families. Most of the current structural comparisons are pairwise-based, which hampers the global analysis of increasing contents in the Protein Data Bank. Additionally, pairing of protein structures introduces uncertainty with respect to reproducibility because it frequently accompanies other settings for superimposition. This study introduces intramolecular distance scoring for the global analysis of proteins, for each of which at least several high-resolution structures are available. As a pilot study, we have tested 300 human proteins and showed that the method is comprehensively used to overview advances in each protein and protein family at the atomic level. This method, together with the interpretation of the model calculations, provide new criteria for understanding specific structural variation in a protein, enabling global comparison of the variability in proteins from different species.
A simple, physically-based method for evaluating the economic costs of geo-engineering schemes
NASA Astrophysics Data System (ADS)
Garrett, T. J.
2009-04-01
The consumption of primary energy (e.g coal, oil, uranium) by the global economy is done in expectation of a return on investment. For geo-engineering schemes, however, the relationship between the primary energy consumption required and the economic return is, at first glance, quite different. The energy costs of a given scheme represent a removal of economically productive available energy to do work in the normal global economy. What are the economic implications of the energy consumption associated with geo-engineering techniques? I will present a simple thermodynamic argument that, in general, real (inflation-adjusted) economic value has a fixed relationship to the rate of global primary energy consumption. This hypothesis will be shown to be supported by 36 years of available energy statistics and a two millennia period of statistics for global economic production. What is found from this analysis is that the value in any given inflation-adjusted 1990 dollar is sustained by a constant 9.7 +/- 0.3 milliwatts of global primary energy consumption. Thus, insofar as geo-engineering is concerned, any scheme that requires some nominal fraction of continuous global primary energy output necessitates a corresponding inflationary loss of real global economic value. For example, if 1% of global energy output is required, at today's consumption rates of 15 TW this corresponds to an inflationary loss of 15 trillion 1990 dollars of real value. The loss will be less, however, if the geo-engineering scheme also enables a demonstrable enhancement to global economic production capacity through climate modification.
Dependency of high coastal water level and river discharge at the global scale
NASA Astrophysics Data System (ADS)
Ward, P.; Couasnon, A.; Haigh, I. D.; Muis, S.; Veldkamp, T.; Winsemius, H.; Wahl, T.
2017-12-01
It is widely recognized that floods cause huge socioeconomic impacts. From 1980-2013, global flood losses exceeded $1 trillion, with 220,000 fatalities. These impacts are particularly hard felt in low-lying densely populated deltas and estuaries, whose location at the coast-land interface makes them naturally prone to flooding. When river and coastal floods coincide, their impacts in these deltas and estuaries are often worse than when they occur in isolation. Such floods are examples of so-called `compound events'. In this contribution, we present the first global scale analysis of the statistical dependency of high coastal water levels (and the storm surge component alone) and river discharge. We show that there is statistical dependency between these components at more than half of the stations examined. We also show time-lags in the highest correlation between peak discharges and coastal water levels. Finally, we assess the probability of the simultaneous occurrence of design discharge and design coastal water levels, assuming both independence and statistical dependence. For those stations where we identified statistical dependency, the probability is between 1 and 5 times greater, when the dependence structure is accounted for. This information is essential for understanding the likelihood of compound flood events occurring at locations around the world as well as for accurate flood risk assessments and effective flood risk management. The research was carried out by analysing the statistical dependency between observed coastal water levels (and the storm surge component) from GESLA-2 and river discharge using gauged data from GRDC stations all around the world. The dependence structure was examined using copula functions.
Global building inventory for earthquake loss estimation and risk management
Jaiswal, Kishor; Wald, David; Porter, Keith
2010-01-01
We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.
Apipattanavis, S.; McCabe, G.J.; Rajagopalan, B.; Gangopadhyay, S.
2009-01-01
Dominant modes of individual and joint variability in global sea surface temperatures (SST) and global Palmer drought severity index (PDSI) values for the twentieth century are identified through a multivariate frequency domain singular value decomposition. This analysis indicates that a secular trend and variability related to the El Niño–Southern Oscillation (ENSO) are the dominant modes of variance shared among the global datasets. For the SST data the secular trend corresponds to a positive trend in Indian Ocean and South Atlantic SSTs, and a negative trend in North Pacific and North Atlantic SSTs. The ENSO reconstruction shows a strong signal in the tropical Pacific, North Pacific, and Indian Ocean regions. For the PDSI data, the secular trend reconstruction shows high amplitudes over central Africa including the Sahel, whereas the regions with strong ENSO amplitudes in PDSI are the southwestern and northwestern United States, South Africa, northeastern Brazil, central Africa, the Indian subcontinent, and Australia. An additional significant frequency, multidecadal variability, is identified for the Northern Hemisphere. This multidecadal frequency appears to be related to the Atlantic multidecadal oscillation (AMO). The multidecadal frequency is statistically significant in the Northern Hemisphere SST data, but is statistically nonsignificant in the PDSI data.
ERIC Educational Resources Information Center
Montgomery, Catherine
2016-01-01
Transnational partnerships between universities can illustrate the changing political,social, and cultural terrain of global higher education. Drawing on secondary data analysis of government educational statistics, university web pages, and a comprehensive literature review, this article focuses on transnational partnerships with particular…
Are secular correlations between sunspots, geomagnetic activity, and global temperature significant?
Love, J.J.; Mursula, K.; Tsai, V.C.; Perkins, D.M.
2011-01-01
Recent studies have led to speculation that solar-terrestrial interaction, measured by sunspot number and geomagnetic activity, has played an important role in global temperature change over the past century or so. We treat this possibility as an hypothesis for testing. We examine the statistical significance of cross-correlations between sunspot number, geomagnetic activity, and global surface temperature for the years 1868-2008, solar cycles 11-23. The data contain substantial autocorrelation and nonstationarity, properties that are incompatible with standard measures of cross-correlational significance, but which can be largely removed by averaging over solar cycles and first-difference detrending. Treated data show an expected statistically- significant correlation between sunspot number and geomagnetic activity, Pearson p < 10-4, but correlations between global temperature and sunspot number (geomagnetic activity) are not significant, p = 0.9954, (p = 0.8171). In other words, straightforward analysis does not support widely-cited suggestions that these data record a prominent role for solar-terrestrial interaction in global climate change. With respect to the sunspot-number, geomagnetic-activity, and global-temperature data, three alternative hypotheses remain difficult to reject: (1) the role of solar-terrestrial interaction in recent climate change is contained wholly in long-term trends and not in any shorter-term secular variation, or, (2) an anthropogenic signal is hiding correlation between solar-terrestrial variables and global temperature, or, (3) the null hypothesis, recent climate change has not been influenced by solar-terrestrial interaction. ?? 2011 by the American Geophysical Union.
Are secular correlations between sunspots, geomagnetic activity, and global temperature significant?
NASA Astrophysics Data System (ADS)
Love, Jeffrey J.; Mursula, Kalevi; Tsai, Victor C.; Perkins, David M.
2011-11-01
Recent studies have led to speculation that solar-terrestrial interaction, measured by sunspot number and geomagnetic activity, has played an important role in global temperature change over the past century or so. We treat this possibility as an hypothesis for testing. We examine the statistical significance of cross-correlations between sunspot number, geomagnetic activity, and global surface temperature for the years 1868-2008, solar cycles 11-23. The data contain substantial autocorrelation and nonstationarity, properties that are incompatible with standard measures of cross-correlational significance, but which can be largely removed by averaging over solar cycles and first-difference detrending. Treated data show an expected statistically-significant correlation between sunspot number and geomagnetic activity, Pearson p < 10-4, but correlations between global temperature and sunspot number (geomagnetic activity) are not significant, p = 0.9954, (p = 0.8171). In other words, straightforward analysis does not support widely-cited suggestions that these data record a prominent role for solar-terrestrial interaction in global climate change. With respect to the sunspot-number, geomagnetic-activity, and global-temperature data, three alternative hypotheses remain difficult to reject: (1) the role of solar-terrestrial interaction in recent climate change is contained wholly in long-term trends and not in any shorter-term secular variation, or, (2) an anthropogenic signal is hiding correlation between solar-terrestrial variables and global temperature, or, (3) the null hypothesis, recent climate change has not been influenced by solar-terrestrial interaction.
Statistical Analysis of TEC Anomalies Prior to M6.0+ Earthquakes During 2003-2014
NASA Astrophysics Data System (ADS)
Zhu, Fuying; Su, Fanfan; Lin, Jian
2018-04-01
There are many studies on the anomalous variations of the ionospheric TEC prior to large earthquakes. However, whether or not the morphological characteristics of the TEC anomalies in the daytime and at night are different is rarely studied. In the present paper, based on the total electron content (TEC) data from the global ionosphere map (GIM), we carry out a statistical survey on the spatial-temporal distribution of TEC anomalies before 1339 global M6.0+ earthquakes during 2003-2014. After excluding the interference of geomagnetic disturbance, the temporal and spatial distributions of ionospheric TEC anomalies prior to the earthquakes in the daytime and at night are investigated and compared. Except that the nighttime occurrence rates of the pre-earthquake ionospheric anomalies (PEIAs) are higher than those in the daytime, our analysis has not found any statistically significant difference in the spatial-temporal distribution of PEIAs in the daytime and at night. Moreover, the occurrence rates of pre-earthquake ionospheric TEC both positive anomalies and negative anomalies at night tend to increase slightly with the earthquake magnitude. Thus, we suggest that monitoring the ionospheric TEC changes at night might be a clue to reveal the relation between ionospheric disturbances and seismic activities.
Global alliances effect in coalition forming
NASA Astrophysics Data System (ADS)
Vinogradova, Galina; Galam, Serge
2014-11-01
Coalition forming is investigated among countries, which are coupled with short range interactions, under the influence of externally-set opposing global alliances. The model extends a recent Natural Model of coalition forming inspired from Statistical Physics, where instabilities are a consequence of decentralized maximization of the individual benefits of actors. In contrast to physics where spins can only evaluate the immediate cost/benefit of a flip of orientation, countries have a long horizon of rationality, which associates with the ability to envision a way up to a better configuration even at the cost of passing through intermediate loosing states. The stabilizing effect is produced through polarization by the global alliances of either a particular unique global interest factor or multiple simultaneous ones. This model provides a versatile theoretical tool for the analysis of real cases and design of novel strategies. Such analysis is provided for several real cases including the Eurozone. The results shed a new light on the understanding of the complex phenomena of planned stabilization in the coalition forming.
The EUSTACE project: delivering global, daily information on surface air temperature
NASA Astrophysics Data System (ADS)
Rayner, Nick
2017-04-01
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-June 2018, https://www.eustaceproject.eu) we are developing an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. As the data volumes involved are considerable, such work needs to include development of new "Big Data" analysis methods. We will present recent progress along this road in the EUSTACE project: 1. providing new, consistent, multi-component estimates of uncertainty in surface skin temperature retrievals from satellites; 2. identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; 3. estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; 4. using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.
The EUSTACE project: delivering global, daily information on surface air temperature
NASA Astrophysics Data System (ADS)
Ghent, D.; Rayner, N. A.
2016-12-01
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-June 2018, https://www.eustaceproject.eu) we are developing an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. As the data volumes involved are considerable, such work needs to include development of new "Big Data" analysis methods. We will present recent progress along this road in the EUSTACE project, i.e.: • providing new, consistent, multi-component estimates of uncertainty in surface skin temperature retrievals from satellites; • identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; • estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; • using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.
Data Resource Profile: United Nations Children’s Fund (UNICEF)
Murray, Colleen; Newby, Holly
2012-01-01
The United Nations Children’s Fund (UNICEF) plays a leading role in the collection, compilation, analysis and dissemination of data to inform sound policies, legislation and programmes for promoting children’s rights and well-being, and for global monitoring of progress towards the Millennium Development Goals. UNICEF maintains a set of global databases representing nearly 200 countries and covering the areas of child mortality, child health, maternal health, nutrition, immunization, water and sanitation, HIV/AIDS, education and child protection. These databases consist of internationally comparable and statistically sound data, and are updated annually through a process that draws on a wealth of data provided by UNICEF’s wide network of >150 field offices. The databases are composed primarily of estimates from household surveys, with data from censuses, administrative records, vital registration systems and statistical models contributing to some key indicators as well. The data are assessed for quality based on a set of objective criteria to ensure that only the most reliable nationally representative information is included. For most indicators, data are available at the global, regional and national levels, plus sub-national disaggregation by sex, urban/rural residence and household wealth. The global databases are featured in UNICEF’s flagship publications, inter-agency reports, including the Secretary General’s Millennium Development Goals Report and Countdown to 2015, sector-specific reports and statistical country profiles. They are also publicly available on www.childinfo.org, together with trend data and equity analyses. PMID:23211414
Modeling urbanization patterns at a global scale with generative adversarial networks
NASA Astrophysics Data System (ADS)
Albert, A. T.; Strano, E.; Gonzalez, M.
2017-12-01
Current demographic projections show that, in the next 30 years, global population growth will mostly take place in developing countries. Coupled with a decrease in density, such population growth could potentially double the land occupied by settlements by 2050. The lack of reliable and globally consistent socio-demographic data, coupled with the limited predictive performance underlying traditional urban spatial explicit models, call for developing better predictive methods, calibrated using a globally-consistent dataset. Thus, richer models of the spatial interplay between the urban built-up land, population distribution and energy use are central to the discussion around the expansion and development of cities, and their impact on the environment in the context of a changing climate. In this talk we discuss methods for, and present an analysis of, urban form, defined as the spatial distribution of macroeconomic quantities that characterize a city, using modern machine learning methods and best-available remote-sensing data for the world's largest 25,000 cities. We first show that these cities may be described by a small set of patterns in radial building density, nighttime luminosity, and population density, which highlight, to first order, differences in development and land use across the world. We observe significant, spatially-dependent variance around these typical patterns, which would be difficult to model using traditional statistical methods. We take a first step in addressing this challenge by developing CityGAN, a conditional generative adversarial network model for simulating realistic urban forms. To guide learning and measure the quality of the simulated synthetic cities, we develop a specialized loss function for GAN optimization that incorporates standard spatial statistics used by urban analysis experts. Our framework is a stark departure from both the standard physics-based approaches in the literature (that view urban forms as fractals with a scale-free behavior), and the traditional statistical learning approaches (whereby values of individual pixels are modeled as functions of locally-defined, hand-engineered features). This is a first-of-its-kind analysis of urban forms using data at a planetary scale.
Micromechanics Fatigue Damage Analysis Modeling for Fabric Reinforced Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Min, J. B.; Xue, D.; Shi, Y.
2013-01-01
A micromechanics analysis modeling method was developed to analyze the damage progression and fatigue failure of fabric reinforced composite structures, especially for the brittle ceramic matrix material composites. A repeating unit cell concept of fabric reinforced composites was used to represent the global composite structure. The thermal and mechanical properties of the repeating unit cell were considered as the same as those of the global composite structure. The three-phase micromechanics, the shear-lag, and the continuum fracture mechanics models were integrated with a statistical model in the repeating unit cell to predict the progressive damages and fatigue life of the composite structures. The global structure failure was defined as the loss of loading capability of the repeating unit cell, which depends on the stiffness reduction due to material slice failures and nonlinear material properties in the repeating unit cell. The present methodology is demonstrated with the analysis results evaluated through the experimental test performed with carbon fiber reinforced silicon carbide matrix plain weave composite specimens.
Statistical Downscaling of WRF-Chem Model: An Air Quality Analysis over Bogota, Colombia
NASA Astrophysics Data System (ADS)
Kumar, Anikender; Rojas, Nestor
2015-04-01
Statistical downscaling is a technique that is used to extract high-resolution information from regional scale variables produced by coarse resolution models such as Chemical Transport Models (CTMs). The fully coupled WRF-Chem (Weather Research and Forecasting with Chemistry) model is used to simulate air quality over Bogota. Bogota is a tropical Andean megacity located over a high-altitude plateau in the middle of very complex terrain. The WRF-Chem model was adopted for simulating the hourly ozone concentrations. The computational domains were chosen of 120x120x32, 121x121x32 and 121x121x32 grid points with horizontal resolutions of 27, 9 and 3 km respectively. The model was initialized with real boundary conditions using NCAR-NCEP's Final Analysis (FNL) and a 1ox1o (~111 km x 111 km) resolution. Boundary conditions were updated every 6 hours using reanalysis data. The emission rates were obtained from global inventories, namely the REanalysis of the TROpospheric (RETRO) chemical composition and the Emission Database for Global Atmospheric Research (EDGAR). Multiple linear regression and artificial neural network techniques are used to downscale the model output at each monitoring stations. The results confirm that the statistically downscaled outputs reduce simulated errors by up to 25%. This study provides a general overview of statistical downscaling of chemical transport models and can constitute a reference for future air quality modeling exercises over Bogota and other Colombian cities.
Bermejo Alegría, Rosa M; Hidalgo Montesinos, M Dolores; Parra Hidalgo, Pedro; Más Castillo, Adelia; Gomis Cebrián, Rafael
2011-04-01
The aim of this study was to analyze the psychometric properties of two scales that assess the perceived quality and patient satisfaction with outpatient surgery in the Health Service of Murcia. These scales assess the degree of Professional Competence (PC) and Personnel Treatment (PT). The scales were administered to a sample of 2017 users of outpatient surgery in the Health Service of Murcia during the years 2008 and 2009. Exploratory factor analysis indicates a unidimensional structure for each scale. Internal consistency was adequate: .68 for PC and .75 for PT. The correlation between the PC scale and patients' global satisfaction was positive and statistically significant. The correlation between the PT scale and patients' global satisfaction was also statistically significant. The scales have shown their utility to detect areas of improvement and to plan intervention strategies.
Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M
2016-01-01
Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
ECG Identification System Using Neural Network with Global and Local Features
ERIC Educational Resources Information Center
Tseng, Kuo-Kun; Lee, Dachao; Chen, Charles
2016-01-01
This paper proposes a human identification system via extracted electrocardiogram (ECG) signals. Two hierarchical classification structures based on global shape feature and local statistical feature is used to extract ECG signals. Global shape feature represents the outline information of ECG signals and local statistical feature extracts the…
Kamal, Ghulam Mustafa; Wang, Xiaohua; Bin Yuan; Wang, Jie; Sun, Peng; Zhang, Xu; Liu, Maili
2016-09-01
Soy sauce a well known seasoning all over the world, especially in Asia, is available in global market in a wide range of types based on its purpose and the processing methods. Its composition varies with respect to the fermentation processes and addition of additives, preservatives and flavor enhancers. A comprehensive (1)H NMR based study regarding the metabonomic variations of soy sauce to differentiate among different types of soy sauce available on the global market has been limited due to the complexity of the mixture. In present study, (13)C NMR spectroscopy coupled with multivariate statistical data analysis like principle component analysis (PCA), and orthogonal partial least square-discriminant analysis (OPLS-DA) was applied to investigate metabonomic variations among different types of soy sauce, namely super light, super dark, red cooking and mushroom soy sauce. The main additives in soy sauce like glutamate, sucrose and glucose were easily distinguished and quantified using (13)C NMR spectroscopy which were otherwise difficult to be assigned and quantified due to serious signal overlaps in (1)H NMR spectra. The significantly higher concentration of sucrose in dark, red cooking and mushroom flavored soy sauce can directly be linked to the addition of caramel in soy sauce. Similarly, significantly higher level of glutamate in super light as compared to super dark and mushroom flavored soy sauce may come from the addition of monosodium glutamate. The study highlights the potentiality of (13)C NMR based metabonomics coupled with multivariate statistical data analysis in differentiating between the types of soy sauce on the basis of level of additives, raw materials and fermentation procedures. Copyright © 2016 Elsevier B.V. All rights reserved.
Decoding the spatial signatures of multi-scale climate variability - a climate network perspective
NASA Astrophysics Data System (ADS)
Donner, R. V.; Jajcay, N.; Wiedermann, M.; Ekhtiari, N.; Palus, M.
2017-12-01
During the last years, the application of complex networks as a versatile tool for analyzing complex spatio-temporal data has gained increasing interest. Establishing this approach as a new paradigm in climatology has already provided valuable insights into key spatio-temporal climate variability patterns across scales, including novel perspectives on the dynamics of the El Nino Southern Oscillation or the emergence of extreme precipitation patterns in monsoonal regions. In this work, we report first attempts to employ network analysis for disentangling multi-scale climate variability. Specifically, we introduce the concept of scale-specific climate networks, which comprises a sequence of networks representing the statistical association structure between variations at distinct time scales. For this purpose, we consider global surface air temperature reanalysis data and subject the corresponding time series at each grid point to a complex-valued continuous wavelet transform. From this time-scale decomposition, we obtain three types of signals per grid point and scale - amplitude, phase and reconstructed signal, the statistical similarity of which is then represented by three complex networks associated with each scale. We provide a detailed analysis of the resulting connectivity patterns reflecting the spatial organization of climate variability at each chosen time-scale. Global network characteristics like transitivity or network entropy are shown to provide a new view on the (global average) relevance of different time scales in climate dynamics. Beyond expected trends originating from the increasing smoothness of fluctuations at longer scales, network-based statistics reveal different degrees of fragmentation of spatial co-variability patterns at different scales and zonal shifts among the key players of climate variability from tropically to extra-tropically dominated patterns when moving from inter-annual to decadal scales and beyond. The obtained results demonstrate the potential usefulness of systematically exploiting scale-specific climate networks, whose general patterns are in line with existing climatological knowledge, but provide vast opportunities for further quantifications at local, regional and global scales that are yet to be explored.
NASA Astrophysics Data System (ADS)
Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard
2018-07-01
This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.
A global building inventory for earthquake loss estimation and risk management
Jaiswal, K.; Wald, D.; Porter, K.
2010-01-01
We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.
NASA Technical Reports Server (NTRS)
Smith, Andrew; LaVerde, Bruce; Teague, David; Gardner, Bryce; Cotoni, Vincent
2010-01-01
This presentation further develops the orthogrid vehicle panel work. Employed Hybrid Module capabilities to assess both low/mid frequency and high frequency models in the VA One simulation environment. The response estimates from three modeling approaches are compared to ground test measurements. Detailed Finite Element Model of the Test Article -Expect to capture both the global panel modes and the local pocket mode response, but at a considerable analysis expense (time & resources). A Composite Layered Construction equivalent global stiffness approximation using SEA -Expect to capture response of the global panel modes only. An SEA approximation using the Periodic Subsystem Formulation. A finite element model of a single periodic cell is used to derive the vibroacoustic properties of the entire periodic structure (modal density, radiation efficiency, etc. Expect to capture response at various locations on the panel (on the skin and on the ribs) with less analysis expense
Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines
NASA Astrophysics Data System (ADS)
Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.
2016-12-01
Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.
P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.
P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternativelymore » available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).« less
USDA-ARS?s Scientific Manuscript database
Porcine reproductive and respiratory syndrome (PRRS) is the most economically significant viral disease facing the global swine industry. Viremia profiles of PRRS virus challenged pigs reflect the severity and progression of the infection within the host and provide crucial information for subsequen...
Factor Scores, Structure and Communality Coefficients: A Primer
ERIC Educational Resources Information Center
Odum, Mary
2011-01-01
(Purpose) The purpose of this paper is to present an easy-to-understand primer on three important concepts of factor analysis: Factor scores, structure coefficients, and communality coefficients. Given that statistical analyses are a part of a global general linear model (GLM), and utilize weights as an integral part of analyses (Thompson, 2006;…
Comment on "Habitat split and the global decline of amphibians".
Cannatella, David C
2008-05-16
Becker et al. (Reports, 14 December 2007, p. 1775) reported that forest amphibians with terrestrial development are less susceptible to the effects of habitat degradation than those with aquatic larvae. However, analysis with more appropriate statistical methods suggests there is no evidence for a difference between aquatic-reproducing and terrestrial-reproducing species.
NASA Astrophysics Data System (ADS)
Schmith, Torben; Thejll, Peter; Johansen, Søren
2016-04-01
We analyse the statistical relationship between changes in global temperature, global steric sea level and radiative forcing in order to reveal causal relationships. There are in this, however, potential pitfalls due to the trending nature of the time series. We therefore apply a statistical method called cointegration analysis, originating from the field of econometrics, which is able to correctly handle the analysis of series with trends and other long-range dependencies. Further, we find a relationship between steric sea level and temperature and find that temperature causally depends on the steric sea level, which can be understood as a consequence of the large heat capacity of the ocean. This result is obtained both when analyzing observed data and data from a CMIP5 historical model run. Finally, we find that in the data from the historical run, the steric sea level, in turn, is driven by the external forcing. Finally, we demonstrate that combining these two results can lead to a novel estimate of radiative forcing back in time based on observations.
Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing
Meng, Bo; Cheng, Lihong
2017-01-01
The rise of global value chains (GVCs) characterized by the so-called “outsourcing”, “fragmentation production”, and “trade in tasks” has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics. PMID:28081201
Dymova, Natalya; Hanumara, R. Choudary; Gagnon, Ronald N.
2009-01-01
Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies. PMID:19696393
Dymova, Natalya; Hanumara, R Choudary; Enander, Richard T; Gagnon, Ronald N
2009-10-01
Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies.
2005-11-01
more random. Autonomous systems can exchange entropy statistics for packet streams with no confidentiality concerns, potentially enabling timely and... analysis began with simulation results, which were validated by analysis of actual data from an Autonomous System (AS). A scale-free network is one...traffic—for example, time series of flux at given nodes and mean path length Outputs the time series from any node queried Calculates
Generalized functional linear models for gene-based case-control association studies.
Fan, Ruzong; Wang, Yifan; Mills, James L; Carter, Tonia C; Lobach, Iryna; Wilson, Alexander F; Bailey-Wilson, Joan E; Weeks, Daniel E; Xiong, Momiao
2014-11-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene region are disease related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease datasets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. © 2014 WILEY PERIODICALS, INC.
Generalized Functional Linear Models for Gene-based Case-Control Association Studies
Mills, James L.; Carter, Tonia C.; Lobach, Iryna; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Weeks, Daniel E.; Xiong, Momiao
2014-01-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene are disease-related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease data sets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. PMID:25203683
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sathaye, Jayant A.
2000-04-01
Integrated assessment (IA) modeling of climate policy is increasingly global in nature, with models incorporating regional disaggregation. The existing empirical basis for IA modeling, however, largely arises from research on industrialized economies. Given the growing importance of developing countries in determining long-term global energy and carbon emissions trends, filling this gap with improved statistical information on developing countries' energy and carbon-emissions characteristics is an important priority for enhancing IA modeling. Earlier research at LBNL on this topic has focused on assembling and analyzing statistical data on productivity trends and technological change in the energy-intensive manufacturing sectors of five developing countries,more » India, Brazil, Mexico, Indonesia, and South Korea. The proposed work will extend this analysis to the agriculture and electric power sectors in India, South Korea, and two other developing countries. They will also examine the impact of alternative model specifications on estimates of productivity growth and technological change for each of the three sectors, and estimate the contribution of various capital inputs--imported vs. indigenous, rigid vs. malleable-- in contributing to productivity growth and technological change. The project has already produced a data resource on the manufacturing sector which is being shared with IA modelers. This will be extended to the agriculture and electric power sectors, which would also be made accessible to IA modeling groups seeking to enhance the empirical descriptions of developing country characteristics. The project will entail basic statistical and econometric analysis of productivity and energy trends in these developing country sectors, with parameter estimates also made available to modeling groups. The parameter estimates will be developed using alternative model specifications that could be directly utilized by the existing IAMs for the manufacturing, agriculture, and electric power sectors.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Zeli; Leung, L. Ruby; Li, Hongyi
Although sediment yield (SY) from water erosion is ubiquitous and its environmental consequences are well recognized, its impacts on the global carbon cycle remain largely uncertain. This knowledge gap is partly due to the lack of soil erosion modeling in Earth System Models (ESMs), which are important tools used to understand the global carbon cycle and explore its changes. This study analyzed sediment and particulate organic carbon yield (CY) data from 1081 and 38 small catchments (0.1-200 km27 ), respectively, in different environments across the globe. Using multiple statistical analysis techniques, we explored environmental factors and hydrological processes important formore » SY and CY modeling in ESMs. Our results show clear correlations of high SY with traditional agriculture, seismicity and heavy storms, as well as strong correlations between SY and annual peak runoff. These highlight the potential limitation of SY models that represent only interrill and rill erosion because shallow overland flow and rill flow have limited transport capacity due to their hydraulic geometry to produce high SY. Further, our results suggest that SY modeling in ESMs should be implemented at the event scale to produce the catastrophic mass transport during episodic events. Several environmental factors such as seismicity and land management that are often not considered in current catchment-scale SY models can be important in controlling global SY. Our analyses show that SY is likely the primary control on CY in small catchments and a statistically significant empirical relationship is established to calculate SY and CY jointly in ESMs.« less
How Close Do We Live to Water? A Global Analysis of Population Distance to Freshwater Bodies
Kummu, Matti; de Moel, Hans; Ward, Philip J.; Varis, Olli
2011-01-01
Traditionally, people have inhabited places with ready access to fresh water. Today, over 50% of the global population lives in urban areas, and water can be directed via tens of kilometres of pipelines. Still, however, a large part of the world's population is directly dependent on access to natural freshwater sources. So how are inhabited places related to the location of freshwater bodies today? We present a high-resolution global analysis of how close present-day populations live to surface freshwater. We aim to increase the understanding of the relationship between inhabited places, distance to surface freshwater bodies, and climatic characteristics in different climate zones and administrative regions. Our results show that over 50% of the world's population lives closer than 3 km to a surface freshwater body, and only 10% of the population lives further than 10 km away. There are, however, remarkable differences between administrative regions and climatic zones. Populations in Australia, Asia, and Europe live closest to water. Although populations in arid zones live furthest away from freshwater bodies in absolute terms, relatively speaking they live closest to water considering the limited number of freshwater bodies in those areas. Population distributions in arid zones show statistically significant relationships with a combination of climatic factors and distance to water, whilst in other zones there is no statistically significant relationship with distance to water. Global studies on development and climate adaptation can benefit from an improved understanding of these relationships between human populations and the distance to fresh water. PMID:21687675
NASA Astrophysics Data System (ADS)
Tumewu, Widya Anjelia; Wulan, Ana Ratna; Sanjaya, Yayan
2017-05-01
The purpose of this study was to know comparing the effectiveness of learning using Project-based learning (PjBL) and Discovery Learning (DL) toward students metacognitive strategies on global warming concept. A quasi-experimental research design with a The Matching-Only Pretest-Posttest Control Group Design was used in this study. The subjects were students of two classes 7th grade of one of junior high school in Bandung City, West Java of 2015/2016 academic year. The study was conducted on two experimental class, that were project-based learning treatment on the experimental class I and discovery learning treatment was done on the experimental class II. The data was collected through questionnaire to know students metacognitive strategies. The statistical analysis showed that there were statistically significant differences in students metacognitive strategies between project-based learning and discovery learning.
Scaling of global input-output networks
NASA Astrophysics Data System (ADS)
Liang, Sai; Qi, Zhengling; Qu, Shen; Zhu, Ji; Chiu, Anthony S. F.; Jia, Xiaoping; Xu, Ming
2016-06-01
Examining scaling patterns of networks can help understand how structural features relate to the behavior of the networks. Input-output networks consist of industries as nodes and inter-industrial exchanges of products as links. Previous studies consider limited measures for node strengths and link weights, and also ignore the impact of dataset choice. We consider a comprehensive set of indicators in this study that are important in economic analysis, and also examine the impact of dataset choice, by studying input-output networks in individual countries and the entire world. Results show that Burr, Log-Logistic, Log-normal, and Weibull distributions can better describe scaling patterns of global input-output networks. We also find that dataset choice has limited impacts on the observed scaling patterns. Our findings can help examine the quality of economic statistics, estimate missing data in economic statistics, and identify key nodes and links in input-output networks to support economic policymaking.
NASA Astrophysics Data System (ADS)
Capozzi, Francesco; Lisi, Eligio; Marrone, Antonio
2016-04-01
Within the standard 3ν oscillation framework, we illustrate the status of currently unknown oscillation parameters: the θ23 octant, the mass hierarchy (normal or inverted), and the possible CP-violating phase δ, as derived by a (preliminary) global analysis of oscillation data available in 2015. We then discuss some challenges that will be faced by future, high-statistics analyses of spectral data, starting with one-dimensional energy spectra in reactor experiments, and concluding with two-dimensional energy-angle spectra in large-volume atmospheric experiments. It is shown that systematic uncertainties in the spectral shapes can noticeably affect the prospective sensitivities to unknown oscillation parameters, in particular to the mass hierarchy.
Global sensitivity analysis of multiscale properties of porous materials
NASA Astrophysics Data System (ADS)
Um, Kimoon; Zhang, Xuan; Katsoulakis, Markos; Plechac, Petr; Tartakovsky, Daniel M.
2018-02-01
Ubiquitous uncertainty about pore geometry inevitably undermines the veracity of pore- and multi-scale simulations of transport phenomena in porous media. It raises two fundamental issues: sensitivity of effective material properties to pore-scale parameters and statistical parameterization of Darcy-scale models that accounts for pore-scale uncertainty. Homogenization-based maps of pore-scale parameters onto their Darcy-scale counterparts facilitate both sensitivity analysis (SA) and uncertainty quantification. We treat uncertain geometric characteristics of a hierarchical porous medium as random variables to conduct global SA and to derive probabilistic descriptors of effective diffusion coefficients and effective sorption rate. Our analysis is formulated in terms of solute transport diffusing through a fluid-filled pore space, while sorbing to the solid matrix. Yet it is sufficiently general to be applied to other multiscale porous media phenomena that are amenable to homogenization.
NASA Astrophysics Data System (ADS)
Hennig, R. J.; Friedrich, J.; Malaguzzi Valeri, L.; McCormick, C.; Lebling, K.; Kressig, A.
2016-12-01
The Power Watch project will offer open data on the global electricity sector starting with power plants and their impacts on climate and water systems; it will also offer visualizations and decision making tools. Power Watch will create the first comprehensive, open database of power plants globally by compiling data from national governments, public and private utilities, transmission grid operators, and other data providers to create a core dataset that has information on over 80% of global installed capacity for electrical generation. Power plant data will at a minimum include latitude and longitude, capacity, fuel type, emissions, water usage, ownership, and annual generation. By providing data that is both comprehensive, as well as making it publically available, this project will support decision making and analysis by actors across the economy and in the research community. The Power Watch research effort focuses on creating a global standard for power plant information, gathering and standardizing data from multiple sources, matching information from multiple sources on a plant level, testing cross-validation approaches (regional statistics, crowdsourcing, satellite data, and others) and developing estimation methodologies for generation, emissions, and water usage. When not available from official reports, emissions, annual generation, and water usage will be estimated. Water use estimates of power plants will be based on capacity, fuel type and satellite imagery to identify cooling types. This analysis is being piloted in several states in India and will then be scaled up to a global level. Other planned applications of of the Power Watch data include improving understanding of energy access, air pollution, emissions estimation, stranded asset analysis, life cycle analysis, tracking of proposed plants and curtailment analysis.
The Effects of Global Warming on Temperature and Precipitation Trends in Northeast America
NASA Astrophysics Data System (ADS)
Francis, F.
2013-12-01
The objective of this paper is to discuss the analysis of results in temperature and precipitation (rainfall) data and how they are affected by the theory of global warming in Northeast America. The topic was chosen because it will show the trends in temperature and precipitation and their relations to global warming. Data was collected from The Global Historical Climatology Network (GHCN). The data range from years of 1973 to 2012. We were able to calculate the yearly and monthly regress to estimate the relationship of variables found in the individual sources. With the use of specially designed software, analysis and manual calculations we are able to give a visualization of these trends in precipitation and temperature and to question if these trends are due to the theory of global warming. With the Calculation of the trends in slope we were able to interpret the changes in minimum and maximum temperature and precipitation. Precipitation had a 9.5 % increase over the past forty years, while maximum temperature increased 1.9 %, a greater increase is seen in minimum temperature of 3.3 % was calculated over the years. The trends in precipitation, maximum and minimum temperature is statistically significant at a 95% level.
NASA Astrophysics Data System (ADS)
Estrada, Francisco; Perron, Pierre; Martínez-López, Benjamín
2013-12-01
The warming of the climate system is unequivocal as evidenced by an increase in global temperatures by 0.8°C over the past century. However, the attribution of the observed warming to human activities remains less clear, particularly because of the apparent slow-down in warming since the late 1990s. Here we analyse radiative forcing and temperature time series with state-of-the-art statistical methods to address this question without climate model simulations. We show that long-term trends in total radiative forcing and temperatures have largely been determined by atmospheric greenhouse gas concentrations, and modulated by other radiative factors. We identify a pronounced increase in the growth rates of both temperatures and radiative forcing around 1960, which marks the onset of sustained global warming. Our analyses also reveal a contribution of human interventions to two periods when global warming slowed down. Our statistical analysis suggests that the reduction in the emissions of ozone-depleting substances under the Montreal Protocol, as well as a reduction in methane emissions, contributed to the lower rate of warming since the 1990s. Furthermore, we identify a contribution from the two world wars and the Great Depression to the documented cooling in the mid-twentieth century, through lower carbon dioxide emissions. We conclude that reductions in greenhouse gas emissions are effective in slowing the rate of warming in the short term.
Probabilistic assessment of sea level during the last interglacial stage.
Kopp, Robert E; Simons, Frederik J; Mitrovica, Jerry X; Maloof, Adam C; Oppenheimer, Michael
2009-12-17
With polar temperatures approximately 3-5 degrees C warmer than today, the last interglacial stage (approximately 125 kyr ago) serves as a partial analogue for 1-2 degrees C global warming scenarios. Geological records from several sites indicate that local sea levels during the last interglacial were higher than today, but because local sea levels differ from global sea level, accurately reconstructing past global sea level requires an integrated analysis of globally distributed data sets. Here we present an extensive compilation of local sea level indicators and a statistical approach for estimating global sea level, local sea levels, ice sheet volumes and their associated uncertainties. We find a 95% probability that global sea level peaked at least 6.6 m higher than today during the last interglacial; it is likely (67% probability) to have exceeded 8.0 m but is unlikely (33% probability) to have exceeded 9.4 m. When global sea level was close to its current level (>or=-10 m), the millennial average rate of global sea level rise is very likely to have exceeded 5.6 m kyr(-1) but is unlikely to have exceeded 9.2 m kyr(-1). Our analysis extends previous last interglacial sea level studies by integrating literature observations within a probabilistic framework that accounts for the physics of sea level change. The results highlight the long-term vulnerability of ice sheets to even relatively low levels of sustained global warming.
Raymond L. Czaplewski
1989-01-01
It is difficult to design systems for national and global resource inventory and analysis that efficiently satisfy changing, and increasingly complex objectives. It is proposed that individual inventory, monitoring, modeling, and remote sensing systems be specialized to achieve portions of the objectives. These separate systems can be statistically linked to accomplish...
Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johannesson, G
2010-03-17
Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that themore » average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.« less
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
NASA Astrophysics Data System (ADS)
Kantar, Ersin; Keskin, Mustafa; Deviren, Bayram
2012-04-01
We have analyzed the topology of 50 important Turkish companies for the period 2006-2010 using the concept of hierarchical methods (the minimal spanning tree (MST) and hierarchical tree (HT)). We investigated the statistical reliability of links between companies in the MST by using the bootstrap technique. We also used the average linkage cluster analysis (ALCA) technique to observe the cluster structures much better. The MST and HT are known as useful tools to perceive and detect global structure, taxonomy, and hierarchy in financial data. We obtained four clusters of companies according to their proximity. We also observed that the Banks and Holdings cluster always forms in the centre of the MSTs for the periods 2006-2007, 2008, and 2009-2010. The clusters match nicely with their common production activities or their strong interrelationship. The effects of the Automobile sector increased after the global financial crisis due to the temporary incentives provided by the Turkish government. We find that Turkish companies were not very affected by the global financial crisis.
Introduction of statistical information in a syntactic analyzer for document image recognition
NASA Astrophysics Data System (ADS)
Maroneze, André O.; Coüasnon, Bertrand; Lemaitre, Aurélie
2011-01-01
This paper presents an improvement to document layout analysis systems, offering a possible solution to Sayre's paradox (which states that an element "must be recognized before it can be segmented; and it must be segmented before it can be recognized"). This improvement, based on stochastic parsing, allows integration of statistical information, obtained from recognizers, during syntactic layout analysis. We present how this fusion of numeric and symbolic information in a feedback loop can be applied to syntactic methods to improve document description expressiveness. To limit combinatorial explosion during exploration of solutions, we devised an operator that allows optional activation of the stochastic parsing mechanism. Our evaluation on 1250 handwritten business letters shows this method allows the improvement of global recognition scores.
Perneczky, R; Drzezga, A; Diehl-Schmid, J; Schmid, G; Wohlschläger, A; Kars, S; Grimmer, T; Wagenpfeil, S; Monsch, A; Kurz, A
2006-09-01
Functional imaging studies report that higher education is associated with more severe pathology in patients with Alzheimer's disease, controlling for disease severity. Therefore, schooling seems to provide brain reserve against neurodegeneration. To provide further evidence for brain reserve in a large sample, using a sensitive technique for the indirect assessment of brain abnormality (18F-fluoro-deoxy-glucose-positron emission tomography (FDG-PET)), a comprehensive measure of global cognitive impairment to control for disease severity (total score of the Consortium to Establish a Registry for Alzheimer's Disease Neuropsychological Battery) and an approach unbiased by predefined regions of interest for the statistical analysis (statistical parametric mapping (SPM)). 93 patients with mild Alzheimer's disease and 16 healthy controls underwent 18F-FDG-PET imaging of the brain. A linear regression analysis with education as independent and glucose utilisation as dependent variables, adjusted for global cognitive status and demographic variables, was conducted in SPM2. The regression analysis showed a marked inverse association between years of schooling and glucose metabolism in the posterior temporo-occipital association cortex and the precuneus in the left hemisphere. In line with previous reports, the findings suggest that education is associated with brain reserve and that people with higher education can cope with brain damage for a longer time.
Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith
2018-01-02
Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.
Alternative accounting in maternal and infant global health.
Adams, Vincanne; Craig, Sienna R; Samen, Arlene
2015-03-18
Efforts to augment accountability through the use of metrics, and especially randomised controlled trial or other statistical methods place an increased burden on small nongovernmental organisations (NGOs) doing global health. In this paper, we explore how one small NGO works to generate forms of accountability and evidence that may not conform to new metrics trends but nevertheless deserve attention and scrutiny for being effective, practical and reliable in the area of maternal and infant health. Through an analysis of one NGO and, in particular, its organisational and ethical principles for creating a network of safety for maternal and child health, we argue that alternative forms of (ac)counting like these might provide useful evidence of another kind of successful global health work.
NASA Astrophysics Data System (ADS)
Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.
2017-12-01
The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.
NASA Astrophysics Data System (ADS)
Martinez, B. S.; Ye, H.; Levy, R. C.; Fetzer, E. J.; Remer, L.
2017-12-01
Atmospheric aerosols expose high levels of uncertainty in regard to Earth's changing atmospheric energy budget. Continued exploration and analysis is necessary to obtain more complete understanding in which, and to what degree, aerosols contribute within climate feedbacks and global climate change. With the advent of global satellite retrievals, along with specific aerosol optical depth (AOD) Dark Target and Deep Blue algorithms, aerosols can now be better measured and analyzed. Aerosol effect on climate depends primarily on altitude, the reflectance albedo of the underlying surface, along with the presence of clouds and the dynamics thereof. As currently known, the majority of aerosol distribution and mixing occur in the lower troposphere from the surface upwards to around 2km. Additionally, being a primary greenhouse gas contributor, water vapor is significant to climate feedbacks and Earth's radiation budget. Feedbacks are generally reported from the top of atmosphere (TOA). Therefore, little is known of the relationship between water vapor and aerosols; specifically, in regional areas of the globe known for aerosol loading such as anthropogenic biomass burning in South America and naturally occurring dust blowing off the deserts in the African and Arabian peninsulas. Statistical regression and timeseries analysis are used in determining significant probabilities suggesting trends of both regional precipitable water (PW) and AOD increase and decrease over a 13-year time period from 2003-2015. Regions with statistically significant positive or negative trends of AOD and PW are analyzed in determining correlations, or lack thereof. This initial examination helps to deduce and better understand how aerosols contribute to the radiation budget and assessing climate change.
Early-type galaxies in the Antlia cluster: catalogue and isophotal analysis
NASA Astrophysics Data System (ADS)
Calderón, Juan P.; Bassino, Lilia P.; Cellone, Sergio A.; Gómez, Matías
2018-06-01
We present a statistical isophotal analysis of 138 early-type galaxies in the Antlia cluster, located at a distance of ˜ 35 Mpc. The observational material consists of CCD images of four 36 × 36 arcmin2 fields obtained with the MOSAIC II camera at the Blanco 4-m telescope at Cerro Tololo Interamerican Observatory. Our present work supersedes previous Antlia studies in the sense that the covered area is four times larger, the limiting magnitude is MB ˜ -9.6 mag, and the surface photometry parameters of each galaxy are derived from Sérsic model fits extrapolated to infinity. In a companion previous study we focused on the scaling relations obtained by means of surface photometry, and now we present the data, on which the previous paper is based, the parameters of the isophotal fits as well as an isophotal analysis. For each galaxy, we derive isophotal shape parameters along the semimajor axis and search for correlations within different radial bins. Through extensive statistical tests, we also analyse the behaviour of these values against photometric and global parameters of the galaxies themselves. While some galaxies do display radial gradients in their ellipticity (ɛ) and/or their Fourier coefficients, differences in mean values between adjacent regions are not statistically significant. Regarding Fourier coefficients, dwarf galaxies usually display gradients between all adjacent regions, while non-dwarfs tend to show this behaviour just between the two outermost regions. Globally, there is no obvious correlation between Fourier coefficients and luminosity for the whole magnitude range (-12 ≳ MV ≳ -22); however, dwarfs display much higher dispersions at all radii.
Quality of Life and Nutritional Status Among Cancer Patients on Chemotherapy
Vergara, Nunilon; Montoya, Jose Enrique; Luna, Herdee Gloriane; Amparo, Jose Roberto; Cristal-Luna, Gloria
2013-01-01
Objectives Malnutrition is prevalent among cancer patients, and maybe correlated with altered quality of life. The objective of this study is to determine wether quality of life among cancer patients on chemotherapy at the National Kidney and Transplant Institute- Cancer Unit differs from patients with normal nutrition based on the Subjective Global Assessment scale. Methods A cross sectional study was conducted among cancer patients admitted for chemotherapy at the National Kidney and Transplant Institute-Cancer Unit from January to May 2011. Demographic profile, performance status by Eastern Cooperative Oncology Group performance scale, nutritional status assessment by Subjective Global Assessment, and quality of life assessment by the European Organization for Research and Treatment of Cancer QoL-30 core module were obtained. Descriptive statistics and ANOVA were performed for analysis of quality of life parameters and nutritional status. Results A total of 97 subjects were included in this study, 66 subjects (68.04%) were females and 31 (31.96%) were males. Mean age was 54.55 ± 11.14 years, while mean performance status by the Eastern Cooperative Oncology Group classification was 0.88 ± 0.83 with a range of 0-3. According to the Subjective Global Assessment, there were 58 patients with SGA A, classified to have adequate nutrition, and 39 patients (40.21%) were considered malnourished. Among these 39 patients, 32 were classified SGA-B (moderately malnourished) and 7 were classified SGA C (severely malnourished) mean global quality of life was 68.73 ± 19.05. Results from ANOVA test revealed that patients were statistically different across the Subjective Global Assessment groups according to global quality of life (p<0.001), physical (p<0.001), role (p<0.001), emotional (p<0.001), and cognitive functioning (p<0.001); fatigue (p<0.001), nausea and vomiting (p<0.001), pain (p<0.001), insomnia (p<0.001), and appetite loss (p<0.001). Conclusion Global quality of life and its parameters: physical state, role, emotional state, cognitive functioning, cancer fatigue, nausea and vomiting, pain, insomnia, and loss of appetite were statistically different across all Subjective Global Assessment groups. Moreover, there was no difference between financial difficulties, social functioning, constipation and diarrhea among the Subjective Global Assessment groups. PMID:23904921
Quality of life and nutritional status among cancer patients on chemotherapy.
Vergara, Nunilon; Montoya, Jose Enrique; Luna, Herdee Gloriane; Amparo, Jose Roberto; Cristal-Luna, Gloria
2013-07-01
Malnutrition is prevalent among cancer patients, and maybe correlated with altered quality of life. The objective of this study is to determine wether quality of life among cancer patients on chemotherapy at the National Kidney and Transplant Institute- Cancer Unit differs from patients with normal nutrition based on the Subjective Global Assessment scale. A cross sectional study was conducted among cancer patients admitted for chemotherapy at the National Kidney and Transplant Institute-Cancer Unit from January to May 2011. Demographic profile, performance status by Eastern Cooperative Oncology Group performance scale, nutritional status assessment by Subjective Global Assessment, and quality of life assessment by the European Organization for Research and Treatment of Cancer QoL-30 core module were obtained. Descriptive statistics and ANOVA were performed for analysis of quality of life parameters and nutritional status. A total of 97 subjects were included in this study, 66 subjects (68.04%) were females and 31 (31.96%) were males. Mean age was 54.55 ± 11.14 years, while mean performance status by the Eastern Cooperative Oncology Group classification was 0.88 ± 0.83 with a range of 0-3. According to the Subjective Global Assessment, there were 58 patients with SGA A, classified to have adequate nutrition, and 39 patients (40.21%) were considered malnourished. Among these 39 patients, 32 were classified SGA-B (moderately malnourished) and 7 were classified SGA C (severely malnourished) mean global quality of life was 68.73 ± 19.05. Results from ANOVA test revealed that patients were statistically different across the Subjective Global Assessment groups according to global quality of life (p<0.001), physical (p<0.001), role (p<0.001), emotional (p<0.001), and cognitive functioning (p<0.001); fatigue (p<0.001), nausea and vomiting (p<0.001), pain (p<0.001), insomnia (p<0.001), and appetite loss (p<0.001). GLOBAL QUALITY OF LIFE AND ITS PARAMETERS: physical state, role, emotional state, cognitive functioning, cancer fatigue, nausea and vomiting, pain, insomnia, and loss of appetite were statistically different across all Subjective Global Assessment groups. Moreover, there was no difference between financial difficulties, social functioning, constipation and diarrhea among the Subjective Global Assessment groups.
Global universe anisotropy probed by the alignment of structures in the cosmic microwave background.
Wiaux, Y; Vielva, P; Martínez-González, E; Vandergheynst, P
2006-04-21
We question the global universe isotropy by probing the alignment of local structures in the cosmic microwave background (CMB) radiation. The original method proposed relies on a steerable wavelet decomposition of the CMB signal on the sphere. The analysis of the first-year Wilkinson Microwave Anisotropy Probe data identifies a mean preferred plane with a normal direction close to the CMB dipole axis, and a mean preferred direction in this plane, very close to the ecliptic poles axis. Previous statistical anisotropy results are thereby synthesized, but further analyses are still required to establish their origin.
Global sensitivity analysis in stochastic simulators of uncertain reaction networks.
Navarro Jimenez, M; Le Maître, O P; Knio, O M
2016-12-28
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.
Global sensitivity analysis in stochastic simulators of uncertain reaction networks
Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.
2016-12-23
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol’s decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes thatmore » the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. Here, a sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.« less
Global sensitivity analysis in stochastic simulators of uncertain reaction networks
NASA Astrophysics Data System (ADS)
Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.
2016-12-01
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.
Recurrent jellyfish blooms are a consequence of global oscillations.
Condon, Robert H; Duarte, Carlos M; Pitt, Kylie A; Robinson, Kelly L; Lucas, Cathy H; Sutherland, Kelly R; Mianzan, Hermes W; Bogeberg, Molly; Purcell, Jennifer E; Decker, Mary Beth; Uye, Shin-ichi; Madin, Laurence P; Brodeur, Richard D; Haddock, Steven H D; Malej, Alenka; Parry, Gregory D; Eriksen, Elena; Quiñones, Javier; Acha, Marcelo; Harvey, Michel; Arthur, James M; Graham, William M
2013-01-15
A perceived recent increase in global jellyfish abundance has been portrayed as a symptom of degraded oceans. This perception is based primarily on a few case studies and anecdotal evidence, but a formal analysis of global temporal trends in jellyfish populations has been missing. Here, we analyze all available long-term datasets on changes in jellyfish abundance across multiple coastal stations, using linear and logistic mixed models and effect-size analysis to show that there is no robust evidence for a global increase in jellyfish. Although there has been a small linear increase in jellyfish since the 1970s, this trend was unsubstantiated by effect-size analysis that showed no difference in the proportion of increasing vs. decreasing jellyfish populations over all time periods examined. Rather, the strongest nonrandom trend indicated jellyfish populations undergo larger, worldwide oscillations with an approximate 20-y periodicity, including a rising phase during the 1990s that contributed to the perception of a global increase in jellyfish abundance. Sustained monitoring is required over the next decade to elucidate with statistical confidence whether the weak increasing linear trend in jellyfish after 1970 is an actual shift in the baseline or part of an oscillation. Irrespective of the nature of increase, given the potential damage posed by jellyfish blooms to fisheries, tourism, and other human industries, our findings foretell recurrent phases of rise and fall in jellyfish populations that society should be prepared to face.
NASA Astrophysics Data System (ADS)
Mendez, F. J.; Rueda, A.; Barnard, P.; Mori, N.; Nakajo, S.; Espejo, A.; del Jesus, M.; Diez Sierra, J.; Cofino, A. S.; Camus, P.
2016-02-01
Hurricanes hitting California have a very low ocurrence probability due to typically cool ocean temperature and westward tracks. However, damages associated to these improbable events would be dramatic in Southern California and understanding the oceanographic and atmospheric drivers is of paramount importance for coastal risk management for present and future climates. A statistical analysis of the historical events is very difficult due to the limited resolution of atmospheric and oceanographic forcing data available. In this work, we propose a combination of: (a) statistical downscaling methods (Espejo et al, 2015); and (b) a synthetic stochastic tropical cyclone (TC) model (Nakajo et al, 2014). To build the statistical downscaling model, Y=f(X), we apply a combination of principal component analysis and the k-means classification algorithm to find representative patterns from a potential TC index derived from large-scale SST fields in Eastern Central Pacific (predictor X) and the associated tropical cyclone ocurrence (predictand Y). SST data comes from NOAA Extended Reconstructed SST V3b providing information from 1854 to 2013 on a 2.0 degree x 2.0 degree global grid. As data for the historical occurrence and paths of tropical cycloneas are scarce, we apply a stochastic TC model which is based on a Monte Carlo simulation of the joint distribution of track, minimum sea level pressure and translation speed of the historical events in the Eastern Central Pacific Ocean. Results will show the ability of the approach to explain seasonal-to-interannual variability of the predictor X, which is clearly related to El Niño Southern Oscillation. References Espejo, A., Méndez, F.J., Diez, J., Medina, R., Al-Yahyai, S. (2015) Seasonal probabilistic forecasting of tropical cyclone activity in the North Indian Ocean, Journal of Flood Risk Management, DOI: 10.1111/jfr3.12197 Nakajo, S., N. Mori, T. Yasuda, and H. Mase (2014) Global Stochastic Tropical Cyclone Model Based on Principal Component Analysis and Cluster Analysis, Journal of Applied Meteorology and Climatology, DOI: 10.1175/JAMC-D-13-08.1
NASA Technical Reports Server (NTRS)
Mach, Douglas M.; Blakeslee, R. J.; Bateman, M. J.; Bailey, J. C.
2011-01-01
We have combined analyses of over 1000 high altitude aircraft observations of electrified clouds with diurnal lightning statistics from the Lightning Imaging Sensor (LIS) and Optical Transient Detector (OTD) to produce an estimate of the diurnal variation in the global electric circuit. Using basic assumptions about the mean storm currents as a function of flash rate and location, and the global electric circuit, our estimate of the current in the global electric circuit matches the Carnegie curve diurnal variation to within 4% for all but two short periods of time. The agreement with the Carnegie curve was obtained without any tuning or adjustment of the satellite or aircraft data. Mean contributions to the global electric circuit from land and ocean thunderstorms are 1.1 kA (land) and 0.7 kA (ocean). Contributions to the global electric circuit from ESCs are 0.22 kA for ocean storms and 0.04 kA for land storms. Using our analysis, the mean total conduction current for the global electric circuit is 2.0 kA.
Nour-Eldein, Hebatallah; Abdelsalam, Shimaa A.; Nasr, Gamila M.; Abdelwahed, Hassan A.
2013-01-01
Background: The close sustained contact of family physician with their patients and local community makes preventive care an integral part of their routine work. Most cardiovascular diseases (CVD) can be prevented by addressing their risk factors. There are several guidelines that recommend different CV risk assessment tools to support CV prevention strategies. Aim: This study aimed to assess awareness and attitude of global CV risk assessment and use of their tools by family physicians; aiming to improve CV prevention service. Methods: The current study is a cross-sectional descriptive analytic. Sixty-five family physicians were asked to respond to, validated anonymous questionnaire to collect data about characteristics of family physicians, their awareness, attitude, current use, barriers, and recommendations of global CV risk assessment. Statistical Package for Social Sciences (SPSS) version 18 was used for data entry and analysis. Results: Awareness of guidelines of global CV risk assessment was relatively higher regarding the American guidelines (30.8%) than that recommended by World Health Organization (WHO) for Egypt (20.2%). 50.8% of participants had favorable attitude. There was statistical significant relationship between attitude scores and physician characteristics; age (P = 0.003), qualification (P = 0.001) and number of patients seen per week (P = 0.009). Routine use of global CV risk assessment tools was reported only (23%) by family physicians. Conclusion: Relative higher attitude scores than use of global CV risk assessment tools in practice. The most frequent barriers were related to lack of resources and shortage in training/skills and the raised suggestions were towards training. PMID:26664843
Nour-Eldein, Hebatallah; Abdelsalam, Shimaa A; Nasr, Gamila M; Abdelwahed, Hassan A
2013-01-01
The close sustained contact of family physician with their patients and local community makes preventive care an integral part of their routine work. Most cardiovascular diseases (CVD) can be prevented by addressing their risk factors. There are several guidelines that recommend different CV risk assessment tools to support CV prevention strategies. This study aimed to assess awareness and attitude of global CV risk assessment and use of their tools by family physicians; aiming to improve CV prevention service. The current study is a cross-sectional descriptive analytic. Sixty-five family physicians were asked to respond to, validated anonymous questionnaire to collect data about characteristics of family physicians, their awareness, attitude, current use, barriers, and recommendations of global CV risk assessment. Statistical Package for Social Sciences (SPSS) version 18 was used for data entry and analysis. Awareness of guidelines of global CV risk assessment was relatively higher regarding the American guidelines (30.8%) than that recommended by World Health Organization (WHO) for Egypt (20.2%). 50.8% of participants had favorable attitude. There was statistical significant relationship between attitude scores and physician characteristics; age (P = 0.003), qualification (P = 0.001) and number of patients seen per week (P = 0.009). Routine use of global CV risk assessment tools was reported only (23%) by family physicians. Relative higher attitude scores than use of global CV risk assessment tools in practice. The most frequent barriers were related to lack of resources and shortage in training/skills and the raised suggestions were towards training.
Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies
Liu, Zhonghua; Lin, Xihong
2017-01-01
Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391
Multiple phenotype association tests using summary statistics in genome-wide association studies.
Liu, Zhonghua; Lin, Xihong
2018-03-01
We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.
Exploring and Analyzing Climate Variations Online by Using MERRA-2 data at GES DISC
NASA Astrophysics Data System (ADS)
Shen, S.; Ostrenga, D.; Vollmer, B.; Kempler, S.
2016-12-01
NASA Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) (http://giovanni.sci.gsfc.nasa.gov/giovanni/) is a web-based data visualization and analysis system developed by the Goddard Earth Sciences Data and Information Services Center (GES DISC). Current data analysis functions include Lat-Lon map, time series, scatter plot, correlation map, difference, cross-section, vertical profile, and animation etc. The system enables basic statistical analysis and comparisons of multiple variables. This web-based tool facilitates data discovery, exploration and analysis of large amount of global and regional remote sensing and model data sets from a number of NASA data centers. Recently, long term global assimilated atmospheric, land, and ocean data have been integrated into the system that enables quick exploration and analysis of climate data without downloading, and preprocessing the data. Example data include climate reanalysis from NASA Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) which provides data beginning 1980 to present; land data from NASA Global Land Data Assimilation System (GLDAS) which assimilates data from 1948 to 2012; as well as ocean biological data from NASA Ocean Biogeochemical Model (NOBM) which assimilates data from 1998 to 2012. This presentation, using surface air temperature, precipitation, ozone, and aerosol, etc. from MERRA-2, demonstrates climate variation analysis with Giovanni at selected regions.
Exploring and Analyzing Climate Variations Online by Using NASA MERRA-2 Data at GES DISC
NASA Technical Reports Server (NTRS)
Shen, Suhung; Ostrenga, Dana M.; Vollmer, Bruce E.; Kempler, Steven J.
2016-01-01
NASA Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) (http:giovanni.sci.gsfc.nasa.govgiovanni) is a web-based data visualization and analysis system developed by the Goddard Earth Sciences Data and Information Services Center (GES DISC). Current data analysis functions include Lat-Lon map, time series, scatter plot, correlation map, difference, cross-section, vertical profile, and animation etc. The system enables basic statistical analysis and comparisons of multiple variables. This web-based tool facilitates data discovery, exploration and analysis of large amount of global and regional remote sensing and model data sets from a number of NASA data centers. Long term global assimilated atmospheric, land, and ocean data have been integrated into the system that enables quick exploration and analysis of climate data without downloading, preprocessing, and learning data. Example data include climate reanalysis data from NASA Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) which provides data beginning in 1980 to present; land data from NASA Global Land Data Assimilation System (GLDAS), which assimilates data from 1948 to 2012; as well as ocean biological data from NASA Ocean Biogeochemical Model (NOBM), which provides data from 1998 to 2012. This presentation, using surface air temperature, precipitation, ozone, and aerosol, etc. from MERRA-2, demonstrates climate variation analysis with Giovanni at selected regions.
NASA Astrophysics Data System (ADS)
Shah, Munawar; Jin, Shuanggen
2015-12-01
Pre-earthquake ionospheric anomalies are still challenging and unclear to obtain and understand, particularly for different earthquake magnitudes and focal depths as well as types of fault. In this paper, the seismo-ionospheric disturbances (SID) related to global earthquakes with 1492 Mw ≥ 5.0 from 1998 to 2014 are investigated using the total electron content (TEC) of GPS global ionosphere maps (GIM). Statistical analysis of 10-day TEC data before global Mw ≥ 5.0 earthquakes shows significant enhancement 5 days before an earthquake of Mw ≥ 6.0 at a 95% confidence level. Earthquakes with a focal depth of less than 60 km and Mw ≥ 6.0 are presumably the root of deviation in the ionospheric TEC because earthquake breeding zones have gigantic quantities of energy at shallower focal depths. Increased anomalous TEC is recorded in cumulative percentages beyond Mw = 5.5. Sharpness in cumulative percentages is evident in seismo-ionospheric disturbance prior to Mw ≥ 6.0 earthquakes. Seismo-ionospheric disturbances related to strike slip and thrust earthquakes are noticeable for magnitude Mw6.0-7.0 earthquakes. The relative values reveal high ratios (up to 2) and low ratios (up to -0.5) within 5 days prior to global earthquakes for positive and negative anomalies. The anomalous patterns in TEC related to earthquakes are possibly due to the coupling of high amounts of energy from earthquake breeding zones of higher magnitude and shallower focal depth.
How does living with HIV impact on women's mental health? Voices from a global survey.
Orza, Luisa; Bewley, Susan; Logie, Carmen H; Crone, Elizabeth Tyler; Moroz, Svetlana; Strachan, Sophie; Vazquez, Marijo; Welbourn, Alice
2015-01-01
Women living with HIV experience a disproportionate burden of mental health issues. To date, global guidelines contain insufficient guidance on mental health support, particularly regarding perinatal care. The aim of this article is to describe the extent and impact of mental health issues as experienced by women living with HIV on their sexual and reproductive health and human rights (SRH&HR). A global, mixed-methods, user-led and designed survey on SRH&HR of women living with HIV was conducted using snowball sampling, containing an optional section exploring mental health issues. Statistical quantitative data analysis included descriptive statistics, correlation and multiple linear regression analysis for the mental health responses. Thematic analysis of open free-text responses was performed for qualitative data. A total of 832 respondents from 94 countries participated in the online survey with 489 responses to the optional mental health section. Of the respondents, 82% reported depression symptoms and 78% rejection. One-fifth reported mental health issues before HIV diagnosis. Respondents reported experiencing a 3.5-fold higher number of mental health issues after diagnosis (8.71 vs 2.48, t[488]=23.00, p<0.001). Nearly half (n=224; 45.8%) had multiple socially disadvantaged identities (SDIs). The number of SDIs was positively correlated with experiencing mental health issues (p<0.05). Women described how mental health issues affected their ability to enjoy their right to sexual and reproductive health and to access services. These included depression, rejection and social exclusion, sleep problems, intersectional stigma, challenges with sexual and intimate relationships, substance use and sexual risk, reproductive health barriers and human rights (HR) violations. Respondents recommended that policymakers and clinicians provide psychological support and counselling, funding for peer support and interventions to challenge gender-based violence and to promote HR. Interventions addressing intersecting stigmas and any especial impacts of diagnosis during pregnancy are required to ensure women's SRH&HR. Global policy guidelines regarding women living with HIV must incorporate mental health considerations.
Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...
2015-07-01
In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less
ERIC Educational Resources Information Center
Watanabe, Satoshi; Murasawa, Masataka; Abe, Yasumi
2013-01-01
The increasingly competitive and globalizing environment of today's higher education market has compelled many colleges and universities around the world to revamp their academic programs and organizational structures by responsively addressing various contemporary issues raised by internal as well as external stakeholders. It is no exception that…
An analysis of tropical hardwood product importation and consumption in the United States
Paul M. Smith; Michael P. Haas; William G. Luppold; William G. Luppold
1995-01-01
The consumption of forest products emanating from tropical rainforests is an issue that is receiving increasing attention in the United States. This attention stems from concerns over the sustainability of tropical ecosystems. However, trade statistics show the United States imported only 4.0 percent of all tropical timber products traded globally in 1989. In addition...
ERIC Educational Resources Information Center
Terasawa, Takunori
2018-01-01
This paper aims to establish that globalised social and linguistic changes have a more complicated impact on local behaviours and attitudes than is believed. Based on statistical analysis of nationally representative surveys in Japan, the paper presents evidence against the following two propositions: (1) globalisation increases local demand for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my; Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com
Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, andmore » time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.« less
Vallabhajosyula, Saraschandra; Rayes, Hamza A; Sakhuja, Ankit; Murad, Mohammad Hassan; Geske, Jeffrey B; Jentzer, Jacob C
2018-01-01
The data on speckle-tracking echocardiography (STE) in patients with sepsis are limited. This systematic review from 1975 to 2016 included studies in adults and children evaluating cardiovascular dysfunction in sepsis, severe sepsis, and septic shock utilizing STE for systolic global longitudinal strain (GLS). The primary outcome was short- or long-term mortality. Given the significant methodological and statistical differences between published studies, combining the data using meta-analysis methods was not appropriate. A total of 120 studies were identified, with 5 studies (561 patients) included in the final analysis. All studies were prospective observational studies using the 2001 criteria for defining sepsis. Three studies demonstrated worse systolic GLS to be associated with higher mortality, whereas 2 did not show a statistically significant association. Various cutoffs between -10% and -17% were used to define abnormal GLS across studies. This systematic review revealed that STE may predict mortality in patients with sepsis; however, the strength of evidence is low due to heterogeneity in study populations, GLS technologies, cutoffs, and timing of STE. Further dedicated studies are needed to understand the optimal application of STE in patients with sepsis.
NASA Astrophysics Data System (ADS)
Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz; Azwan, Zairul; Raduan, Farhana; Sagap, Ismail
2014-12-01
Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.
On nonstationarity and antipersistency in global temperature series
NASA Astrophysics Data System (ADS)
KäRner, O.
2002-10-01
Statistical analysis is carried out for satellite-based global daily tropospheric and stratospheric temperature anomaly and solar irradiance data sets. Behavior of the series appears to be nonstationary with stationary daily increments. Estimating long-range dependence between the increments reveals a remarkable difference between the two temperature series. Global average tropospheric temperature anomaly behaves similarly to the solar irradiance anomaly. Their daily increments show antipersistency for scales longer than 2 months. The property points at a cumulative negative feedback in the Earth climate system governing the tropospheric variability during the last 22 years. The result emphasizes a dominating role of the solar irradiance variability in variations of the tropospheric temperature and gives no support to the theory of anthropogenic climate change. The global average stratospheric temperature anomaly proceeds like a 1-dim random walk at least up to 11 years, allowing good presentation by means of the autoregressive integrated moving average (ARIMA) models for monthly series.
Annuar, Bin Rapaee; Liew, Chee Khoon; Chin, Sze Piaw; Ong, Tiong Kiam; Seyfarth, M Tobias; Chan, Wei Ling; Fong, Yean Yip; Ang, Choon Kiat; Lin, Naing; Liew, Houng Bang; Sim, Kui Hian
2008-01-01
To compare the assessment of global and regional left ventricular (LV) function using 64-slice multislice computed tomography (MSCT), 2D echocardiography (2DE) and cardiac magnetic resonance (CMR). Thirty-two consecutive patients (mean age, 56.5+/-9.7 years) referred for evaluation of coronary artery using 64-slice MSCT also underwent 2DE and CMR within 48h. The global left ventricular function which include left ventricular ejection fraction (LVEF), left ventricular end diastolic volume (LVdV) and left ventricular end systolic volume (LVsV) were determine using the three modalities. Regional wall motion (RWM) was assessed visually in all three modalities. The CMR served as the gold standard for the comparison between 64-slice MSCT with CMR and 2DE with CMR. Statistical analysis included Pearson correlation coefficient, Bland-Altman plots and kappa-statistics. The 64-slice MSCT agreed well with CMR for assessment of LVEF (r=0.92; p<0.0001), LVdV (r=0.98; p<0.0001) and LVsV (r=0.98; p<0.0001). In comparison with 64-slice MSCT, 2DE showed moderate correlation with CMR for the assessment of LVEF (r=0.84; p<0.0001), LVdV (r=0.83; p<0.0001) and LVsV (r=0.80; p<0.0001). However in RWM analysis, 2DE showed better accuracy than 64-slice MSCT (94.3% versus 82.4%) and closer agreement (kappa=0.89 versus 0.63) with CMR. 64-Slice MSCT correlates strongly with CMR in global LV function however in regional LV function 2DE showed better agreement with CMR than 64-slice MSCT.
Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping
2018-05-16
As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Global Warming Estimation from MSU
NASA Technical Reports Server (NTRS)
Prabhakara, C.; Iacovazzi, Robert, Jr.
1999-01-01
In this study, we have developed time series of global temperature from 1980-97 based on the Microwave Sounding Unit (MSU) Ch 2 (53.74 GHz) observations taken from polar-orbiting NOAA operational satellites. In order to create these time series, systematic errors (approx. 0.1 K) in the Ch 2 data arising from inter-satellite differences are removed objectively. On the other hand, smaller systematic errors (approx. 0.03 K) in the data due to orbital drift of each satellite cannot be removed objectively. Such errors are expected to remain in the time series and leave an uncertainty in the inferred global temperature trend. With the help of a statistical method, the error in the MSU inferred global temperature trend resulting from orbital drifts and residual inter-satellite differences of all satellites is estimated to be 0.06 K decade. Incorporating this error, our analysis shows that the global temperature increased at a rate of 0.13 +/- 0.06 K decade during 1980-97.
Just the right age: well-clustered exposure ages from a global glacial 10Be compilation
NASA Astrophysics Data System (ADS)
Heyman, Jakob; Margold, Martin
2017-04-01
Cosmogenic exposure dating has been used extensively for defining glacial chronologies, both in ice sheet and alpine settings, and the global set of published ages today reaches well beyond 10,000 samples. Over the last few years, a number of important developments have improved the measurements (with well-defined AMS standards) and exposure age calculations (with updated data and methods for calculating production rates), in the best case enabling high precision dating of past glacial events. A remaining problem, however, is the fact that a large portion of all dated samples have been affected by prior and/or incomplete exposure, yielding erroneous exposure ages under the standard assumptions. One way to address this issue is to only use exposure ages that can be confidently considered as unaffected by prior/incomplete exposure, such as groups of samples with statistically identical ages. Here we use objective statistical criteria to identify groups of well-clustered exposure ages from the global glacial "expage" 10Be compilation. Out of ˜1700 groups with at least 3 individual samples ˜30% are well-clustered, increasing to ˜45% if allowing outlier rejection of a maximum of 1/3 of the samples (still requiring a minimum of 3 well-clustered ages). The dataset of well-clustered ages is heavily dominated by ages <30 ka, showing that well-defined cosmogenic chronologies primarily exist for the last glaciation. We observe a large-scale global synchronicity in the timing of the last deglaciation from ˜20 to 10 ka. There is also a general correlation between the timing of deglaciation and latitude (or size of the individual ice mass), with earlier deglaciation in lower latitudes and later deglaciation towards the poles. Grouping the data into regions and comparing with available paleoclimate data we can start to untangle regional differences in the last deglaciation and the climate events controlling the ice mass loss. The extensive dataset and the statistical analysis enables an unprecedented global view on the last deglaciation.
Friedman, David B
2012-01-01
All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.
The Global Signature of Ocean Wave Spectra
NASA Astrophysics Data System (ADS)
Portilla-Yandún, Jesús
2018-01-01
A global atlas of ocean wave spectra is developed and presented. The development is based on a new technique for deriving wave spectral statistics, which is applied to the extensive ERA-Interim database from European Centre of Medium-Range Weather Forecasts. Spectral statistics is based on the idea of long-term wave systems, which are unique and distinct at every geographical point. The identification of those wave systems allows their separation from the overall spectrum using the partition technique. Their further characterization is made using standard integrated parameters, which turn out much more meaningful when applied to the individual components than to the total spectrum. The parameters developed include the density distribution of spectral partitions, which is the main descriptor; the identified wave systems; the individual distribution of the characteristic frequencies, directions, wave height, wave age, seasonal variability of wind and waves; return periods derived from extreme value analysis; and crossing-sea probabilities. This information is made available in web format for public use at http://www.modemat.epn.edu.ec/#/nereo. It is found that wave spectral statistics offers the possibility to synthesize data while providing a direct and comprehensive view of the local and regional wave conditions.
2013-01-01
Background Metabolic alteration is one of the hallmarks of carcinogenesis. We aimed to identify certain metabolic biomarkers for the early detection of pancreatic cancer (PC) using the transgenic PTEN-null mouse model. Pancreas-specific deletion of PTEN in mouse caused progressive premalignant lesions such as highly proliferative ductal metaplasia. We imaged the mitochondrial redox state of the pancreases of the transgenic mice approximately eight months old using the redox scanner, i.e., the nicotinamide adenine dinucleotide/oxidized flavoproteins (NADH/Fp) fluorescence imager at low temperature. Two different approaches, the global averaging of the redox indices without considering tissue heterogeneity along tissue depth and the univariate analysis of multi-section data using tissue depth as a covariate were adopted for the statistical analysis of the multi-section imaging data. The standard deviations of the redox indices and the histogram analysis with Gaussian fit were used to determine the tissue heterogeneity. Results All methods show consistently that the PTEN deficient pancreases (Pdx1-Cre;PTENlox/lox) were significantly more heterogeneous in their mitochondrial redox state compared to the controls (PTENlox/lox). Statistical analysis taking into account the variations of the redox state with tissue depth further shows that PTEN deletion significantly shifted the pancreatic tissue to an overall more oxidized state. Oxidization of the PTEN-null group was not seen when the imaging data were analyzed by global averaging without considering the variation of the redox indices along tissue depth, indicating the importance of taking tissue heterogeneity into account for the statistical analysis of the multi-section imaging data. Conclusions This study reveals a possible link between the mitochondrial redox state alteration of the pancreas and its malignant transformation and may be further developed for establishing potential metabolic biomarkers for the early diagnosis of pancreatic cancer. PMID:24252270
NASA Astrophysics Data System (ADS)
Rakesh, V.; Kantharao, B.
2017-03-01
Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events
NASA Technical Reports Server (NTRS)
Lien, Guo-Yuan; Kalnay, Eugenia; Miyoshi, Takemasa; Huffman, George J.
2016-01-01
Assimilation of satellite precipitation data into numerical models presents several difficulties, with two of the most important being the non-Gaussian error distributions associated with precipitation, and large model and observation errors. As a result, improving the model forecast beyond a few hours by assimilating precipitation has been found to be difficult. To identify the challenges and propose practical solutions to assimilation of precipitation, statistics are calculated for global precipitation in a low-resolution NCEP Global Forecast System (GFS) model and the TRMM Multisatellite Precipitation Analysis (TMPA). The samples are constructed using the same model with the same forecast period, observation variables, and resolution as in the follow-on GFSTMPA precipitation assimilation experiments presented in the companion paper.The statistical results indicate that the T62 and T126 GFS models generally have positive bias in precipitation compared to the TMPA observations, and that the simulation of the marine stratocumulus precipitation is not realistic in the T62 GFS model. It is necessary to apply to precipitation either the commonly used logarithm transformation or the newly proposed Gaussian transformation to obtain a better relationship between the model and observational precipitation. When the Gaussian transformations are separately applied to the model and observational precipitation, they serve as a bias correction that corrects the amplitude-dependent biases. In addition, using a spatially andor temporally averaged precipitation variable, such as the 6-h accumulated precipitation, should be advantageous for precipitation assimilation.
Ménard, Richard; Deshaies-Jacques, Martin; Gasset, Nicolas
2016-09-01
An objective analysis is one of the main components of data assimilation. By combining observations with the output of a predictive model we combine the best features of each source of information: the complete spatial and temporal coverage provided by models, with a close representation of the truth provided by observations. The process of combining observations with a model output is called an analysis. To produce an analysis requires the knowledge of observation and model errors, as well as its spatial correlation. This paper is devoted to the development of methods of estimation of these error variances and the characteristic length-scale of the model error correlation for its operational use in the Canadian objective analysis system. We first argue in favor of using compact support correlation functions, and then introduce three estimation methods: the Hollingsworth-Lönnberg (HL) method in local and global form, the maximum likelihood method (ML), and the [Formula: see text] diagnostic method. We perform one-dimensional (1D) simulation studies where the error variance and true correlation length are known, and perform an estimation of both error variances and correlation length where both are non-uniform. We show that a local version of the HL method can capture accurately the error variances and correlation length at each observation site, provided that spatial variability is not too strong. However, the operational objective analysis requires only a single and globally valid correlation length. We examine whether any statistics of the local HL correlation lengths could be a useful estimate, or whether other global estimation methods such as by the global HL, ML, or [Formula: see text] should be used. We found in both 1D simulation and using real data that the ML method is able to capture physically significant aspects of the correlation length, while most other estimates give unphysical and larger length-scale values. This paper describes a proposed improvement of the objective analysis of surface pollutants at Environment and Climate Change Canada (formerly known as Environment Canada). Objective analyses are essentially surface maps of air pollutants that are obtained by combining observations with an air quality model output, and are thought to provide a complete and more accurate representation of the air quality. The highlight of this study is an analysis of methods to estimate the model (or background) error correlation length-scale. The error statistics are an important and critical component to the analysis scheme.
Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice declines
NASA Astrophysics Data System (ADS)
Laidre, K. L.; Regehr, E. V.; Akcakaya, H. R.; Amstrup, S. C.; Atwood, T.; Lunn, N.; Obbard, M.; Stern, H. L., III; Thiemann, G.; Wiig, O.
2016-12-01
Loss of Arctic sea ice due to climate change is the most serious threat to polar bears (Ursus maritimus) throughout their circumpolar range. We performed a data-based sensitivity analysis with respect to this threat by evaluating the potential response of the global polar bear population to projected sea-ice conditions. We conducted 1) an assessment of generation length for polar bears, 2) developed of a standardized sea-ice metric representing important habitat characteristics for the species; and 3) performed population projections over three generations, using computer simulation and statistical models representing alternative relationships between sea ice and polar bear abundance. Using three separate approaches, the median percent change in mean global population size for polar bears between 2015 and 2050 ranged from -4% (95% CI = -62%, 50%) to -43% (95% CI = -76%, -20%). Results highlight the potential for large reductions in the global population if sea-ice loss continues. They also highlight the large amount of uncertainty in statistical projections of polar bear abundance and the sensitivity of projections to plausible alternative assumptions. The median probability of a reduction in the mean global population size of polar bears greater than 30% over three generations was approximately 0.71 (range 0.20-0.95. The median probability of a reduction greater than 50% was approximately 0.07 (range 0-0.35), and the probability of a reduction greater than 80% was negligible.
The past and future of food stocks
NASA Astrophysics Data System (ADS)
Laio, Francesco; Ridolfi, Luca; D'Odorico, Paolo
2016-03-01
Human societies rely on food reserves and the importation of agricultural goods as means to cope with crop failures and associated food shortage. While food trade has been the subject of intensive investigations in recent years, food reserves remain poorly quantified. It is unclear how food stocks are changing and whether they are declining. In this study we use food stock records for 92 products to reconstruct 50 years of aggregated food reserves, expressed in caloric equivalent (kcal), at the regional and global scales. A detailed statistical analysis demonstrates that the overall regional and global per-capita food stocks are stationary, challenging a widespread impression that food reserves are shrinking. We develop a statistically-sound stochastic representation of stock dynamics and take the stock-halving probability as a measure of the natural variability of the process. We find that there is a 20% probability that the global per-capita stocks will be halved by 2050. There are, however, some strong regional differences: Western Europe and the region encompassing North Africa and the Middle East have smaller halving probabilities and smaller per-capita stocks, while North America and Oceania have greater halving probabilities and greater per-capita stocks than the global average. Africa exhibits low per-capita stocks and relatively high probability of stock halving by 2050, which reflects a state of higher food insecurity in this continent.
Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?
NASA Astrophysics Data System (ADS)
Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.
2018-02-01
The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and found the statistical range of β values. The observed value of β = 0.83 for the CMT catalog corresponds to a p value of p=0.004 leading us to conclude that the interevent natural times in the CMT catalog are not random. For the time series analysis, we calculated the autocorrelation function for the sequence of natural time intervals between large global earthquakes and again compared with data from 1.5 × 10^4 synthetic catalogs of random data. In this case, the spread of autocorrelation values was much larger, so we concluded that this approach is insensitive to deviations from random behavior.
Statistical functions and relevant correlation coefficients of clearness index
NASA Astrophysics Data System (ADS)
Pavanello, Diego; Zaaiman, Willem; Colli, Alessandra; Heiser, John; Smith, Scott
2015-08-01
This article presents a statistical analysis of the sky conditions, during years from 2010 to 2012, for three different locations: the Joint Research Centre site in Ispra (Italy, European Solar Test Installation - ESTI laboratories), the site of National Renewable Energy Laboratory in Golden (Colorado, USA) and the site of Brookhaven National Laboratories in Upton (New York, USA). The key parameter is the clearness index kT, a dimensionless expression of the global irradiance impinging upon a horizontal surface at a given instant of time. In the first part, the sky conditions are characterized using daily averages, giving a general overview of the three sites. In the second part the analysis is performed using data sets with a short-term resolution of 1 sample per minute, demonstrating remarkable properties of the statistical distributions of the clearness index, reinforced by a proof using fuzzy logic methods. Successively some time-dependent correlations between different meteorological variables are presented in terms of Pearson and Spearman correlation coefficients, and introducing a new one.
Structural texture similarity metrics for image analysis and retrieval.
Zujovic, Jana; Pappas, Thrasyvoulos N; Neuhoff, David L
2013-07-01
We develop new metrics for texture similarity that accounts for human visual perception and the stochastic nature of textures. The metrics rely entirely on local image statistics and allow substantial point-by-point deviations between textures that according to human judgment are essentially identical. The proposed metrics extend the ideas of structural similarity and are guided by research in texture analysis-synthesis. They are implemented using a steerable filter decomposition and incorporate a concise set of subband statistics, computed globally or in sliding windows. We conduct systematic tests to investigate metric performance in the context of "known-item search," the retrieval of textures that are "identical" to the query texture. This eliminates the need for cumbersome subjective tests, thus enabling comparisons with human performance on a large database. Our experimental results indicate that the proposed metrics outperform peak signal-to-noise ratio (PSNR), structural similarity metric (SSIM) and its variations, as well as state-of-the-art texture classification metrics, using standard statistical measures.
An Adaptive Buddy Check for Observational Quality Control
NASA Technical Reports Server (NTRS)
Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)
2000-01-01
An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.
Kang, Xiao-guang; Ma, Qing-Bin
2005-01-01
Within the global urban system, the statistical relationship between urban eco-environment (UE) and urban competitiveness (UC) (RUEC) is researched. Data showed that there is a statistically inverted-U relationship between UE and UC. Eco-environmental factor is put into the classification of industries, and gets six industrial types by two indexes viz. industries' eco-environmental demand and pressure. The statistical results showed that there is a strong relationship, for new industrial classification, between the changes of industrial structure and evolvement of UE. The drive mechanism of the evolvement of urban eco-environment, with human demand and global work division was analyzed. The conclusion is that the development stratege, industrial policies of cities, and environmental policies fo cities must be fit with their ranks among the global urban system. At the era of globalization, so far as the environmental policies, their rationality could not be assessed with the level of strictness, but it can enhance cities' competitiveness when they are fit with cities' capabilities to attract and control some sections of the industry's value-chain. None but these kinds of environmental policies can probably enhance the UC.
Heat balance statistics derived from four-dimensional assimilations with a global circulation model
NASA Technical Reports Server (NTRS)
Schubert, S. D.; Herman, G. F.
1981-01-01
The reported investigation was conducted to develop a reliable procedure for obtaining the diabatic and vertical terms required for atmospheric heat balance studies. The method developed employs a four-dimensional assimilation mode in connection with the general circulation model of NASA's Goddard Laboratory for Atmospheric Sciences. The initial analysis was conducted with data obtained in connection with the 1976 Data Systems Test. On the basis of the results of the investigation, it appears possible to use the model's observationally constrained diagnostics to provide estimates of the global distribution of virtually all of the quantities which are needed to compute the atmosphere's heat and energy balance.
Wu, Wei; Sun, Le; Zhang, Zhe; Guo, Yingying; Liu, Shuying
2015-03-25
An ultra-high-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UHPLC-Q-TOF-MS) method was developed for the detection and structural analysis of ginsenosides in white ginseng and related processed products (red ginseng). Original neutral, malonyl, and chemically transformed ginsenosides were identified in white and red ginseng samples. The aglycone types of ginsenosides were determined by MS/MS as PPD (m/z 459), PPT (m/z 475), C-24, -25 hydrated-PPD or PPT (m/z 477 or m/z 493), and Δ20(21)-or Δ20(22)-dehydrated-PPD or PPT (m/z 441 or m/z 457). Following the structural determination, the UHPLC-Q-TOF-MS-based chemical profiling coupled with multivariate statistical analysis method was applied for global analysis of white and processed ginseng samples. The chemical markers present between the processed products red ginseng and white ginseng could be assigned. Process-mediated chemical changes were recognized as the hydrolysis of ginsenosides with large molecular weight, chemical transformations of ginsenosides, changes in malonyl-ginsenosides, and generation of 20-(R)-ginsenoside enantiomers. The relative contents of compounds classified as PPD, PPT, malonyl, and transformed ginsenosides were calculated based on peak areas in ginseng before and after processing. This study provides possibility to monitor multiple components for the quality control and global evaluation of ginseng products during processing. Copyright © 2014 Elsevier B.V. All rights reserved.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Recurrent jellyfish blooms are a consequence of global oscillations
Condon, Robert H.; Duarte, Carlos M.; Pitt, Kylie A.; Robinson, Kelly L.; Lucas, Cathy H.; Sutherland, Kelly R.; Mianzan, Hermes W.; Bogeberg, Molly; Purcell, Jennifer E.; Decker, Mary Beth; Uye, Shin-ichi; Madin, Laurence P.; Brodeur, Richard D.; Haddock, Steven H. D.; Malej, Alenka; Parry, Gregory D.; Eriksen, Elena; Quiñones, Javier; Acha, Marcelo; Harvey, Michel; Arthur, James M.; Graham, William M.
2013-01-01
A perceived recent increase in global jellyfish abundance has been portrayed as a symptom of degraded oceans. This perception is based primarily on a few case studies and anecdotal evidence, but a formal analysis of global temporal trends in jellyfish populations has been missing. Here, we analyze all available long-term datasets on changes in jellyfish abundance across multiple coastal stations, using linear and logistic mixed models and effect-size analysis to show that there is no robust evidence for a global increase in jellyfish. Although there has been a small linear increase in jellyfish since the 1970s, this trend was unsubstantiated by effect-size analysis that showed no difference in the proportion of increasing vs. decreasing jellyfish populations over all time periods examined. Rather, the strongest nonrandom trend indicated jellyfish populations undergo larger, worldwide oscillations with an approximate 20-y periodicity, including a rising phase during the 1990s that contributed to the perception of a global increase in jellyfish abundance. Sustained monitoring is required over the next decade to elucidate with statistical confidence whether the weak increasing linear trend in jellyfish after 1970 is an actual shift in the baseline or part of an oscillation. Irrespective of the nature of increase, given the potential damage posed by jellyfish blooms to fisheries, tourism, and other human industries, our findings foretell recurrent phases of rise and fall in jellyfish populations that society should be prepared to face. PMID:23277544
ERIC Educational Resources Information Center
Rossion, Bruno; Hanseeuw, Bernard; Dricot, Laurence
2012-01-01
A number of human brain areas showing a larger response to faces than to objects from different categories, or to scrambled faces, have been identified in neuroimaging studies. Depending on the statistical criteria used, the set of areas can be overextended or minimized, both at the local (size of areas) and global (number of areas) levels. Here…
NASA Astrophysics Data System (ADS)
Han, H. J.; Kang, J. H.
2016-12-01
Since Jul. 2015, KIAPS (Korea Institute of Atmospheric Prediction Systems) has been performing the semi real-time forecast system to assess the performance of their forecast system as a NWP model. KPOP (KIAPS Protocol for Observation Processing) is a part of KIAPS data assimilation system and has been performing well in KIAPS semi real-time forecast system. In this study, due to the fact that KPOP would be able to treat the scatterometer wind data, we analyze the effect of scatterometer wind (ASCAT-A/B) on KIAPS semi real-time forecast system. O-B global distribution and statistics of scatterometer wind give use two information which are the difference between background field and observation is not too large and KPOP processed the scatterometer wind data well. The changes of analysis increment because of O-B global distribution appear remarkably at the bottom of atmospheric field. It also shows that scatterometer wind data cover wide ocean where data would be able to short. Performance of scatterometer wind data can be checked through the vertical error reduction against IFS between background and analysis field and vertical statistics of O-A. By these analysis result, we can notice that scatterometer wind data will influence the positive effect on lower level performance of semi real-time forecast system at KIAPS. After, long-term result based on effect of scatterometer wind data will be analyzed.
Multi-criteria evaluation of CMIP5 GCMs for climate change impact analysis
NASA Astrophysics Data System (ADS)
Ahmadalipour, Ali; Rana, Arun; Moradkhani, Hamid; Sharma, Ashish
2017-04-01
Climate change is expected to have severe impacts on global hydrological cycle along with food-water-energy nexus. Currently, there are many climate models used in predicting important climatic variables. Though there have been advances in the field, there are still many problems to be resolved related to reliability, uncertainty, and computing needs, among many others. In the present work, we have analyzed performance of 20 different global climate models (GCMs) from Climate Model Intercomparison Project Phase 5 (CMIP5) dataset over the Columbia River Basin (CRB) in the Pacific Northwest USA. We demonstrate a statistical multicriteria approach, using univariate and multivariate techniques, for selecting suitable GCMs to be used for climate change impact analysis in the region. Univariate methods includes mean, standard deviation, coefficient of variation, relative change (variability), Mann-Kendall test, and Kolmogorov-Smirnov test (KS-test); whereas multivariate methods used were principal component analysis (PCA), singular value decomposition (SVD), canonical correlation analysis (CCA), and cluster analysis. The analysis is performed on raw GCM data, i.e., before bias correction, for precipitation and temperature climatic variables for all the 20 models to capture the reliability and nature of the particular model at regional scale. The analysis is based on spatially averaged datasets of GCMs and observation for the period of 1970 to 2000. Ranking is provided to each of the GCMs based on the performance evaluated against gridded observational data on various temporal scales (daily, monthly, and seasonal). Results have provided insight into each of the methods and various statistical properties addressed by them employed in ranking GCMs. Further; evaluation was also performed for raw GCM simulations against different sets of gridded observational dataset in the area.
Development of an Independent Global Land Cover Validation Dataset
NASA Astrophysics Data System (ADS)
Sulla-Menashe, D. J.; Olofsson, P.; Woodcock, C. E.; Holden, C.; Metcalfe, M.; Friedl, M. A.; Stehman, S. V.; Herold, M.; Giri, C.
2012-12-01
Accurate information related to the global distribution and dynamics in global land cover is critical for a large number of global change science questions. A growing number of land cover products have been produced at regional to global scales, but the uncertainty in these products and the relative strengths and weaknesses among available products are poorly characterized. To address this limitation we are compiling a database of high spatial resolution imagery to support international land cover validation studies. Validation sites were selected based on a probability sample, and may therefore be used to estimate statistically defensible accuracy statistics and associated standard errors. Validation site locations were identified using a stratified random design based on 21 strata derived from an intersection of Koppen climate classes and a population density layer. In this way, the two major sources of global variation in land cover (climate and human activity) are explicitly included in the stratification scheme. At each site we are acquiring high spatial resolution (< 1-m) satellite imagery for 5-km x 5-km blocks. The response design uses an object-oriented hierarchical legend that is compatible with the UN FAO Land Cover Classification System. Using this response design, we are classifying each site using a semi-automated algorithm that blends image segmentation with a supervised RandomForest classification algorithm. In the long run, the validation site database is designed to support international efforts to validate land cover products. To illustrate, we use the site database to validate the MODIS Collection 4 Land Cover product, providing a prototype for validating the VIIRS Surface Type Intermediate Product scheduled to start operational production early in 2013. As part of our analysis we evaluate sources of error in coarse resolution products including semantic issues related to the class definitions, mixed pixels, and poor spectral separation between classes.
Cloud encounter statistics in the 28.5-43.5 KFT altitude region from four years of GASP observations
NASA Technical Reports Server (NTRS)
Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D.
1983-01-01
The results of an analysis of cloud encounter measurements taken at aircraft flight altitudes as part of the Global Atmospheric Sampling Program are summarized. The results can be used in estimating the probability of cloud encounter and in assessing the economic feasibility of laminar flow control aircraft along particular routes. The data presented clearly show the tropical circulation and its seasonal migration; characteristics of the mid-latitude regime, such as the large-scale traveling cyclones in the winter and increased convective activity in the summer, can be isolated in the data. The cloud encounter statistics are shown to be consistent with the mid-latitude cyclone model. A model for TIC (time-in-clouds), a cloud encounter statistic, is presented for several common airline routes.
NASA Astrophysics Data System (ADS)
Sheehan, J. J.
2016-12-01
We report here a first-of-its-kind analysis of the potential for intensification of global grazing systems. Intensification is calculated using the statistical yield gap methodology developed previously by others (Mueller et al 2012 and Licker et al 2010) for global crop systems. Yield gaps are estimated by binning global pasture land area into 100 equal area sized bins of similar climate (defined by ranges of rainfall and growing degree days). Within each bin, grid cells of pastureland are ranked from lowest to highest productivity. The global intensification potential is defined as the sum of global production across all bins at a given percentile ranking (e.g. performance at the 90th percentile) divided by the total current global production. The previous yield gap studies focused on crop systems because productivity data on these systems is readily available. Nevertheless, global crop land represents only one-third of total global agricultural land, while pasture systems account for the remaining two-thirds. Thus, it is critical to conduct the same kind of analysis on what is the largest human use of land on the planet—pasture systems. In 2013, Herrero et al announced the completion of a geospatial data set that augmented the animal census data with data and modeling about production systems and overall food productivity (Herrero et al, PNAS 2013). With this data set, it is now possible to apply yield gap analysis to global pasture systems. We used the Herrero et al data set to evaluate yield gaps for meat and milk production from pasture based systems for cattle, sheep and goats. The figure included with this abstract shows the intensification potential for kcal per hectare per year of meat and milk from global cattle, sheep and goats as a function of increasing levels of performance. Performance is measured as the productivity achieved at a given ranked percentile within each bin.We find that if all pasture land were raised to their 90th percentile of performance, global output of meat and milk could increase 2.8 fold. This is much higher than that reported previously for major grain crops like corn and wheat. Our results suggest that efforts to address poor performance of pasture systems around the world could substantially improve the outlook for meeting future food demand.
Haranas, Ioannis; Gkigkitzis, Ioannis; Kotsireas, Ilias; Austerlitz, Carlos
2017-01-01
Understanding how the brain encodes information and performs computation requires statistical and functional analysis. Given the complexity of the human brain, simple methods that facilitate the interpretation of statistical correlations among different brain regions can be very useful. In this report we introduce a numerical correlation measure that may serve the interpretation of correlational neuronal data, and may assist in the evaluation of different brain states. The description of the dynamical brain system, through a global numerical measure may indicate the presence of an action principle which may facilitate a application of physics principles in the study of the human brain and cognition.
NASA Astrophysics Data System (ADS)
Rambla, Xavier
2006-05-01
The present study analyzes educational targeting in Argentina, Brazil and Chile from a sociological point of view. It shows that a `logic of induction' has become the vehicle for anti-poverty education strategies meant to help targeted groups improve on their own. The analysis explores the influence of the global educational agenda, the empirical connection between the logic of induction and the mechanism of emulation, and the territorial aspects of educational inequalities. Emulation plays a main role inasmuch as the logic of induction leads targeted groups to compare their adverse situation with more privileged groups, which actually legitimizes inequalities. A brief statistical summary completes the study, showing that educational inequality has remained unchanged as far as urban-rural ratios (in Brazil and Chile) and regional disparities (in all three countries) are concerned.
Quality and Consistency of the NASA Ocean Color Data Record
NASA Technical Reports Server (NTRS)
Franz, Bryan A.
2012-01-01
The NASA Ocean Biology Processing Group (OBPG) recently reprocessed the multimission ocean color time-series from SeaWiFS, MODIS-Aqua, and MODIS-Terra using common algorithms and improved instrument calibration knowledge. Here we present an analysis of the quality and consistency of the resulting ocean color retrievals, including spectral water-leaving reflectance, chlorophyll a concentration, and diffuse attenuation. Statistical analysis of satellite retrievals relative to in situ measurements will be presented for each sensor, as well as an assessment of consistency in the global time-series for the overlapping periods of the missions. Results will show that the satellite retrievals are in good agreement with in situ measurements, and that the sensor ocean color data records are highly consistent over the common mission lifespan for the global deep oceans, but with degraded agreement in higher productivity, higher complexity coastal regions.
Charidimou, Andreas; Farid, Karim; Tsai, Hsin-Hsi; Tsai, Li-Kai; Yen, Rouh-Fang; Baron, Jean-Claude
2018-04-01
We performed a meta-analysis to synthesise current evidence on amyloid-positron emission tomography (PET) burden and presumed preferential occipital distribution in sporadic cerebral amyloid angiopathy (CAA). In a PubMed systematic search, we identified case-control studies with extractable data on global and occipital-to-global amyloid-PET uptake in symptomatic patients with CAA (per Boston criteria) versus control groups (healthy participants or patients with non-CAA deep intracerebral haemorrhage) and patients with Alzheimer's disease. To circumvent PET studies' methodological variation, we generated and used 'fold change', that is, ratio of mean amyloid uptake (global and occipital-to-global) of CAA relative to comparison groups. Amyloid-PET uptake biomarker performance was then quantified by random-effects meta-analysis on the ratios of the means. A ratio >1 indicates that amyloid-PET uptake (global or occipital/global) is higher in CAA than comparison groups, and a ratio <1 indicates the reverse. Seven studies, including 106 patients with CAA (>90% with probable CAA) and 138 controls (96 healthy elderly, 42 deep intracerebral haemorrhage controls) and 72 patients with Alzheimer's disease, were included. Global amyloid-PET ratio between patients with CAA and controls was above 1, with an average effect size of 1.18 (95% CI 1.08 to 1.28; p<0.0001). Occipital-to-global amyloid-PET uptake ratio did not differ between patients with CAA versus patients with deep intracerebral haemorrhage or healthy controls. By contrast, occipital-to-global amyloid-PET uptake ratio was above 1 in patients with CAA versus those with Alzheimer's disease, with an average ratio of 1.10 (95% CI 1.03 to 1.19; p=0.009) and high statistical heterogeneity. Our analysis provides exploratory actionable data on the overall effect sizes and strength of amyloid-PET burden and distribution in patients with CAA, useful for future larger studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Seasonality of Kawasaki Disease: A Global Perspective
Burns, Jane C.; Herzog, Lauren; Fabri, Olivia; Tremoulet, Adriana H.; Rodó, Xavier; Uehara, Ritei; Burgner, David; Bainto, Emelia; Pierce, David; Tyree, Mary; Cayan, Daniel
2013-01-01
Background Understanding global seasonal patterns of Kawasaki disease (KD) may provide insight into the etiology of this vasculitis that is now the most common cause of acquired heart disease in children in developed countries worldwide. Methods Data from 1970-2012 from 25 countries distributed over the globe were analyzed for seasonality. The number of KD cases from each location was normalized to minimize the influence of greater numbers from certain locations. The presence of seasonal variation of KD at the individual locations was evaluated using three different tests: time series modeling, spectral analysis, and a Monte Carlo technique. Results A defined seasonal structure emerged demonstrating broad coherence in fluctuations in KD cases across the Northern Hemisphere extra-tropical latitudes. In the extra-tropical latitudes of the Northern Hemisphere, KD case numbers were highest in January through March and approximately 40% higher than in the months of lowest case numbers from August through October. Datasets were much sparser in the tropics and the Southern Hemisphere extra-tropics and statistical significance of the seasonality tests was weak, but suggested a maximum in May through June, with approximately 30% higher number of cases than in the least active months of February, March and October. The seasonal pattern in the Northern Hemisphere extra-tropics was consistent across the first and second halves of the sample period. Conclusion Using the first global KD time series, analysis of sites located in the Northern Hemisphere extra-tropics revealed statistically significant and consistent seasonal fluctuations in KD case numbers with high numbers in winter and low numbers in late summer and fall. Neither the tropics nor the Southern Hemisphere extra-tropics registered a statistically significant aggregate seasonal cycle. These data suggest a seasonal exposure to a KD agent that operates over large geographic regions and is concentrated during winter months in the Northern Hemisphere extra-tropics. PMID:24058585
Seasonality of Kawasaki disease: a global perspective.
Burns, Jane C; Herzog, Lauren; Fabri, Olivia; Tremoulet, Adriana H; Rodó, Xavier; Uehara, Ritei; Burgner, David; Bainto, Emelia; Pierce, David; Tyree, Mary; Cayan, Daniel
2013-01-01
Understanding global seasonal patterns of Kawasaki disease (KD) may provide insight into the etiology of this vasculitis that is now the most common cause of acquired heart disease in children in developed countries worldwide. Data from 1970-2012 from 25 countries distributed over the globe were analyzed for seasonality. The number of KD cases from each location was normalized to minimize the influence of greater numbers from certain locations. The presence of seasonal variation of KD at the individual locations was evaluated using three different tests: time series modeling, spectral analysis, and a Monte Carlo technique. A defined seasonal structure emerged demonstrating broad coherence in fluctuations in KD cases across the Northern Hemisphere extra-tropical latitudes. In the extra-tropical latitudes of the Northern Hemisphere, KD case numbers were highest in January through March and approximately 40% higher than in the months of lowest case numbers from August through October. Datasets were much sparser in the tropics and the Southern Hemisphere extra-tropics and statistical significance of the seasonality tests was weak, but suggested a maximum in May through June, with approximately 30% higher number of cases than in the least active months of February, March and October. The seasonal pattern in the Northern Hemisphere extra-tropics was consistent across the first and second halves of the sample period. Using the first global KD time series, analysis of sites located in the Northern Hemisphere extra-tropics revealed statistically significant and consistent seasonal fluctuations in KD case numbers with high numbers in winter and low numbers in late summer and fall. Neither the tropics nor the Southern Hemisphere extra-tropics registered a statistically significant aggregate seasonal cycle. These data suggest a seasonal exposure to a KD agent that operates over large geographic regions and is concentrated during winter months in the Northern Hemisphere extra-tropics.
Multi-scale Quantitative Precipitation Forecasting Using ...
Global sea surface temperature (SST) anomalies can affect terrestrial precipitation via ocean-atmosphere interaction known as climate teleconnection. Non-stationary and non-linear characteristics of the ocean-atmosphere system make the identification of the teleconnection signals difficult to be detected at a local scale as it could cause large uncertainties when using linear correlation analysis only. This paper explores the relationship between global SST and terrestrial precipitation with respect to long-term non-stationary teleconnection signals during 1981-2010 over three regions in North America and one in Central America. Empirical mode decomposition as well as wavelet analysis is utilized to extract the intrinsic trend and the dominant oscillation of the SST and precipitation time series in sequence. After finding possible associations between the dominant oscillation of seasonal precipitation and global SST through lagged correlation analysis, the statistically significant SST regions are extracted based on the correlation coefficient. With these characterized associations, individual contribution of these SST forcing regions linked to the related precipitation responses are further quantified through nonlinear modeling with the aid of extreme learning machine. Results indicate that the non-leading SST regions also contribute a salient portion to the terrestrial precipitation variability compared to some known leading SST regions. In some cases, these
Global Tuberculosis Report 2016
... Alt+0 Navigation Alt+1 Content Alt+2 Tuberculosis (TB) Menu Tuberculosis Data and statistics Regional Framework Resources Meetings and events Global tuberculosis report 2017 WHO has published a global TB ...
Blacklock, Kristin; Verkhivker, Gennady M.
2014-01-01
A fundamental role of the Hsp90 chaperone in regulating functional activity of diverse protein clients is essential for the integrity of signaling networks. In this work we have combined biophysical simulations of the Hsp90 crystal structures with the protein structure network analysis to characterize the statistical ensemble of allosteric interaction networks and communication pathways in the Hsp90 chaperones. We have found that principal structurally stable communities could be preserved during dynamic changes in the conformational ensemble. The dominant contribution of the inter-domain rigidity to the interaction networks has emerged as a common factor responsible for the thermodynamic stability of the active chaperone form during the ATPase cycle. Structural stability analysis using force constant profiling of the inter-residue fluctuation distances has identified a network of conserved structurally rigid residues that could serve as global mediating sites of allosteric communication. Mapping of the conformational landscape with the network centrality parameters has demonstrated that stable communities and mediating residues may act concertedly with the shifts in the conformational equilibrium and could describe the majority of functionally significant chaperone residues. The network analysis has revealed a relationship between structural stability, global centrality and functional significance of hotspot residues involved in chaperone regulation. We have found that allosteric interactions in the Hsp90 chaperone may be mediated by modules of structurally stable residues that display high betweenness in the global interaction network. The results of this study have suggested that allosteric interactions in the Hsp90 chaperone may operate via a mechanism that combines rapid and efficient communication by a single optimal pathway of structurally rigid residues and more robust signal transmission using an ensemble of suboptimal multiple communication routes. This may be a universal requirement encoded in protein structures to balance the inherent tension between resilience and efficiency of the residue interaction networks. PMID:24922508
Blacklock, Kristin; Verkhivker, Gennady M
2014-06-01
A fundamental role of the Hsp90 chaperone in regulating functional activity of diverse protein clients is essential for the integrity of signaling networks. In this work we have combined biophysical simulations of the Hsp90 crystal structures with the protein structure network analysis to characterize the statistical ensemble of allosteric interaction networks and communication pathways in the Hsp90 chaperones. We have found that principal structurally stable communities could be preserved during dynamic changes in the conformational ensemble. The dominant contribution of the inter-domain rigidity to the interaction networks has emerged as a common factor responsible for the thermodynamic stability of the active chaperone form during the ATPase cycle. Structural stability analysis using force constant profiling of the inter-residue fluctuation distances has identified a network of conserved structurally rigid residues that could serve as global mediating sites of allosteric communication. Mapping of the conformational landscape with the network centrality parameters has demonstrated that stable communities and mediating residues may act concertedly with the shifts in the conformational equilibrium and could describe the majority of functionally significant chaperone residues. The network analysis has revealed a relationship between structural stability, global centrality and functional significance of hotspot residues involved in chaperone regulation. We have found that allosteric interactions in the Hsp90 chaperone may be mediated by modules of structurally stable residues that display high betweenness in the global interaction network. The results of this study have suggested that allosteric interactions in the Hsp90 chaperone may operate via a mechanism that combines rapid and efficient communication by a single optimal pathway of structurally rigid residues and more robust signal transmission using an ensemble of suboptimal multiple communication routes. This may be a universal requirement encoded in protein structures to balance the inherent tension between resilience and efficiency of the residue interaction networks.
Orwat, Melanie Iris; Kempny, Aleksander; Bauer, Ulrike; Gatzoulis, Michael A; Baumgartner, Helmut; Diller, Gerhard-Paul
2015-09-15
The determinants of adult congenital heart disease (ACHD) research output are only partially understood. The heterogeneity of ACHD naturally calls for collaborative work; however, limited information exists on the impact of collaboration on academic performance. We aimed to examine the global topology of ACHD research, distribution of research collaboration and its association with cumulative research output. Based on publications presenting original research between 2005 and 2011, a network analysis was performed quantifying centrality measures and key players in the field of ACHD. In addition, network maps were produced to illustrate the global distribution and interconnected nature of ACHD research. The proportion of collaborative research was 35.6 % overall, with a wide variation between countries (7.1 to 62.8%). The degree of research collaboration, as well as measures of network centrality (betweenness and degree centrality), were statistically associated with cumulative research output independently of national wealth and available workforce. The global ACHD research network was found to be scale-free with a small number of central hubs and a relatively large number of peripheral nodes. In addition, we could identify potentially influential hubs based on cluster analysis and measures of centrality/key player analysis. Using network analysis methods the current study illustrates the complex and global structures of ACHD research. It suggests that collaboration between research institutions is associated with higher academic output. As a consequence national and international collaboration in ACHD research should be encouraged and the creation of an adequate supporting infrastructure should be further promoted. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Quasi-decadal Oscillation in the CMIP5 and CMIP3 Climate Model Simulations: California Case
NASA Astrophysics Data System (ADS)
Wang, J.; Yin, H.; Reyes, E.; Chung, F. I.
2014-12-01
The ongoing three drought years in California are reminding us of two other historical long drought periods: 1987-1992 and 1928-1934. This kind of interannual variability is corresponding to the dominating 7-15 yr quasi-decadal oscillation in precipitation and streamflow in California. When using global climate model projections to assess the climate change impact on water resources planning in California, it is natural to ask if global climate models are able to reproduce the observed interannual variability like 7-15 yr quasi-decadal oscillation. Further spectral analysis to tree ring retrieved precipitation and historical precipitation record proves the existence of 7-15 yr quasi-decadal oscillation in California. But while implementing spectral analysis to all the CMIP5 and CMIP3 global climate model historical simulations using wavelet analysis approach, it was found that only two models in CMIP3 , CGCM 2.3.2a of MRI and NCAP PCM1.0, and only two models in CMIP5, MIROC5 and CESM1-WACCM, have statistically significant 7-15 yr quasi-decadal oscillations in California. More interesting, the existence of 7-15 yr quasi-decadal oscillation in the global climate model simulation is also sensitive to initial conditions. 12-13 yr quasi-decadal oscillation occurs in one ensemble run of CGCM 2.3.2a of MRI but does not exist in the other four ensemble runs.
Global impacts of the 1980s regime shift.
Reid, Philip C; Hari, Renata E; Beaugrand, Grégory; Livingstone, David M; Marty, Christoph; Straile, Dietmar; Barichivich, Jonathan; Goberville, Eric; Adrian, Rita; Aono, Yasuyuki; Brown, Ross; Foster, James; Groisman, Pavel; Hélaouët, Pierre; Hsu, Huang-Hsiung; Kirby, Richard; Knight, Jeff; Kraberg, Alexandra; Li, Jianping; Lo, Tzu-Ting; Myneni, Ranga B; North, Ryan P; Pounds, J Alan; Sparks, Tim; Stübi, René; Tian, Yongjun; Wiltshire, Karen H; Xiao, Dong; Zhu, Zaichun
2016-02-01
Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Environmental assessment of Al-Hammar Marsh, Southern Iraq.
Al-Gburi, Hind Fadhil Abdullah; Al-Tawash, Balsam Salim; Al-Lafta, Hadi Salim
2017-02-01
(a) To determine the spatial distributions and levels of major and minor elements, as well as heavy metals, in water, sediment, and biota (plant and fish) in Al-Hammar Marsh, southern Iraq, and ultimately to supply more comprehensive information for policy-makers to manage the contaminants input into the marsh so that their concentrations do not reach toxic levels. (b) to characterize the seasonal changes in the marsh surface water quality. (c) to address the potential environmental risk of these elements by comparison with the historical levels and global quality guidelines (i.e., World Health Organization (WHO) standard limits). (d) to define the sources of these elements (i.e., natural and/or anthropogenic) using combined multivariate statistical techniques such as Principal Component Analysis (PCA) and Agglomerative Hierarchical Cluster Analysis (AHCA) along with pollution analysis (i.e., enrichment factor analysis). Water, sediment, plant, and fish samples were collected from the marsh, and analyzed for major and minor ions, as well as heavy metals, and then compared to historical levels and global quality guidelines (WHO guidelines). Then, multivariate statistical techniques, such as PCA and AHCA, were used to determine the element sourcing. Water analyses revealed unacceptable values for almost all physio-chemical and biological properties, according to WHO standard limits for drinking water. Almost all major ions and heavy metal concentrations in water showed a distinct decreasing trend at the marsh outlet station compared to other stations. In general, major and minor ions, as well as heavy metals exhibit higher concentrations in winter than in summer. Sediment analyses using multivariate statistical techniques revealed that Mg, Fe, S, P, V, Zn, As, Se, Mo, Co, Ni, Cu, Sr, Br, Cd, Ca, N, Mn, Cr, and Pb were derived from anthropogenic sources, while Al, Si, Ti, K, and Zr were primarily derived from natural sources. Enrichment factor analysis gave results compatible with multivariate statistical techniques findings. Analysis of heavy metals in plant samples revealed that there is no pollution in plants in Al-Hammar Marsh. However, the concentrations of heavy metals in fish samples showed that all samples were contaminated by Pb, Mn, and Ni, while some samples were contaminated by Pb, Mn, and Ni. Decreasing of Tigris and Euphrates discharges during the past decades due to drought conditions and upstream damming, as well as the increasing stress of wastewater effluents from anthropogenic activities, led to degradation of the downstream Al-Hammar Marsh water quality in terms of physical, chemical, and biological properties. As such properties were found to consistently exceed the historical and global quality objectives. However, element concentration decreasing trend at the marsh outlet station compared to other stations indicate that the marsh plays an important role as a natural filtration and bioremediation system. Higher element concentrations in winter were due to runoff from the washing of the surrounding Sabkha during flooding by winter rainstorms. Finally, the high concentrations of heavy metals in fish samples can be attributed to bioaccumulation and biomagnification processes.
Studies of oceanic tectonics based on GEOS-3 satellite altimetry
NASA Technical Reports Server (NTRS)
Poehls, K. A.; Kaula, W. M.; Schubert, G.; Sandwell, D.
1979-01-01
Using statistical analysis, geoidal admittance (the relationship between the ocean geoid and seafloor topography) obtained from GEOS-3 altimetry was compared to various model admittances. Analysis of several altimetry tracks in the Pacific Ocean demonstrated a low coherence between altimetry and seafloor topography except where the track crosses active or recent tectonic features. However, global statistical studies using the much larger data base of all available gravimetry showed a positive correlation of oceanic gravity with topography. The oceanic lithosphere was modeled by simultaneously inverting surface wave dispersion, topography, and gravity data. Efforts to incorporate geoid data into the inversion showed that the base of the subchannel can be better resolved with geoid rather than gravity data. Thermomechanical models of seafloor spreading taking into account differing plate velocities, heat source distributions, and rock rheologies were discussed.
Advanced functional network analysis in the geosciences: The pyunicorn package
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen
2013-04-01
Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.
Global Statistical Learning in a Visual Search Task
ERIC Educational Resources Information Center
Jones, John L.; Kaschak, Michael P.
2012-01-01
Locating a target in a visual search task is facilitated when the target location is repeated on successive trials. Global statistical properties also influence visual search, but have often been confounded with local regularities (i.e., target location repetition). In two experiments, target locations were not repeated for four successive trials,…
ERIC Educational Resources Information Center
Komatsu, Hikaru; Rappleye, Jeremy
2017-01-01
Several recent, highly influential comparative studies have made strong statistical claims that improvements on global learning assessments such as PISA will lead to higher GDP growth rates. These claims have provided the primary source of legitimation for policy reforms championed by leading international organisations, most notably the World…
Lee, Young-Beom; Lee, Jeonghyeon; Tak, Sungho; Lee, Kangjoo; Na, Duk L; Seo, Sang Won; Jeong, Yong; Ye, Jong Chul
2016-01-15
Recent studies of functional connectivity MR imaging have revealed that the default-mode network activity is disrupted in diseases such as Alzheimer's disease (AD). However, there is not yet a consensus on the preferred method for resting-state analysis. Because the brain is reported to have complex interconnected networks according to graph theoretical analysis, the independency assumption, as in the popular independent component analysis (ICA) approach, often does not hold. Here, rather than using the independency assumption, we present a new statistical parameter mapping (SPM)-type analysis method based on a sparse graph model where temporal dynamics at each voxel position are described as a sparse combination of global brain dynamics. In particular, a new concept of a spatially adaptive design matrix has been proposed to represent local connectivity that shares the same temporal dynamics. If we further assume that local network structures within a group are similar, the estimation problem of global and local dynamics can be solved using sparse dictionary learning for the concatenated temporal data across subjects. Moreover, under the homoscedasticity variance assumption across subjects and groups that is often used in SPM analysis, the aforementioned individual and group analyses using sparse dictionary learning can be accurately modeled by a mixed-effect model, which also facilitates a standard SPM-type group-level inference using summary statistics. Using an extensive resting fMRI data set obtained from normal, mild cognitive impairment (MCI), and Alzheimer's disease patient groups, we demonstrated that the changes in the default mode network extracted by the proposed method are more closely correlated with the progression of Alzheimer's disease. Copyright © 2015 Elsevier Inc. All rights reserved.
Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment
NASA Technical Reports Server (NTRS)
Prive, N. C.; Errico, Ronald M.
2015-01-01
The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.
Enzensberger, Christian; Achterberg, Friederike; Graupner, Oliver; Wolter, Aline; Herrmann, Johannes; Axt-Fliedner, Roland
2017-06-01
Frame rates (FR) used for strain analysis assessed by speckle tracking in fetal echocardiography show a considerable variation. The aim of this study was to investigate the influence of the FR on strain analysis in 2D speckle tracking. Fetal echocardiography was performed prospectively on a Toshiba Aplio 500 system and a Toshiba Artida system, respectively. Based on an apical or basal four-chamber view of the fetal heart, cine loops were stored with a FR of 30 fps (Aplio 500) and 60 fps (Artida/Aplio 500). For both groups (30fps and 60fps), global and segmental longitudinal peak systolic strain (LPSS) values of both, left (LV) and right ventricle (RV), were assessed by 2D wall-motion tracking. A total of 101 fetuses, distributed to three study groups, were included. The mean gestational age was 25.2±5.0 weeks. Mean global LPSS values for RV in the 30 fps group and in the 60 fps group were -16.07% and -16.47%, respectively. Mean global LPSS values for LV in the 30 fps group and in the 60 fps group were -17.54% and -17.06%, respectively. Comparing global and segmental LPSS values of both, the RV and LV, did not show any statistically significant differences within the two groups. Performance of myocardial 2D strain analysis by wall-motion tracking was feasible with 30 and 60 fps. Obtained global and segmental LPSS values of both ventricles were relatively independent from acquisition rate. © 2017, Wiley Periodicals, Inc.
Global aesthetic surgery statistics: a closer look.
Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas
2017-08-01
Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.
Ruiliang Pu; Zhanqing Li; Peng Gong; Ivan Csiszar; Robert Fraser; Wei-Min Hao; Shobha Kondragunta; Fuzhong Weng
2007-01-01
Fires in boreal and temperate forests play a significant role in the global carbon cycle. While forest fires in North America (NA) have been surveyed extensively by U.S. and Canadian forest services, most fire records are limited to seasonal statistics without information on temporal evolution and spatial expansion. Such dynamic information is crucial for modeling fire...
New axion and hidden photon constraints from a solar data global fit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vinyoles, N.; Serenelli, A.; Isern, J.
2015-10-01
We present a new statistical analysis that combines helioseismology (sound speed, surface helium and convective radius) and solar neutrino observations (the {sup 8}B and {sup 7}Be fluxes) to place upper limits to the properties of non standard weakly interacting particles. Our analysis includes theoretical and observational errors, accounts for tensions between input parameters of solar models and can be easily extended to include other observational constraints. We present two applications to test the method: the well studied case of axions and axion-like particles and the more novel case of low mass hidden photons. For axions we obtain an upper limitmore » at 3σ for the axion-photon coupling constant of g{sub aγ} < 4.1 · 10{sup −10} GeV{sup −1}. For hidden photons we obtain the most restrictive upper limit available accross a wide range of masses for the product of the kinetic mixing and mass of χ m < 1.8 ⋅ 10{sup −12} eV at 3σ. Both cases improve the previous solar constraints based on the Standard Solar Models showing the power of using a global statistical approach.« less
Web-GIS-based SARS epidemic situation visualization
NASA Astrophysics Data System (ADS)
Lu, Xiaolin
2004-03-01
In order to research, perform statistical analysis and broadcast the information of SARS epidemic situation according to the relevant spatial position, this paper proposed a unified global visualization information platform for SARS epidemic situation based on Web-GIS and scientific virtualization technology. To setup the unified global visual information platform, the architecture of Web-GIS based interoperable information system is adopted to enable public report SARS virus information to health cure center visually by using the web visualization technology. A GIS java applet is used to visualize the relationship between spatial graphical data and virus distribution, and other web based graphics figures such as curves, bars, maps and multi-dimensional figures are used to visualize the relationship between SARS virus tendency with time, patient number or locations. The platform is designed to display the SARS information in real time, simulate visually for real epidemic situation and offer an analyzing tools for health department and the policy-making government department to support the decision-making for preventing against the SARS epidemic virus. It could be used to analyze the virus condition through visualized graphics interface, isolate the areas of virus source, and control the virus condition within shortest time. It could be applied to the visualization field of SARS preventing systems for SARS information broadcasting, data management, statistical analysis, and decision supporting.
Performance Comparison of Big Data Analytics With NEXUS and Giovanni
NASA Astrophysics Data System (ADS)
Jacob, J. C.; Huang, T.; Lynnes, C.
2016-12-01
NEXUS is an emerging data-intensive analysis framework developed with a new approach for handling science data that enables large-scale data analysis. It is available through open source. We compare performance of NEXUS and Giovanni for 3 statistics algorithms applied to NASA datasets. Giovanni is a statistics web service at NASA Distributed Active Archive Centers (DAACs). NEXUS is a cloud-computing environment developed at JPL and built on Apache Solr, Cassandra, and Spark. We compute global time-averaged map, correlation map, and area-averaged time series. The first two algorithms average over time to produce a value for each pixel in a 2-D map. The third algorithm averages spatially to produce a single value for each time step. This talk is our report on benchmark comparison findings that indicate 15x speedup with NEXUS over Giovanni to compute area-averaged time series of daily precipitation rate for the Tropical Rainfall Measuring Mission (TRMM with 0.25 degree spatial resolution) for the Continental United States over 14 years (2000-2014) with 64-way parallelism and 545 tiles per granule. 16-way parallelism with 16 tiles per granule worked best with NEXUS for computing an 18-year (1998-2015) TRMM daily precipitation global time averaged map (2.5 times speedup) and 18-year global map of correlation between TRMM daily precipitation and TRMM real time daily precipitation (7x speedup). These and other benchmark results will be presented along with key lessons learned in applying the NEXUS tiling approach to big data analytics in the cloud.
NASA Astrophysics Data System (ADS)
Roy, P. K.; Pal, S.; Banerjee, G.; Biswas Roy, M.; Ray, D.; Majumder, A.
2014-12-01
River is considered as one of the main sources of freshwater all over the world. Hence analysis and maintenance of this water resource is globally considered a matter of major concern. This paper deals with the assessment of surface water quality of the Ichamati river using multivariate statistical techniques. Eight distinct surface water quality observation stations were located and samples were collected. For the samples collected statistical techniques were applied to the physico-chemical parameters and depth of siltation. In this paper cluster analysis is done to determine the relations between surface water quality and siltation depth of river Ichamati. Multiple regressions and mathematical equation modeling have been done to characterize surface water quality of Ichamati river on the basis of physico-chemical parameters. It was found that surface water quality of the downstream river was different from the water quality of the upstream. The analysis of the water quality parameters of the Ichamati river clearly indicate high pollution load on the river water which can be accounted to agricultural discharge, tidal effect and soil erosion. The results further reveal that with the increase in depth of siltation, water quality degraded.
NASA Astrophysics Data System (ADS)
Kawzenuk, B.; Sellars, S. L.; Nguyen, P.; Ralph, F. M.; Sorooshian, S.
2017-12-01
The CONNected objECT (CONNECT) algorithm is applied to Integrated Water Vapor Transport (IVT) data from the NASA's Modern-Era Retrospective Analysis for Research and Applications - Version 2 reanalysis product for the period 1980 to 2016 to study water vapor transport globally. The algorithm generates life-cycle records as statistical objects for the time and space location of the evolving strong vapor transport events. Global statistics are presented and used to investigate how climate variability impacts the events' location and frequency. Results show distinct water vapor object frequency and seasonal peaks during NH and SH Winter. Moreover, a positive linear trend in the annual number of objects is reported, increasing by 3.58 objects year-over-year (with 95% confidence, +/- 1.39). In addition, we show five distinct regions where these events typically exist (southeastern United States, eastern China, South Pacific south of 25°S, eastern South America and off the southern tip of South Africa), and where they rarely exist (eastern South Pacific Ocean and central southern Atlantic Ocean between 5°N-25°S). In addition, the event frequency and geographical location are also shown to be related to the Arctic Oscillation, Pacific North American Pattern, and the Quasi-Biennial Oscillation.
Global, local and focused geographic clustering for case-control data with residential histories
Jacquez, Geoffrey M; Kaufmann, Andy; Meliker, Jaymie; Goovaerts, Pierre; AvRuskin, Gillian; Nriagu, Jerome
2005-01-01
Background This paper introduces a new approach for evaluating clustering in case-control data that accounts for residential histories. Although many statistics have been proposed for assessing local, focused and global clustering in health outcomes, few, if any, exist for evaluating clusters when individuals are mobile. Methods Local, global and focused tests for residential histories are developed based on sets of matrices of nearest neighbor relationships that reflect the changing topology of cases and controls. Exposure traces are defined that account for the latency between exposure and disease manifestation, and that use exposure windows whose duration may vary. Several of the methods so derived are applied to evaluate clustering of residential histories in a case-control study of bladder cancer in south eastern Michigan. These data are still being collected and the analysis is conducted for demonstration purposes only. Results Statistically significant clustering of residential histories of cases was found but is likely due to delayed reporting of cases by one of the hospitals participating in the study. Conclusion Data with residential histories are preferable when causative exposures and disease latencies occur on a long enough time span that human mobility matters. To analyze such data, methods are needed that take residential histories into account. PMID:15784151
Statistical downscaling modeling with quantile regression using lasso to estimate extreme rainfall
NASA Astrophysics Data System (ADS)
Santri, Dewi; Wigena, Aji Hamim; Djuraidah, Anik
2016-02-01
Rainfall is one of the climatic elements with high diversity and has many negative impacts especially extreme rainfall. Therefore, there are several methods that required to minimize the damage that may occur. So far, Global circulation models (GCM) are the best method to forecast global climate changes include extreme rainfall. Statistical downscaling (SD) is a technique to develop the relationship between GCM output as a global-scale independent variables and rainfall as a local- scale response variable. Using GCM method will have many difficulties when assessed against observations because GCM has high dimension and multicollinearity between the variables. The common method that used to handle this problem is principal components analysis (PCA) and partial least squares regression. The new method that can be used is lasso. Lasso has advantages in simultaneuosly controlling the variance of the fitted coefficients and performing automatic variable selection. Quantile regression is a method that can be used to detect extreme rainfall in dry and wet extreme. Objective of this study is modeling SD using quantile regression with lasso to predict extreme rainfall in Indramayu. The results showed that the estimation of extreme rainfall (extreme wet in January, February and December) in Indramayu could be predicted properly by the model at quantile 90th.
Spatial interpolation of solar global radiation
NASA Astrophysics Data System (ADS)
Lussana, C.; Uboldi, F.; Antoniazzi, C.
2010-09-01
Solar global radiation is defined as the radiant flux incident onto an area element of the terrestrial surface. Its direct knowledge plays a crucial role in many applications, from agrometeorology to environmental meteorology. The ARPA Lombardia's meteorological network includes about one hundred of pyranometers, mostly distributed in the southern part of the Alps and in the centre of the Po Plain. A statistical interpolation method based on an implementation of the Optimal Interpolation is applied to the hourly average of the solar global radiation observations measured by the ARPA Lombardia's network. The background field is obtained using SMARTS (The Simple Model of the Atmospheric Radiative Transfer of Sunshine, Gueymard, 2001). The model is initialised by assuming clear sky conditions and it takes into account the solar position and orography related effects (shade and reflection). The interpolation of pyranometric observations introduces in the analysis fields information about cloud presence and influence. A particular effort is devoted to prevent observations affected by large errors of different kinds (representativity errors, systematic errors, gross errors) from entering the analysis procedure. The inclusion of direct cloud information from satellite observations is also planned.
Rhodes, Lindsay A; Huisingh, Carrie E; Quinn, Adam E; McGwin, Gerald; LaRussa, Frank; Box, Daniel; Owsley, Cynthia; Girkin, Christopher A
2017-02-01
To examine if racial differences in Bruch's membrane opening minimum rim width (BMO-MRW) in spectral-domain optical coherence tomography (SDOCT) exist, specifically between people of African descent (AD) and European descent (ED) in normal ocular health. Cross-sectional study. Patients presenting for a comprehensive eye examination at retail-based primary eye clinics were enrolled based on ≥1 of the following at-risk criteria for glaucoma: AD aged ≥40 years, ED aged ≥50 years, diabetes, family history of glaucoma, and/or pre-existing diagnosis of glaucoma. Participants with normal optic nerves on examination received SDOCT of the optic nerve head (24 radial scans). Global and regional (temporal, superotemporal, inferotemporal, nasal, superonasal, and inferonasal) BMO-MRW were measured and compared by race using generalized estimating equations. Models were adjusted for age, sex, and BMO area. SDOCT scans from 269 eyes (148 participants) were included in the analysis. Mean global BMO-MRW declined as age increased. After adjusting for age, sex, and BMO area, there was not a statistically significant difference in mean global BMO-MRW by race (P = .60). Regionally, the mean BMO-MRW was lower in the crude model among AD eyes in the temporal, superotemporal, and nasal regions and higher in the inferotemporal, superonasal, and inferonasal regions. However, in the adjusted model, these differences were not statistically significant. BMO-MRW was not statistically different between those of AD and ED. Race-specific normative data may not be necessary for the deployment of BMO-MRW in AD patients. Copyright © 2016 Elsevier Inc. All rights reserved.
Rhodes, Lindsay A.; Huisingh, Carrie E.; Quinn, Adam E.; McGwin, Gerald; LaRussa, Frank; Box, Daniel; Owsley, Cynthia; Girkin, Christopher A.
2016-01-01
Purpose To examine if racial differences in Bruch's membrane opening-minimum rim width (BMO-MRW) in spectral domain optical coherence tomography (SDOCT) exist, specifically between people of African descent (AD) and European descent (ED) in normal ocular health. Design Cross-sectional study Methods Patients presenting for a comprehensive eye exam at retail-based primary eye clinics were enrolled based on ≥1 of the following at-risk criteria for glaucoma: AD aged ≥ 40 years, ED aged ≥50 years, diabetes, family history of glaucoma, and/or preexisting diagnosis of glaucoma. Participants with normal optic nerves on exam received SDOCT of the optic nerve head (24 radial scans). Global and regional (temporal, superotemporal, inferotemporal, nasal, superonasal, and inferonasal) BMO-MRW were measured and compared by race using generalized estimating equations. Models were adjusted for age, gender, and BMO area. Results SDOCT scans from 269 eyes (148 participants) were included in the analysis. Mean global BMO-MRW declined as age increased. After adjusting for age, gender, and BMO area, there was not a statistically significant difference in mean global BMO-MRW by race (P = 0.60). Regionally, the mean BMO-MRW was lower in the crude model among AD eyes in the temporal, superotemporal, and nasal regions and higher in the inferotemporal, superonasal, and inferonasal regions. However, in the adjusted model, these differences were not statistically significant. Conclusions BMO-MRW was not statistically different between those of AD and ED. Race-specific normative data may not be necessary for the deployment of BMO-MRW in AD patients. PMID:27825982
NASA Astrophysics Data System (ADS)
Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.
2015-08-01
We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.
What Fraction of Global Fire Activity Can Be Forecast Using Sea Surface Temperatures?
NASA Astrophysics Data System (ADS)
Chen, Y.; Randerson, J. T.; Morton, D. C.; Andela, N.; Giglio, L.
2015-12-01
Variations in sea surface temperatures (SSTs) can influence climate dynamics in local and remote land areas, and thus influence fire-climate interactions that govern burned area. SST information has been recently used in statistical models to create seasonal outlooks of fire season severity in South America and as the initial condition for dynamical model predictions of fire activity in Indonesia. However, the degree to which large-scale ocean-atmosphere interactions can influence burned area in other continental regions has not been systematically explored. Here we quantified the amount of global burned area that can be predicted using SSTs in 14 different oceans regions as statistical predictors. We first examined lagged correlations between GFED4s burned area and the 14 ocean climate indices (OCIs) individually. The maximum correlations from different OCIs were used to construct a global map of fire predictability. About half of the global burned area can be forecast by this approach 3 months before the peak burning month (with a Pearson's r of 0.5 or higher), with the highest levels of predictability in Central America and Equatorial Asia. Several hotspots of predictability were identified using k-means cluster analysis. Within these regions, we tested the improvements of the forecast by using two OCIs from different oceans. Our forecast models were based on near-real-time SST data and may therefore support the development of new seasonal outlooks for fire activity that can aid the sustainable management of these fire-prone ecosystems.
The effect of solar radio bursts on the GNSS radio occultation signals
NASA Astrophysics Data System (ADS)
Yue, Xinan; Schreiner, William S.; Kuo, Ying-Hwa; Zhao, Biqiang; Wan, Weixing; Ren, Zhipeng; Liu, Libo; Wei, Yong; Lei, Jiuhou; Solomon, Stan; Rocken, Christian
2013-09-01
radio burst (SRB) is the radio wave emission after a solar flare, covering a broad frequency range, originated from the Sun's atmosphere. During the SRB occurrence, some specific frequency radio wave could interfere with the Global Navigation Satellite System (GNSS) signals and therefore disturb the received signals. In this study, the low Earth orbit- (LEO-) based high-resolution GNSS radio occultation (RO) signals from multiple satellites (COSMIC, CHAMP, GRACE, SAC-C, Metop-A, and TerraSAR-X) processed in University Corporation for Atmospheric Research (UCAR) were first used to evaluate the effect of SRB on the RO technique. The radio solar telescope network (RSTN) observed radio flux was used to represent SRB occurrence. An extreme case during 6 December 2006 and statistical analysis during April 2006 to September 2012 were studied. The LEO RO signals show frequent loss of lock (LOL), simultaneous decrease on L1 and L2 signal-to-noise ratio (SNR) globally during daytime, small-scale perturbations of SNR, and decreased successful retrieval percentage (SRP) for both ionospheric and atmospheric occultations during SRB occurrence. A potential harmonic band interference was identified. Either decreased data volume or data quality will influence weather prediction, climate study, and space weather monitoring by using RO data during SRB time. Statistically, the SRP of ionospheric and atmospheric occultation retrieval shows ~4% and ~13% decrease, respectively, while the SNR of L1 and L2 show ~5.7% and ~11.7% decrease, respectively. A threshold value of ~1807 SFU of 1415 MHz frequency, which can result in observable GNSS SNR decrease, was derived based on our statistical analysis.
NASA Technical Reports Server (NTRS)
1978-01-01
Research activities related to global weather, ocean/air interactions, and climate are reported. The global weather research is aimed at improving the assimilation of satellite-derived data in weather forecast models, developing analysis/forecast models that can more fully utilize satellite data, and developing new measures of forecast skill to properly assess the impact of satellite data on weather forecasting. The oceanographic research goal is to understand and model the processes that determine the general circulation of the oceans, focusing on those processes that affect sea surface temperature and oceanic heat storage, which are the oceanographic variables with the greatest influence on climate. The climate research objective is to support the development and effective utilization of space-acquired data systems in climate forecast models and to conduct sensitivity studies to determine the affect of lower boundary conditions on climate and predictability studies to determine which global climate features can be modeled either deterministically or statistically.
Improving the Global Precipitation Record: GPCP Version 2.1
NASA Technical Reports Server (NTRS)
Huffman, George J.; Adler, Robert F.; Bolvin, David t.; Gu, Guojun
2009-01-01
The GPCP has developed Version 2.1 of its long-term (1979-present) global Satellite-Gauge (SG) data sets to take advantage of the improved GPCC gauge analysis, which is one key input. As well, the OPI estimates used in the pre-SSM/I era have been rescaled to 20 years of the SSM/I-era SG. The monthly, pentad, and daily GPCP products have been entirely reprocessed, continuing to enforce consistency of the submonthly estimates to the monthly. Version 2.1 is close to Version 2, with the global ocean, land, and total values about 0%, 6%, and 2% higher, respectively. The revised long-term global precipitation rate is 2.68 mm/d. The corresponding tropical (25 N-S) increases are 0%, 7%, and 3%. Long-term linear changes in the data tend to be smaller in Version 2.1, but the statistics are sensitive to the threshold for land/ocean separation and use of the pre-SSM/I part of the record.
NASA Astrophysics Data System (ADS)
Sellars, S. L.; Kawzenuk, B.; Nguyen, P.; Ralph, F. M.; Sorooshian, S.
2017-12-01
The CONNected objECT (CONNECT) algorithm is applied to global Integrated Water Vapor Transport data from the NASA's Modern-Era Retrospective Analysis for Research and Applications - Version 2 reanalysis product for the period of 1980 to 2016. The algorithm generates life-cycle records in time and space evolving strong vapor transport events. We show five regions, located in the midlatitudes, where events typically exist (off the coast of the southeast United States, eastern China, eastern South America, off the southern tip of South Africa, and in the southeastern Pacific Ocean). Global statistics show distinct genesis and termination regions and global seasonal peak frequency during Northern Hemisphere late fall/winter and Southern Hemisphere winter. In addition, the event frequency and geographical location are shown to be modulated by the Arctic Oscillation, Pacific North American Pattern, and the quasi-biennial oscillation. Moreover, a positive linear trend in the annual number of objects is reported, increasing by 3.58 objects year-over-year.
A Cloud-Based Global Flood Disaster Community Cyber-Infrastructure: Development and Demonstration
NASA Technical Reports Server (NTRS)
Wan, Zhanming; Hong, Yang; Khan, Sadiq; Gourley, Jonathan; Flamig, Zachary; Kirschbaum, Dalia; Tang, Guoqiang
2014-01-01
Flood disasters have significant impacts on the development of communities globally. This study describes a public cloud-based flood cyber-infrastructure (CyberFlood) that collects, organizes, visualizes, and manages several global flood databases for authorities and the public in real-time, providing location-based eventful visualization as well as statistical analysis and graphing capabilities. In order to expand and update the existing flood inventory, a crowdsourcing data collection methodology is employed for the public with smartphones or Internet to report new flood events, which is also intended to engage citizen-scientists so that they may become motivated and educated about the latest developments in satellite remote sensing and hydrologic modeling technologies. Our shared vision is to better serve the global water community with comprehensive flood information, aided by the state-of-the- art cloud computing and crowdsourcing technology. The CyberFlood presents an opportunity to eventually modernize the existing paradigm used to collect, manage, analyze, and visualize water-related disasters.
Geo-located Twitter as proxy for global mobility patterns.
Hawelka, Bartosz; Sitko, Izabela; Beinat, Euro; Sobolevsky, Stanislav; Kazakopoulos, Pavlos; Ratti, Carlo
2014-05-27
Pervasive presence of location-sharing services made it possible for researchers to gain an unprecedented access to the direct records of human activity in space and time. This article analyses geo-located Twitter messages in order to uncover global patterns of human mobility. Based on a dataset of almost a billion tweets recorded in 2012, we estimate the volume of international travelers by country of residence. Mobility profiles of different nations were examined based on such characteristics as mobility rate, radius of gyration, diversity of destinations, and inflow-outflow balance. Temporal patterns disclose the universally valid seasons of increased international mobility and the particular character of international travels of different nations. Our analysis of the community structure of the Twitter mobility network reveals spatially cohesive regions that follow the regional division of the world. We validate our result using global tourism statistics and mobility models provided by other authors and argue that Twitter is exceptionally useful for understanding and quantifying global mobility patterns.
A heuristic evaluation of long-term global sea level acceleration
NASA Astrophysics Data System (ADS)
Spada, Giorgio; Olivieri, Marco; Galassi, Gaia
2015-05-01
In view of the scientific and social implications, the global mean sea level rise (GMSLR) and its possible causes and future trend have been a challenge for so long. For the twentieth century, reconstructions generally indicate a rate of GMSLR in the range of 1.5 to 2.0 mm yr-1. However, the existence of nonlinear trends is still debated, and current estimates of the secular acceleration are subject to ample uncertainties. Here we use various GMSLR estimates published on scholarly journals since the 1940s for a heuristic assessment of global sea level acceleration. The approach, alternative to sea level reconstructions, is based on simple statistical methods and exploits the principles of meta-analysis. Our results point to a global sea level acceleration of 0.54 ± 0.27 mm/yr/century (1σ) between 1898 and 1975. This supports independent estimates and suggests that a sea level acceleration since the early 1900s is more likely than currently believed.
NASA Astrophysics Data System (ADS)
Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.
2017-05-01
The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.
Multifractal Approach to Time Clustering of Earthquakes. Application to Mt. Vesuvio Seismicity
NASA Astrophysics Data System (ADS)
Codano, C.; Alonzo, M. L.; Vilardo, G.
The clustering structure of the Vesuvian earthquakes occurring is investigated by means of statistical tools: the inter-event time distribution, the running mean and the multifractal analysis. The first cannot clearly distinguish between a Poissonian process and a clustered one due to the difficulties of clearly distinguishing between an exponential distribution and a power law one. The running mean test reveals the clustering of the earthquakes, but looses information about the structure of the distribution at global scales. The multifractal approach can enlighten the clustering at small scales, while the global behaviour remains Poissonian. Subsequently the clustering of the events is interpreted in terms of diffusive processes of the stress in the earth crust.
NASA Astrophysics Data System (ADS)
Jin, Yang; Ciwei, Gao; Jing, Zhang; Min, Sun; Jie, Yu
2017-05-01
The selection and evaluation of priority domains in Global Energy Internet standard development will help to break through limits of national investment, thus priority will be given to standardizing technical areas with highest urgency and feasibility. Therefore, in this paper, the process of Delphi survey based on technology foresight is put forward, the evaluation index system of priority domains is established, and the index calculation method is determined. Afterwards, statistical method is used to evaluate the alternative domains. Finally the top four priority domains are determined as follows: Interconnected Network Planning and Simulation Analysis, Interconnected Network Safety Control and Protection, Intelligent Power Transmission and Transformation, and Internet of Things.
Statistics of the Work done in a Quantum Quench
NASA Astrophysics Data System (ADS)
Silva, Alessandro
2009-03-01
The quantum quench, i.e. a rapid change in time of a control parameter of a quantum system, is the simplest paradigm of non-equilibrium process, completely analogous to a standard thermodynamic transformation. The dynamics following a quantum quench is particularly interesting in strongly correlated quantum systems, most prominently when the quench in performed across a quantum critical point. In this talk I will present a way to characterize the physics of quantum quenches by looking at the statistics of a basic thermodynamic variable: the work done on the system by changing its parameters [1]. I will first elucidate the relation between the probability distribution of the work, quantum Jarzynski equalities, and the Loschmidt echo, a quantity that emerges usually in the context of dephasing. Using this connection, I will then characterize the statistics of the work done on a Quantum Ising chain by quenching locally or globally the transverse field. I will then show that for global quenches the presence of a quantum critical point results in singularities of the moments of the distribution, while, for local quenches starting at criticality, the probability distribution itself displays an interesting edge singularity. The results of a similar analysis for other systems will be discussed. [4pt] [1] A. Silva, Phys. Rev. Lett. 101, 120603 (2008).
Aquarius Instrument Science Calibration During the Risk Reduction Phase
NASA Technical Reports Server (NTRS)
Ruf, Christopher S.
2004-01-01
This final report presents the results of work performed under NASA Grant NAG512726 during the period 15 January 2003 through 30 June 2004. An analysis was performed of a possible vicarious calibration method for use by Aquarius to monitor and stabilize the absolute and relative calibration of its microwave radiometer. Stationary statistical properties of the brightness temperature (T(sub B)) measured by a low Earth orbiting radiometer operating at 1.4135 GHz are considered as a means of validating its absolute calibration. The global minimum, maximum, and average T(sub B) are considered, together with a vicarious cold reference method that detects the presence of a sharp lower bound on naturally occurring values for T(sub B). Of particular interest is the reliability with which these statistics can be extracted from a realistic distribution of T(sub B) measurements that would be observed by a typical sensor. Simulations of measurements are performed that include the effects of instrument noise and variable environmental factors such as the global water vapor and ocean surface temperature, salinity and wind distributions. Global minima can vary widely due to instrument noise and are not a reliable calibration reference. Global maxima are strongly influenced by several environmental factors as well as instrument noise and are even less stationary. Global averages are largely insensitive to instrument noise and, in most cases, to environmental conditions as well. The global average T(sub B) varies at only the 0.1 K RMS level except in cases of anomalously high winds, when it can increase considerably more. The vicarious cold reference is similarly insensitive to instrument effects and most environmental factors. It is not significantly affected by high wind conditions. The stability of the vicarious reference is, however, found to be somewhat sensitive (at the several tenths of Kelvins level) to variations in the background cold space brightness, T(sub c). The global average is much less sensitive to this parameter and so using two approaches together can be mutually beneficial.
Feature-Based Statistical Analysis of Combustion Simulation Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, J; Krishnamoorthy, V; Liu, S
2011-11-18
We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing andmore » reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.« less
Multidimensional Scaling Analysis of the Dynamics of a Country Economy
Mata, Maria Eugénia
2013-01-01
This paper analyzes the Portuguese short-run business cycles over the last 150 years and presents the multidimensional scaling (MDS) for visualizing the results. The analytical and numerical assessment of this long-run perspective reveals periods with close connections between the macroeconomic variables related to government accounts equilibrium, balance of payments equilibrium, and economic growth. The MDS method is adopted for a quantitative statistical analysis. In this way, similarity clusters of several historical periods emerge in the MDS maps, namely, in identifying similarities and dissimilarities that identify periods of prosperity and crises, growth, and stagnation. Such features are major aspects of collective national achievement, to which can be associated the impact of international problems such as the World Wars, the Great Depression, or the current global financial crisis, as well as national events in the context of broad political blueprints for the Portuguese society in the rising globalization process. PMID:24294132
α -induced reactions on 115In: Cross section measurements and statistical model analysis
NASA Astrophysics Data System (ADS)
Kiss, G. G.; Szücs, T.; Mohr, P.; Török, Zs.; Huszánk, R.; Gyürky, Gy.; Fülöp, Zs.
2018-05-01
Background: α -nucleus optical potentials are basic ingredients of statistical model calculations used in nucleosynthesis simulations. While the nucleon+nucleus optical potential is fairly well known, for the α +nucleus optical potential several different parameter sets exist and large deviations, reaching sometimes even an order of magnitude, are found between the cross section predictions calculated using different parameter sets. Purpose: A measurement of the radiative α -capture and the α -induced reaction cross sections on the nucleus 115In at low energies allows a stringent test of statistical model predictions. Since experimental data are scarce in this mass region, this measurement can be an important input to test the global applicability of α +nucleus optical model potentials and further ingredients of the statistical model. Methods: The reaction cross sections were measured by means of the activation method. The produced activities were determined by off-line detection of the γ rays and characteristic x rays emitted during the electron capture decay of the produced Sb isotopes. The 115In(α ,γ )119Sb and 115In(α ,n )Sb118m reaction cross sections were measured between Ec .m .=8.83 and 15.58 MeV, and the 115In(α ,n )Sb118g reaction was studied between Ec .m .=11.10 and 15.58 MeV. The theoretical analysis was performed within the statistical model. Results: The simultaneous measurement of the (α ,γ ) and (α ,n ) cross sections allowed us to determine a best-fit combination of all parameters for the statistical model. The α +nucleus optical potential is identified as the most important input for the statistical model. The best fit is obtained for the new Atomki-V1 potential, and good reproduction of the experimental data is also achieved for the first version of the Demetriou potentials and the simple McFadden-Satchler potential. The nucleon optical potential, the γ -ray strength function, and the level density parametrization are also constrained by the data although there is no unique best-fit combination. Conclusions: The best-fit calculations allow us to extrapolate the low-energy (α ,γ ) cross section of 115In to the astrophysical Gamow window with reasonable uncertainties. However, still further improvements of the α -nucleus potential are required for a global description of elastic (α ,α ) scattering and α -induced reactions in a wide range of masses and energies.
On the diffuse fraction of daily and monthly global radiation for the island of Cyprus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacovides, C.P.; Hadjioannou, L.; Pashiardis, S.
1996-06-01
Six years of hourly global and diffuse irradiation measurements on a horizontal surface performed at Athalassa, Cyprus, are used to establish a relationship between the daily diffuse fraction and the daily clearness index. Two types of correlations - yearly and seasonal - have been developed. These correlations, of first and third order in the clearness index are compared to the various correlations established by Collares-Pereira and Rabl (1979), Newland (1989), Erbs et al. (1982), Rao et al. (1984), Page (1961), Liu and Jordan (1960) and Lalas et al. (1987). The comparison has been performed in terms of the widely usedmore » statistical indicators (MBE) and (RMSE) errors; and additional statistical indicator, the t-statistic, combining the earlier indicators, is introduced. The results indicate that the proposed yearly correlation matches the earlier correlations quite closely and all correlations examined yield results that are statistically significant. For large K{sub t} > 0.60 values, most of the earlier correlations exhibit a slight tendency to systematically overestimate the diffuse fraction. This marginal disagreement between the earlier correlations and the proposed model is probably significantly affected by the clear sky conditions that prevail over Cyprus for most of the time as well as atmospheric humidity content. It is clear that the standard correlations examined in this analysis appear to be location-independent models for diffuse irradiation predictions, at least for the Cyprus case. 13 refs., 5 figs., 4 tabs.« less
Analysis of the Einstein sample of early-type galaxies
NASA Technical Reports Server (NTRS)
Eskridge, Paul B.; Fabbiano, Giuseppina
1993-01-01
The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.
Liu, Jianhua; Jiang, Hongbo; Zhang, Hao; Guo, Chun; Wang, Lei; Yang, Jing; Nie, Shaofa
2017-06-27
In the summer of 2014, an influenza A(H3N2) outbreak occurred in Yichang city, Hubei province, China. A retrospective study was conducted to collect and interpret hospital and epidemiological data on it using social network analysis and global sensitivity and uncertainty analyses. Results for degree (χ2=17.6619, P<0.0001) and betweenness(χ2=21.4186, P<0.0001) centrality suggested that the selection of sampling objects were different between traditional epidemiological methods and newer statistical approaches. Clique and network diagrams demonstrated that the outbreak actually consisted of two independent transmission networks. Sensitivity analysis showed that the contact coefficient (k) was the most important factor in the dynamic model. Using uncertainty analysis, we were able to better understand the properties and variations over space and time on the outbreak. We concluded that use of newer approaches were significantly more efficient for managing and controlling infectious diseases outbreaks, as well as saving time and public health resources, and could be widely applied on similar local outbreaks.
Moment-based metrics for global sensitivity analysis of hydrological systems
NASA Astrophysics Data System (ADS)
Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto
2017-12-01
We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.
Evaluation of the Impact of AIRS Radiance and Profile Data Assimilation in Partly Cloudy Regions
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Srikishen, Jayanthi; Jedlovec, Gary
2013-01-01
Improvements to global and regional numerical weather prediction have been demonstrated through assimilation of data from NASA s Atmospheric Infrared Sounder (AIRS). Current operational data assimilation systems use AIRS radiances, but impact on regional forecasts has been much smaller than for global forecasts. Retrieved profiles from AIRS contain much of the information that is contained in the radiances and may be able to reveal reasons for this reduced impact. Assimilating AIRS retrieved profiles in an identical analysis configuration to the radiances, tracking the quantity and quality of the assimilated data in each technique, and examining analysis increments and forecast impact from each data type can yield clues as to the reasons for the reduced impact. By doing this with regional scale models individual synoptic features (and the impact of AIRS on these features) can be more easily tracked. This project examines the assimilation of hyperspectral sounder data used in operational numerical weather prediction by comparing operational techniques used for AIRS radiances and research techniques used for AIRS retrieved profiles. Parallel versions of a configuration of the Weather Research and Forecasting (WRF) model with Gridpoint Statistical Interpolation (GSI) are run to examine the impact AIRS radiances and retrieved profiles. Statistical evaluation of a long-term series of forecast runs will be compared along with preliminary results of in-depth investigations for select case comparing the analysis increments in partly cloudy regions and short-term forecast impacts.
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Srikishen, Jayanthi; Jedlovec, Gary
2013-01-01
Improvements to global and regional numerical weather prediction have been demonstrated through assimilation of data from NASA s Atmospheric Infrared Sounder (AIRS). Current operational data assimilation systems use AIRS radiances, but impact on regional forecasts has been much smaller than for global forecasts. Retrieved profiles from AIRS contain much of the information that is contained in the radiances and may be able to reveal reasons for this reduced impact. Assimilating AIRS retrieved profiles in an identical analysis configuration to the radiances, tracking the quantity and quality of the assimilated data in each technique, and examining analysis increments and forecast impact from each data type can yield clues as to the reasons for the reduced impact. By doing this with regional scale models individual synoptic features (and the impact of AIRS on these features) can be more easily tracked. This project examines the assimilation of hyperspectral sounder data used in operational numerical weather prediction by comparing operational techniques used for AIRS radiances and research techniques used for AIRS retrieved profiles. Parallel versions of a configuration of the Weather Research and Forecasting (WRF) model with Gridpoint Statistical Interpolation (GSI) are run to examine the impact AIRS radiances and retrieved profiles. Statistical evaluation of 6 weeks of forecast runs will be compared along with preliminary results of in-depth investigations for select case comparing the analysis increments in partly cloudy regions and short-term forecast impacts.
Application of spatial technology in malaria research & control: some new insights.
Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P
2009-08-01
Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.
Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel
2016-01-01
An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different casemore » studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.« less
Global network centrality of university rankings
NASA Astrophysics Data System (ADS)
Guo, Weisi; Del Vecchio, Marco; Pogrebna, Ganna
2017-10-01
Universities and higher education institutions form an integral part of the national infrastructure and prestige. As academic research benefits increasingly from international exchange and cooperation, many universities have increased investment in improving and enabling their global connectivity. Yet, the relationship of university performance and its global physical connectedness has not been explored in detail. We conduct, to our knowledge, the first large-scale data-driven analysis into whether there is a correlation between university relative ranking performance and its global connectivity via the air transport network. The results show that local access to global hubs (as measured by air transport network betweenness) strongly and positively correlates with the ranking growth (statistical significance in different models ranges between 5% and 1% level). We also found that the local airport's aggregate flight paths (degree) and capacity (weighted degree) has no effect on university ranking, further showing that global connectivity distance is more important than the capacity of flight connections. We also examined the effect of local city economic development as a confounding variable and no effect was observed suggesting that access to global transportation hubs outweighs economic performance as a determinant of university ranking. The impact of this research is that we have determined the importance of the centrality of global connectivity and, hence, established initial evidence for further exploring potential connections between university ranking and regional investment policies on improving global connectivity.
Global network centrality of university rankings
Del Vecchio, Marco; Pogrebna, Ganna
2017-01-01
Universities and higher education institutions form an integral part of the national infrastructure and prestige. As academic research benefits increasingly from international exchange and cooperation, many universities have increased investment in improving and enabling their global connectivity. Yet, the relationship of university performance and its global physical connectedness has not been explored in detail. We conduct, to our knowledge, the first large-scale data-driven analysis into whether there is a correlation between university relative ranking performance and its global connectivity via the air transport network. The results show that local access to global hubs (as measured by air transport network betweenness) strongly and positively correlates with the ranking growth (statistical significance in different models ranges between 5% and 1% level). We also found that the local airport’s aggregate flight paths (degree) and capacity (weighted degree) has no effect on university ranking, further showing that global connectivity distance is more important than the capacity of flight connections. We also examined the effect of local city economic development as a confounding variable and no effect was observed suggesting that access to global transportation hubs outweighs economic performance as a determinant of university ranking. The impact of this research is that we have determined the importance of the centrality of global connectivity and, hence, established initial evidence for further exploring potential connections between university ranking and regional investment policies on improving global connectivity. PMID:29134105
Tankeu, Aurel T; Bigna, Jean Joël; Nansseu, Jobert Richie; Endomba, Francky Teddy A; Wafeu, Guy Sadeu; Kaze, Arnaud D; Noubiap, Jean Jacques
2017-06-09
Diabetes mellitus (DM) is an important risk factor for active tuberculosis (TB), which also adversely affect TB treatment outcomes. The escalating global DM epidemic is fuelling the burden of TB and should therefore be a major target in the strategy for ending TB. This review aims to estimate the global prevalence of DM in patients with TB. This systematic review will include cross-sectional, case-control or cohort studies of populations including patients diagnosed with TB that have reported the prevalence of DM using one of the fourth standard recommendations for screening and diagnosis. This protocol is written in accordance with recommendations from the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols 2015 statement. Relevant abstracts published in English/French from inception to 31 December 2016 will be searched in PubMed, Excerpta Medica Database and online journals. Two investigators will independently screen, select studies, extract data and assess the risk of bias in each study. The study-specific estimates will be pooled through a random-effects meta-analysis model to obtain an overall summary estimate of the prevalence of diabetes across the studies. Heterogeneity will be assessed, and we will pool studies judged to be clinically homogenous. On the other hand, statistical heterogeneity will be evaluated by the χ² test on Cochrane's Q statistic. Funnel-plots analysis and Egger's test will be used to investigate publication bias. Results will be presented by continent or geographic regions. This study is based on published data. An ethical approval is therefore not required. This systematic review and meta-analysis is expected to inform healthcare providers as well as general population on the co-occurrence of DM and TB. The final report will be published as an original article in a peer-reviewed journal, and will also be presented at conferences and submitted to relevant health authorities. We also plan to update the review every 5 years. PROSPERO International Prospective Register of Systematic Reviews (CRD42016049901). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Anthony, R. E.; Aster, R. C.; Rowe, C. A.
2016-12-01
The Earth's seismic noise spectrum features two globally ubiquitous peaks near 8 and 16 s periods (secondary and primary bands) that arise when storm-generated ocean gravity waves are converted to seismic energy, predominantly into Rayleigh waves. Because of its regionally integrative nature, microseism intensity and other seismographic data from long running sites can provide useful proxies for wave state. Expanding an earlier study of global microseism trends (Aster et al., 2010), we analyze digitally-archived, up-to-date (through late 2016) multi-decadal seismic data from stations of global seismographic networks to characterize the spatiotemporal evolution of wave climate over the past >20 years. The IRIS Noise Tool Kit (Bahavair et al., 2013) is used to produce ground motion power spectral density (PSD) estimates in 3-hour overlapping time series segments. The result of this effort is a longer duration and more broadly geographically distributed PSD database than attained in previous studies, particularly for the primary microseism band. Integrating power within the primary and secondary microseism bands enables regional characterization of spatially-integrated trends in wave states and storm event statistics of varying thresholds. The results of these analyses are then interpreted within the context of recognized modes of atmospheric variability, including the particularly strong 2015-2016 El Niño. We note a number of statistically significant increasing trends in both raw microseism power and storm activity occurring at multiple stations in the Northwest Atlantic and Southeast Pacific consistent with generally increased wave heights and storminess in these regions. Such trends in wave activity have the potential to significantly influence coastal environments particularly under rising global sea levels.
Morning-evening differences in global and regional oceanic precipitation as observed by the SSM/I
NASA Technical Reports Server (NTRS)
Petty, Grant W.; Katsaros, Kristina B.
1992-01-01
For the present preliminary analysis of oceanic rainfall statistics, global oceanic SSM/I data were simply scanned for pixels which exhibited a 37 GHz polarization difference (vertically polarized brightness temperatures minus horizontally polarized brightness temperatures) of less than 15 K. Such a low polarization difference over the open ocean is a completely unambiguous indication of moderate to intense precipitation. Co-located brightness temperatures from all seven channels of the SSM/I were saved for each pixel so identified. Bad scans and geographically mislocated block of data were objectively identified and removed from the resulting data base. We collected global oceanic rainfall data for two time periods, each one month in length. The first period (20 July-19 August 1987) coincides with the peak of the Northern Hemisphere summer. The second period (13 January-12 February 1988) coincides with the Northern Hemisphere winter.
NASA Technical Reports Server (NTRS)
Nemani, Ramakrishna R.
2016-01-01
Photosynthesis and light use efficiency (LUE) are major factors in the evolution of the continental carbon cycle due to their contribution to gross primary production (GPP). However, while the drivers of photosynthesis and LUE on a plant or canopy scale can often be identified, significant uncertainties exist when modeling these on a global scale. This is due to sparse observations in regions such as the tropics and the lack of a direct global observation dataset. Although others have attempted to address this issue using correlations (Beer, 2010) or calculating GPP from vegetation indices (Running, 2004), in this study we take a new approach. We combine the statistical method of Granger frequency causality and partial Granger frequency causality with remote sensing data products (including sun-induced fluorescence used as a proxy for GPP) to determine the main environmental drivers of GPP across the globe.
Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.
2016-01-01
Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934
Trends in total column ozone measurements
NASA Technical Reports Server (NTRS)
Rowland, F. S.; Angell, J.; Attmannspacher, W.; Bloomfield, P.; Bojkov, R. D.; Harris, N.; Komhyr, W.; Mcfarland, M.; Mcpeters, R.; Stolarski, R. S.
1989-01-01
It is important to ensure the best available data are used in any determination of possible trends in total ozone in order to have the most accurate estimates of any trends and the associated uncertainties. Accordingly, the existing total ozone records were examined in considerable detail. Once the best data set has been produced, the statistical analysis must examine the data for any effects that might indicate changes in the behavior of global total ozone. The changes at any individual measuring station could be local in nature, and herein, particular attention was paid to the seasonal and latitudinal variations of total ozone, because two dimensional photochemical models indicate that any changes in total ozone would be most pronounced at high latitudes during the winter months. The conclusions derived from this detailed examination of available total ozone can be split into two categories, one concerning the quality and the other the statistical analysis of the total ozone record.
Statistical Analysis of Wireless Networks: Predicting Performance in Multiple Environments
2006-06-01
The demonstration planned for May 2006 is an air, ground, and water- based scenario, occurring just north of Chiang Mai , Thailand. The scenario...will be fused, displayed, and distributed in real-time to local ( Chiang Mai ), theater (Bangkok), and global (Alameda, California) command and control...COASTS 2006 TOPOLOGY The 2006 version of the COASTS project occurred just north of Chiang Mai , Thailand, at the Mae Ngat Dam. COTS systems were
NASA Astrophysics Data System (ADS)
Cromwell, G.; Johnson, C. L.; Tauxe, L.; Constable, C.; Jarboe, N.
2015-12-01
Previous paleosecular variation (PSV) and time-averaged field (TAF) models draw on compilations of paleodirectional data that lack equatorial and high latitude sites and use latitudinal virtual geomagnetic pole (VGP) cutoffs designed to remove transitional field directions. We present a new selected global dataset (PSV10) of paleodirectional data spanning the last 10 Ma. We include all results calculated with modern laboratory methods, regardless of site VGP colatitude, that meet statistically derived selection criteria. We exclude studies that target transitional field states or identify significant tectonic effects, and correct for any bias from serial correlation by averaging directions from sequential lava flows. PSV10 has an improved global distribution compared with previous compilations, comprising 1519 sites from 71 studies. VGP dispersion in PSV10 varies with latitude, exhibiting substantially higher values in the southern hemisphere than at corresponding northern latitudes. Inclination anomaly estimates at many latitudes are within error of an expected GAD field, but significant negative anomalies are found at equatorial and mid-northern latitudes. Current PSV models Model G and TK03 do not fit observed PSV or TAF latitudinal behavior in PSV10, or subsets of normal and reverse polarity data, particularly for southern hemisphere sites. Attempts to fit these observations with simple modifications to TK03 showed slight statistical improvements, but still exceed acceptable errors. The root-mean-square misfit of TK03 (and subsequent iterations) is substantially lower for the normal polarity subset of PSV10, compared to reverse polarity data. Two-thirds of data in PSV10 are normal polarity, most which are from the last 5 Ma, so we develop a new TAF model using this subset of data. We use the resulting TAF model to explore whether new statistical PSV models can better describe our new global compilation.
Wehner, Michael F.; Bala, G.; Duffy, Phillip; ...
2010-01-01
We present a set of high-resolution global atmospheric general circulation model (AGCM) simulations focusing on the model's ability to represent tropical storms and their statistics. We find that the model produces storms of hurricane strength with realistic dynamical features. We also find that tropical storm statistics are reasonable, both globally and in the north Atlantic, when compared to recent observations. The sensitivity of simulated tropical storm statistics to increases in sea surface temperature (SST) is also investigated, revealing that a credible late 21st century SST increase produced increases in simulated tropical storm numbers and intensities in all ocean basins. Whilemore » this paper supports previous high-resolution model and theoretical findings that the frequency of very intense storms will increase in a warmer climate, it differs notably from previous medium and high-resolution model studies that show a global reduction in total tropical storm frequency. However, we are quick to point out that this particular model finding remains speculative due to a lack of radiative forcing changes in our time-slice experiments as well as a focus on the Northern hemisphere tropical storm seasons.« less
Analyses of global sea surface temperature 1856-1991
NASA Astrophysics Data System (ADS)
Kaplan, Alexey; Cane, Mark A.; Kushnir, Yochanan; Clement, Amy C.; Blumenthal, M. Benno; Rajagopalan, Balaji
1998-08-01
Global analyses of monthly sea surface temperature (SST) anomalies from 1856 to 1991 are produced using three statistically based methods: optimal smoothing (OS), the Kaiman filter (KF) and optimal interpolation (OI). Each of these is accompanied by estimates of the error covariance of the analyzed fields. The spatial covariance function these methods require is estimated from the available data; the timemarching model is a first-order autoregressive model again estimated from data. The data input for the analyses are monthly anomalies from the United Kingdom Meteorological Office historical sea surface temperature data set (MOHSST5) [Parker et al., 1994] of the Global Ocean Surface Temperature Atlas (GOSTA) [Bottomley et al., 1990]. These analyses are compared with each other, with GOSTA, and with an analysis generated by projection (P) onto a set of empirical orthogonal functions (as in Smith et al. [1996]). In theory, the quality of the analyses should rank in the order OS, KF, OI, P, and GOSTA. It is found that the first four give comparable results in the data-rich periods (1951-1991), but at times when data is sparse the first three differ significantly from P and GOSTA. At these times the latter two often have extreme and fluctuating values, prima facie evidence of error. The statistical schemes are also verified against data not used in any of the analyses (proxy records derived from corals and air temperature records from coastal and island stations). We also present evidence that the analysis error estimates are indeed indicative of the quality of the products. At most times the OS and KF products are close to the OI product, but at times of especially poor coverage their use of information from other times is advantageous. The methods appear to reconstruct the major features of the global SST field from very sparse data. Comparison with other indications of the El Niño-Southern Oscillation cycle show that the analyses provide usable information on interannual variability as far back as the 1860s.
Cloud Statistics for NASA Climate Change Studies
NASA Technical Reports Server (NTRS)
Wylie, Donald P.
1999-01-01
The Principal Investigator participated in two field experiments and developed a global data set on cirrus cloud frequency and optical depth to aid the development of numerical models of climate. Four papers were published under this grant. The accomplishments are summarized: (1) In SUCCESS (SUbsonic aircraft: Contrail & Cloud Effects Special Study) the Principal Investigator aided weather forecasters in the start of the field program. A paper also was published on the clouds studied in SUCCESS and the use of the satellite stereographic technique to distinguish cloud forms and heights of clouds. (2) In SHEBA (Surface Heat Budget in the Arctic) FIRE/ACE (Arctic Cloud Experiment) the Principal Investigator provided daily weather and cloud forecasts for four research aircraft crews, NASA's ER-2, UCAR's C-130, University of Washington's Convert 580, and the Canadian Atmospheric Environment Service's Convert 580. Approximately 105 forecasts were written. The Principal Investigator also made daily weather summaries with calculations of air trajectories for 54 flight days in the experiment. The trajectories show where the air sampled during the flights came from and will be used in future publications to discuss the origin and history of the air and clouds sampled by the aircraft. A paper discussing how well the FIRE/ACE data represent normal climatic conditions in the arctic is being prepared. (3) The Principal Investigator's web page became the source of information for weather forecasting by the scientists on the SHEBA ship. (4) Global Cirrus frequency and optical depth is a continuing analysis of global cloud cover and frequency distribution are being made from the NOAA polar orbiting weather satellites. This analysis is sensitive to cirrus clouds because of the radiative channels used. During this grant three papers were published which describe cloud frequencies, their optical properties and compare the Wisconsin FM Cloud Analysis to other global cloud data such as the International Satellite Cloud Climatology Program (ISCCP) and the Stratospheric Aerosol and Gas Experiment (SAGE). A summary of eight years of HIRS data will be published in late 1998. Important information from this study are: 1) cirrus clouds cover most of the earth, 2) they are found about 40% of the time globally, 3) in the tropics cirrus cloud frequencies are even higher, from 80-100%, 4) there is slight evidence that cirnis cloud cover is increasing in the northern hemisphere at about 0.5% per year, and 5) cirrus clouds have an average infrared transmittance of about 40% of the terrestrial radiation. (5) Global Cloud Frequency Statistics published on the Principal Investigator's web page have been used in the planning of the future CRYSTAL experiment and have been used for refinements of a global numerical model operated at the Colorado State University.
NASA Astrophysics Data System (ADS)
Chavez, Roberto; Lozano, Sergio; Correia, Pedro; Sanz-Rodrigo, Javier; Probst, Oliver
2013-04-01
With the purpose of efficiently and reliably generating long-term wind resource maps for the wind energy industry, the application and verification of a statistical methodology for the climate downscaling of wind fields at surface level is presented in this work. This procedure is based on the combination of the Monte Carlo and the Principal Component Analysis (PCA) statistical methods. Firstly the Monte Carlo method is used to create a huge number of daily-based annual time series, so called climate representative years, by the stratified sampling of a 33-year-long time series corresponding to the available period of the NCAR/NCEP global reanalysis data set (R-2). Secondly the representative years are evaluated such that the best set is chosen according to its capability to recreate the Sea Level Pressure (SLP) temporal and spatial fields from the R-2 data set. The measure of this correspondence is based on the Euclidean distance between the Empirical Orthogonal Functions (EOF) spaces generated by the PCA (Principal Component Analysis) decomposition of the SLP fields from both the long-term and the representative year data sets. The methodology was verified by comparing the selected 365-days period against a 9-year period of wind fields generated by dynamical downscaling the Global Forecast System data with the mesoscale model SKIRON for the Iberian Peninsula. These results showed that, compared to the traditional method of dynamical downscaling any random 365-days period, the error in the average wind velocity by the PCA's representative year was reduced by almost 30%. Moreover the Mean Absolute Errors (MAE) in the monthly and daily wind profiles were also reduced by almost 25% along all SKIRON grid points. These results showed also that the methodology presented maximum error values in the wind speed mean of 0.8 m/s and maximum MAE in the monthly curves of 0.7 m/s. Besides the bulk numbers, this work shows the spatial distribution of the errors across the Iberian domain and additional wind statistics such as the velocity and directional frequency. Additional repetitions were performed to prove the reliability and robustness of this kind-of statistical-dynamical downscaling method.
NASA Astrophysics Data System (ADS)
Lopez, S. R.; Hogue, T. S.
2011-12-01
Global climate models (GCMs) are primarily used to generate historical and future large-scale circulation patterns at a coarse resolution (typical order of 50,000 km2) and fail to capture climate variability at the ground level due to localized surface influences (i.e topography, marine, layer, land cover, etc). Their inability to accurately resolve these processes has led to the development of numerous 'downscaling' techniques. The goal of this study is to enhance statistical downscaling of daily precipitation and temperature for regions with heterogeneous land cover and topography. Our analysis was divided into two periods, historical (1961-2000) and contemporary (1980-2000), and tested using sixteen predictand combinations from four GCMs (GFDL CM2.0, GFDL CM2.1, CNRM-CM3 and MRI-CGCM2 3.2a. The Southern California area was separated into five county regions: Santa Barbara, Ventura, Los Angeles, Orange and San Diego. Principle component analysis (PCA) was performed on ground-based observations in order to (1) reduce the number of redundant gauges and minimize dimensionality and (2) cluster gauges that behave statistically similarly for post-analysis. Post-PCA analysis included extensive testing of predictor-predictand relationships using an enhanced canonical correlation analysis (ECCA). The ECCA includes obtaining the optimal predictand sets for all models within each spatial domain (county) as governed by daily and monthly overall statistics. Results show all models maintain mean annual and monthly behavior within each county and daily statistics are improved. The level of improvement highly depends on the vegetation extent within each county and the land-to-ocean ratio within the GCM spatial grid. The utilization of the entire historical period also leads to better statistical representation of observed daily precipitation. The validated ECCA technique is being applied to future climate scenarios distributed by the IPCC in order to provide forcing data for regional hydrologic models and assess future water resources in the Southern California region.
It's all relative: ranking the diversity of aquatic bacterial communities.
Shaw, Allison K; Halpern, Aaron L; Beeson, Karen; Tran, Bao; Venter, J Craig; Martiny, Jennifer B H
2008-09-01
The study of microbial diversity patterns is hampered by the enormous diversity of microbial communities and the lack of resources to sample them exhaustively. For many questions about richness and evenness, however, one only needs to know the relative order of diversity among samples rather than total diversity. We used 16S libraries from the Global Ocean Survey to investigate the ability of 10 diversity statistics (including rarefaction, non-parametric, parametric, curve extrapolation and diversity indices) to assess the relative diversity of six aquatic bacterial communities. Overall, we found that the statistics yielded remarkably similar rankings of the samples for a given sequence similarity cut-off. This correspondence, despite the different underlying assumptions of the statistics, suggests that diversity statistics are a useful tool for ranking samples of microbial diversity. In addition, sequence similarity cut-off influenced the diversity ranking of the samples, demonstrating that diversity statistics can also be used to detect differences in phylogenetic structure among microbial communities. Finally, a subsampling analysis suggests that further sequencing from these particular clone libraries would not have substantially changed the richness rankings of the samples.
GIA Model Statistics for GRACE Hydrology, Cryosphere, and Ocean Science
NASA Astrophysics Data System (ADS)
Caron, L.; Ivins, E. R.; Larour, E.; Adhikari, S.; Nilsson, J.; Blewitt, G.
2018-03-01
We provide a new analysis of glacial isostatic adjustment (GIA) with the goal of assembling the model uncertainty statistics required for rigorously extracting trends in surface mass from the Gravity Recovery and Climate Experiment (GRACE) mission. Such statistics are essential for deciphering sea level, ocean mass, and hydrological changes because the latter signals can be relatively small (≤2 mm/yr water height equivalent) over very large regions, such as major ocean basins and watersheds. With abundant new >7 year continuous measurements of vertical land motion (VLM) reported by Global Positioning System stations on bedrock and new relative sea level records, our new statistical evaluation of GIA uncertainties incorporates Bayesian methodologies. A unique aspect of the method is that both the ice history and 1-D Earth structure vary through a total of 128,000 forward models. We find that best fit models poorly capture the statistical inferences needed to correctly invert for lower mantle viscosity and that GIA uncertainty exceeds the uncertainty ascribed to trends from 14 years of GRACE data in polar regions.
Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno
2016-01-22
Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.
A global compilation of coral sea-level benchmarks: Implications and new challenges
NASA Astrophysics Data System (ADS)
Medina-Elizalde, Martín
2013-01-01
I present a quality-controlled compilation of sea-level data from U-Th dated corals, encompassing 30 studies of 13 locations around the world. The compilation contains relative sea level (RSL) data from each location based on both conventional and open-system U-Th ages. I have applied a commonly used age quality control criterion based on the initial 234U/238U activity ratios of corals in order to select reliable ages and to reconstruct sea level histories for the last 150,000 yr. This analysis reveals scatter of RSL estimates among coeval coral benchmarks both within individual locations and between locations, particularly during Marine Isotope Stage (MIS) 5a and the glacial inception following the last interglacial. The character of data scatter during these time intervals imply that uncertainties still exist regarding tectonics, glacio-isostacy, U-series dating, and/or coral position. To elucidate robust underlying patterns, with confidence limits, I performed a Monte Carlo-style statistical analysis of the compiled coral data considering appropriate age and sea-level uncertainties. By its nature, such an analysis has the tendency to smooth/obscure millennial-scale (and finer) details that may be important in individual datasets, and favour the major underlying patterns that are supported by all datasets. This statistical analysis is thus functional to illustrate major trends that are statistically robust ('what we know'), trends that are suggested but still are supported by few data ('what we might know, subject to addition of more supporting data and improved corrections'), and which patterns/data are clear outliers ('unlikely to be realistic given the rest of the global data and possibly needing further adjustments'). Prior to the last glacial maximum and with the possible exception of the 130-120 ka period, available coral data generally have insufficient temporal resolution and unexplained scatter, which hinders identification of a well-defined pattern with usefully narrow confidence limits. This analysis thus provides a framework that objectively identifies critical targets for new data collection, improved corrections, and integration of coral data with independent, stratigraphically continuous methods of sea-level reconstruction.
A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis
Rahman, M. M.; Antani, S. K.; Thoma, G. R.
2011-01-01
We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350
Detrended Cross Correlation Analysis: a new way to figure out the underlying cause of global warming
NASA Astrophysics Data System (ADS)
Hazra, S.; Bera, S. K.
2016-12-01
Analysing non-stationary time series is a challenging task in earth science, seismology, solar physics, climate, biology, finance etc. Most of the cases external noise like oscillation, high frequency noise, low frequency noise in different scales lead to erroneous result. Many statistical methods are proposed to find the correlation between two non-stationary time series. N. Scafetta and B. J. West, Phys. Rev. Lett. 90, 248701 (2003), reported a strong relationship between solar flare intermittency (SFI) and global temperature anomalies (GTA) using diffusion entropy analysis. It has been recently shown that detrended cross correlation analysis (DCCA) is better technique to remove the effects of any unwanted signal as well as local and periodic trend. Thus DCCA technique is more suitable to find the correlation between two non-stationary time series. By this technique, correlation coefficient at different scale can be estimated. Motivated by this here we have applied a new DCCA technique to find the relationship between SFI and GTA. We have also applied this technique to find the relationship between GTA and carbon di-oxide density, GTA and methane density on earth atmosphere. In future we will try to find the relationship between GTA and aerosols present in earth atmosphere, water vapour density on earth atmosphere, ozone depletion etc. This analysis will help us for better understanding about the reason behind global warming
NASA Astrophysics Data System (ADS)
LIU, J.; Bi, Y.; Duan, S.; Lu, D.
2017-12-01
It is well-known that cloud characteristics, such as top and base heights and their layering structure of micro-physical parameters, spatial coverage and temporal duration are very important factors influencing both radiation budget and its vertical partitioning as well as hydrological cycle through precipitation data. Also, cloud structure and their statistical distribution and typical values will have respective characteristics with geographical and seasonal variation. Ka band radar is a powerful tool to obtain above parameters around the world, such as ARM cloud radar at the Oklahoma US, Since 2006, Cloudsat is one of NASA's A-Train satellite constellation, continuously observe the cloud structure with global coverage, but only twice a day it monitor clouds over same local site at same local time.By using IAP Ka band Doppler radar which has been operating continuously since early 2013 over the roof of IAP building in Beijing, we obtained the statistical characteristic of clouds, including cloud layering, cloud top and base heights, as well as the thickness of each cloud layer and their distribution, and were analyzed monthly and seasonal and diurnal variation, statistical analysis of cloud reflectivity profiles is also made. The analysis covers both non-precipitating clouds and precipitating clouds. Also, some preliminary comparison of the results with Cloudsat/Calipso products for same period and same area are made.
A Spatio-Temporal Approach for Global Validation and Analysis of MODIS Aerosol Products
NASA Technical Reports Server (NTRS)
Ichoku, Charles; Chu, D. Allen; Mattoo, Shana; Kaufman, Yoram J.; Remer, Lorraine A.; Tanre, Didier; Slutsker, Ilya; Holben, Brent N.; Lau, William K. M. (Technical Monitor)
2001-01-01
With the launch of the MODIS sensor on the Terra spacecraft, new data sets of the global distribution and properties of aerosol are being retrieved, and need to be validated and analyzed. A system has been put in place to generate spatial statistics (mean, standard deviation, direction and rate of spatial variation, and spatial correlation coefficient) of the MODIS aerosol parameters over more than 100 validation sites spread around the globe. Corresponding statistics are also computed from temporal subsets of AERONET-derived aerosol data. The means and standard deviations of identical parameters from MOMS and AERONET are compared. Although, their means compare favorably, their standard deviations reveal some influence of surface effects on the MODIS aerosol retrievals over land, especially at low aerosol loading. The direction and rate of spatial variation from MODIS are used to study the spatial distribution of aerosols at various locations either individually or comparatively. This paper introduces the methodology for generating and analyzing the data sets used by the two MODIS aerosol validation papers in this issue.
NASA Astrophysics Data System (ADS)
Wechsung, Frank; Wechsung, Maximilian
2016-11-01
The STatistical Analogue Resampling Scheme (STARS) statistical approach was recently used to project changes of climate variables in Germany corresponding to a supposed degree of warming. We show by theoretical and empirical analysis that STARS simply transforms interannual gradients between warmer and cooler seasons into climate trends. According to STARS projections, summers in Germany will inevitably become dryer and winters wetter under global warming. Due to the dominance of negative interannual correlations between precipitation and temperature during the year, STARS has a tendency to generate a net annual decrease in precipitation under mean German conditions. Furthermore, according to STARS, the annual level of global radiation would increase in Germany. STARS can be still used, e.g., for generating scenarios in vulnerability and uncertainty studies. However, it is not suitable as a climate downscaling tool to access risks following from changing climate for a finer than general circulation model (GCM) spatial scale.
van Tilburg, C W J; Stronks, D L; Groeneweg, J G; Huygen, F J P M
2017-03-01
Investigate the effect of percutaneous radiofrequency compared to a sham procedure, applied to the ramus communicans for treatment of lumbar disc pain. Randomized sham-controlled, double-blind, crossover, multicenter clinical trial. Multidisciplinary pain centres of two general hospitals. Sixty patients aged 18 or more with medical history and physical examination suggestive for lumbar disc pain and a reduction of two or more on a numerical rating scale (0-10) after a diagnostic ramus communicans test block. Treatment group: percutaneous radiofrequency treatment applied to the ramus communicans; sham: same procedure except radiofrequency treatment. pain reduction. Secondary outcome measure: Global Perceived Effect. No statistically significant difference in pain level over time between the groups, as well as in the group was found; however, the factor period yielded a statistically significant result. In the crossover group, 11 out of 16 patients experienced a reduction in NRS of 2 or more at 1 month (no significant deviation from chance). No statistically significant difference in satisfaction over time between the groups was found. The independent factors group and period also showed no statistically significant effects. The same applies to recovery: no statistically significant effects were found. The null hypothesis of no difference in pain reduction and in Global Perceived Effect between the treatment and sham group cannot be rejected. Post hoc analysis revealed that none of the investigated parameters contributed to the prediction of a significant pain reduction. Interrupting signalling through the ramus communicans may interfere with the transition of painful information from the discs to the central nervous system. Methodological differences exist in studies evaluating the efficacy of radiofrequency treatment for lumbar disc pain. A randomized, sham-controlled, double-blind, multicenter clinical trial on the effect of radiofrequency at the ramus communicans for lumbar disc pain was conducted. The null hypothesis of no difference in pain reduction and in Global Perceived Effect between the treatment and sham group cannot be rejected. © 2016 The Authors. European Journal of Pain published by John Wiley & Sons Ltd on behalf of European Pain Federation - EFIC®.
NASA Technical Reports Server (NTRS)
Colarco, P. R.; Kahn, R. A.; Remer, L. A.; Levy, R. C.
2014-01-01
We use the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite aerosol optical thickness (AOT) product to assess the impact of reduced swath width on global and regional AOT statistics and trends. Alongtrack and across-track sampling strategies are employed, in which the full MODIS data set is sub-sampled with various narrow-swath (approximately 400-800 km) and single pixel width (approximately 10 km) configurations. Although view-angle artifacts in the MODIS AOT retrieval confound direct comparisons between averages derived from different sub-samples, careful analysis shows that with many portions of the Earth essentially unobserved, spatial sampling introduces uncertainty in the derived seasonal-regional mean AOT. These AOT spatial sampling artifacts comprise up to 60%of the full-swath AOT value under moderate aerosol loading, and can be as large as 0.1 in some regions under high aerosol loading. Compared to full-swath observations, narrower swath and single pixel width sampling exhibits a reduced ability to detect AOT trends with statistical significance. On the other hand, estimates of the global, annual mean AOT do not vary significantly from the full-swath values as spatial sampling is reduced. Aggregation of the MODIS data at coarse grid scales (10 deg) shows consistency in the aerosol trends across sampling strategies, with increased statistical confidence, but quantitative errors in the derived trends are found even for the full-swath data when compared to high spatial resolution (0.5 deg) aggregations. Using results of a model-derived aerosol reanalysis, we find consistency in our conclusions about a seasonal-regional spatial sampling artifact in AOT Furthermore, the model shows that reduced spatial sampling can amount to uncertainty in computed shortwave top-ofatmosphere aerosol radiative forcing of 2-3 W m(sup-2). These artifacts are lower bounds, as possibly other unconsidered sampling strategies would perform less well. These results suggest that future aerosol satellite missions having significantly less than full-swath viewing are unlikely to sample the true AOT distribution well enough to obtain the statistics needed to reduce uncertainty in aerosol direct forcing of climate.
Thinking Globally, Acting Locally: Using the Local Environment to Explore Global Issues.
ERIC Educational Resources Information Center
Simmons, Deborah
1994-01-01
Asserts that water pollution is a global problem and presents statistics indicating how much of the world's water is threatened. Presents three elementary school classroom activities on water quality and local water resources. Includes a figure describing the work of the Global Rivers Environmental Education Network. (CFR)
Similar Estimates of Temperature Impacts on Global Wheat Yield by Three Independent Methods
NASA Technical Reports Server (NTRS)
Liu, Bing; Asseng, Senthold; Muller, Christoph; Ewart, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.;
2016-01-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify 'method uncertainty' in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
Similar estimates of temperature impacts on global wheat yield by three independent methods
NASA Astrophysics Data System (ADS)
Liu, Bing; Asseng, Senthold; Müller, Christoph; Ewert, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; Rosenzweig, Cynthia; Aggarwal, Pramod K.; Alderman, Phillip D.; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andy; Deryng, Delphine; Sanctis, Giacomo De; Doltra, Jordi; Fereres, Elias; Folberth, Christian; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A.; Izaurralde, Roberto C.; Jabloun, Mohamed; Jones, Curtis D.; Kersebaum, Kurt C.; Kimball, Bruce A.; Koehler, Ann-Kristin; Kumar, Soora Naresh; Nendel, Claas; O'Leary, Garry J.; Olesen, Jørgen E.; Ottman, Michael J.; Palosuo, Taru; Prasad, P. V. Vara; Priesack, Eckart; Pugh, Thomas A. M.; Reynolds, Matthew; Rezaei, Ehsan E.; Rötter, Reimund P.; Schmid, Erwin; Semenov, Mikhail A.; Shcherbak, Iurii; Stehfest, Elke; Stöckle, Claudio O.; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wall, Gerard W.; Wang, Enli; White, Jeffrey W.; Wolf, Joost; Zhao, Zhigan; Zhu, Yan
2016-12-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify `method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
Global Statistics of Bolides in the Terrestrial Atmosphere
NASA Astrophysics Data System (ADS)
Chernogor, L. F.; Shevelyov, M. B.
2017-06-01
Purpose: Evaluation and analysis of distribution of the number of meteoroid (mini asteroid) falls as a function of glow energy, velocity, the region of maximum glow altitude, and geographic coordinates. Design/methodology/approach: The satellite database on the glow of 693 mini asteroids, which were decelerated in the terrestrial atmosphere, has been used for evaluating basic meteoroid statistics. Findings: A rapid decrease in the number of asteroids with increasing of their glow energy is confirmed. The average speed of the celestial bodies is equal to about 17.9 km/s. The altitude of maximum glow most often equals to 30-40 km. The distribution law for a number of meteoroids entering the terrestrial atmosphere in longitude and latitude (after excluding the component in latitudinal dependence due to the geometry) is approximately uniform. Conclusions: Using a large enough database of measurements, the meteoroid (mini asteroid) statistics has been evaluated.
Statistical analysis of modeling error in structural dynamic systems
NASA Technical Reports Server (NTRS)
Hasselman, T. K.; Chrostowski, J. D.
1990-01-01
The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.
NASA Technical Reports Server (NTRS)
Bonavito, N. L.; Gordon, C. L.; Inguva, R.; Serafino, G. N.; Barnes, R. A.
1994-01-01
NASA's Mission to Planet Earth (MTPE) will address important interdisciplinary and environmental issues such as global warming, ozone depletion, deforestation, acid rain, and the like with its long term satellite observations of the Earth and with its comprehensive Data and Information System. Extensive sets of satellite observations supporting MTPE will be provided by the Earth Observing System (EOS), while more specific process related observations will be provided by smaller Earth Probes. MTPE will use data from ground and airborne scientific investigations to supplement and validate the global observations obtained from satellite imagery, while the EOS satellites will support interdisciplinary research and model development. This is important for understanding the processes that control the global environment and for improving the prediction of events. In this paper we illustrate the potential for powerful artificial intelligence (AI) techniques when used in the analysis of the formidable problems that exist in the NASA Earth Science programs and of those to be encountered in the future MTPE and EOS programs. These techniques, based on the logical and probabilistic reasoning aspects of plausible inference, strongly emphasize the synergetic relation between data and information. As such, they are ideally suited for the analysis of the massive data streams to be provided by both MTPE and EOS. To demonstrate this, we address both the satellite imagery and model enhancement issues for the problem of ozone profile retrieval through a method based on plausible scientific inferencing. Since in the retrieval problem, the atmospheric ozone profile that is consistent with a given set of measured radiances may not be unique, an optimum statistical method is used to estimate a 'best' profile solution from the radiances and from additional a priori information.
Supaporn, Pansuwan; Yeom, Sung Ho
2018-04-30
This study investigated the biological conversion of crude glycerol generated from a commercial biodiesel production plant as a by-product to 1,3-propanediol (1,3-PD). Statistical analysis was employed to derive a statistical model for the individual and interactive effects of glycerol, (NH 4 ) 2 SO 4 , trace elements, pH, and cultivation time on the four objectives: 1,3-PD concentration, yield, selectivity, and productivity. Optimum conditions for each objective with its maximum value were predicted by statistical optimization, and experiments under the optimum conditions verified the predictions. In addition, by systematic analysis of the values of four objectives, optimum conditions for 1,3-PD concentration (49.8 g/L initial glycerol, 4.0 g/L of (NH 4 ) 2 SO 4 , 2.0 mL/L of trace element, pH 7.5, and 11.2 h of cultivation time) were determined to be the global optimum culture conditions for 1,3-PD production. Under these conditions, we could achieve high 1,3-PD yield (47.4%), 1,3-PD selectivity (88.8%), and 1,3-PD productivity (2.1/g/L/h) as well as high 1,3-PD concentration (23.6 g/L).
A hierarchical fuzzy rule-based approach to aphasia diagnosis.
Akbarzadeh-T, Mohammad-R; Moshtagh-Khorasani, Majid
2007-10-01
Aphasia diagnosis is a particularly challenging medical diagnostic task due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. To efficiently address this diagnostic process, a hierarchical fuzzy rule-based structure is proposed here that considers the effect of different features of aphasia by statistical analysis in its construction. This approach can be efficient for diagnosis of aphasia and possibly other medical diagnostic applications due to its fuzzy and hierarchical reasoning construction. Initially, the symptoms of the disease which each consists of different features are analyzed statistically. The measured statistical parameters from the training set are then used to define membership functions and the fuzzy rules. The resulting two-layered fuzzy rule-based system is then compared with a back propagating feed-forward neural network for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. In order to reduce the number of required inputs, the technique is applied and compared on both comprehensive and spontaneous speech tests. Statistical t-test analysis confirms that the proposed approach uses fewer Aphasia features while also presenting a significant improvement in terms of accuracy.
Global health business: the production and performativity of statistics in Sierra Leone and Germany.
Erikson, Susan L
2012-01-01
The global push for health statistics and electronic digital health information systems is about more than tracking health incidence and prevalence. It is also experienced on the ground as means to develop and maintain particular norms of health business, knowledge, and decision- and profit-making that are not innocent. Statistics make possible audit and accountability logics that undergird the management of health at a distance and that are increasingly necessary to the business of health. Health statistics are inextricable from their social milieus, yet as business artifacts they operate as if they are freely formed, objectively originated, and accurate. This article explicates health statistics as cultural forms and shows how they have been produced and performed in two very different countries: Sierra Leone and Germany. In both familiar and surprising ways, this article shows how statistics and their pursuit organize and discipline human behavior, constitute subject positions, and reify existing relations of power.
Blanco-Guillot, Francles; Castañeda-Cediel, M Lucía; Cruz-Hervert, Pablo; Ferreyra-Reyes, Leticia; Delgado-Sánchez, Guadalupe; Ferreira-Guerrero, Elizabeth; Montero-Campos, Rogelio; Bobadilla-Del-Valle, Miriam; Martínez-Gamboa, Rosa Areli; Torres-González, Pedro; Téllez-Vazquez, Norma; Canizales-Quintero, Sergio; Yanes-Lane, Mercedes; Mongua-Rodríguez, Norma; Ponce-de-León, Alfredo; Sifuentes-Osornio, José; García-García, Lourdes
2018-01-01
Genotyping and georeferencing in tuberculosis (TB) have been used to characterize the distribution of the disease and occurrence of transmission within specific groups and communities. The objective of this study was to test the hypothesis that diabetes mellitus (DM) and pulmonary TB may occur in spatial and molecular aggregations. Retrospective cohort study of patients with pulmonary TB. The study area included 12 municipalities in the Sanitary Jurisdiction of Orizaba, Veracruz, México. Patients with acid-fast bacilli in sputum smears and/or Mycobacterium tuberculosis in sputum cultures were recruited from 1995 to 2010. Clinical (standardized questionnaire, physical examination, chest X-ray, blood glucose test and HIV test), microbiological, epidemiological, and molecular evaluations were carried out. Patients were considered "genotype-clustered" if two or more isolates from different patients were identified within 12 months of each other and had six or more IS6110 bands in an identical pattern, or < 6 bands with identical IS6110 RFLP patterns and spoligotype with the same spacer oligonucleotides. Residential and health care centers addresses were georeferenced. We used a Jeep hand GPS. The coordinates were transferred from the GPS files to ArcGIS using ArcMap 9.3. We evaluated global spatial aggregation of patients in IS6110-RFLP/ spoligotype clusters using global Moran´s I. Since global distribution was not random, we evaluated "hotspots" using Getis-Ord Gi* statistic. Using bivariate and multivariate analysis we analyzed sociodemographic, behavioral, clinic and bacteriological conditions associated with "hotspots". We used STATA® v13.1 for all statistical analysis. From 1995 to 2010, 1,370 patients >20 years were diagnosed with pulmonary TB; 33% had DM. The proportion of isolates that were genotyped was 80.7% (n = 1105), of which 31% (n = 342) were grouped in 91 genotype clusters with 2 to 23 patients each; 65.9% of total clusters were small (2 members) involving 35.08% of patients. Twenty three (22.7) percent of cases were classified as recent transmission. Moran`s I indicated that distribution of patients in IS6110-RFLP/spoligotype clusters was not random (Moran`s I = 0.035468, Z value = 7.0, p = 0.00). Local spatial analysis showed statistically significant spatial aggregation of patients in IS6110-RFLP/spoligotype clusters identifying "hotspots" and "coldspots". GI* statistic showed that the hotspot for spatial clustering was located in Camerino Z. Mendoza municipality; 14.6% (50/342) of patients in genotype clusters were located in a hotspot; of these, 60% (30/50) lived with DM. Using logistic regression the statistically significant variables associated with hotspots were: DM [adjusted Odds Ratio (aOR) 7.04, 95% Confidence interval (CI) 3.03-16.38] and attending the health center in Camerino Z. Mendoza (aOR18.04, 95% CI 7.35-44.28). The combination of molecular and epidemiological information with geospatial data allowed us to identify the concurrence of molecular clustering and spatial aggregation of patients with DM and TB. This information may be highly useful for TB control programs.
Blanco-Guillot, Francles; Ferreyra-Reyes, Leticia; Delgado-Sánchez, Guadalupe; Ferreira-Guerrero, Elizabeth; Montero-Campos, Rogelio; Bobadilla-del-Valle, Miriam; Martínez-Gamboa, Rosa Areli; Torres-González, Pedro; Téllez-Vazquez, Norma; Canizales-Quintero, Sergio; Yanes-Lane, Mercedes; Mongua-Rodríguez, Norma; Ponce-de-León, Alfredo; Sifuentes-Osornio, José
2018-01-01
Background Genotyping and georeferencing in tuberculosis (TB) have been used to characterize the distribution of the disease and occurrence of transmission within specific groups and communities. Objective The objective of this study was to test the hypothesis that diabetes mellitus (DM) and pulmonary TB may occur in spatial and molecular aggregations. Material and methods Retrospective cohort study of patients with pulmonary TB. The study area included 12 municipalities in the Sanitary Jurisdiction of Orizaba, Veracruz, México. Patients with acid-fast bacilli in sputum smears and/or Mycobacterium tuberculosis in sputum cultures were recruited from 1995 to 2010. Clinical (standardized questionnaire, physical examination, chest X-ray, blood glucose test and HIV test), microbiological, epidemiological, and molecular evaluations were carried out. Patients were considered “genotype-clustered” if two or more isolates from different patients were identified within 12 months of each other and had six or more IS6110 bands in an identical pattern, or < 6 bands with identical IS6110 RFLP patterns and spoligotype with the same spacer oligonucleotides. Residential and health care centers addresses were georeferenced. We used a Jeep hand GPS. The coordinates were transferred from the GPS files to ArcGIS using ArcMap 9.3. We evaluated global spatial aggregation of patients in IS6110-RFLP/ spoligotype clusters using global Moran´s I. Since global distribution was not random, we evaluated “hotspots” using Getis-Ord Gi* statistic. Using bivariate and multivariate analysis we analyzed sociodemographic, behavioral, clinic and bacteriological conditions associated with “hotspots”. We used STATA® v13.1 for all statistical analysis. Results From 1995 to 2010, 1,370 patients >20 years were diagnosed with pulmonary TB; 33% had DM. The proportion of isolates that were genotyped was 80.7% (n = 1105), of which 31% (n = 342) were grouped in 91 genotype clusters with 2 to 23 patients each; 65.9% of total clusters were small (2 members) involving 35.08% of patients. Twenty three (22.7) percent of cases were classified as recent transmission. Moran`s I indicated that distribution of patients in IS6110-RFLP/spoligotype clusters was not random (Moran`s I = 0.035468, Z value = 7.0, p = 0.00). Local spatial analysis showed statistically significant spatial aggregation of patients in IS6110-RFLP/spoligotype clusters identifying “hotspots” and “coldspots”. GI* statistic showed that the hotspot for spatial clustering was located in Camerino Z. Mendoza municipality; 14.6% (50/342) of patients in genotype clusters were located in a hotspot; of these, 60% (30/50) lived with DM. Using logistic regression the statistically significant variables associated with hotspots were: DM [adjusted Odds Ratio (aOR) 7.04, 95% Confidence interval (CI) 3.03–16.38] and attending the health center in Camerino Z. Mendoza (aOR18.04, 95% CI 7.35–44.28). Conclusions The combination of molecular and epidemiological information with geospatial data allowed us to identify the concurrence of molecular clustering and spatial aggregation of patients with DM and TB. This information may be highly useful for TB control programs. PMID:29534104
Mechanical and Statistical Evidence of Human-Caused Earthquakes - A Global Data Analysis
NASA Astrophysics Data System (ADS)
Klose, C. D.
2012-12-01
The causality of large-scale geoengineering activities and the occurrence of earthquakes with magnitudes of up to M=8 is discussed and mechanical and statistical evidence is provided. The earthquakes were caused by artificial water reservoir impoundments, underground and open-pit mining, coastal management, hydrocarbon production and fluid injections/extractions. The presented global earthquake catalog has been recently published in the Journal of Seismology and is available for the public at www.cdklose.com. The data show evidence that geomechanical relationships exist with statistical significance between a) seismic moment magnitudes of observed earthquakes, b) anthropogenic mass shifts on the Earth's crust, and c) lateral distances of the earthquake hypocenters to the locations of the mass shifts. Research findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. First analyses, however, indicate that that small- to medium size earthquakes (
Increasing power-law range in avalanche amplitude and energy distributions
NASA Astrophysics Data System (ADS)
Navas-Portella, Víctor; Serra, Isabel; Corral, Álvaro; Vives, Eduard
2018-02-01
Power-law-type probability density functions spanning several orders of magnitude are found for different avalanche properties. We propose a methodology to overcome empirical constraints that limit the range of truncated power-law distributions. By considering catalogs of events that cover different observation windows, the maximum likelihood estimation of a global power-law exponent is computed. This methodology is applied to amplitude and energy distributions of acoustic emission avalanches in failure-under-compression experiments of a nanoporous silica glass, finding in some cases global exponents in an unprecedented broad range: 4.5 decades for amplitudes and 9.5 decades for energies. In the latter case, however, strict statistical analysis suggests experimental limitations might alter the power-law behavior.
Global limits and interference patterns in dark matter direct detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catena, Riccardo; Gondolo, Paolo
2015-08-13
We compare the general effective theory of one-body dark matter nucleon interactions to current direct detection experiments in a global multidimensional statistical analysis. We derive exclusion limits on the 28 isoscalar and isovector coupling constants of the theory, and show that current data place interesting constraints on dark matter-nucleon interaction operators usually neglected in this context. We characterize the interference patterns that can arise in dark matter direct detection from pairs of dark matter-nucleon interaction operators, or from isoscalar and isovector components of the same operator. We find that commonly neglected destructive interference effects weaken standard direct detection exclusion limitsmore » by up to one order of magnitude in the coupling constants.« less
Global limits and interference patterns in dark matter direct detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catena, Riccardo; Gondolo, Paolo, E-mail: riccardo.catena@theorie.physik.uni-goettingen.de, E-mail: paolo.gondolo@utah.edu
2015-08-01
We compare the general effective theory of one-body dark matter nucleon interactions to current direct detection experiments in a global multidimensional statistical analysis. We derive exclusion limits on the 28 isoscalar and isovector coupling constants of the theory, and show that current data place interesting constraints on dark matter-nucleon interaction operators usually neglected in this context. We characterize the interference patterns that can arise in dark matter direct detection from pairs of dark matter-nucleon interaction operators, or from isoscalar and isovector components of the same operator. We find that commonly neglected destructive interference effects weaken standard direct detection exclusion limitsmore » by up to one order of magnitude in the coupling constants.« less
Increasing power-law range in avalanche amplitude and energy distributions.
Navas-Portella, Víctor; Serra, Isabel; Corral, Álvaro; Vives, Eduard
2018-02-01
Power-law-type probability density functions spanning several orders of magnitude are found for different avalanche properties. We propose a methodology to overcome empirical constraints that limit the range of truncated power-law distributions. By considering catalogs of events that cover different observation windows, the maximum likelihood estimation of a global power-law exponent is computed. This methodology is applied to amplitude and energy distributions of acoustic emission avalanches in failure-under-compression experiments of a nanoporous silica glass, finding in some cases global exponents in an unprecedented broad range: 4.5 decades for amplitudes and 9.5 decades for energies. In the latter case, however, strict statistical analysis suggests experimental limitations might alter the power-law behavior.
MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data
Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.
2014-01-01
Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674
Further developments in cloud statistics for computer simulations
NASA Technical Reports Server (NTRS)
Chang, D. T.; Willand, J. H.
1972-01-01
This study is a part of NASA's continued program to provide global statistics of cloud parameters for computer simulation. The primary emphasis was on the development of the data bank of the global statistical distributions of cloud types and cloud layers and their applications in the simulation of the vertical distributions of in-cloud parameters such as liquid water content. These statistics were compiled from actual surface observations as recorded in Standard WBAN forms. Data for a total of 19 stations were obtained and reduced. These stations were selected to be representative of the 19 primary cloud climatological regions defined in previous studies of cloud statistics. Using the data compiled in this study, a limited study was conducted of the hemogeneity of cloud regions, the latitudinal dependence of cloud-type distributions, the dependence of these statistics on sample size, and other factors in the statistics which are of significance to the problem of simulation. The application of the statistics in cloud simulation was investigated. In particular, the inclusion of the new statistics in an expanded multi-step Monte Carlo simulation scheme is suggested and briefly outlined.
NASA Astrophysics Data System (ADS)
Dong, Ning; Wright, Ian; Prentice, Iain Colin
2017-04-01
Natural abundance of the stable isotope 15N is an under-utilized resource for research on the global terrestrial nitrogen cycle. Mass balance considerations suggest that if reactive N inputs have a roughly constant isotopic signature, soil δ15N should be mainly determined by the fraction of N losses by leaching - which barely discriminates against 15N - versus gaseous N losses, which discriminate strongly against 15N. We defined simple process-oriented functions of runoff (frunoff) and soil temperature (ftemp) and investigated the dependencies of soil and foliage δ15N (from global compilations of both types of measurement) on their ratio. Both plant and soil δ15N were found to systematically increase with ftemp/frunoff. Consistent with previous analyses, foliage δ15N was offset (more negative) with respect to soil δ15N, with significant differences in this offset between (from largest to smallest offset) ericoid, ectomycorrhizal, arbuscular mycorrhizal and non-mycorrhizal associated plants. δ15N values tend to be large and positive in the driest environments and to decline as frunoff increases, while also being lower in cold environments and increasing as ftemp increases. The fitted statistical model was used to estimate the gaseous fraction of total N losses from ecosystems (fgas) on a global grid basis. In common with earlier results, the largest values of fgas are predicted in the tropics and semi-arid subtropics. This analysis provides an indirectly estimated global mapping of fgas, which could be used as an improved benchmark for terrestrial nitrogen cycle models.
An effective drift correction for dynamical downscaling of decadal global climate predictions
NASA Astrophysics Data System (ADS)
Paeth, Heiko; Li, Jingmin; Pollinger, Felix; Müller, Wolfgang A.; Pohlmann, Holger; Feldmann, Hendrik; Panitz, Hans-Jürgen
2018-04-01
Initialized decadal climate predictions with coupled climate models are often marked by substantial climate drifts that emanate from a mismatch between the climatology of the coupled model system and the data set used for initialization. While such drifts may be easily removed from the prediction system when analyzing individual variables, a major problem prevails for multivariate issues and, especially, when the output of the global prediction system shall be used for dynamical downscaling. In this study, we present a statistical approach to remove climate drifts in a multivariate context and demonstrate the effect of this drift correction on regional climate model simulations over the Euro-Atlantic sector. The statistical approach is based on an empirical orthogonal function (EOF) analysis adapted to a very large data matrix. The climate drift emerges as a dramatic cooling trend in North Atlantic sea surface temperatures (SSTs) and is captured by the leading EOF of the multivariate output from the global prediction system, accounting for 7.7% of total variability. The SST cooling pattern also imposes drifts in various atmospheric variables and levels. The removal of the first EOF effectuates the drift correction while retaining other components of intra-annual, inter-annual and decadal variability. In the regional climate model, the multivariate drift correction of the input data removes the cooling trends in most western European land regions and systematically reduces the discrepancy between the output of the regional climate model and observational data. In contrast, removing the drift only in the SST field from the global model has hardly any positive effect on the regional climate model.
Dynamic water allocation policies improve the global efficiency of storage systems
NASA Astrophysics Data System (ADS)
Niayifar, Amin; Perona, Paolo
2017-06-01
Water impoundment by dams strongly affects the river natural flow regime, its attributes and the related ecosystem biodiversity. Fostering the sustainability of water uses e.g., hydropower systems thus implies searching for innovative operational policies able to generate Dynamic Environmental Flows (DEF) that mimic natural flow variability. The objective of this study is to propose a Direct Policy Search (DPS) framework based on defining dynamic flow release rules to improve the global efficiency of storage systems. The water allocation policies proposed for dammed systems are an extension of previously developed flow redistribution rules for small hydropower plants by Razurel et al. (2016).The mathematical form of the Fermi-Dirac statistical distribution applied to lake equations for the stored water in the dam is used to formulate non-proportional redistribution rules that partition the flow for energy production and environmental use. While energy production is computed from technical data, riverine ecological benefits associated with DEF are computed by integrating the Weighted Usable Area (WUA) for fishes with Richter's hydrological indicators. Then, multiobjective evolutionary algorithms (MOEAs) are applied to build ecological versus economic efficiency plot and locate its (Pareto) frontier. This study benchmarks two MOEAs (NSGA II and Borg MOEA) and compares their efficiency in terms of the quality of Pareto's frontier and computational cost. A detailed analysis of dam characteristics is performed to examine their impact on the global system efficiency and choice of the best redistribution rule. Finally, it is found that non-proportional flow releases can statistically improve the global efficiency, specifically the ecological one, of the hydropower system when compared to constant minimal flows.
Statistical Distribution Analysis of Lineated Bands on Europa
NASA Astrophysics Data System (ADS)
Chen, T.; Phillips, C. B.; Pappalardo, R. T.
2016-12-01
Tina Chen, Cynthia B. Phillips, Robert T. Pappalardo Europa's surface is covered with intriguing linear and disrupted features, including lineated bands that range in scale and size. Previous studies have shown the possibility of an icy shell at the surface that may be concealing a liquid ocean with the potential to harboring life (Pappalardo et al., 1999). Utilizing the high-resolution imaging data from the Galileo spacecraft, we examined bands through a morphometric and morphologic approach. Greeley et al. (2000) and Procktor et al. (2002) have defined bands as wide, hummocky to lineated features that have distinctive surface texture and albedo compared to its surrounding terrain. We took morphometric measurements of lineated bands to find correlations in properties such as size, location, and orientation, and to shed light on formation models. We will present our measurements of over 100 bands on Europa that was mapped on the USGS Europa Global Mosaic Base Map (2002). We also conducted a statistical analysis to understand the distribution of lineated bands globally, and whether the widths of the bands differ by location. Our preliminary analysis from our statistical distribution evaluation, combined with the morphometric measurements, supports a uniform ice shell thickness for Europa rather than one that varies geographically. References: Greeley, Ronald, et al. "Geologic mapping of Europa." Journal of Geophysical Research: Planets 105.E9 (2000): 22559-22578.; Pappalardo, R. T., et al. "Does Europa have a subsurface ocean? Evaluation of the geological evidence." Journal of Geophysical Research: Planets 104.E10 (1999): 24015-24055.; Prockter, Louise M., et al. "Morphology of Europan bands at high resolution: A mid-ocean ridge-type rift mechanism." Journal of Geophysical Research: Planets 107.E5 (2002).; U.S. Geological Survey, 2002, Controlled photomosaic map of Europa, Je 15M CMN: U.S. Geological Survey Geologic Investigations Series I-2757, available at http://pubs.usgs.gov/imap/i2757/
Tan, Meng-Shan; Yu, Jin-Tai; Tan, Chen-Chen; Wang, Hui-Fu; Meng, Xiang-Fei; Wang, Chong; Jiang, Teng; Zhu, Xi-Chen; Tan, Lan
2015-01-01
Research into Ginkgo biloba has been ongoing for many years, while the benefit and adverse effects of Ginkgo biloba extract EGb761 for cognitive impairment and dementia has been discussed controversially. To discuss new evidence on the clinical and adverse effects of standardized Ginkgo biloba extract EGb761 for cognitive impairment and dementia. MEDLINE, EMBASE, Cochrane, and other relevant databases were searched in March 2014 for eligible randomized controlled trials of Ginkgo biloba EGb761 therapy in patients with cognitive impairment and dementia. Nine trials met our inclusion criteria. Trials were of 22-26 weeks duration and included 2,561 patients in total. In the meta-analysis, the weighted mean differences in change scores for cognition were in favor of EGb761 compared to placebo (-2.86, 95%CI -3.18; -2.54); the standardized mean differences in change scores for activities in daily living (ADLs) were also in favor of EGb761 compared to placebo (-0.36, 95%CI -0.44; -0.28); Peto OR showed a statistically significant difference from placebo for Clinicians' Global Impression of Change (CGIC) scale (1.88, 95%CI 1.54; 2.29). All these benefits are mainly associated with EGb761 at a dose of 240 mg/day. For subgroup analysis in patients with neuropsychiatric symptoms, 240 mg/day EGb761 improved cognitive function, ADLs, CGIC, and also neuropsychiatric symptoms with statistical superiority than for the whole group. For the Alzheimer's disease subgroup, the main outcomes were almost the same as the whole group of patients with no statistical superiority. Finally, safety data revealed no important safety concerns with EGb761. EGb761 at 240 mg/day is able to stabilize or slow decline in cognition, function, behavior, and global change at 22-26 weeks in cognitive impairment and dementia, especially for patients with neuropsychiatric symptoms.
LINGO1 and risk for essential tremor: results of a meta-analysis of rs9652490 and rs11856808.
Jiménez-Jiménez, Félix Javier; García-Martín, Elena; Lorenzo-Betancor, Oswaldo; Pastor, Pau; Alonso-Navarro, Hortensia; Agúndez, José A G
2012-06-15
Recently, a genome-wide association study revealed a significant statistical association between LINGO1 rs9652490 and rs11856808 polymorphisms and the risk of developing essential tremor (ET) in Icelandic people. Because the results of further association studies were controversial, we conducted a meta-analysis including all the studies published on the risk of ET related with these polymorphisms. The metaanalysis included 11 association studies between LINGO1 rs9652490 (3972 ET patients, 20,714 controls) and 7 association studies between LINGO1 rs11856808, and risk for ET (2076 ET patients, 18,792 controls), and was carried out by using the software Meta-Disc 1.1.1 (http://www.hrc.es/investigacion/metadisc.html; Unit of Clinical Statistics, Hospital Ramón y Cajal, Madrid, Spain). Heterogeneity between studies in terms of degree of association was tested using the Q-statistic. Global diagnostic odds-ratios (ORs) and 95% confidence intervals (CI) for rs9652490 and rs11856808 of the total series were, respectively, 1.17 (1.00-1.36) (p=0.069) and 1.20 (1.05-1.36) (p=0.016). After excluding data on Icelandic people of the discovery series (that was responsible of a high degree of heterogeneity for rs9652490 polymorphism), the ORs and CI were 1.10 (0.97-1.26) (p=0.063) and 1.12 (0.99-1.27) (p=0.034). Global ORs and 95% CI for rs9652490 and rs11856808 of familial ET patients were, respectively, 1.27 (1.03-1.57) (p=0.014) and 1.21 (1.10-1.44) (p=0.031). The results of the meta-analysis suggest a relationship between LINGO1 rs11856808 polymorphism and the risk for ET and for familial ET, while rs9652490 polymorphism was only related with the risk for familial ET. Copyright © 2012 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, R.M.; Harding, J.M.; Pollak, K.D.
1992-02-01
Global-scale analyses of ocean thermal structure produced operationally at the U.S. Navy`s Fleet Numerical Oceanography Center are verified, along with an ocean thermal climatology, against unassimilated bathythermograph (bathy), satellite multichannel sea surface temperature (MCSST), and ship sea surface temperature (SST) data. Verification statistics are calculated from the three types of data for February-April of 1988 and February-April of 1990 in nine verification areas covering most of the open ocean in the Northern Hemisphere. The analyzed thermal fields were produced by version 1.0 of the Optimum Thermal Interpolation System (OTIS 1.0) in 1988, but by an upgraded version of this model,more » referred to as OTIS 1.1, in 1990. OTIS 1.1 employs exactly the same analysis methodology as OTIS 1.0. The principal difference is that OTIS 1.1 has twice the spatial resolution of OTIS 1.0 and consequently uses smaller spatial decorrelation scales and noise-to-signal ratios. As a result, OTIS 1.1 is able to represent more horizontal detail in the ocean thermal fields than its predecessor. Verification statistics for the SST fields derived from bathy and MCSST data are consistent with each other, showing similar trends and error levels. These data indicate that the analyzed SST fields are more accurate in 1990 than in 1988, and generally more accurate than climatology for both years. Verification statistics for the SST fields derived from ship data are inconsistent with those derived from the bathy and MCSST data, and show much higher error levels indicative of observational noise.« less
Additive scales in degenerative disease--calculation of effect sizes and clinical judgment.
Riepe, Matthias W; Wilkinson, David; Förstl, Hans; Brieden, Andreas
2011-12-16
The therapeutic efficacy of an intervention is often assessed in clinical trials by scales measuring multiple diverse activities that are added to produce a cumulative global score. Medical communities and health care systems subsequently use these data to calculate pooled effect sizes to compare treatments. This is done because major doubt has been cast over the clinical relevance of statistically significant findings relying on p values with the potential to report chance findings. Hence in an aim to overcome this pooling the results of clinical studies into a meta-analyses with a statistical calculus has been assumed to be a more definitive way of deciding of efficacy. We simulate the therapeutic effects as measured with additive scales in patient cohorts with different disease severity and assess the limitations of an effect size calculation of additive scales which are proven mathematically. We demonstrate that the major problem, which cannot be overcome by current numerical methods, is the complex nature and neurobiological foundation of clinical psychiatric endpoints in particular and additive scales in general. This is particularly relevant for endpoints used in dementia research. 'Cognition' is composed of functions such as memory, attention, orientation and many more. These individual functions decline in varied and non-linear ways. Here we demonstrate that with progressive diseases cumulative values from multidimensional scales are subject to distortion by the limitations of the additive scale. The non-linearity of the decline of function impedes the calculation of effect sizes based on cumulative values from these multidimensional scales. Statistical analysis needs to be guided by boundaries of the biological condition. Alternatively, we suggest a different approach avoiding the error imposed by over-analysis of cumulative global scores from additive scales.
NASA Astrophysics Data System (ADS)
Antón, M.; Román, R.; Sanchez-Lorenzo, A.; Calbó, J.; Vaquero, J. M.
2017-07-01
This study focuses on the analysis of the daily global solar radiation (GSR) reconstructed from sunshine duration measurements at Madrid (Spain) from 1887 to 1950. Additionally, cloud cover information recorded simultaneously by human observations for the study period was also analyzed and used to select cloud-free days. First, the day-to-day variability of reconstructed GSR data was evaluated, finding a strong relationship between GSR and cloudiness. The second step was to analyze the long-term evolution of the GSR data which exhibited two clear trends with opposite sign: a marked negative trend of - 36 kJ/m2 per year for 1887-1915 period and a moderate positive trend of + 13 kJ/m2 per year for 1916-1950 period, both statistically significant at the 95% confidence level. Therefore, there is evidence of "early dimming" and "early brightening" periods in the reconstructed GSR data for all-sky conditions in Madrid from the late 19th to the mid-20th centuries. Unlike the long-term evolution of GSR data, cloud cover showed non-statistically significant trends for the two analyzed sub-periods, 1887-1915 and 1916-1950. Finally, GSR trends were analyzed exclusively under cloud-free conditions in summer by means of the determination of the clearness index for those days with all cloud cover observations equal to zero oktas. The long-term evolution of the clearness index was in accordance with the "early dimming" and "early brightening" periods, showing smaller trends but still statistically significant. This result points out that aerosol load variability could have had a non-negligible influence on the long-term evolution of GSR even as far as from the late 19th century.
Long term, non-anthropogenic groundwater storage changes simulated by a global land surface model
NASA Astrophysics Data System (ADS)
Li, B.; Rodell, M.; Sheffield, J.; Wood, E. F.
2017-12-01
Groundwater is crucial for meeting agricultural, industrial and municipal water needs, especially in arid, semi-arid and drought impacted regions. Yet, knowledge on groundwater response to climate variability is not well understood due to lack of systematic and continuous in situ measurements. In this study, we investigate global non-anthropogenic groundwater storage variations with a land surface model driven by a 67-year (1948-204) meteorological forcing data set. Model estimates were evaluated using in situ groundwater data from the central and northeastern U.S. and terrestrial water storage derived from the Gravity Recovery and Climate Experiment (GRACE) satellites and found to be reasonable. Empirical orthogonal function (EOF) analysis was employed to examine modes of variability of groundwater storage and their relationship with atmospheric effects such as precipitation and evapotranspiration. The result shows that the leading mode in global groundwater storage reflects the influence of the El Niño Southern Oscillation (ENSO). Consistent with the EOF analysis, global total groundwater storage reflected the low frequency variability of ENSO and decreased significantly over 1948-2014 while global ET and precipitation did not exhibit statistically significant trends. This study suggests that while precipitation and ET are the primary drivers of climate related groundwater variability, changes in other forcing fields than precipitation and temperature are also important because of their influence on ET. We discuss the need to improve model physics and to continuously validate model estimates and forcing data for future studies.
Real-life assessment of the validity of patient global impression of change in fibromyalgia.
Rampakakis, Emmanouil; Ste-Marie, Peter A; Sampalis, John S; Karellis, Angeliki; Shir, Yoram; Fitzcharles, Mary-Ann
2015-01-01
Patient Global Rating of Change (GRC) scales are commonly used in routine clinical care given their ease of use, availability and short completion time. This analysis aimed at assessing the validity of Patient Global Impression of Change (PGIC), a GRC scale commonly used in fibromyalgia, in a Canadian real-life setting. 167 fibromyalgia patients with available PGIC data were recruited in 2005-2013 from a Canadian tertiary-care multidisciplinary clinic. In addition to PGIC, disease severity was assessed with: pain visual analogue scale (VAS); Patient Global Assessment (PGA); Fibromyalgia Impact Questionnaire (FIQ); Health Assessment Questionnaire (HAQ); McGill Pain Questionnaire; body map. Multivariate linear regression assessed the PGIC relationship with disease parameter improvement while adjusting for follow-up duration and baseline parameter levels. The Spearman's rank coefficient assessed parameter correlation. Higher PGIC scores were significantly (p<0.001) associated with greater improvement in pain, PGA, FIQ, HAQ and the body map. A statistically significant moderate positive correlation was observed between PGIC and FIQ improvement (r=0.423; p<0.001); correlation with all remaining disease severity measures was weak. Regression analysis confirmed a significant (p<0.001) positive association between improvement in all disease severity measures and PGIC. Baseline disease severity and follow-up duration were identified as significant independent predictors of PGIC rating. Despite that only a weak correlation was identified between PGIC and standard fibromyalgia outcomes improvement, in the absence of objective outcomes, PGIC remains a clinically relevant tool to assess perceived impact of disease management. However, our analysis suggests that outcome measures data should not be considered in isolation but, within the global clinical context.
Axial mass in quasielastic antineutrino-nucleon scattering accompanied by strange-hyperon production
NASA Astrophysics Data System (ADS)
Kuzmin, K. S.; Naumov, V. A.
2009-09-01
Reactions of quasielastic Λ-, Σ--, and Σ0-hyperon production in antineutrino-nucleon interactions are studied. An axial-mass ( M A ) value that agrees with a fit to all accelerator data on exclusive and inclusive νN and νN reactions was extracted from a global statistical analysis of experimental data on differential and total cross sections for Δ Y = 0 and 1 quasielastic reactions of neutrino and antineutrino scattering on various nuclear targets.
Geography of end-Cretaceous marine bivalve extinctions
NASA Technical Reports Server (NTRS)
Raup, David M.; Jablonski, David
1993-01-01
Analysis of the end-Cretaceous mass extinction, based on 3514 occurrences of 340 genera of marine bivalves (Mollusca), suggests that extinction intensities were uniformly global; no latitudinal gradients or other geographic patterns are detected. Elevated extinction intensities in some tropical areas are entirely a result of the distribution of one extinct group of highly specialized bivalves, the rudists. When rudists are omitted, intensities at those localities are statistically indistinguishable from those of both the rudist-free tropics and extratropical localities.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen
2016-04-01
Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5
Amino acid pair- and triplet-wise groupings in the interior of α-helical segments in proteins.
de Sousa, Miguel M; Munteanu, Cristian R; Pazos, Alejandro; Fonseca, Nuno A; Camacho, Rui; Magalhães, A L
2011-02-21
A statistical approach has been applied to analyse primary structure patterns at inner positions of α-helices in proteins. A systematic survey was carried out in a recent sample of non-redundant proteins selected from the Protein Data Bank, which were used to analyse α-helix structures for amino acid pairing patterns. Only residues more than three positions apart from both termini of the α-helix were considered as inner. Amino acid pairings i, i+k (k=1, 2, 3, 4, 5), were analysed and the corresponding 20×20 matrices of relative global propensities were constructed. An analysis of (i, i+4, i+8) and (i, i+3, i+4) triplet patterns was also performed. These analysis yielded information on a series of amino acid patterns (pairings and triplets) showing either high or low preference for α-helical motifs and suggested a novel approach to protein alphabet reduction. In addition, it has been shown that the individual amino acid propensities are not enough to define the statistical distribution of these patterns. Global pair propensities also depend on the type of pattern, its composition and orientation in the protein sequence. The data presented should prove useful to obtain and refine useful predictive rules which can further the development and fine-tuning of protein structure prediction algorithms and tools. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Wenjing; Xu, Liang
2017-07-01
Based on Center of Orbit Determination in Europe (CODE) global ionospheric map (GIM) data, a statistical analysis of local total electron content (TEC) anomalies before 121 low-depth ( D ≤ 100 km) strong ( M w ≥ 7.0) earthquakes has been made using the sliding median differential calculation method combining with a new approach of image processing technique. The results show that significant local TEC anomalies could be observed 0-6 days before 80 earthquakes, about 66.1% out of the total. The positive anomalies occur more often than negative ones. For 26 cases, both positive and negative anomalies are observed before the shock. The pre-earthquake TEC anomalies show local time recurrence for 38 earthquakes, which occur around the same local time on different days. The local time distribution of the pre-earthquake TEC anomalies mainly concentrates between 19 and 06 LT, roughly from the sunset to sunrise. Most of the pre-earthquake TEC anomalies do not locate above the epicenter but shift to the south. The pre-earthquake TEC anomalies could be extracted near the magnetic conjugate point of the epicenter for 40 events, which is 50% out of the total 80 cases with significant local TEC anomalies. In general, the signs of the anomalies around epicenter and its conjugate point are the same, but the abnormal magnitude and lasting time are not.
On Learning Cluster Coefficient of Private Networks
Wang, Yue; Wu, Xintao; Zhu, Jun; Xiang, Yang
2013-01-01
Enabling accurate analysis of social network data while preserving differential privacy has been challenging since graph features such as clustering coefficient or modularity often have high sensitivity, which is different from traditional aggregate functions (e.g., count and sum) on tabular data. In this paper, we treat a graph statistics as a function f and develop a divide and conquer approach to enforce differential privacy. The basic procedure of this approach is to first decompose the target computation f into several less complex unit computations f1, …, fm connected by basic mathematical operations (e.g., addition, subtraction, multiplication, division), then perturb the output of each fi with Laplace noise derived from its own sensitivity value and the distributed privacy threshold εi, and finally combine those perturbed fi as the perturbed output of computation f. We examine how various operations affect the accuracy of complex computations. When unit computations have large global sensitivity values, we enforce the differential privacy by calibrating noise based on the smooth sensitivity, rather than the global sensitivity. By doing this, we achieve the strict differential privacy guarantee with smaller magnitude noise. We illustrate our approach by using clustering coefficient, which is a popular statistics used in social network analysis. Empirical evaluations on five real social networks and various synthetic graphs generated from three random graph models show the developed divide and conquer approach outperforms the direct approach. PMID:24429843
The Spatial Scaling of Global Rainfall Extremes
NASA Astrophysics Data System (ADS)
Devineni, N.; Xi, C.; Lall, U.; Rahill-Marier, B.
2013-12-01
Floods associated with severe storms are a significant source of risk for property, life and supply chains. These property losses tend to be determined as much by the duration of flooding as by the depth and velocity of inundation. High duration floods are typically induced by persistent rainfall (upto 30 day duration) as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Events related to persistent and recurrent rainfall appear to correspond to the persistence of specific global climate patterns that may be identifiable from global, historical data fields, and also from climate models that project future conditions. A clear understanding of the space-time rainfall patterns for events or for a season will enable in assessing the spatial distribution of areas likely to have a high/low inundation potential for each type of rainfall forcing. In this paper, we investigate the statistical properties of the spatial manifestation of the rainfall exceedances. We also investigate the connection of persistent rainfall events at different latitudinal bands to large-scale climate phenomena such as ENSO. Finally, we present the scaling phenomena of contiguous flooded areas as a result of large scale organization of long duration rainfall events. This can be used for spatially distributed flood risk assessment conditional on a particular rainfall scenario. Statistical models for spatio-temporal loss simulation including model uncertainty to support regional and portfolio analysis can be developed.
[Alterations of brain network efficiency in patients with post-concussion syndrome].
Peng, Nan; Qian, Ruobing; Fu, Xianming; Li, Shunli; Kang, Zhiqiang; Lin, Bin; Ji, Xuebing; Wei, Xiangpin; Niu, Chaoshi; Wang, Yehan
2015-07-07
To discuss the alterations of brain network efficiency in patients with post-concussion syndrome. A total of 23 patients from Anhui Provincial Hospital in the period from 2013/6 to 2014/3 who have had the concussion for 3 months were enrolled and 23 volunteers paired in sex, age and education were also enrolled as healthy controls. Comparisons of selective attention of both groups were conducted using Stroop Word-Color Test. The data of resting-state functional magnetic resonance imaging (fMRI) in both groups were collected and the data were dealt with Network Construction which is a part of GRETNA software to obtain the Matrix of brain network. Network analysis was used to obtain Global and Nodal efficiency, then independent t-test was used for statistical analyses of the value of Global and Nodal efficiency. The difference in Global efficiency of two groups in every threshold value had no statistical significance. Compared with healthy controls, the Nodal efficiencies in patients with post-concussion syndrome were significantly different in the brain regions as below: left orbital middle frontal gyrus, left posterior cingulate, left lingual, left thalamus, left superior temporal gyrus, right anterior cingulate, right posterior cingulate, right supramarginalgyrus. Compared with healthy controls, there is no significant changes of Globe efficiency in patients with post-concussion syndrome, and the brain function deficits in these patients may be caused by changes of Nodal efficiency in their brain network.
Drought Persistence Errors in Global Climate Models
NASA Astrophysics Data System (ADS)
Moon, H.; Gudmundsson, L.; Seneviratne, S. I.
2018-04-01
The persistence of drought events largely determines the severity of socioeconomic and ecological impacts, but the capability of current global climate models (GCMs) to simulate such events is subject to large uncertainties. In this study, the representation of drought persistence in GCMs is assessed by comparing state-of-the-art GCM model simulations to observation-based data sets. For doing so, we consider dry-to-dry transition probabilities at monthly and annual scales as estimates for drought persistence, where a dry status is defined as negative precipitation anomaly. Though there is a substantial spread in the drought persistence bias, most of the simulations show systematic underestimation of drought persistence at global scale. Subsequently, we analyzed to which degree (i) inaccurate observations, (ii) differences among models, (iii) internal climate variability, and (iv) uncertainty of the employed statistical methods contribute to the spread in drought persistence errors using an analysis of variance approach. The results show that at monthly scale, model uncertainty and observational uncertainty dominate, while the contribution from internal variability is small in most cases. At annual scale, the spread of the drought persistence error is dominated by the statistical estimation error of drought persistence, indicating that the partitioning of the error is impaired by the limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current GCMs and suggest directions for further model improvement.
Long-term variability of global statistical properties of epileptic brain networks
NASA Astrophysics Data System (ADS)
Kuhnert, Marie-Therese; Elger, Christian E.; Lehnertz, Klaus
2010-12-01
We investigate the influence of various pathophysiologic and physiologic processes on global statistical properties of epileptic brain networks. We construct binary functional networks from long-term, multichannel electroencephalographic data recorded from 13 epilepsy patients, and the average shortest path length and the clustering coefficient serve as global statistical network characteristics. For time-resolved estimates of these characteristics we observe large fluctuations over time, however, with some periodic temporal structure. These fluctuations can—to a large extent—be attributed to daily rhythms while relevant aspects of the epileptic process contribute only marginally. Particularly, we could not observe clear cut changes in network states that can be regarded as predictive of an impending seizure. Our findings are of particular relevance for studies aiming at an improved understanding of the epileptic process with graph-theoretical approaches.
Appraising the Corporate Sustainability Reports - Text Mining and Multi-Discriminatory Analysis
NASA Astrophysics Data System (ADS)
Modapothala, J. R.; Issac, B.; Jayamani, E.
The voluntary disclosure of the sustainability reports by the companies attracts wider stakeholder groups. Diversity in these reports poses challenge to the users of information and regulators. This study appraises the corporate sustainability reports as per GRI (Global Reporting Initiative) guidelines (the most widely accepted and used) across all industrial sectors. Text mining is adopted to carry out the initial analysis with a large sample size of 2650 reports. Statistical analyses were performed for further investigation. The results indicate that the disclosures made by the companies differ across the industrial sectors. Multivariate Discriminant Analysis (MDA) shows that the environmental variable is a greater significant contributing factor towards explanation of sustainability report.
Protein mass spectra data analysis for clinical biomarker discovery: a global review.
Roy, Pascal; Truntzer, Caroline; Maucort-Boulch, Delphine; Jouve, Thomas; Molinari, Nicolas
2011-03-01
The identification of new diagnostic or prognostic biomarkers is one of the main aims of clinical cancer research. In recent years there has been a growing interest in using high throughput technologies for the detection of such biomarkers. In particular, mass spectrometry appears as an exciting tool with great potential. However, to extract any benefit from the massive potential of clinical proteomic studies, appropriate methods, improvement and validation are required. To better understand the key statistical points involved with such studies, this review presents the main data analysis steps of protein mass spectra data analysis, from the pre-processing of the data to the identification and validation of biomarkers.
NASA Astrophysics Data System (ADS)
Lu, Q.-B.
2013-07-01
This study is focused on the effects of cosmic rays (solar activity) and halogen-containing molecules (mainly chlorofluorocarbons — CFCs) on atmospheric ozone depletion and global climate change. Brief reviews are first given on the cosmic-ray-driven electron-induced-reaction (CRE) theory for O3 depletion and the warming theory of halogenated molecules for climate change. Then natural and anthropogenic contributions to these phenomena are examined in detail and separated well through in-depth statistical analyses of comprehensive measured datasets of quantities, including cosmic rays (CRs), total solar irradiance, sunspot number, halogenated gases (CFCs, CCl4 and HCFCs), CO2, total O3, lower stratospheric temperatures and global surface temperatures. For O3 depletion, it is shown that an analytical equation derived from the CRE theory reproduces well 11-year cyclic variations of both polar O3 loss and stratospheric cooling, and new statistical analyses of the CRE equation with observed data of total O3 and stratospheric temperature give high linear correlation coefficients ≥ 0.92. After the removal of the CR effect, a pronounced recovery by 20 25 % of the Antarctic O3 hole is found, while no recovery of O3 loss in mid-latitudes has been observed. These results show both the correctness and dominance of the CRE mechanism and the success of the Montreal Protocol. For global climate change, in-depth analyses of the observed data clearly show that the solar effect and human-made halogenated gases played the dominant role in Earth's climate change prior to and after 1970, respectively. Remarkably, a statistical analysis gives a nearly zero correlation coefficient (R = -0.05) between corrected global surface temperature data by removing the solar effect and CO2 concentration during 1850-1970. In striking contrast, a nearly perfect linear correlation with coefficients as high as 0.96-0.97 is found between corrected or uncorrected global surface temperature and total amount of stratospheric halogenated gases during 1970-2012. Furthermore, a new theoretical calculation on the greenhouse effect of halogenated gases shows that they (mainly CFCs) could alone result in the global surface temperature rise of 0.6°C in 1970-2002. These results provide solid evidence that recent global warming was indeed caused by the greenhouse effect of anthropogenic halogenated gases. Thus, a slow reversal of global temperature to the 1950 value is predicted for coming 5 7 decades. It is also expected that the global sea level will continue to rise in coming 1 2 decades until the effect of the global temperature recovery dominates over that of the polar O3 hole recovery; after that, both will drop concurrently. All the observed, analytical and theoretical results presented lead to a convincing conclusion that both the CRE mechanism and the CFC-warming mechanism not only provide new fundamental understandings of the O3 hole and global climate change but have superior predictive capabilities, compared with the conventional models.
A global approach to estimate irrigated areas - a comparison between different data and statistics
NASA Astrophysics Data System (ADS)
Meier, Jonas; Zabel, Florian; Mauser, Wolfram
2018-02-01
Agriculture is the largest global consumer of water. Irrigated areas constitute 40 % of the total area used for agricultural production (FAO, 2014a) Information on their spatial distribution is highly relevant for regional water management and food security. Spatial information on irrigation is highly important for policy and decision makers, who are facing the transition towards more efficient sustainable agriculture. However, the mapping of irrigated areas still represents a challenge for land use classifications, and existing global data sets differ strongly in their results. The following study tests an existing irrigation map based on statistics and extends the irrigated area using ancillary data. The approach processes and analyzes multi-temporal normalized difference vegetation index (NDVI) SPOT-VGT data and agricultural suitability data - both at a spatial resolution of 30 arcsec - incrementally in a multiple decision tree. It covers the period from 1999 to 2012. The results globally show a 18 % larger irrigated area than existing approaches based on statistical data. The largest differences compared to the official national statistics are found in Asia and particularly in China and India. The additional areas are mainly identified within already known irrigated regions where irrigation is more dense than previously estimated. The validation with global and regional products shows the large divergence of existing data sets with respect to size and distribution of irrigated areas caused by spatial resolution, the considered time period and the input data and assumption made.
NASA Astrophysics Data System (ADS)
Rolinski, S.; Müller, C.; Lotze-Campen, H.; Bondeau, A.
2010-12-01
More than a quarter of the Earth’s land surface is covered by grassland, which is also the major part (~ 70 %) of the agricultural area. Most of this area is used for livestock production in different degrees of intensity. The dynamic global vegetation model LPJmL (Sitch et al., Global Change Biology, 2003; Bondeau et al., Global Change Biology, 2007) is one of few process-based model that simulates biomass production on managed grasslands at the global scale. The implementation of managed grasslands and its evaluation has received little attention so far, as reference data on grassland productivity are scarce and the definition of grassland extent and usage are highly uncertain. However, grassland productivity is related to large areas, and strongly influences global estimates of carbon and water budgets and should thus be improved. Plants are implemented in LPJmL in an aggregated form as plant functional types assuming that processes concerning carbon and water fluxes are quite similar between species of the same type. Therefore, the parameterization of a functional type is possible with parameters in a physiologically meaningful range of values. The actual choice of the parameter values from the possible and reasonable phase space should satisfy the condition of the best fit of model results and measured data. In order to improve the parameterization of managed grass we follow a combined procedure using model output and measured data of carbon and water fluxes. By comparing carbon and water fluxes simultaneously, we expect well-balanced refinements and avoid over-tuning of the model in only one direction. The comparison of annual biomass from grassland to data from the Food and Agriculture Organization of the United Nations (FAO) per country provide an overview about the order of magnitude and the identification of deviations. The comparison of daily net primary productivity, soil respiration and water fluxes at specific sites (FluxNet Data) provides information on boundary conditions such as water and light availability or temperature sensibility. Based on the given limitation factors, a number of sensitive parameters are chosen, e.g. for the phenological development, biomass allocation, and different management regimes. These are introduced to a sensitivity analysis and Bayesian parameter evaluation using the R package FME (Soetart & Petzoldt, Journal of Statistical Software, 2010). Given the extremely different climatic conditions at the FluxNet grass sites, the premises for the global sensitivity analysis are very promising.
ERIC Educational Resources Information Center
Van Gundy, Karen; Morton, Beth A.; Liu, Hope Q.; Kline, Jennifer
2006-01-01
To explore the effects of web-based instruction (WBI) on math anxiety, the sense of mastery, and global self-esteem, we use quasi-experimental data from undergraduate statistics students in classes assigned to three study conditions, each with varied access to, and incentive for, the use of online technologies. Results suggest that when statistics…
Yu, Yanbao; Leng, Taohua; Yun, Dong; Liu, Na; Yao, Jun; Dai, Ying; Yang, Pengyuan; Chen, Xian
2013-01-01
Emerging evidences indicate that blood platelets function in multiple biological processes including immune response, bone metastasis and liver regeneration in addition to their known roles in hemostasis and thrombosis. Global elucidation of platelet proteome will provide the molecular base of these platelet functions. Here, we set up a high throughput platform for maximum exploration of the rat/human platelet proteome using integrated proteomics technologies, and then applied to identify the largest number of the proteins expressed in both rat and human platelets. After stringent statistical filtration, a total of 837 unique proteins matched with at least two unique peptides were precisely identified, making it the first comprehensive protein database so far for rat platelets. Meanwhile, quantitative analyses of the thrombin-stimulated platelets offered great insights into the biological functions of platelet proteins and therefore confirmed our global profiling data. A comparative proteomic analysis between rat and human platelets was also conducted, which revealed not only a significant similarity, but also an across-species evolutionary link that the orthologous proteins representing ‘core proteome’, and the ‘evolutionary proteome’ is actually a relatively static proteome. PMID:20443191
Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models
Blanc, Élodie
2017-01-26
This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less
Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanc, Élodie
This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less
Cross ranking of cities and regions: population versus income
NASA Astrophysics Data System (ADS)
Cerqueti, Roy; Ausloos, Marcel
2015-07-01
This paper explores the relationship between the inner economical structure of communities and their population distribution through a rank-rank analysis of official data, along statistical physics ideas within two techniques. The data is taken on Italian cities. The analysis is performed both at a global (national) and at a more local (regional) level in order to distinguish ‘macro’ and ‘micro’ aspects. First, the rank-size rule is found not to be a standard power law, as in many other studies, but a doubly decreasing power law. Next, the Kendall τ and the Spearman ρ rank correlation coefficients which measure pair concordance and the correlation between fluctuations in two rankings, respectively,—as a correlation function does in thermodynamics, are calculated for finding rank correlation (if any) between demography and wealth. Results show non only global disparities for the whole (country) set, but also (regional) disparities, when comparing the number of cities in regions, the number of inhabitants in cities and that in regions, as well as when comparing the aggregated tax income of the cities and that of regions. Different outliers are pointed out and justified. Interestingly, two classes of cities in the country and two classes of regions in the country are found. ‘Common sense’ social, political, and economic considerations sustain the findings. More importantly, the methods show that they allow to distinguish communities, very clearly, when specific criteria are numerically sound. A specific modeling for the findings is presented, i.e. for the doubly decreasing power law and the two phase system, based on statistics theory, e.g. urn filling. The model ideas can be expected to hold when similar rank relationship features are observed in fields. It is emphasized that the analysis makes more sense than one through a Pearson Π value-value correlation analysis
Bond strength of universal adhesives: A systematic review and meta-analysis.
Rosa, Wellington Luiz de Oliveira da; Piva, Evandro; Silva, Adriana Fernandes da
2015-07-01
A systematic review was conducted to determine whether the etch-and-rinse or self-etching mode is the best protocol for dentin and enamel adhesion by universal adhesives. This report followed the PRISMA Statement. A total of 10 articles were included in the meta-analysis. Two reviewers performed a literature search up to October 2014 in eight databases: PubMed, Web of Science, Scopus, BBO, SciELO, LILACS, IBECS and The Cochrane Library. In vitro studies evaluating the bond strength of universal adhesives to dentin and/or enamel by the etch-and-rinse and self-etch strategies were eligible to be selected. Statistical analyses were conducted using RevMan 5.1 (The Cochrane Collaboration, Copenhagen, Denmark). A global comparison was performed with random-effects models at a significance level of p<0.05. The analysis of dentin micro-tensile bond strength showed no statistically significant difference between the etch-and-rinse and self-etch strategies for mild universal adhesives (p≥0.05). However, for the ultra-mild All-Bond Universal adhesive, the etch-and-rinse strategy was significantly different than the self-etch mode in terms of dentin micro-tensile bond strength, as well as in the global analysis of enamel micro-tensile and micro-shear bond strength (p≤0.05). The enamel bond strength of universal adhesives is improved with prior phosphoric acid etching. However, this effect was not evident for dentin with the use of mild universal adhesives with the etch-and-rinse strategy. Selective enamel etching prior to the application of a mild universal adhesive is an advisable strategy for optimizing bonding. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis
NASA Astrophysics Data System (ADS)
Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.
2014-03-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.
LORETA imaging of P300 in schizophrenia with individual MRI and 128-channel EEG.
Pae, Ji Soo; Kwon, Jun Soo; Youn, Tak; Park, Hae-Jeong; Kim, Myung Sun; Lee, Boreom; Park, Kwang Suk
2003-11-01
We investigated the characteristics of P300 generators in schizophrenics by using voxel-based statistical parametric mapping of current density images. P300 generators, produced by a rare target tone of 1500 Hz (15%) under a frequent nontarget tone of 1000 Hz (85%), were measured in 20 right-handed schizophrenics and 21 controls. Low-resolution electromagnetic tomography (LORETA), using a realistic head model of the boundary element method based on individual MRI, was applied to the 128-channel EEG. Three-dimensional current density images were reconstructed from the LORETA intensity maps that covered the whole cortical gray matter. Spatial normalization and intensity normalization of the smoothed current density images were used to reduce anatomical variance and subject-specific global activity and statistical parametric mapping (SPM) was applied for the statistical analysis. We found that the sources of P300 were consistently localized at the left superior parietal area in normal subjects, while those of schizophrenics were diversely distributed. Upon statistical comparison, schizophrenics, with globally reduced current densities, showed a significant P300 current density reduction in the left medial temporal area and in the left inferior parietal area, while both left prefrontal and right orbitofrontal areas were relatively activated. The left parietotemporal area was found to correlate negatively with Positive and Negative Syndrome Scale total scores of schizophrenic patients. In conclusion, the reduced and increased areas of current density in schizophrenic patients suggest that the medial temporal and frontal areas contribute to the pathophysiology of schizophrenia, the frontotemporal circuitry abnormality.
Global effects of local food-production crises: a virtual water perspective
Tamea, Stefania; Laio, Francesco; Ridolfi, Luca
2016-01-01
By importing food and agricultural goods, countries cope with the heterogeneous global water distribution and often rely on water resources available abroad. The virtual displacement of the water used to produce such goods (known as virtual water) connects together, in a global water system, all countries participating to the international trade network. Local food-production crises, having social, economic or environmental origin, propagate in this network, modifying the virtual water trade and perturbing local and global food availability, quantified in terms of virtual water. We analyze here the possible effects of local crises by developing a new propagation model, parsimonious but grounded on data-based and statistically-verified assumptions, whose effectiveness is proved on the Argentinean crisis in 2008–09. The model serves as the basis to propose indicators of crisis impact and country vulnerability to external food-production crises, which highlight that countries with largest water resources have the highest impact on the international trade, and that not only water-scarce but also wealthy and globalized countries are among the most vulnerable to external crises. The temporal analysis reveals that global average vulnerability has increased over time and that stronger effects of crises are now found in countries with low food (and water) availability. PMID:26804492
Global effects of local food-production crises: a virtual water perspective.
Tamea, Stefania; Laio, Francesco; Ridolfi, Luca
2016-01-25
By importing food and agricultural goods, countries cope with the heterogeneous global water distribution and often rely on water resources available abroad. The virtual displacement of the water used to produce such goods (known as virtual water) connects together, in a global water system, all countries participating to the international trade network. Local food-production crises, having social, economic or environmental origin, propagate in this network, modifying the virtual water trade and perturbing local and global food availability, quantified in terms of virtual water. We analyze here the possible effects of local crises by developing a new propagation model, parsimonious but grounded on data-based and statistically-verified assumptions, whose effectiveness is proved on the Argentinean crisis in 2008-09. The model serves as the basis to propose indicators of crisis impact and country vulnerability to external food-production crises, which highlight that countries with largest water resources have the highest impact on the international trade, and that not only water-scarce but also wealthy and globalized countries are among the most vulnerable to external crises. The temporal analysis reveals that global average vulnerability has increased over time and that stronger effects of crises are now found in countries with low food (and water) availability.
Global effects of local food-production crises: a virtual water perspective
NASA Astrophysics Data System (ADS)
Tamea, Stefania; Laio, Francesco; Ridolfi, Luca
2016-01-01
By importing food and agricultural goods, countries cope with the heterogeneous global water distribution and often rely on water resources available abroad. The virtual displacement of the water used to produce such goods (known as virtual water) connects together, in a global water system, all countries participating to the international trade network. Local food-production crises, having social, economic or environmental origin, propagate in this network, modifying the virtual water trade and perturbing local and global food availability, quantified in terms of virtual water. We analyze here the possible effects of local crises by developing a new propagation model, parsimonious but grounded on data-based and statistically-verified assumptions, whose effectiveness is proved on the Argentinean crisis in 2008-09. The model serves as the basis to propose indicators of crisis impact and country vulnerability to external food-production crises, which highlight that countries with largest water resources have the highest impact on the international trade, and that not only water-scarce but also wealthy and globalized countries are among the most vulnerable to external crises. The temporal analysis reveals that global average vulnerability has increased over time and that stronger effects of crises are now found in countries with low food (and water) availability.
Hormonal therapy is associated with better self-esteem, mood, and quality of life in transsexuals.
Gorin-Lazard, Audrey; Baumstarck, Karine; Boyer, Laurent; Maquigneau, Aurélie; Penochet, Jean-Claude; Pringuey, Dominique; Albarel, Frédérique; Morange, Isabelle; Bonierbale, Mireille; Lançon, Christophe; Auquier, Pascal
2013-11-01
Few studies have assessed the role of cross-sex hormones on psychological outcomes during the period of hormonal therapy preceding sex reassignment surgery in transsexuals. The objective of this study was to assess the relationship between hormonal therapy, self-esteem, depression, quality of life (QoL), and global functioning. This study incorporated a cross-sectional design. The inclusion criteria were diagnosis of gender identity disorder (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) and inclusion in a standardized sex reassignment procedure. The outcome measures were self-esteem (Social Self-Esteem Inventory), mood (Beck Depression Inventory), QoL (Subjective Quality of Life Analysis), and global functioning (Global Assessment of Functioning). Sixty-seven consecutive individuals agreed to participate. Seventy-three percent received hormonal therapy. Hormonal therapy was an independent factor in greater self-esteem, less severe depression symptoms, and greater "psychological-like" dimensions of QoL. These findings should provide pertinent information for health care providers who consider this period as a crucial part of the global sex reassignment procedure.
NASA Astrophysics Data System (ADS)
Wakazuki, Yasutaka; Hara, Masayuki; Fujita, Mikiko; Ma, Xieyao; Kimura, Fujio
2013-04-01
Regional scale climate change projections play an important role in assessments of influences of global warming and include statistical (SD) and dynamical downscaling (DD) approaches. One of DD methods is developed basing on the pseudo-global-warming (PGW) method developed by Kimura and Kitoh (2007) in this study. In general, DD uses regional climate model (RCM) with lateral boundary data. In PGW method, the climatological mean difference estimated by GCMs are added to the objective analysis data (ANAL), and the data are used as the lateral boundary data in the future climate simulations. The ANAL is also used as the lateral boundary conditions of the present climate simulation. One of merits of the PGW method is that influences of biases of GCMs in RCM simulations are reduced. However, the PGW method does not treat climate changes in relative humidity, year-to-year variation, and short-term disturbances. The developing new downscaling method is named as the incremental dynamical downscaling and analysis system (InDDAS). The InDDAS treat climate changes in relative humidity and year-to-year variations. On the other hand, uncertainties of climate change projections estimated by many GCMs are large and are not negligible. Thus, stochastic regional scale climate change projections are expected for assessments of influences of global warming. Many RCM runs must be performed to make stochastic information. However, the computational costs are huge because grid size of RCM runs should be small to resolve heavy rainfall phenomena. Therefore, the number of runs to make stochastic information must be reduced. In InDDAS, climatological differences added to ANAL become statistically pre-analyzed information. The climatological differences of many GCMs are divided into mean climatological difference (MD) and departures from MD. The departures are analyzed by principal component analysis, and positive and negative perturbations (positive and negative standard deviations multiplied by departure patterns (eigenvectors)) with multi modes are added to MD. Consequently, the most likely future states are calculated with climatological difference of MD. For example, future states in cases that temperature increase is large and small are calculated with MD plus positive and negative perturbations of the first mode.
The GEOS-5 Data Assimilation System-Documentation of Versions 5.0.1, 5.1.0, and 5.2.0
NASA Technical Reports Server (NTRS)
Suarez, Max J.; Rienecker, M. M.; Todling, R.; Bacmeister, J.; Takacs, L.; Liu, H. C.; Gu, W.; Sienkiewicz, M.; Koster, R. D.; Gelaro, R.;
2008-01-01
This report documents the GEOS-5 global atmospheric model and data assimilation system (DAS), including the versions 5.0.1, 5.1.0, and 5.2.0, which have been implemented in products distributed for use by various NASA instrument team algorithms and ultimately for the Modem Era Retrospective analysis for Research and Applications (MERRA). The DAS is the integration of the GEOS-5 atmospheric model with the Gridpoint Statistical Interpolation (GSI) Analysis, a joint analysis system developed by the NOAA/National Centers for Environmental Prediction and the NASA/Global Modeling and Assimilation Office. The primary performance drivers for the GEOS DAS are temperature and moisture fields suitable for the EOS instrument teams, wind fields for the transport studies of the stratospheric and tropospheric chemistry communities, and climate-quality analyses to support studies of the hydrological cycle through MERRA. The GEOS-5 atmospheric model has been approved for open source release and is available from: http://opensource.gsfc.nasa.gov/projects/GEOS-5/GEOS-5.php.
Monitoring the Earth System Grid Federation through the ESGF Dashboard
NASA Astrophysics Data System (ADS)
Fiore, S.; Bell, G. M.; Drach, B.; Williams, D.; Aloisio, G.
2012-12-01
The Climate Model Intercomparison Project, phase 5 (CMIP5) is a global effort coordinated by the World Climate Research Programme (WCRP) involving tens of modeling groups spanning 19 countries. It is expected the CMIP5 distributed data archive will total upwards of 3.5 petabytes, stored across several ESGF Nodes on four continents (North America, Europe, Asia, and Australia). The Earth System Grid Federation (ESGF) provides the IT infrastructure to support the CMIP5. In this regard, the monitoring of the distributed ESGF infrastructure represents a crucial part carried out by the ESGF Dashboard. The ESGF Dashboard is a software component of the ESGF stack, responsible for collecting key information about the status of the federation in terms of: 1) Network topology (peer-groups composition), 2) Node type (host/services mapping), 3) Registered users (including their Identity Providers), 4) System metrics (e.g., round-trip time, service availability, CPU, memory, disk, processes, etc.), 5) Download metrics (both at the Node and federation level). The last class of information is very important since it provides a strong insight of the CMIP5 experiment: the data usage statistics. In this regard, CMCC and LLNL have developed a data analytics management system for the analysis of both node-level and federation-level data usage statistics. It provides data usage statistics aggregated by project, model, experiment, variable, realm, peer node, time, ensemble, datasetname (including version), etc. The back-end of the system is able to infer the data usage information of the entire federation, by carrying out: - at node level: a 18-step reconciliation process on the peer node databases (i.e. node manager and publisher DB) which provides a 15-dimension datawarehouse with local statistics and - at global level: an aggregation process which federates the data usage statistics into a 16-dimension datawarehouse with federation-level data usage statistics. The front-end of the Dashboard system exploits a web desktop approach, which joins the pervasivity of a web application with the flexibility of a desktop one.
Landslide Susceptibility Statistical Methods: A Critical and Systematic Literature Review
NASA Astrophysics Data System (ADS)
Mihir, Monika; Malamud, Bruce; Rossi, Mauro; Reichenbach, Paola; Ardizzone, Francesca
2014-05-01
Landslide susceptibility assessment, the subject of this systematic review, is aimed at understanding the spatial probability of slope failures under a set of geomorphological and environmental conditions. It is estimated that about 375 landslides that occur globally each year are fatal, with around 4600 people killed per year. Past studies have brought out the increasing cost of landslide damages which primarily can be attributed to human occupation and increased human activities in the vulnerable environments. Many scientists, to evaluate and reduce landslide risk, have made an effort to efficiently map landslide susceptibility using different statistical methods. In this paper, we do a critical and systematic landslide susceptibility literature review, in terms of the different statistical methods used. For each of a broad set of studies reviewed we note: (i) study geography region and areal extent, (ii) landslide types, (iii) inventory type and temporal period covered, (iv) mapping technique (v) thematic variables used (vi) statistical models, (vii) assessment of model skill, (viii) uncertainty assessment methods, (ix) validation methods. We then pulled out broad trends within our review of landslide susceptibility, particularly regarding the statistical methods. We found that the most common statistical methods used in the study of landslide susceptibility include logistic regression, artificial neural network, discriminant analysis and weight of evidence. Although most of the studies we reviewed assessed the model skill, very few assessed model uncertainty. In terms of geographic extent, the largest number of landslide susceptibility zonations were in Turkey, Korea, Spain, Italy and Malaysia. However, there are also many landslides and fatalities in other localities, particularly India, China, Philippines, Nepal and Indonesia, Guatemala, and Pakistan, where there are much fewer landslide susceptibility studies available in the peer-review literature. This raises some concern that existing studies do not always cover all the regions globally that currently experience landslides and landslide fatalities.
Smith, Ashlee L.; Sun, Mai; Bhargava, Rohit; Stewart, Nicolas A.; Flint, Melanie S.; Bigbee, William L.; Krivak, Thomas C.; Strange, Mary A.; Cooper, Kristine L.; Zorn, Kristin K.
2013-01-01
Objective: The biology of high grade serous ovarian carcinoma (HGSOC) is poorly understood. Little has been reported on intratumoral homogeneity or heterogeneity of primary HGSOC tumors and their metastases. We evaluated the global protein expression profiles of paired primary and metastatic HGSOC from formalin-fixed, paraffin-embedded (FFPE) tissue samples. Methods: After IRB approval, six patients with advanced HGSOC were identified with tumor in both ovaries at initial surgery. Laser capture microdissection (LCM) was used to extract tumor for protein digestion. Peptides were extracted and analyzed by reversed-phase liquid chromatography coupled to a linear ion trap mass spectrometer. Tandem mass spectra were searched against the UniProt human protein database. Differences in protein abundance between samples were assessed and analyzed by Ingenuity Pathway Analysis software. Immunohistochemistry (IHC) for select proteins from the original and an additional validation set of five patients was performed. Results: Unsupervised clustering of the abundance profiles placed the paired specimens adjacent to each other. IHC H-score analysis of the validation set revealed a strong correlation between paired samples for all proteins. For the similarly expressed proteins, the estimated correlation coefficients in two of three experimental samples and all validation samples were statistically significant (p < 0.05). The estimated correlation coefficients in the experimental sample proteins classified as differentially expressed were not statistically significant. Conclusion: A global proteomic screen of primary HGSOC tumors and their metastatic lesions identifies tumoral homogeneity and heterogeneity and provides preliminary insight into these protein profiles and the cellular pathways they constitute. PMID:28250404
NASA Astrophysics Data System (ADS)
Hopkins, J.; Balch, W. M.; Henson, S.; Poulton, A. J.; Drapeau, D.; Bowler, B.; Lubelczyk, L.
2016-02-01
Coccolithophores, the single celled phytoplankton that produce an outer covering of calcium carbonate coccoliths, are considered to be the greatest contributors to the global oceanic particulate inorganic carbon (PIC) pool. The reflective coccoliths scatter light back out from the ocean surface, enabling PIC concentration to be quantitatively estimated from ocean color satellites. Here we use datasets of AQUA MODIS PIC concentration from 2003-2014 (using the recently-revised PIC algorithm), as well as statistics on coccolithophore vertical distribution derived from cruises throughout the world ocean, to estimate the average global (surface and integrated) PIC standing stock and its associated inter-annual variability. In addition, we divide the global ocean into Longhurst biogeochemical provinces, update the PIC biomass statistics and identify those regions that have the greatest inter-annual variability and thus may exert the greatest influence on global PIC standing stock and the alkalinity pump.
NASA Astrophysics Data System (ADS)
van den Dool, G.
2017-11-01
This study (van den Dool, 2017) is a proof of concept for a global predictive wildfire model, in which the temporal-spatial characteristics of wildfires are placed in a Geographical Information System (GIS), and the risk analysis is based on data-driven fuzzy logic functions. The data sources used in this model are available as global datasets, but subdivided into three pilot areas: North America (California/Nevada), Europe (Spain), and Asia (Mongolia), and are downscaled to the highest resolution (3-arc second). The GIS is constructed around three themes: topography, fuel availability and climate. From the topographical data, six derived sub-themes are created and converted to a fuzzy membership based on the catchment area statistics. The fuel availability score is a composite of four data layers: land cover, wood loads, biomass, biovolumes. As input for the climatological sub-model reanalysed daily averaged, weather-related data is used, which is accumulated to a global weekly time-window (to account for the uncertainty within the climatological model) and forms the temporal component of the model. The final product is a wildfire risk score (from 0 to 1) by week, representing the average wildfire risk in an area. To compute the potential wildfire risk the sub-models are combined usinga Multi-Criteria Approach, and the model results are validated against the area under the Receiver Operating Characteristic curve.
Zhang, Honghua; Xia, Mingying; Qi, Lijie; Dong, Lei; Song, Shuang; Ma, Teng; Yang, Shuping; Jin, Li; Li, Liming; Li, Shilin
2016-05-01
Estimating the allele frequencies and forensic statistical parameters of commonly used short tandem repeat (STR) loci of the Uyghur population, which is the fifth largest group in China, provides a more precise reference database for forensic investigation. The 6-dye GlobalFiler™ Express PCR Amplification kit incorporates 21 autosomal STRs, which have been proven that could provide reliable DNA typing results and enhance the power of discrimination. Here we analyzed the GlobalFiler STR loci on 1962 unrelated individuals from Chinese Uyghur population of Xinjiang, China. No significant deviations from Hardy-Weinberg equilibrium and linkage disequilibrium were detected within and between the GlobalFiler STR loci. SE33 showed the greatest power of discrimination in Uyghur population, whereas TPOX showed the lowest. The combined power of discrimination was 99.999999999999999999999998746%. No significant difference was observed between Uyghur and the other two Uyghur populations at all tested STRs, as well as Dai and Mongolian. Significant differences were only observed between Uyghur and other Chinese populations at TH01, as well as Central-South Asian at D13S317, East Asian at TH01 and VWA. The phylogenetic analysis showed that Uyghur is genetically close to Chinese populations, as well as East Asian and Central-South Asian. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Regmi, G. R.; Lama, R. P.; Puri, G.; Huettmann, F.
2016-12-01
Asia remains one of the last wilderness resources in the world. It's widely praised for those resources and they are in a global appreciation and demand. Considering open borders, many of them virtually uncontrollable in Asia, and globalization in full swing, precious local resources become available to a global audience without much constraint though. Nepal and its unique biodiversity presents us with one of such cases while hard data remain elusive. Here we present a first telecoupling analysis based on poaching and crime statistics presented in the public print national daily newspapers (Kantipur and Gorkhapatra) in Nepal. This review highlights a few high-profile species (timber: Sal Shorea robusta , Sissoo Dalbergia sissoo, Pine Pinus species; aromatic and medicinal plants: Red Sandal Wood Santalum album, Orchid Orchid species, Paris Paris polyphylla, Jatamashi Nardostachys grandiflora, Kutki Picrorhyza scrophulariiflora and wildlife: Royal Bengal Tiger Panthera tigris tigris, Rhino Rhinoceros unicornis, Pangolin Manis species, Common Leopard Panthera pardus, Red Panda Ailurus fulgens, Snow Leopard Panthera uncia) in Nepal, traded out directly and illegally to India and China. We provide a wider perspective regarding sending, receiving and spill-over agents. Arguably, the western world as the spill-over agent set up a globalization framework that allows for items, virtually any items, to be shipped across borders, e.g. done on foot, by car or plane. It further allows to create and satisfy a demand by the receiver (=nations in wider Asia), and a system that circumvents the legal structure in the sending location (=Nepal and its biodiversity hotspots and wilderness). We extend the traditional telecoupling analysis with a flow analysis of money, remittance payments and banking networks. This research describes for the first time such a system which is by now essentially found worldwide, how it operates, what devastating impacts it leaves behind on landscapes, and how to resolve it for betterment.
Levy, Robert; Khokhlov, Alexander; Kopenkin, Sergey; Bart, Boris; Ermolova, Tatiana; Kantemirova, Raiasa; Mazurov, Vadim; Bell, Marjorie; Caldron, Paul; Pillai, Lakshmi; Burnett, Bruce
2010-12-01
twice-daily flavocoxid, a cyclooxygenase and 5-lipoxygenase inhibitor with potent antioxidant activity of botanical origin, was evaluated for 12 weeks in a randomized, double-blind, active-comparator study against naproxen in 220 subjects with moderate-severe osteoarthritis (OA) of the knee. As previously reported, both groups noted a significant reduction in the signs and symptoms of OA with no detectable differences in efficacy between the groups when the entire intent-to-treat population was considered. This post-hoc analysis compares the efficacy of flavocoxid to naproxen in different subsets of patients, specifically those related to age, gender, and disease severity as reported at baseline for individual response parameters. in the original randomized, double-blind study, 220 subjects were assigned to receive either flavocoxid (500 mg twice daily) or naproxen (500 mg twice daily) for 12 weeks. In this subgroup analysis, primary outcome measures including the Western Ontario and McMaster Universities OA index and subscales, timed walk, and secondary efficacy variables, including investigator global assessment for disease and global response to treatment, subject visual analog scale for discomfort, overall disease activity, global response to treatment, index joint tenderness and mobility, were evaluated for differing trends between the study groups. subset analyses revealed some statistically significant differences and some notable trends in favor of the flavocoxid group. These trends became stronger the longer the subjects continued on therapy. These observations were specifically noted in older subjects (>60 years), males and in subjects with milder disease, particularly those with lower subject global assessment of disease activity and investigator global assessment for disease and faster walking times at baseline. initial analysis of the entire intent-to-treat population revealed that flavocoxid was as effective as naproxen in managing the signs and symptoms of OA of the knee. Detailed analyses of subject subsets demonstrated distinct trends in favor of flavocoxid for specific groups of subjects.
Super-delta: a new differential gene expression analysis procedure with robust data normalization.
Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing
2017-12-21
Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super-delta provides new insights to the area of differential gene expression analysis. Solid theoretical foundation supports its asymptotic unbiasedness and technical noise-free properties. Implementation on real and simulated datasets demonstrates its decent performance compared with state-of-art procedures. It also has the potential of expansion to be incorporated with other data type and/or more general between-group comparison problems.
Evolutions of fluctuation modes and inner structures of global stock markets
NASA Astrophysics Data System (ADS)
Yan, Yan; Wang, Lei; Liu, Maoxin; Chen, Xiaosong
2016-09-01
The paper uses empirical data, including 42 globally main stock indices in the period 1996-2014, to systematically study the evolution of fluctuation modes and inner structures of global stock markets. The data are large in scale considering both time and space. A covariance matrix-based principle fluctuation mode analysis (PFMA) is used to explore the properties of the global stock markets. It has been ignored by previous studies that covariance matrix is more suitable than the correlation matrix to be the basis of PFMA. It is found that the principle fluctuation modes of global stock markets are in the same directions, and global stock markets are divided into three clusters, which are found to be closely related to the countries’ locations with exceptions of China, Russia and Czech Republic. A time-stable correlation network constructing method is proposed to solve the problem of high-level statistical uncertainty when the estimated periods are very short, and the complex dynamic network (CDN) is constructed to investigate the evolution of inner structures. The results show when the clusters emerge and how long the clusters exist. When the 2008 financial crisis broke out, the indices form one cluster. After these crises, only the European cluster still exists. These findings complement the previous studies, and can help investors and regulators to understand the global stock markets.
Machine Learning Predictions of a Multiresolution Climate Model Ensemble
NASA Astrophysics Data System (ADS)
Anderson, Gemma J.; Lucas, Donald D.
2018-05-01
Statistical models of high-resolution climate models are useful for many purposes, including sensitivity and uncertainty analyses, but building them can be computationally prohibitive. We generated a unique multiresolution perturbed parameter ensemble of a global climate model. We use a novel application of a machine learning technique known as random forests to train a statistical model on the ensemble to make high-resolution model predictions of two important quantities: global mean top-of-atmosphere energy flux and precipitation. The random forests leverage cheaper low-resolution simulations, greatly reducing the number of high-resolution simulations required to train the statistical model. We demonstrate that high-resolution predictions of these quantities can be obtained by training on an ensemble that includes only a small number of high-resolution simulations. We also find that global annually averaged precipitation is more sensitive to resolution changes than to any of the model parameters considered.
Partial Support of Meeting of the Board on Mathematical Sciences and Their Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weidman, Scott
2014-08-31
During the performance period, BMSA released the following major reports: Transforming Combustion Research through Cyberinfrastructure (2011); Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification (2012); Fueling Innovation and Discovery: The Mathematical Sciences in the 21st Century (2012); Aging and the Macroeconomy: Long-Term Implications of an Older Population (2012); The Mathematical Sciences in 2025 (2013); Frontiers in Massive Data Analysis (2013); and Developing a 21st Century Global Library for Mathematics Research (2014).
A study of correlations between crude oil spot and futures markets: A rolling sample test
NASA Astrophysics Data System (ADS)
Liu, Li; Wan, Jieqiu
2011-10-01
In this article, we investigate the asymmetries of exceedance correlations and cross-correlations between West Texas Intermediate (WTI) spot and futures markets. First, employing the test statistic proposed by Hong et al. [Asymmetries in stock returns: statistical tests and economic evaluation, Review of Financial Studies 20 (2007) 1547-1581], we find that the exceedance correlations were overall symmetric. However, the results from rolling windows show that some occasional events could induce the significant asymmetries of the exceedance correlations. Second, employing the test statistic proposed by Podobnik et al. [Quantifying cross-correlations using local and global detrending approaches, European Physics Journal B 71 (2009) 243-250], we find that the cross-correlations were significant even for large lagged orders. Using the detrended cross-correlation analysis proposed by Podobnik and Stanley [Detrended cross-correlation analysis: a new method for analyzing two nonstationary time series, Physics Review Letters 100 (2008) 084102], we find that the cross-correlations were weakly persistent and were stronger between spot and futures contract with larger maturity. Our results from rolling sample test also show the apparent effects of the exogenous events. Additionally, we have some relevant discussions on the obtained evidence.
Role of reservoirs in sustained seismicity of Koyna-Warna region—a statistical analysis
NASA Astrophysics Data System (ADS)
Yadav, Amrita; Gahalaut, Kalpna; Purnachandra Rao, N.
2018-03-01
Koyna-Warna region in western India is a globally recognized site of reservoir-triggered seismicity near the Koyna and Warna reservoirs. The region has been reported with several M > 5 earthquakes in the last five decades including M6.3 Koyna earthquake which is considered as the largest triggered earthquake worldwide. In the present study, a detailed statistical analysis has been done for long period earthquake catalogues during 1968-2004 of MERI and 2005-2012 of CSIR-NGRI to find out the spatio-temporal influence of the Koyna and Warna reservoirs impoundment on the seismicity of the region. Depending upon the earthquake clusters, we divided the region into three different zones and performed power spectrum and singular spectrum analysis (SSA) on them. For the time period 1983-1995, the earthquake zone near the Warna reservoir; for 1996-2004, the earthquake zone near the Koyna reservoir; and for 2005-2012, the earthquake zone near the Warna reservoir found to be influenced by the annual water level variations in the reservoirs that confirm the continuous role of both the reservoirs in the seismicity of the Koyna-Warna region.
Sampling Analysis of Aerosol Retrievals by Single-track Spaceborne Instrument for Climate Research
NASA Astrophysics Data System (ADS)
Geogdzhayev, I. V.; Cairns, B.; Alexandrov, M. D.; Mishchenko, M. I.
2012-12-01
We examine to what extent the reduced sampling of along-track instruments such as Cloud-Aerosol LIdar with Orthogonal Polarisation (CALIOP) and Aerosol Polarimetry Sensor (APS) affects the statistical accuracy of a satellite climatology of retrieved aerosol optical thickness (AOT) by sub-sampling the retrievals from a wide-swath imaging instrument (MODerate resolution Imaging Spectroradiometer (MODIS)). Owing to its global coverage, longevity, and extensive characterization versus ground based data, the MODIS level-2 aerosol product is an instructive testbed for assessing sampling effects on climatic means derived from along-track instrument data. The advantage of using daily pixel-level aerosol retrievals from MODIS is that limitations caused by the presence of clouds are implicit in the sample, so that their seasonal and regional variations are captured coherently. However, imager data can exhibit cross-track variability of monthly global mean AOTs caused by a scattering-angle dependence. We found that single along-track values can deviate from the imager mean by 15% over land and by more than 20% over ocean. This makes it difficult to separate natural variability from viewing-geometry artifacts complicating direct comparisons of an along-track sub-sample with the full imager data. To work around this problem, we introduce "flipped-track" sampling which, by design, is statistically equivalent to along-track sampling and while closely approximating the imager in terms of angular artifacts. We show that the flipped-track variability of global monthly mean AOT is much smaller than the cross-track one for the 7-year period considered. Over the ocean flipped-track standard error is 85% less than the cross-track one (absolute values 0.0012 versus 0.0079), and over land it is about one third of the cross-track value (0.0054 versus 0.0188) on average. This allows us to attribute the difference between the two errors to the viewing-geometry artifacts and obtain an upper limit on AOT errors caused by along-track sampling. Our results show that using along-track subsets of MODIS aerosol data directly to analyze the sampling adequacy of single-track instruments can lead to false conclusions owing to the apparent enhancement of natural aerosol variability by the track-to-track artifacts. The analysis based on the statistics of the flipped-track means yields better estimates because it allows for better separation of the viewing-geometry artifacts and true natural variability. Published assessments estimate that a global AOT change of 0.01 would yield a climatically important flux change of 0.25 W/m2. Since the standard error estimates that we have obtained are comfortably below 0.01, we conclude that along-track instruments flown on a sun-synchronous orbiting platform have sufficient spatial sampling for estimating aerosol effects on climate. Since AOT is believed to be the most variable characteristic of tropospheric aerosols, our results imply that pixel-wide along-track coverage also provides adequate statistical representation of the global distribution of aerosol microphysical parameters.
Similar negative impacts of temperature on global wheat yield estimated by three independent methods
USDA-ARS?s Scientific Manuscript database
The potential impact of global temperature change on global wheat production has recently been assessed with different methods, scaling and aggregation approaches. Here we show that grid-based simulations, point-based simulations, and statistical regressions produce similar estimates of temperature ...
NASA Technical Reports Server (NTRS)
Morris, Kenneth R.; Schwaller, Mathew
2010-01-01
The Validation Network (VN) prototype for the Global Precipitation Measurement (GPM) Mission compares data from the Tropical Rainfall Measuring Mission (TRMM) satellite Precipitation Radar (PR) to similar measurements from U.S. and international operational weather radars. This prototype is a major component of the GPM Ground Validation System (GVS). The VN provides a means for the precipitation measurement community to identify and resolve significant discrepancies between the ground radar (GR) observations and similar satellite observations. The VN prototype is based on research results and computer code described by Anagnostou et al. (2001), Bolen and Chandrasekar (2000), and Liao et al. (2001), and has previously been described by Morris, et al. (2007). Morris and Schwaller (2009) describe the PR-GR volume-matching algorithm used to create the VN match-up data set used for the comparisons. This paper describes software tools that have been developed for visualization and statistical analysis of the original and volume matched PR and GR data.
Changle Zhang; Tao Chai; Na Gao; Ma, Heather T
2017-07-01
Effective measurement of cognitive impairment caused by Alzheimer's disease (AD) will provide a chance for early medical intervention and delay the disease onset. Diffusion tensor imaging (DTI) provides a non-intrusive examination of cranial nerve diseases which can help us observe the microstructure of neuron fibers. Cognitive control network (CCN) consists of the brain regions that highly related to human self-control. In this study, hub-and-spoke model which was widely used in transportation and sociology area had been employed to analyze the relationship of CCN and other regions under its control, cognitive control related network (CCRN) was built by applying this model. Local and global graph theoretical parameters were calculated and went through statistical analysis. Significant difference had been found in the scale of local as well as global which may represent the impairment of cognitive control ability. This result may provide a potential bio-marker for the loss of connection caused by Alzheimer's disease.
Seismo-ionospheric anomalies in DEMETER observationsduring the Wenchuan M7.9 earthquake
NASA Astrophysics Data System (ADS)
Huang, C. C.; Liu, J. Y. G.
2014-12-01
This paper examines pre-earthquake ionospheric anomalies (PEIAs) observed by the French satellite DEMETER (Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions) during the 12 May 2008 M7.9 Wenchuan earthquake. Both daytime and nighttime electron density (Ne), electron temperature (Te), ion density (Ni) and ion temperature (Ti) are investigated. A statistical analysis of the box-and-whisker method is utilized to see if the four DEMETER datasets 1-6 days before and after the earthquake are significantly different. The analysis is employed to investigate the epicenter and three reference areas along the same magnetic latitude and to discriminate the earthquake-related anomalies from global effects. Results show that the nighttime Ne and Ni over the epicenter significantly decrease 1-6 days before the earthquake. The ionospheric total electron content (TEC) of global ionosphere map (GIM) over the epicenter is further inspected to find the sensitive local time for detecting the PEIAs of the M7.9 Wenchuan earthquake.
Mercury from wildfires: Global emission inventories and sensitivity to 2000-2050 global change
NASA Astrophysics Data System (ADS)
Kumar, Aditya; Wu, Shiliang; Huang, Yaoxian; Liao, Hong; Kaplan, Jed O.
2018-01-01
We estimate the global Hg wildfire emissions for the 2000s and the potential impacts from the 2000-2050 changes in climate, land use and land cover and Hg anthropogenic emissions by combining statistical analysis with global data on vegetation type and coverage as well as fire activities. Global Hg wildfire emissions are estimated to be 612 Mg year-1. Africa is the dominant source region (43.8% of global emissions), followed by Eurasia (31%) and South America (16.6%). We find significant perturbations to wildfire emissions of Hg in the context of global change, driven by the projected changes in climate, land use and land cover and Hg anthropogenic emissions. 2000-2050 climate change could increase Hg emissions by 14% globally and regionally by 18% for South America, 14% for Africa and 13% for Eurasia. Projected changes in land use by 2050 could decrease the global Hg emissions from wildfires by 13% mainly driven by a decline in African emissions due to significant agricultural land expansion. Future land cover changes could lead to significant increases in Hg emissions over some regions (+32% North America, +14% Africa, +13% Eurasia). Potential enrichment of terrestrial ecosystems in 2050 in response to changes in Hg anthropogenic emissions could increase Hg wildfire emissions globally (+28%) and regionally (+19% North America, +20% South America, +24% Africa, +41% Eurasia). Our results indicate that the future evolution of climate, land use and land cover and Hg anthropogenic emissions are all important factors affecting Hg wildfire emissions in the coming decades.
Studies in the use of cloud type statistics in mission simulation
NASA Technical Reports Server (NTRS)
Fowler, M. G.; Willand, J. H.; Chang, D. T.; Cogan, J. L.
1974-01-01
A study to further improve NASA's global cloud statistics for mission simulation is reported. Regional homogeneity in cloud types was examined; most of the original region boundaries defined for cloud cover amount in previous studies were supported by the statistics on cloud types and the number of cloud layers. Conditionality in cloud statistics was also examined with special emphasis on temporal and spatial dependencies, and cloud type interdependence. Temporal conditionality was found up to 12 hours, and spatial conditionality up to 200 miles; the diurnal cycle in convective cloudiness was clearly evident. As expected, the joint occurrence of different cloud types reflected the dynamic processes which form the clouds. Other phases of the study improved the cloud type statistics for several region and proposed a mission simulation scheme combining the 4-dimensional atmospheric model, sponsored by MSFC, with the global cloud model.
BRICS countries and the global movement for universal health coverage.
Tediosi, Fabrizio; Finch, Aureliano; Procacci, Christina; Marten, Robert; Missoni, Eduardo
2016-07-01
This article explores BRICS' engagement in the global movement for Universal Health Coverage (UHC) and the implications for global health governance. It is based on primary data collected from 43 key informant interviews, complemented by a review of BRICS' global commitments supporting UHC. Interviews were conducted using a semi-structured questionnaire that included both closed- and open-ended questions. Question development was informed by insights from the literature on UHC, Cox's framework for action, and Kingdon's multiple-stream theory of policy formation. The closed questions were analysed with simple descriptive statistics and the open-ended questions using grounded theory approach. The analysis demonstrates that most BRICS countries implicitly supported the global movement for UHC, and that they share an active engagement in promoting UHC. However, only Brazil, China and to some extent South Africa, were recognized as proactively pushing UHC in the global agenda. In addition, despite some concerted actions, BRICS countries seem to act more as individual countries rather that as an allied group. These findings suggest that BRICS are unlikely to be a unified political block that will transform global health governance. Yet the documented involvement of BRICS in the global movement supporting UHC, and their focus on domestic challenges, shows that BRICS individually are increasingly influential players in global health. So if BRICS countries should probably not be portrayed as the centre of future political community that will transform global health governance, their individual involvement in global health, and their documented concerted actions, may give greater voice to low- and middle-income countries supporting the emergence of multiple centres of powers in global health. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Reconstruction of Arctic surface temperature in past 100 years using DINEOF
NASA Astrophysics Data System (ADS)
Zhang, Qiyi; Huang, Jianbin; Luo, Yong
2015-04-01
Global annual mean surface temperature has not risen apparently since 1998, which is described as global warming hiatus in recent years. However, measuring of temperature variability in Arctic is difficult because of large gaps in coverage of Arctic region in most observed gridded datasets. Since Arctic has experienced a rapid temperature change in recent years that called polar amplification, and temperature risen in Arctic is faster than global mean, the unobserved temperature in central Arctic will result in cold bias in both global and Arctic temperature measurement compared with model simulations and reanalysis datasets. Moreover, some datasets that have complete coverage in Arctic but short temporal scale cannot show Arctic temperature variability for long time. Data Interpolating Empirical Orthogonal Function (DINEOF) were applied to fill the coverage gap of NASA's Goddard Institute for Space Studies Surface Temperature Analysis (GISTEMP 250km smooth) product in Arctic with IABP dataset which covers entire Arctic region between 1979 and 1998, and to reconstruct Arctic temperature in 1900-2012. This method provided temperature reconstruction in central Arctic and precise estimation of both global and Arctic temperature variability with a long temporal scale. Results have been verified by extra independent station records in Arctic by statistical analysis, such as variance and standard deviation. The result of reconstruction shows significant warming trend in Arctic in recent 30 years, as the temperature trend in Arctic since 1997 is 0.76°C per decade, compared with 0.48°C and 0.67°C per decade from 250km smooth and 1200km smooth of GISTEMP. And global temperature trend is two times greater after using DINEOF. The discrepancies above stress the importance of fully consideration of temperature variance in Arctic because gaps of coverage in Arctic cause apparent cold bias in temperature estimation. The result of global surface temperature also proves that global warming in recent years is not as slow as thought.
NASA Technical Reports Server (NTRS)
Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D.
1984-01-01
Summary studies are presented for the entire cloud observation archieve from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long range airline routes, and to assess the probability and extent of laminar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical.
[Study on ecological suitability regionalization of Eucommia ulmoides in Guizhou].
Kang, Chuan-Zhi; Wang, Qing-Qing; Zhou, Tao; Jiang, Wei-Ke; Xiao, Cheng-Hong; Xie, Yu
2014-05-01
To study the ecological suitability regionalization of Eucommia ulmoides, for selecting artificial planting base and high-quality industrial raw material purchase area of the herb in Guizhou. Based on the investigation of 14 Eucommia ulmoides producing areas, pinoresinol diglucoside content and ecological factors were obtained. Using spatial analysis method to carry on ecological suitability regionalization. Meanwhile, combining pinoresinol diglucoside content, the correlation of major active components and environmental factors were analyzed by statistical analysis. The most suitability planting area of Eucommia ulmoides was the northwest of Guizhou. The distribution of Eucommia ulmoides was mainly affected by the type and pH value of soil, and monthly precipitation. The spatial structure of major active components in Eucommia ulmoides were randomly distributed in global space, but had only one aggregation point which had a high positive correlation in local space. The major active components of Eucommia ulmoides had no correlation with altitude, longitude or latitude. Using the spatial analysis method and statistical analysis method, based on environmental factor and pinoresinol diglucoside content, the ecological suitability regionalization of Eucommia ulmoides can provide reference for the selection of suitable planting area, artificial planting base and directing production layout.
System analysis to estimate subsurface flow: from global level to the State of Minnesota
NASA Astrophysics Data System (ADS)
Shmagin, Boris A.; Kanivetsky, Roman
2002-06-01
Stream runoff data globally and in the state of Minnesota were used to estimate subsurface water flow. This system approach is based, in principal, on unity of groundwater and surface water systems, and it is in stark contrast to the traditional deterministic approach based on modeling. In coordination with methodology of system analysis, two levels of study were used to estimate subsurface flow. First, the global stream runoff data were assessed to estimate the temporal-spatial variability of surface water runoff. Factor analysis was used to study the temporal-spatial variability of global runoff for the period from 1918 to 1967. Results of these analysis demonstrate that the variability of global runoff could be represented by seven major components (factor scores) that could be grouped into seven distinct independent grouping from the total of 18 continental slopes on the Earth. Computed variance value in this analysis is 76% and supports such analysis. The global stream runoff for this period is stationary, and is more closely connected with the stream flow of Asia to the Pacific Ocean as well as with the stream runoff of North America towards the Arctic and Pacific Oceans. The second level examines the distribution of river runoff (annual and for February) for various landscapes and the hydrogeological conditions in the State of Minnesota (218,000 km2). The annual and minimal monthly rate of stream runoff for 115 gauging stations with a period of observation of 47 years (1935-1981) were used to characterize the spatio-temporal distribution of stream runoff in Minnesota. Results of this analysis demonstrate that the annual stream runoff rate changes from 6.3, towards 3.95, and then to 2.09 l s-1 km-2 (the difference is significant based on Student's criteria). These values in Minnesota correspond to ecological provinces from a mixed forest province towards the broadleaf forest and to prairie province, respectively. The distribution of minimal monthly stream runoff rate (February runoff) is controlled by hydrogeological systems in Minnesota. The difference between the two hydrogeological regions, Precambrian crystalline basement and Paleozoic artesian basin of 0.83 and 2.09 l/s/km2, is statistically significant. Within these regions, the monthly minimal runoff (0.5 and 1.68, and 0.87 and 3.11 l s-1 km-2 for February, respectively) is also distinctly different for delineated subregions, depending on whether or not the Quaternary cover is present. The spatio-temporal structure that emerges could thus be used to generate river runoff and subsurface flow maps at any scale - from the global level to local detail. Such analysis was carried out in Minnesota with the detailed mapping of the subsurface flow for the Twin Cities Metropolitan area.
System analysis to estimate subsurface flow: From global level to the State of Minnesota
Shmagin, B.A.; Kanivetsky, R.
2002-01-01
Stream runoff data globally and in the state of Minnesota were used to estimate subsurface water flow. This system approach is based, in principal, on unity of groundwater and surface water systems, and it is in stark contrast to the traditional deterministic approach based on modeling. In coordination with methodology of system analysis, two levels of study were used to estimate subsurface flow. First, the global stream runoff data were assessed to estimate the temporal-spatial variability of surface water runoff. Factor analysis was used to study the temporal-spatial variability of global runoff for the period from 1918 to 1967. Results of these analysis demonstrate that the variability of global runoff could be represented by seven major components (factor scores) that could be grouped into seven distinct independent grouping from the total of 18 continental slopes on the Earth. Computed variance value in this analysis is 76% and supports such analysis. The global stream runoff for this period is stationary, and is more closely connected with the stream flow of Asia to the Pacific Ocean as well as with the stream runoff of North America towards the Arctic and Pacific Oceans. The second level examines the distribution of river runoff (annual and for February) for various landscapes and the hydrogeological conditions in the State of Minnesota (218,000 km2). The annual and minimal monthly rate of stream runoff for 115 gauging stations with a period of observation of 47 years (1935-1981) were used to characterize the spatio-temporal distribution of stream runoff in Minnesota. Results of this analysis demonstrate that the annual stream runoff rate changes from 6.3, towards 3.95, and then to 2.09 1 s-1 km-2 (the difference is significant based on Student's criteria). These values in Minnesota correspond to ecological provinces from a mixed forest province towards the broadleaf forest and to prairie province, respectively. The distribution of minimal monthly stream runoff rate (February runoff) is controlled by hydrogeological systems in Minnesota. The difference between the two hydrogeological regions, Precambrian crystalline basement and Paleozoic artesian basin of 0.83 and 2.09 1/s/km2, is statistically significant. Within these regions, the monthly minimal runoff (0.5 and 1.68, and 0.87 and 3.11 1 s-1 km-2 for February, respectively) is also distinctly different for delineated subregions, depending on whether or not the Quaternary cover is present. The spatio-temporal structure that emerges could thus be used to generate river runoff and subsurface flow maps at any scale - from the global level to local detail. Such analysis was carried out in Minnesota with the detailed mapping of the subsurface flow for the Twin Cities Metropolitan area.
Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T.; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio
2014-01-01
Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden–Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003–2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts. PMID:24801254
Duan, Fei; Ward, C A
2009-07-07
In the steady-state experiments of water droplet evaporation, when the throat was heating at a stainless steel conical funnel, the interfacial liquid temperature was found to increase parabolically from the center line to the rim of the funnel with the global vapor-phase pressure at around 600 Pa. The energy conservation analysis at the interface indicates that the energy required for evaporation is maintained by thermal conduction to the interface from the liquid and vapor phases, thermocapillary convection at interface, and the viscous dissipation globally and locally. The local evaporation flux increases from the center line to the periphery as a result of multiple effects of energy transport at the interface. The local vapor-phase pressure predicted from statistical rate theory (SRT) is also found to increase monotonically toward the interface edge from the center line. However, the average value of the local vapor-phase pressures is in agreement with the measured global vapor-phase pressure within the measured error bar.
Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio
2014-05-06
Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden-Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003-2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts.
2013-01-01
Background The availability of gene expression data that corresponds to pig immune response challenges provides compelling material for the understanding of the host immune system. Meta-analysis offers the opportunity to confirm and expand our knowledge by combining and studying at one time a vast set of independent studies creating large datasets with increased statistical power. In this study, we performed two meta-analyses of porcine transcriptomic data: i) scrutinized the global immune response to different challenges, and ii) determined the specific response to Porcine Reproductive and Respiratory Syndrome Virus (PRRSV) infection. To gain an in-depth knowledge of the pig response to PRRSV infection, we used an original approach comparing and eliminating the common genes from both meta-analyses in order to identify genes and pathways specifically involved in the PRRSV immune response. The software Pointillist was used to cope with the highly disparate data, circumventing the biases generated by the specific responses linked to single studies. Next, we used the Ingenuity Pathways Analysis (IPA) software to survey the canonical pathways, biological functions and transcription factors found to be significantly involved in the pig immune response. We used 779 chips corresponding to 29 datasets for the pig global immune response and 279 chips obtained from 6 datasets for the pig response to PRRSV infection, respectively. Results The pig global immune response analysis showed interconnected canonical pathways involved in the regulation of translation and mitochondrial energy metabolism. Biological functions revealed in this meta-analysis were centred around translation regulation, which included protein synthesis, RNA-post transcriptional gene expression and cellular growth and proliferation. Furthermore, the oxidative phosphorylation and mitochondria dysfunctions, associated with stress signalling, were highly regulated. Transcription factors such as MYCN, MYC and NFE2L2 were found in this analysis to be potentially involved in the regulation of the immune response. The host specific response to PRRSV infection engendered the activation of well-defined canonical pathways in response to pathogen challenge such as TREM1, toll-like receptor and hyper-cytokinemia/ hyper-chemokinemia signalling. Furthermore, this analysis brought forth the central role of the crosstalk between innate and adaptive immune response and the regulation of anti-inflammatory response. The most significant transcription factor potentially involved in this analysis was HMGB1, which is required for the innate recognition of viral nucleic acids. Other transcription factors like interferon regulatory factors IRF1, IRF3, IRF5 and IRF8 were also involved in the pig specific response to PRRSV infection. Conclusions This work reveals key genes, canonical pathways and biological functions involved in the pig global immune response to diverse challenges, including PRRSV infection. The powerful statistical approach led us to consolidate previous findings as well as to gain new insights into the pig immune response either to common stimuli or specifically to PRRSV infection. PMID:23552196
FADTTS: functional analysis of diffusion tensor tract statistics.
Zhu, Hongtu; Kong, Linglong; Li, Runze; Styner, Martin; Gerig, Guido; Lin, Weili; Gilmore, John H
2011-06-01
The aim of this paper is to present a functional analysis of a diffusion tensor tract statistics (FADTTS) pipeline for delineating the association between multiple diffusion properties along major white matter fiber bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these white matter tract properties in various diffusion tensor imaging studies. The FADTTS integrates five statistical tools: (i) a multivariate varying coefficient model for allowing the varying coefficient functions in terms of arc length to characterize the varying associations between fiber bundle diffusion properties and a set of covariates, (ii) a weighted least squares estimation of the varying coefficient functions, (iii) a functional principal component analysis to delineate the structure of the variability in fiber bundle diffusion properties, (iv) a global test statistic to test hypotheses of interest, and (v) a simultaneous confidence band to quantify the uncertainty in the estimated coefficient functions. Simulated data are used to evaluate the finite sample performance of FADTTS. We apply FADTTS to investigate the development of white matter diffusivities along the splenium of the corpus callosum tract and the right internal capsule tract in a clinical study of neurodevelopment. FADTTS can be used to facilitate the understanding of normal brain development, the neural bases of neuropsychiatric disorders, and the joint effects of environmental and genetic factors on white matter fiber bundles. The advantages of FADTTS compared with the other existing approaches are that they are capable of modeling the structured inter-subject variability, testing the joint effects, and constructing their simultaneous confidence bands. However, FADTTS is not crucial for estimation and reduces to the functional analysis method for the single measure. Copyright © 2011 Elsevier Inc. All rights reserved.
Effect of Creatine Monohydrate on Clinical Progression in Patients With Parkinson Disease
2015-01-01
IMPORTANCE There are no treatments available to slow or prevent the progression of Parkinson disease, despite its global prevalence and significant health care burden. The National Institute of Neurological Disorders and Stroke Exploratory Trials in Parkinson Disease program was established to promote discovery of potential therapies. OBJECTIVE To determine whether creatine monohydrate was more effective than placebo in slowing long-term clinical decline in participants with Parkinson disease. DESIGN, SETTING, AND PATIENTS The Long-term Study 1, a multicenter, double-blind, parallel-group, placebo-controlled, 1:1 randomized efficacy trial. Participants were recruited from 45 investigative sites in the United States and Canada and included 1741 men and women with early (within 5 years of diagnosis) and treated (receiving dopaminergic therapy) Parkinson disease. Participants were enrolled from March 2007 to May 2010 and followed up until September 2013. INTERVENTIONS Participants were randomized to placebo or creatine (10 g/d) monohydrate for a minimum of 5 years (maximum follow-up, 8 years). MAIN OUTCOMES AND MEASURES The primary outcome measure was a difference in clinical decline from baseline to 5-year follow-up, compared between the 2 treatment groups using a global statistical test. Clinical status was defined by 5 outcome measures: Modified Rankin Scale, Symbol Digit Modalities Test, PDQ-39 Summary Index, Schwab and England Activities of Daily Living scale, and ambulatory capacity. All outcomes were coded such that higher scores indicated worse outcomes and were analyzed by a global statistical test. Higher summed ranks (range, 5–4775) indicate worse outcomes. RESULTS The trial was terminated early for futility based on results of a planned interim analysis of participants enrolled at least 5 years prior to the date of the analysis (n = 955). The median follow-up time was 4 years. Of the 955 participants, the mean of the summed ranks for placebo was 2360 (95% CI, 2249–2470) and for creatine was 2414 (95% CI, 2304–2524). The global statistical test yielded t1865.8 = −0.75 (2-sided P = .45). There were no detectable differences (P < .01 to partially adjust for multiple comparisons) in adverse and serious adverse events by body system. CONCLUSIONS AND RELEVANCE Among patients with early and treated Parkinson disease, treatment with creatine monohydrate for at least 5 years, compared with placebo did not improve clinical outcomes. These findings do not support the use of creatine monohydrate in patients with Parkinson disease. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00449865 PMID:25668262
Kieburtz, Karl; Tilley, Barbara C; Elm, Jordan J; Babcock, Debra; Hauser, Robert; Ross, G Webster; Augustine, Alicia H; Augustine, Erika U; Aminoff, Michael J; Bodis-Wollner, Ivan G; Boyd, James; Cambi, Franca; Chou, Kelvin; Christine, Chadwick W; Cines, Michelle; Dahodwala, Nabila; Derwent, Lorelei; Dewey, Richard B; Hawthorne, Katherine; Houghton, David J; Kamp, Cornelia; Leehey, Maureen; Lew, Mark F; Liang, Grace S Lin; Luo, Sheng T; Mari, Zoltan; Morgan, John C; Parashos, Sotirios; Pérez, Adriana; Petrovitch, Helen; Rajan, Suja; Reichwein, Sue; Roth, Jessie Tatsuno; Schneider, Jay S; Shannon, Kathleen M; Simon, David K; Simuni, Tanya; Singer, Carlos; Sudarsky, Lewis; Tanner, Caroline M; Umeh, Chizoba C; Williams, Karen; Wills, Anne-Marie
2015-02-10
There are no treatments available to slow or prevent the progression of Parkinson disease, despite its global prevalence and significant health care burden. The National Institute of Neurological Disorders and Stroke Exploratory Trials in Parkinson Disease program was established to promote discovery of potential therapies. To determine whether creatine monohydrate was more effective than placebo in slowing long-term clinical decline in participants with Parkinson disease. The Long-term Study 1, a multicenter, double-blind, parallel-group, placebo-controlled, 1:1 randomized efficacy trial. Participants were recruited from 45 investigative sites in the United States and Canada and included 1741 men and women with early (within 5 years of diagnosis) and treated (receiving dopaminergic therapy) Parkinson disease. Participants were enrolled from March 2007 to May 2010 and followed up until September 2013. Participants were randomized to placebo or creatine (10 g/d) monohydrate for a minimum of 5 years (maximum follow-up, 8 years). The primary outcome measure was a difference in clinical decline from baseline to 5-year follow-up, compared between the 2 treatment groups using a global statistical test. Clinical status was defined by 5 outcome measures: Modified Rankin Scale, Symbol Digit Modalities Test, PDQ-39 Summary Index, Schwab and England Activities of Daily Living scale, and ambulatory capacity. All outcomes were coded such that higher scores indicated worse outcomes and were analyzed by a global statistical test. Higher summed ranks (range, 5-4775) indicate worse outcomes. The trial was terminated early for futility based on results of a planned interim analysis of participants enrolled at least 5 years prior to the date of the analysis (n = 955). The median follow-up time was 4 years. Of the 955 participants, the mean of the summed ranks for placebo was 2360 (95% CI, 2249-2470) and for creatine was 2414 (95% CI, 2304-2524). The global statistical test yielded t1865.8 = -0.75 (2-sided P = .45). There were no detectable differences (P < .01 to partially adjust for multiple comparisons) in adverse and serious adverse events by body system. Among patients with early and treated Parkinson disease, treatment with creatine monohydrate for at least 5 years, compared with placebo did not improve clinical outcomes. These findings do not support the use of creatine monohydrate in patients with Parkinson disease. clinicaltrials.gov Identifier: NCT00449865.
Samanta, Brajogopal; Bhadury, Punyasloke
2016-01-01
Marine chromophytes are taxonomically diverse group of algae and contribute approximately half of the total oceanic primary production. To understand the global patterns of functional diversity of chromophytic phytoplankton, robust bioinformatics and statistical analyses including deep phylogeny based on 2476 form ID rbcL gene sequences representing seven ecologically significant oceanographic ecoregions were undertaken. In addition, 12 form ID rbcL clone libraries were generated and analyzed (148 sequences) from Sundarbans Biosphere Reserve representing the world’s largest mangrove ecosystem as part of this study. Global phylogenetic analyses recovered 11 major clades of chromophytic phytoplankton in varying proportions with several novel rbcL sequences in each of the seven targeted ecoregions. Majority of OTUs was found to be exclusive to each ecoregion, whereas some were shared by two or more ecoregions based on beta-diversity analysis. Present phylogenetic and bioinformatics analyses provide a strong statistical support for the hypothesis that different oceanographic regimes harbor distinct and coherent groups of chromophytic phytoplankton. It has been also shown as part of this study that varying natural selection pressure on form ID rbcL gene under different environmental conditions could lead to functional differences and overall fitness of chromophytic phytoplankton populations. PMID:26861415
NASA Astrophysics Data System (ADS)
Rougier, Jonty; Cashman, Kathy; Sparks, Stephen
2016-04-01
We have analysed the Large Magnitude Explosive Volcanic Eruptions database (LaMEVE) for volcanoes that classify as stratovolcanoes. A non-parametric statistical approach is used to assess the global recording rate for large (M4+). The approach imposes minimal structure on the shape of the recording rate through time. We find that the recording rates have declined rapidly, going backwards in time. Prior to 1600 they are below 50%, and prior to 1100 they are below 20%. Even in the recent past, e.g. the 1800s, they are likely to be appreciably less than 100%.The assessment for very large (M5+) eruptions is more uncertain, due to the scarcity of events. Having taken under-recording into account the large-eruption rates of stratovolcanoes are modelled exchangeably, in order to derive an informative prior distribution as an input into a subsequent volcano-by-volcano hazard assessment. The statistical model implies that volcano-by-volcano predictions can be grouped by the number of recorded large eruptions. Further, it is possible to combine all volcanoes together into a global large eruption prediction, with an M4+ rate computed from the LaMEVE database of 0.57/yr.
How to explain variations in sea cliff erosion rate?
NASA Astrophysics Data System (ADS)
Prémaillon, Melody; Regard, Vincent; Dewez, Thomas
2017-04-01
Every rocky coast of the world is eroding at different rate (cliff retreat rates). Erosion is caused by a complex interaction of multiple sea weather factors. While numerous local studies exist and explain erosion processes on specific sites, global studies lack. We started to compile many of those local studies and analyse their results with a global point of view in order to quantify the various parameters influencing erosion rates. In other words: is erosion more important in energetic seas? Are chalk cliff eroding faster in rainy environment? etc. In order to do this, we built a database based on literature and national erosion databases. It now contains 80 publications which represents 2500 cliffs studied and more than 3500 erosion rate estimates. A statistical analysis was conducted on this database. On a first approximation, cliff lithology is the only clear signal explaining erosion rate variation: hard lithologies are eroding at 1cm/y or less, whereas unconsolidated lithologies commonly erode faster than 10cm/y. No clear statistical relation were found between erosion rate and external parameters such as sea energy (swell, tide) or weather condition, even on cliff with similar lithology.
Cross-cultural variation of memory colors of familiar objects.
Smet, Kevin A G; Lin, Yandan; Nagy, Balázs V; Németh, Zoltan; Duque-Chica, Gloria L; Quintero, Jesús M; Chen, Hung-Shing; Luo, Ronnier M; Safi, Mahdi; Hanselaer, Peter
2014-12-29
The effect of cross-regional or cross-cultural differences on color appearance ratings and memory colors of familiar objects was investigated in seven different countries/regions - Belgium, Hungary, Brazil, Colombia, Taiwan, China and Iran. In each region the familiar objects were presented on a calibrated monitor in over 100 different colors to a test panel of observers that were asked to rate the similarity of the presented object color with respect to what they thought the object looks like in reality (memory color). For each object and region the mean observer ratings were modeled by a bivariate Gaussian function. A statistical analysis showed significant (p < 0.001) differences between the region average observers and the global average observer obtained by pooling the data from all regions. However, the effect size of geographical region or culture was found to be small. In fact, the differences between the region average observers and the global average observer were found to of the same magnitude or smaller than the typical within region inter-observer variability. Thus, although statistical differences in color appearance ratings and memory between regions were found, regional impact is not likely to be of practical importance.
NASA Astrophysics Data System (ADS)
Johnson, A. C.; Yeakley, A.
2009-12-01
Timberline forest advance associated with global climate change is occurring worldwide and is often associated with microsites. Microsites, controlled by topography, substrates, and plant cover, are localized regions dictating temperature, moisture, and solar radiation. These abiotic factors are integral to seedling survival. From a compilation of world-wide information on seedling regeneration on microsites at timberline, including our on-going research in the Pacific Northwest, we classified available literature into four microsite categories, related microsite category to annual precipitation, and used analysis of variance to detect statistical differences in microsite type and associated precipitation. We found statistical differences (p = 0.022) indicating the usefulness of understanding microsite/precipitation associations in detecting world-wide trends in timberline expansion. For example, wetter timberlines with downed wood, had regeneration associated with nurse logs, whereas on windy, drier landscapes, regeneration was typically associated with either leeward sides of tree clumps or on microsites protected from frost by overstory canopy. In our study of timberline expansion in the Pacific Northwest, we expect that such knowledge of microsite types associated with forest expansion will reveal a better understanding of mechanisms and rates of timberline forest advance during global warming.
Trends and associated uncertainty in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, A. N.; Moyer, E. J.; Stein, M.
2016-12-01
Physical models suggest that the Earth's mean temperature warms in response to changing CO2 concentrations (and hence increased radiative forcing); given physical uncertainties in this relationship, the historical temperature record is a source of empirical information about global warming. A persistent thread in many analyses of the historical temperature record, however, is the reliance on methods that appear to deemphasize both physical and statistical assumptions. Examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for natural variability in nonparametric rather than parametric ways. We show here that methods that deemphasize assumptions can limit the scope of analysis and can lead to misleading inferences, particularly in the setting considered where the data record is relatively short and the scale of temporal correlation is relatively long. A proposed model that is simple but physically informed provides a more reliable estimate of trends and allows a broader array of questions to be addressed. In accounting for uncertainty, we also illustrate how parametric statistical models that are attuned to the important characteristics of natural variability can be more reliable than ostensibly more flexible approaches.
Research on Visual Analysis Methods of Terrorism Events
NASA Astrophysics Data System (ADS)
Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing
2016-06-01
Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.
Progress in Turbulence Detection via GNSS Occultation Data
NASA Technical Reports Server (NTRS)
Cornman, L. B.; Goodrich, R. K.; Axelrad, P.; Barlow, E.
2012-01-01
The increased availability of radio occultation (RO) data offers the ability to detect and study turbulence in the Earth's atmosphere. An analysis of how RO data can be used to determine the strength and location of turbulent regions is presented. This includes the derivation of a model for the power spectrum of the log-amplitude and phase fluctuations of the permittivity (or index of refraction) field. The bulk of the paper is then concerned with the estimation of the model parameters. Parameter estimators are introduced and some of their statistical properties are studied. These estimators are then applied to simulated log-amplitude RO signals. This includes the analysis of global statistics derived from a large number of realizations, as well as case studies that illustrate various specific aspects of the problem. Improvements to the basic estimation methods are discussed, and their beneficial properties are illustrated. The estimation techniques are then applied to real occultation data. Only two cases are presented, but they illustrate some of the salient features inherent in real data.
Primary Student-Teachers' Conceptual Understanding of the Greenhouse Effect: A mixed method study
NASA Astrophysics Data System (ADS)
Ratinen, Ilkka Johannes
2013-04-01
The greenhouse effect is a reasonably complex scientific phenomenon which can be used as a model to examine students' conceptual understanding in science. Primary student-teachers' understanding of global environmental problems, such as climate change and ozone depletion, indicates that they have many misconceptions. The present mixed method study examines Finnish primary student-teachers' understanding of the greenhouse effect based on the results obtained via open-ended and closed-form questionnaires. The open-ended questionnaire considers primary student-teachers' spontaneous ideas about the greenhouse effect depicted by concept maps. The present study also uses statistical analysis to reveal respondents' conceptualization of the greenhouse effect. The concept maps and statistical analysis reveal that the primary student-teachers' factual knowledge and their conceptual understanding of the greenhouse effect are incomplete and even misleading. In the light of the results of the present study, proposals for modifying the instruction of climate change in science, especially in geography, are presented.
Higher order statistical moment application for solar PV potential analysis
NASA Astrophysics Data System (ADS)
Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan
2016-10-01
Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.
World Hunger: Facts. Facts for Action #1.
ERIC Educational Resources Information Center
Phillips, Jim
Designed for global education at the high school level, this document presents statistics on malnutrition, infant mortality, and illiteracy in developing nations. The statistics are compared with private and government expenditures of wealthy nations. Examples of the statistical information for developing nations are: more than 500 million people…
Color constancy in natural scenes explained by global image statistics
Foster, David H.; Amano, Kinjiro; Nascimento, Sérgio M. C.
2007-01-01
To what extent do observers' judgments of surface color with natural scenes depend on global image statistics? To address this question, a psychophysical experiment was performed in which images of natural scenes under two successive daylights were presented on a computer-controlled high-resolution color monitor. Observers reported whether there was a change in reflectance of a test surface in the scene. The scenes were obtained with a hyperspectral imaging system and included variously trees, shrubs, grasses, ferns, flowers, rocks, and buildings. Discrimination performance, quantified on a scale of 0 to 1 with a color-constancy index, varied from 0.69 to 0.97 over 21 scenes and two illuminant changes, from a correlated color temperature of 25,000 K to 6700 K and from 4000 K to 6700 K. The best account of these effects was provided by receptor-based rather than colorimetric properties of the images. Thus, in a linear regression, 43% of the variance in constancy index was explained by the log of the mean relative deviation in spatial cone-excitation ratios evaluated globally across the two images of a scene. A further 20% was explained by including the mean chroma of the first image and its difference from that of the second image and a further 7% by the mean difference in hue. Together, all four global color properties accounted for 70% of the variance and provided a good fit to the effects of scene and of illuminant change on color constancy, and, additionally, of changing test-surface position. By contrast, a spatial-frequency analysis of the images showed that the gradient of the luminance amplitude spectrum accounted for only 5% of the variance. PMID:16961965
Color constancy in natural scenes explained by global image statistics.
Foster, David H; Amano, Kinjiro; Nascimento, Sérgio M C
2006-01-01
To what extent do observers' judgments of surface color with natural scenes depend on global image statistics? To address this question, a psychophysical experiment was performed in which images of natural scenes under two successive daylights were presented on a computer-controlled high-resolution color monitor. Observers reported whether there was a change in reflectance of a test surface in the scene. The scenes were obtained with a hyperspectral imaging system and included variously trees, shrubs, grasses, ferns, flowers, rocks, and buildings. Discrimination performance, quantified on a scale of 0 to 1 with a color-constancy index, varied from 0.69 to 0.97 over 21 scenes and two illuminant changes, from a correlated color temperature of 25,000 K to 6700 K and from 4000 K to 6700 K. The best account of these effects was provided by receptor-based rather than colorimetric properties of the images. Thus, in a linear regression, 43% of the variance in constancy index was explained by the log of the mean relative deviation in spatial cone-excitation ratios evaluated globally across the two images of a scene. A further 20% was explained by including the mean chroma of the first image and its difference from that of the second image and a further 7% by the mean difference in hue. Together, all four global color properties accounted for 70% of the variance and provided a good fit to the effects of scene and of illuminant change on color constancy, and, additionally, of changing test-surface position. By contrast, a spatial-frequency analysis of the images showed that the gradient of the luminance amplitude spectrum accounted for only 5% of the variance.
The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis
NASA Astrophysics Data System (ADS)
Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali
2018-04-01
The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.
Exploratory analysis of environmental interactions in central California
De Cola, Lee; Falcone, Neil L.
1996-01-01
As part of its global change research program, the United States Geological Survey (USGS) has produced raster data that describe the land cover of the United States using a consistent format. The data consist of elevations, satellite measurements, computed vegetation indices, land cover classes, and ancillary political, topographic and hydrographic information. This open-file report uses some of these data to explore the environment of a (256-km)? region of central California. We present various visualizations of the data, multiscale correlations between topography and vegetation, a path analysis of more complex statistical interactions, and a map that portrays the influence of agriculture on the region's vegetation. An appendix contains C and Mathematica code used to generate the graphics and some of the analysis.
Local sensitivity analysis for inverse problems solved by singular value decomposition
Hill, M.C.; Nolan, B.T.
2010-01-01
Local sensitivity analysis provides computationally frugal ways to evaluate models commonly used for resource management, risk assessment, and so on. This includes diagnosing inverse model convergence problems caused by parameter insensitivity and(or) parameter interdependence (correlation), understanding what aspects of the model and data contribute to measures of uncertainty, and identifying new data likely to reduce model uncertainty. Here, we consider sensitivity statistics relevant to models in which the process model parameters are transformed using singular value decomposition (SVD) to create SVD parameters for model calibration. The statistics considered include the PEST identifiability statistic, and combined use of the process-model parameter statistics composite scaled sensitivities and parameter correlation coefficients (CSS and PCC). The statistics are complimentary in that the identifiability statistic integrates the effects of parameter sensitivity and interdependence, while CSS and PCC provide individual measures of sensitivity and interdependence. PCC quantifies correlations between pairs or larger sets of parameters; when a set of parameters is intercorrelated, the absolute value of PCC is close to 1.00 for all pairs in the set. The number of singular vectors to include in the calculation of the identifiability statistic is somewhat subjective and influences the statistic. To demonstrate the statistics, we use the USDA’s Root Zone Water Quality Model to simulate nitrogen fate and transport in the unsaturated zone of the Merced River Basin, CA. There are 16 log-transformed process-model parameters, including water content at field capacity (WFC) and bulk density (BD) for each of five soil layers. Calibration data consisted of 1,670 observations comprising soil moisture, soil water tension, aqueous nitrate and bromide concentrations, soil nitrate concentration, and organic matter content. All 16 of the SVD parameters could be estimated by regression based on the range of singular values. Identifiability statistic results varied based on the number of SVD parameters included. Identifiability statistics calculated for four SVD parameters indicate the same three most important process-model parameters as CSS/PCC (WFC1, WFC2, and BD2), but the order differed. Additionally, the identifiability statistic showed that BD1 was almost as dominant as WFC1. The CSS/PCC analysis showed that this results from its high correlation with WCF1 (-0.94), and not its individual sensitivity. Such distinctions, combined with analysis of how high correlations and(or) sensitivities result from the constructed model, can produce important insights into, for example, the use of sensitivity analysis to design monitoring networks. In conclusion, the statistics considered identified similar important parameters. They differ because (1) with CSS/PCC can be more awkward because sensitivity and interdependence are considered separately and (2) identifiability requires consideration of how many SVD parameters to include. A continuing challenge is to understand how these computationally efficient methods compare with computationally demanding global methods like Markov-Chain Monte Carlo given common nonlinear processes and the often even more nonlinear models.
New advances in the statistical parton distributions approach
NASA Astrophysics Data System (ADS)
Soffer, Jacques; Bourrely, Claude
2016-03-01
The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p¯p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results. Presented by J. So.er at POETIC 2015
Gannon, J.L.
2012-01-01
Statistics on geomagnetic storms with minima below -50 nanoTesla are compiled using a 25-year span of the 1-minute resolution disturbance index, U.S. Geological Survey Dst. A sudden commencement, main phase minimum, and time between the two has a magnitude of 35 nanoTesla, -100 nanoTesla, and 12 hours, respectively, at the 50th percentile level. The cumulative distribution functions for each of these features are presented. Correlation between sudden commencement magnitude and main phase magnitude is shown to be low. Small, medium, and large storm templates at the 33rd, 50th, and 90th percentile are presented and compared to real examples. In addition, the relative occurrence of rates of change in Dst are presented.
2011-01-01
Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440
Statistical variability and confidence intervals for planar dose QA pass rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, Daniel W.; Nelms, Benjamin E.; Attwood, Kristopher
Purpose: The most common metric for comparing measured to calculated dose, such as for pretreatment quality assurance of intensity-modulated photon fields, is a pass rate (%) generated using percent difference (%Diff), distance-to-agreement (DTA), or some combination of the two (e.g., gamma evaluation). For many dosimeters, the grid of analyzed points corresponds to an array with a low areal density of point detectors. In these cases, the pass rates for any given comparison criteria are not absolute but exhibit statistical variability that is a function, in part, on the detector sampling geometry. In this work, the authors analyze the statistics ofmore » various methods commonly used to calculate pass rates and propose methods for establishing confidence intervals for pass rates obtained with low-density arrays. Methods: Dose planes were acquired for 25 prostate and 79 head and neck intensity-modulated fields via diode array and electronic portal imaging device (EPID), and matching calculated dose planes were created via a commercial treatment planning system. Pass rates for each dose plane pair (both centered to the beam central axis) were calculated with several common comparison methods: %Diff/DTA composite analysis and gamma evaluation, using absolute dose comparison with both local and global normalization. Specialized software was designed to selectively sample the measured EPID response (very high data density) down to discrete points to simulate low-density measurements. The software was used to realign the simulated detector grid at many simulated positions with respect to the beam central axis, thereby altering the low-density sampled grid. Simulations were repeated with 100 positional iterations using a 1 detector/cm{sup 2} uniform grid, a 2 detector/cm{sup 2} uniform grid, and similar random detector grids. For each simulation, %/DTA composite pass rates were calculated with various %Diff/DTA criteria and for both local and global %Diff normalization techniques. Results: For the prostate and head/neck cases studied, the pass rates obtained with gamma analysis of high density dose planes were 2%-5% higher than respective %/DTA composite analysis on average (ranging as high as 11%), depending on tolerances and normalization. Meanwhile, the pass rates obtained via local normalization were 2%-12% lower than with global maximum normalization on average (ranging as high as 27%), depending on tolerances and calculation method. Repositioning of simulated low-density sampled grids leads to a distribution of possible pass rates for each measured/calculated dose plane pair. These distributions can be predicted using a binomial distribution in order to establish confidence intervals that depend largely on the sampling density and the observed pass rate (i.e., the degree of difference between measured and calculated dose). These results can be extended to apply to 3D arrays of detectors, as well. Conclusions: Dose plane QA analysis can be greatly affected by choice of calculation metric and user-defined parameters, and so all pass rates should be reported with a complete description of calculation method. Pass rates for low-density arrays are subject to statistical uncertainty (vs. the high-density pass rate), but these sampling errors can be modeled using statistical confidence intervals derived from the sampled pass rate and detector density. Thus, pass rates for low-density array measurements should be accompanied by a confidence interval indicating the uncertainty of each pass rate.« less
Paleomagnetism.org: An online multi-platform open source environment for paleomagnetic data analysis
NASA Astrophysics Data System (ADS)
Koymans, Mathijs R.; Langereis, Cor G.; Pastor-Galán, Daniel; van Hinsbergen, Douwe J. J.
2016-08-01
This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The Paleomagnetism.org application is split in to an interpretation portal, a statistics portal, and a portal for miscellaneous paleomagnetic tools. In the interpretation portal, principle component analysis can be performed on visualized demagnetization diagrams. Interpreted directions and great circles can be combined to find great circle solutions. These directions can be used in the statistics portal, or exported as data and figures. The tools in the statistics portal cover standard Fisher statistics for directions and VGPs, including other statistical parameters used as reliability criteria. Other available tools include an eigenvector approach foldtest, two reversal test including a Monte Carlo simulation on mean directions, and a coordinate bootstrap on the original data. An implementation is included for the detection and correction of inclination shallowing in sediments following TK03.GAD. Finally we provide a module to visualize VGPs and expected paleolatitudes, declinations, and inclinations relative to widely used global apparent polar wander path models in coordinates of major continent-bearing plates. The tools in the miscellaneous portal include a net tectonic rotation (NTR) analysis to restore a body to its paleo-vertical and a bootstrapped oroclinal test using linear regressive techniques, including a modified foldtest around a vertical axis. Paleomagnetism.org provides an integrated approach for researchers to work with visualized (e.g. hemisphere projections, Zijderveld diagrams) paleomagnetic data. The application constructs a custom exportable file that can be shared freely and included in public databases. This exported file contains all data and can later be imported to the application by other researchers. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared makes Paleomagnetism.org of interest to the community.
NASA Astrophysics Data System (ADS)
Clotet, Xavier; Santucci, Stéphane; Ortín, Jordi
2016-01-01
We report the results of an experimental investigation of the spatiotemporal dynamics of stable imbibition fronts in a disordered medium, in the regime of capillary disorder, for a wide range of experimental conditions. We have used silicone oils of various viscosities μ and nearly identical oil-air surface tension, and forced them to slowly invade a model open fracture at very different flow rates v . In this second part of the study we have carried out a scale-dependent statistical analysis of the front dynamics. We have specifically analyzed the influence of μ and v on the statistical properties of the velocity Vℓ, the spatial average of the local front velocities over a window of lateral size ℓ . We have varied ℓ from the local scale defined by our spatial resolution up to the lateral system size L . Even though the imposed flow rate is constant, the signals Vℓ(t ) present very strong fluctuations which evolve systematically with the parameters μ , v , and ℓ . We have verified that the non-Gaussian fluctuations of the global velocity Vℓ(t ) are very well described by a generalized Gumbel statistics. The asymmetric shape and the exponential tail of those distributions are controlled by the number of effective degrees of freedom of the imbibition fronts, given by Neff=ℓ /ℓc (the ratio of the lateral size of the measuring window ℓ to the correlation length ℓc˜1 /√{μ v } ). The large correlated excursions of Vℓ(t ) correspond to global avalanches, which reflect extra displacements of the imbibition fronts. We show that global avalanches are power-law distributed, both in sizes and durations, with robustly defined exponents—independent of μ , v , and ℓ . Nevertheless, the exponential upper cutoffs of the distributions evolve systematically with those parameters. We have found, moreover, that maximum sizes ξS and maximum durations ξT of global avalanches are not controlled by the same mechanism. While ξS are also determined by ℓ /ℓc , like the amplitude fluctuations of Vℓ(t ) , ξT and the temporal correlations of Vℓ(t ) evolve much more strongly with imposed flow rate v than with fluid viscosity μ .
The GODAE High Resolution Sea Surface Temperature Pilot Project (GHRSST-PP)
NASA Astrophysics Data System (ADS)
Donlon, C.; Ghrsst-Pp Science Team
2003-04-01
This paper summarises Development and Implementation Plan of the GODAE High Resolution Sea Surface Temperature Pilot Project (GHRSST-PP). The aim of the GHRSST-PP is to coordinate a new generation of global, multi-sensor, high-resolution (better than 10 km and 12 hours) SST products for the benefit of the operational and scientific community and for those with a potential interest in the products of GODAE. The GHRSST-PP project will deliver a demonstration system that integrates data from existing international satellite and in situ data sources using state-of-the-art communications and analysis tools. Primary GHRSST-PP products will be generated by fusing infrared and microwave satellite data obtained from sensors in near polar, geostationary and low earth orbits, constrained by in situ observations. Surface skin SST, sub-surface SST and SST at depth will be produced as both merged and analysed data products. Merged data products have a common grid but all input data retaining their error statistics whereas analysed data products use all data to derive a best estimate data source having one set of error statistics. Merged SST fields will not be interpolated thereby preserving the integrity of the source data as much as possible. Products will be first produced and validated using in situ observations for regional areas by regional data assembly centres (RDAC) and sent to a global data analysis centre (GDAC) for integration with other data to provide global coverage. GDAC and RDAC will be connected together with other data using a virtual dynamic distributed database (DDD). The GDAC will merge and analyse RDAC data together with other data (from the GTS and space agencies) to provide global coverage every 12 hours in real time. In all cases data products will be accurate to better than 0.5 K validated using data collected at globally distributed diagnostic data set (DDS) sites. A user information service (UIS) will work together with user applications and services (AUS) to ensure that the GHRSST-PP is able to respond appropriately to user demands. In addition, the GDAC will provide product validation and dissemination services as well as the means for researchers to test and use the In situ and Satellite Data Integration Processing Model (ISDI-PM) operational demonstration code using a large supercomputer.
NASA Astrophysics Data System (ADS)
Sommer, Philipp; Kaplan, Jed
2016-04-01
Accurate modelling of large-scale vegetation dynamics, hydrology, and other environmental processes requires meteorological forcing on daily timescales. While meteorological data with high temporal resolution is becoming increasingly available, simulations for the future or distant past are limited by lack of data and poor performance of climate models, e.g., in simulating daily precipitation. To overcome these limitations, we may temporally downscale monthly summary data to a daily time step using a weather generator. Parameterization of such statistical models has traditionally been based on a limited number of observations. Recent developments in the archiving, distribution, and analysis of "big data" datasets provide new opportunities for the parameterization of a temporal downscaling model that is applicable over a wide range of climates. Here we parameterize a WGEN-type weather generator using more than 50 million individual daily meteorological observations, from over 10'000 stations covering all continents, based on the Global Historical Climatology Network (GHCN) and Synoptic Cloud Reports (EECRA) databases. Using the resulting "universal" parameterization and driven by monthly summaries, we downscale mean temperature (minimum and maximum), cloud cover, and total precipitation, to daily estimates. We apply a hybrid gamma-generalized Pareto distribution to calculate daily precipitation amounts, which overcomes much of the inability of earlier weather generators to simulate high amounts of daily precipitation. Our globally parameterized weather generator has numerous applications, including vegetation and crop modelling for paleoenvironmental studies.
Quality of life in breast cancer patients--a quantile regression analysis.
Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma
2008-01-01
Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.
Hyperconnectivity in juvenile myoclonic epilepsy: a network analysis.
Caeyenberghs, K; Powell, H W R; Thomas, R H; Brindley, L; Church, C; Evans, J; Muthukumaraswamy, S D; Jones, D K; Hamandi, K
2015-01-01
Juvenile myoclonic epilepsy (JME) is a common idiopathic (genetic) generalized epilepsy (IGE) syndrome characterized by impairments in executive and cognitive control, affecting independent living and psychosocial functioning. There is a growing consensus that JME is associated with abnormal function of diffuse brain networks, typically affecting frontal and fronto-thalamic areas. Using diffusion MRI and a graph theoretical analysis, we examined bivariate (network-based statistic) and multivariate (global and local) properties of structural brain networks in patients with JME (N = 34) and matched controls. Neuropsychological assessment was performed in a subgroup of 14 patients. Neuropsychometry revealed impaired visual memory and naming in JME patients despite a normal full scale IQ (mean = 98.6). Both JME patients and controls exhibited a small world topology in their white matter networks, with no significant differences in the global multivariate network properties between the groups. The network-based statistic approach identified one subnetwork of hyperconnectivity in the JME group, involving primary motor, parietal and subcortical regions. Finally, there was a significant positive correlation in structural connectivity with cognitive task performance. Our findings suggest that structural changes in JME patients are distributed at a network level, beyond the frontal lobes. The identified subnetwork includes key structures in spike wave generation, along with primary motor areas, which may contribute to myoclonic jerks. We conclude that analyzing the affected subnetworks may provide new insights into understanding seizure generation, as well as the cognitive deficits observed in JME patients.
Hyperconnectivity in juvenile myoclonic epilepsy: A network analysis
Caeyenberghs, K.; Powell, H.W.R.; Thomas, R.H.; Brindley, L.; Church, C.; Evans, J.; Muthukumaraswamy, S.D.; Jones, D.K.; Hamandi, K.
2014-01-01
Objective Juvenile myoclonic epilepsy (JME) is a common idiopathic (genetic) generalized epilepsy (IGE) syndrome characterized by impairments in executive and cognitive control, affecting independent living and psychosocial functioning. There is a growing consensus that JME is associated with abnormal function of diffuse brain networks, typically affecting frontal and fronto-thalamic areas. Methods Using diffusion MRI and a graph theoretical analysis, we examined bivariate (network-based statistic) and multivariate (global and local) properties of structural brain networks in patients with JME (N = 34) and matched controls. Neuropsychological assessment was performed in a subgroup of 14 patients. Results Neuropsychometry revealed impaired visual memory and naming in JME patients despite a normal full scale IQ (mean = 98.6). Both JME patients and controls exhibited a small world topology in their white matter networks, with no significant differences in the global multivariate network properties between the groups. The network-based statistic approach identified one subnetwork of hyperconnectivity in the JME group, involving primary motor, parietal and subcortical regions. Finally, there was a significant positive correlation in structural connectivity with cognitive task performance. Conclusions Our findings suggest that structural changes in JME patients are distributed at a network level, beyond the frontal lobes. The identified subnetwork includes key structures in spike wave generation, along with primary motor areas, which may contribute to myoclonic jerks. We conclude that analyzing the affected subnetworks may provide new insights into understanding seizure generation, as well as the cognitive deficits observed in JME patients. PMID:25610771
NASA Astrophysics Data System (ADS)
Zhu, X.
2017-12-01
Based on methods of statistical analysis, the time series of global surface air temperature(SAT) anomalies from 1860-2014 has been defined by three types of phase changes that occur through the division of temperature changes into different stages. The characteristics of the three types of phase changes simulated by CMIP5 models were evaluated. The conclusion is as follows: the SAT from 1860-2014 can be divided into six stages according to trend differences, and this subdivision is proved to be statistically significant. Based on trend analysis and the distribution of slopes between any two points (two points' slope) in every stage, the six stages can be summarized as three phase changes of warming, cooling, and hiatus. Between 1860 and 2014, the world experienced three heating phases (1860-1878, 1909-1942,1975-2004), one cooling phase (1878-1909), and two hiatus phases (1942-1975, 2004-2014).Using the definition method, whether the next year belongs to the previous phase can be estimated. Furthermore, the temperature in 2015 was used as an example to validate the feasibility of this method. The simulations of the heating period by CMIP5 models are well; however the characteristics shown by SAT during the cooling and hiatus period cannot be represented by CMIP5 models. As such, the projections of future heating phases using the CMIP5 models are credible, but for cooling and hiatus events they are unreliable.
NASA Astrophysics Data System (ADS)
Kim, D.; Youn, J.; Kim, C.
2017-08-01
As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.
Tang, Guoping; Shafer, Sarah L.; Barlein, Patrick J.; Holman, Justin O.
2009-01-01
Prognostic vegetation models have been widely used to study the interactions between environmental change and biological systems. This study examines the sensitivity of vegetation model simulations to: (i) the selection of input climatologies representing different time periods and their associated atmospheric CO2 concentrations, (ii) the choice of observed vegetation data for evaluating the model results, and (iii) the methods used to compare simulated and observed vegetation. We use vegetation simulated for Asia by the equilibrium vegetation model BIOME4 as a typical example of vegetation model output. BIOME4 was run using 19 different climatologies and their associated atmospheric CO2 concentrations. The Kappa statistic, Fuzzy Kappa statistic and a newly developed map-comparison method, the Nomad index, were used to quantify the agreement between the biomes simulated under each scenario and the observed vegetation from three different global land- and tree-cover data sets: the global Potential Natural Vegetation data set (PNV), the Global Land Cover Characteristics data set (GLCC), and the Global Land Cover Facility data set (GLCF). The results indicate that the 30-year mean climatology (and its associated atmospheric CO2 concentration) for the time period immediately preceding the collection date of the observed vegetation data produce the most accurate vegetation simulations when compared with all three observed vegetation data sets. The study also indicates that the BIOME4-simulated vegetation for Asia more closely matches the PNV data than the other two observed vegetation data sets. Given the same observed data, the accuracy assessments of the BIOME4 simulations made using the Kappa, Fuzzy Kappa and Nomad index map-comparison methods agree well when the compared vegetation types consist of a large number of spatially continuous grid cells. The results of this analysis can assist model users in designing experimental protocols for simulating vegetation.
Badr, R; Hashemi, M; Javadi, G; Movafagh, A; Mahdian, R
2016-01-01
It is well known that hippocampus has a pivotal role in learning, formation and consolidation of memory. Global cerebral ischemia causes severe damage to pyramidal neurons of the CA1 region and usually results in residual neurological deficits following a recovery from ischemia. Scientists investigate to find the molecular mechanism of apoptosis and to use this cell death for clinical treatment. In this investigation, we evaluated the molecular mechanism of FK-506 in apoptosis using gene expression quantification of BAX and BCL-2 genes in hippocampus following global ischemic/reperfusion. In the present experimental study, adult male Wistar rats were obtained and housed under standard conditions. Each experimental group consisted of five rats and was equally distributed in the normal control, ischemia/reperfusion, ischemia/reperfusion followed by FK-506. Global ischemia was induced for animals in ischemia and drug groups. In the drug group, moreover, two doses of FK-506 were injected as IV injection and intra-peritoneal (IP) injection after 48 h. Then, hippocampus tissue was removed. Consequently, RNA isolated, cDNA was synthesized and Real-Time PCR was performed. Finally, the obtained data was analyzed statistically (p<0.05). The quantitative results showed the BAX expression ratio increased approximately 3-times in ischemia/reperfusion (3.11 ± 0.28) group compared to the untreated (NR) and the drug group (p<0.001). The statistical analysis showed a significant difference for BCL-2 expression between the experimental groups (p<0.001). The mRNA level of BCL-2 decreased in the ischemia/reperfusion group, while it was without alteration in the other groups. The results showed that global cerebral ischemia/reperfusion induced BAX as pro-apoptotic gene and tacrolimus a neuroprotective drug inhibited BAX gene expression and induced BCL-2 gene expression as anti-apoptotic gene (Tab. 2, Fig. 3, Ref. 21).
NASA Astrophysics Data System (ADS)
Allen, G. H.; David, C. H.; Andreadis, K. M.; Emery, C. M.; Famiglietti, J. S.
2017-12-01
Earth observing satellites provide valuable near real-time (NRT) information about flood occurrence and magnitude worldwide. This NRT information can be used in early flood warning systems and other flood management applications to save lives and mitigate flood damage. However, these NRT products are only useful to early flood warning systems if they are quickly made available, with sufficient time for flood mitigation actions to be implemented. More specifically, NRT data latency, or the time period between the satellite observation and when the user has access to the information, must be less than the time it takes a flood to travel from the flood observation location to a given downstream point of interest. Yet the paradigm that "lower latency is always better" may not necessarily hold true in river systems due to tradeoffs between data latency and data quality. Further, the existence of statistical breaks in the global distribution of flood wave travel time (i.e. a jagged statistical distribution) would represent preferable latencies for river-observation NRT remote sensing products. Here we present a global analysis of flood wave velocity (i.e. flow celerity) and travel time. We apply a simple kinematic wave model to a global hydrography dataset and calculate flow wave celerity and travel time during bankfull flow conditions. Bankfull flow corresponds to the condition of maximum celerity and thus we present the "worst-case scenario" minimum flow wave travel time. We conduct a similar analysis with respect to the time it takes flood waves to reach the next downstream city, as well as the next downstream reservoir. Finally, we conduct these same analyses, but with regards to the technical capabilities of the planned Surface Water and Ocean Topography (SWOT) satellite mission, which is anticipated to provide waterbody elevation and extent measurements at an unprecedented spatial and temporal resolution. We validate these results with discharge records from paired USGS gauge stations located along a diverse collection of river reaches. These results provide a scientific rationale for optimizing the utility of existing and future NRT river-observation products.
De Barro, Paul; Ahmed, Muhammad Z
2011-01-01
A challenge within the context of cryptic species is the delimitation of individual species within the complex. Statistical parsimony network analytics offers the opportunity to explore limits in situations where there are insufficient species-specific morphological characters to separate taxa. The results also enable us to explore the spread in taxa that have invaded globally. Using a 657 bp portion of mitochondrial cytochrome oxidase 1 from 352 unique haplotypes belonging to the Bemisia tabaci cryptic species complex, the analysis revealed 28 networks plus 7 unconnected individual haplotypes. Of the networks, 24 corresponded to the putative species identified using the rule set devised by Dinsdale et al. (2010). Only two species proposed in Dinsdale et al. (2010) departed substantially from the structure suggested by the analysis. The analysis of the two invasive members of the complex, Mediterranean (MED) and Middle East - Asia Minor 1 (MEAM1), showed that in both cases only a small number of haplotypes represent the majority that have spread beyond the home range; one MEAM1 and three MED haplotypes account for >80% of the GenBank records. Israel is a possible source of the globally invasive MEAM1 whereas MED has two possible sources. The first is the eastern Mediterranean which has invaded only the USA, primarily Florida and to a lesser extent California. The second are western Mediterranean haplotypes that have spread to the USA, Asia and South America. The structure for MED supports two home range distributions, a Sub-Saharan range and a Mediterranean range. The MEAM1 network supports the Middle East - Asia Minor region. The network analyses show a high level of congruence with the species identified in a previous phylogenetic analysis. The analysis of the two globally invasive members of the complex support the view that global invasion often involve very small portions of the available genetic diversity.
Boal Carvalho, Pedro; Magalhães, Joana; Dias de Castro, Francisca; Rosa, Bruno; Cotter, José
2017-03-31
Helicobacter pylori eradication has become increasingly difficult as resistances to several antibiotics develop. We aimed to compare Helicobacter pylori eradication rates between triple therapy and sequential therapy in a naive Portuguese population. Prospective randomized trial including consecutive patients referred for first-line Helicobacter pylori eradication treatment. previous gastric surgery/neoplasia, pregnancy/lactancy, allergy to any of the drugs. The compared eradication regimens were triple therapy (pantoprazol, amoxicillin and clarithromycin 12/12 hours, 14 days) and sequential therapy (pantoprazol 12/12 hours for 10 days, amoxicillin 12/12 hours for days 1 - 5 and clarithromycin plus metronidazol 12/12 hours during days 6 - 10). Eradication success was confirmed with urea breath test. Statistical analysis was performed with SPSS v21.0 and a p-value < 0.05 was considered statistically significant. Included 60 patients, 39 (65%) female with mean age 52 years (SD ± 14.3). Treatment groups were homogeneous for gender, age, indication for treatment and smoking status. No statistical differences were encountered between sequential and triple therapy eradication rates (86.2% vs 77.4%, p = 0.379), global eradication rate was 82%. Tobacco consumption was associated with a significantly lower eradication success (54.5 vs 87.8%, p = 0.022). In this randomized controlled trial in a naive Portuguese population, we found a satisfactory global Helicobacter pylori eradication rate of 82%, with no statistical differences observed in the efficacy of the treatment between triple and sequential regimens. These results support the use of either therapy for the first-line eradication of Helicobacter pylori.
Comparative shotgun proteomics using spectral count data and quasi-likelihood modeling.
Li, Ming; Gray, William; Zhang, Haixia; Chung, Christine H; Billheimer, Dean; Yarbrough, Wendell G; Liebler, Daniel C; Shyr, Yu; Slebos, Robbert J C
2010-08-06
Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography-tandem mass spectrometry (LC-MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher's Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples.
Comparative Shotgun Proteomics Using Spectral Count Data and Quasi-Likelihood Modeling
2010-01-01
Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography−tandem mass spectrometry (LC−MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher’s Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography−multiple reaction monitoring mass spectrometry (LC−MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples. PMID:20586475
de Almeida, C M; da Rosa, W L O; Meereis, C T W; de Almeida, S M; Ribeiro, J S; da Silva, A F; Lund, Rafael Guerra
2018-06-01
The purpose of this study was to evaluate the efficacy of orthodontic bonding systems containing different antimicrobial agents, as well as the influence of antimicrobial agent incorporation in the bonding properties of these materials. Eight databases were searched: PubMed (Medline), Web of Science, Scopus, Lilacs, Ibecs, BBO, Scielo and Google Scholar. Any study that evaluated antimicrobial activity in experimental or commercial orthodontic bonding systems was included. Data were tabulated independently and in duplicated by two authors on pre-designed data collection form. The global analysis was carried out using a random-effects model, and pooled-effect estimates were obtained by comparing the standardised mean difference of each antimicrobial orthodontic adhesive with the respective control group. A p-value < .05 was considered as statistically significant. Thirty-two studies were included in the qualitative analysis; of these, 22 studies were included in the meta-analysis. Antimicrobial agents such as silver nanoparticles, benzalkonium chloride, chlorhexidine, triclosan, cetylpyridinium chloride, Galla chinensis extract, acid ursolic, dimethylaminododecyl methacrylate, dimethylaminohexadecyl methacrylate, 2-methacryloyloxyethyl phosphorylcholine, 1,3,5-triacryloylhexahydro-1,3,5-triazine, zinc oxide and titanium oxide have been incorporated into orthodontic bonding systems. The antimicrobial agent incorporation in orthodontic bonding systems showed higher antimicrobial activity than the control group in agar diffusion (overall standardised mean difference: 3.71; 95% CI 2.98 to 4.43) and optical density tests (0.41; 95% CI -0.05 to 0.86) (p < .05). However, for biofilm, the materials did not present antimicrobial activity (6.78; 95% CI 4.78 to 8.77). Regarding bond strength, the global analysis showed antimicrobial orthodontic bonding systems were statistically similar to the control. Although there is evidence of antibacterial activity from in vitro studies, clinical and long-term studies are still necessary to confirm the effectiveness of antibacterial orthodontic bonding systems in preventing caries disease.
ERIC Educational Resources Information Center
Kim, Ki Su
2005-01-01
This article examines the relationship between globalization and national education reforms, especially those of educational systems. Instead of exploring the much debated issues of how globalization affects national educational systems and how the nations react by what kinds of systemic education reform, however, it focuses on what such a method…
NASA Astrophysics Data System (ADS)
Hu, Y.; Vaughan, M.; McClain, C.; Behrenfeld, M.; Maring, H.; Anderson, D.; Sun-Mack, S.; Flittner, D.; Huang, J.; Wielicki, B.; Minnis, P.; Weimer, C.; Trepte, C.; Kuehn, R.
2007-03-01
This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water content and effective number density are presented.
Lei, Tianli; Chen, Shifeng; Wang, Kai; Zhang, Dandan; Dong, Lin; Lv, Chongning; Wang, Jing; Lu, Jincai
2018-02-01
Bupleuri Radix is a commonly used herb in clinic, and raw and vinegar-baked Bupleuri Radix are both documented in the Pharmacopoeia of People's Republic of China. According to the theories of traditional Chinese medicine, Bupleuri Radix possesses different therapeutic effects before and after processing. However, the chemical mechanism of this processing is still unknown. In this study, ultra-high-performance liquid chromatography with quadruple time-of-flight mass spectrometry coupled with multivariate statistical analysis including principal component analysis and orthogonal partial least square-discriminant analysis was developed to holistically compare the difference between raw and vinegar-baked Bupleuri Radix for the first time. As a result, 50 peaks in raw and processed Bupleuri Radix were detected, respectively, and a total of 49 peak chemical compounds were identified. Saikosaponin a, saikosaponin d, saikosaponin b 3 , saikosaponin e, saikosaponin c, saikosaponin b 2 , saikosaponin b 1 , 4''-O-acetyl-saikosaponin d, hyperoside and 3',4'-dimethoxy quercetin were explored as potential markers of raw and vinegar-baked Bupleuri Radix. This study has been successfully applied for global analysis of raw and vinegar-processed samples. Furthermore, the underlying hepatoprotective mechanism of Bupleuri Radix was predicted, which was related to the changes of chemical profiling. Copyright © 2017 John Wiley & Sons, Ltd.
Zhao, Xi; Dellandréa, Emmanuel; Chen, Liming; Kakadiaris, Ioannis A
2011-10-01
Three-dimensional face landmarking aims at automatically localizing facial landmarks and has a wide range of applications (e.g., face recognition, face tracking, and facial expression analysis). Existing methods assume neutral facial expressions and unoccluded faces. In this paper, we propose a general learning-based framework for reliable landmark localization on 3-D facial data under challenging conditions (i.e., facial expressions and occlusions). Our approach relies on a statistical model, called 3-D statistical facial feature model, which learns both the global variations in configurational relationships between landmarks and the local variations of texture and geometry around each landmark. Based on this model, we further propose an occlusion classifier and a fitting algorithm. Results from experiments on three publicly available 3-D face databases (FRGC, BU-3-DFE, and Bosphorus) demonstrate the effectiveness of our approach, in terms of landmarking accuracy and robustness, in the presence of expressions and occlusions.
Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics
NASA Astrophysics Data System (ADS)
Lazarus, S. M.; Holman, B. P.; Splitt, M. E.
2017-12-01
A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.
Topological Cacti: Visualizing Contour-based Statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio
2011-05-26
Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introducemore » a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.« less
Stec, Małgorzata; Grzebyk, Mariola
2018-01-01
The European Union (EU), striving to create economic dominance on the global market, has prepared a comprehensive development programme, which initially was the Lisbon Strategy and then the Strategy Europe 2020. The attainment of the strategic goals included in the prospective development programmes shall transform the EU into the most competitive economy in the world based on knowledge. This paper presents a statistical evaluation of progress being made by EU member states in meeting Europe 2020. For the basis of the assessment, the authors proposed a general synthetic measure in dynamic terms, which allows to objectively compare EU member states by 10 major statistical indicators. The results indicate that most of EU countries show average progress in realisation of Europe's development programme which may suggest that the goals may not be achieved in the prescribed time. It is particularly important to monitor the implementation of Europe 2020 to arrive at the right decisions which will guarantee the accomplishment of the EU's development strategy.
A global catalogue of Ceres impact craters ≥ 1 km and preliminary analysis
NASA Astrophysics Data System (ADS)
Gou, Sheng; Yue, Zongyu; Di, Kaichang; Liu, Zhaoqin
2018-03-01
The orbital data products of Ceres, including global LAMO image mosaic and global HAMO DTM with a resolution of 35 m/pixel and 135 m/pixel respectively, are utilized in this research to create a global catalogue of impact craters with diameter ≥ 1 km, and their morphometric parameters are calculated. Statistics shows: (1) There are 29,219 craters in the catalogue, and the craters have a various morphologies, e.g., polygonal crater, floor fractured crater, complex crater with central peak, etc.; (2) The identifiable smallest crater size is extended to 1 km and the crater numbers have been updated when compared with the crater catalogue (D ≥ 20 km) released by the Dawn Science Team; (3) The d/D ratios for fresh simple craters, obviously degraded simple crater and polygonal simple crater are 0.11 ± 0.04, 0.05 ± 0.04 and 0.14 ± 0.02 respectively. (4) The d/D ratios for non-polygonal complex crater and polygonal complex crater are 0.08 ± 0.04 and 0.09 ± 0.03. The global crater catalogue created in this work can be further applied to many other scientific researches, such as comparing d/D with other bodies, inferring subsurface properties, determining surface age, and estimating average erosion rate.
Random Forests for Global and Regional Crop Yield Predictions.
Jeong, Jig Han; Resop, Jonathan P; Mueller, Nathaniel D; Fleisher, David H; Yun, Kyungdahm; Butler, Ethan E; Timlin, Dennis J; Shim, Kyo-Moon; Gerber, James S; Reddy, Vangimalla R; Kim, Soo-Hyung
2016-01-01
Accurate predictions of crop yield are critical for developing effective agricultural and food policies at the regional and global scales. We evaluated a machine-learning method, Random Forests (RF), for its ability to predict crop yield responses to climate and biophysical variables at global and regional scales in wheat, maize, and potato in comparison with multiple linear regressions (MLR) serving as a benchmark. We used crop yield data from various sources and regions for model training and testing: 1) gridded global wheat grain yield, 2) maize grain yield from US counties over thirty years, and 3) potato tuber and maize silage yield from the northeastern seaboard region. RF was found highly capable of predicting crop yields and outperformed MLR benchmarks in all performance statistics that were compared. For example, the root mean square errors (RMSE) ranged between 6 and 14% of the average observed yield with RF models in all test cases whereas these values ranged from 14% to 49% for MLR models. Our results show that RF is an effective and versatile machine-learning method for crop yield predictions at regional and global scales for its high accuracy and precision, ease of use, and utility in data analysis. RF may result in a loss of accuracy when predicting the extreme ends or responses beyond the boundaries of the training data.
Tropospheric Ozone Change from 1980 to 2010 Dominated by Equatorward Redistribution of Emissions
NASA Technical Reports Server (NTRS)
Zhang, Yuqiang; Cooper, Owen R.; Gaudel, Audrey; Thompson, Anne M.; Nedelec, Philippe; Ogino, Shin-Ya; West, J. Jason
2016-01-01
Ozone is an important air pollutant at the surface, and the third most important anthropogenic greenhouse gas in the troposphere. Since 1980, anthropogenic emissions of ozone precursors methane, non-methane volatile organic compounds, carbon monoxide and nitrogen oxides (NOx) have shifted from developed to developing regions. Emissions have thereby been redistributed equatorwards, where they are expected to have a stronger effect on the tropospheric ozone burden due to greater convection, reaction rates and NOx sensitivity. Here we use a global chemical transport model to simulate changes in tropospheric ozone concentrations from 1980 to 2010, and to separate the influences of changes in the spatial distribution of global anthropogenic emissions of short-lived pollutants, the magnitude of these emissions, and the global atmospheric methane concentration. We estimate that the increase in ozone burden due to the spatial distribution change slightly exceeds the combined influences of the increased emission magnitude and global methane. Emission increases in Southeast, East and South Asia may be most important for the ozone change, supported by an analysis of statistically significant increases in observed ozone above these regions. The spatial distribution of emissions dominates global tropospheric ozone, suggesting that the future ozone burden will be determined mainly by emissions from low latitudes.
A Statistical Analysis of the Solar Phenomena Associated with Global EUV Waves
NASA Astrophysics Data System (ADS)
Long, D. M.; Murphy, P.; Graham, G.; Carley, E. P.; Pérez-Suárez, D.
2017-12-01
Solar eruptions are the most spectacular events in our solar system and are associated with many different signatures of energy release including solar flares, coronal mass ejections, global waves, radio emission and accelerated particles. Here, we apply the Coronal Pulse Identification and Tracking Algorithm (CorPITA) to the high-cadence synoptic data provided by the Solar Dynamics Observatory (SDO) to identify and track global waves observed by SDO. 164 of the 362 solar flare events studied (45%) were found to have associated global waves with no waves found for the remaining 198 (55%). A clear linear relationship was found between the median initial velocity and the acceleration of the waves, with faster waves exhibiting a stronger deceleration (consistent with previous results). No clear relationship was found between global waves and type II radio bursts, electrons or protons detected in situ near Earth. While no relationship was found between the wave properties and the associated flare size (with waves produced by flares from B to X-class), more than a quarter of the active regions studied were found to produce more than one wave event. These results suggest that the presence of a global wave in a solar eruption is most likely determined by the structure and connectivity of the erupting active region and the surrounding quiet solar corona rather than by the amount of free energy available within the active region.
Global Fire Trends from Satellite ATSR Instrument Series
NASA Astrophysics Data System (ADS)
Arino, Olivier; Casadio, Stefano; Serpe, Danilo
2010-12-01
Global night-time fire counts for the years from 1995 to 2009 have been obtained by using the latest version of Along Track Scanning Radiometer TOA radiance products (level 1), and related trends have been estimated. Possible biases due to cloud coverage variations have been assumed to be negligible. The sampling number (acquisition frequency) has also been analysed and proved not to influence our results. Global night-time fire trends have been evaluated by inspecting the time series of hot spots aggregated a) at 2°x2° scale; b) at district/country/region/continent scales, and c) globally. The statistical significance of the estimated trend parameters has been verified by means of the Mann-Kendal test. Results indicate that no trends in the absolute number of spots can be identified at the global scale, that there has been no appreciable shift in the fire season during the last fourteen years, and that statistically significant positive and negative trends are only found when data are aggregated at smaller scales.
Mihic, Marko M; Todorovic, Marija Lj; Obradovic, Vladimir Lj; Mitrovic, Zorica M
2016-01-01
Social services aimed at the elderly are facing great challenges caused by progressive aging of the global population but also by the constant pressure to spend funds in a rational manner. This paper focuses on analyzing the investments into human resources aimed at enhancing home care for the elderly since many countries have recorded progress in the area over the past years. The goal of this paper is to stress the significance of performing an economic analysis of the investment. This paper combines statistical analysis methods such as correlation and regression analysis, methods of economic analysis, and scenario method. The economic analysis of investing in human resources for home care service in Serbia showed that the both scenarios of investing in either additional home care hours or more beneficiaries are cost-efficient. However, the optimal solution with the positive (and the highest) value of economic net present value criterion is to invest in human resources to boost the number of home care hours from 6 to 8 hours per week and increase the number of the beneficiaries to 33%. This paper shows how the statistical and economic analysis results can be used to evaluate different scenarios and enable quality decision-making based on exact data in order to improve health and quality of life of the elderly and spend funds in a rational manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Yuanshun; Baek, Seung H.; Garcia-Diza, Alberto
2012-01-01
This paper designs a comprehensive approach based on the engineering machine/system concept, to model, analyze, and assess the level of CO2 exchange between the atmosphere and terrestrial ecosystems, which is an important factor in understanding changes in global climate. The focus of this article is on spatial patterns and on the correlation between levels of CO2 fluxes and a variety of influencing factors in eco-environments. The engineering/machine concept used is a system protocol that includes the sequential activities of design, test, observe, and model. This concept is applied to explicitly include various influencing factors and interactions associated with CO2 fluxes.more » To formulate effective models of a large and complex climate system, this article introduces a modeling technique that will be referred to as Stochastic Filtering Analysis of Variance (SFANOVA). The CO2 flux data observed from some sites of AmeriFlux are used to illustrate and validate the analysis, prediction and globalization capabilities of the proposed engineering approach and the SF-ANOVA technology. The SF-ANOVA modeling approach was compared to stepwise regression, ridge regression, and neural networks. The comparison indicated that the proposed approach is a valid and effective tool with similar accuracy and less complexity than the other procedures.« less
Global Sensitivity Analysis with Small Sample Sizes: Ordinary Least Squares Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Michael J.; Liu, Wei; Sivaramakrishnan, Raghu
2016-12-21
A new version of global sensitivity analysis is developed in this paper. This new version coupled with tools from statistics, machine learning, and optimization can devise small sample sizes that allow for the accurate ordering of sensitivity coefficients for the first 10-30 most sensitive chemical reactions in complex chemical-kinetic mechanisms, and is particularly useful for studying the chemistry in realistic devices. A key part of the paper is calibration of these small samples. Because these small sample sizes are developed for use in realistic combustion devices, the calibration is done over the ranges of conditions in such devices, with amore » test case being the operating conditions of a compression ignition engine studied earlier. Compression ignition engines operate under low-temperature combustion conditions with quite complicated chemistry making this calibration difficult, leading to the possibility of false positives and false negatives in the ordering of the reactions. So an important aspect of the paper is showing how to handle the trade-off between false positives and false negatives using ideas from the multiobjective optimization literature. The combination of the new global sensitivity method and the calibration are sample sizes a factor of approximately 10 times smaller than were available with our previous algorithm.« less
Coastal Low-Level Wind Jets: A Global Study Based On An Ensemble Of Reanalysis
NASA Astrophysics Data System (ADS)
Cardoso, R. M.; Lima, D. C. A.; Soares, P. M. M.; Semedo, A.
2017-12-01
Reanalyses data are a useful tool for climate and atmospheric studies since they provide physically consistent spatial and temporal information of observable and unobservable atmospheric parameters. Here, we propose the analysis of coastal low-level jets (CLLJs) resorting to three global reanalyses. The six hourly data from the European Centre for Medium-Range Weather Forecasts (ECMWF) Interim Reanalysis (ERA-Interim), the Japanese 55-year Reanalysis (JRA-55) and the Modern Era Retrospective-analysis for Research and Applications (MERRA2), are used to build an ensemble of reanalyses, for a period encompassing 1980-2016. A detailed global climatology of CLLJs is presented based on a reanalyses ensemble. This gives robustness to the CLLJs representation and also reduces uncertainty. The annual and diurnal cycle as well as the inter-annual variability are analysed in order to evaluate the temporal fluctuations of frequency of occurrence of CLLJ. The ensemble mean displays a good representation of their seasonal spatial variability. The Oman and Benguela CLLJs show, respectively, a decrease and increase of frequency of occurrence, which is statistically significant during boreal summer and austral spring for the period of study. The Oman CLLJ is the most intense and occurs in higher altitudes when compared with the other jets occurring during the season where each CLLJs have higher mean incidence.
SSS variability inferred from recent SMOS reprocessing at CATDS
NASA Astrophysics Data System (ADS)
Boutin, Jacqueline; Vergely, Jean-Luc; Marchand, Stéphane; Tarot, Stéphane; Hasson, Audrey; Reverdin, Gilles
2017-04-01
The Soil Moisture and Ocean Salinity (SMOS) satellite mission has monitored sea surface salinity (SSS) over the global ocean for over 7 years. In this poster, we present results obtained at the LOCEAN/ACRI-st expertise center using recent CATDS (Centre Aval de Traitement des Données) SMOS RE05 reprocessing., We find that correction for systematic errors and removal of data contaminated by ice and radio frequency interferences in fresh regions (river mouths, high latitudes) has been improved with respect to SMOS CATDS RE04 reprocessing. We analyze SSS variability as observed by SMOS on a wide range of spatial and temporal scales using various statistical indicators such as mean, median, standard deviation, minimum, maximum values and spectral analysis. We compare them with ARGO interpolated fields (In Situ Analysis System fields) at global scale and with ship SSS transects from the GOSUD and ORE SSS data base. This allows us 1) to demonstrate and quantify the improvement of SMOS SSS fields with respect to earlier versions and 2) to study SSS variability, especially at spatial scales between 50km and 600km not well covered globally by in situ network. The complementarity of this information with respect to SMAP (Soil Moisture Active Passive) SSS fields will be discussed.
Estimating global cropland production from 1961 to 2010
NASA Astrophysics Data System (ADS)
Han, Pengfei; Zeng, Ning; Zhao, Fang; Lin, Xiaohui
2017-09-01
Global cropland net primary production (NPP) has tripled over the last 50 years, contributing 17-45 % to the increase in global atmospheric CO2 seasonal amplitude. Although many regional-scale comparisons have been made between statistical data and modeling results, long-term national comparisons across global croplands are scarce due to the lack of detailed spatiotemporal management data. Here, we conducted a simulation study of global cropland NPP from 1961 to 2010 using a process-based model called Vegetation-Global Atmosphere-Soil (VEGAS) and compared the results with Food and Agriculture Organization of the United Nations (FAO) statistical data on both continental and country scales. According to the FAO data, the global cropland NPP was 1.3, 1.8, 2.2, 2.6, 3.0, and 3.6 PgC yr-1 in the 1960s, 1970s, 1980s, 1990s, 2000s, and 2010s, respectively. The VEGAS model captured these major trends on global and continental scales. The NPP increased most notably in the US Midwest, western Europe, and the North China Plain and increased modestly in Africa and Oceania. However, significant biases remained in some regions such as Africa and Oceania, especially in temporal evolution. This finding is not surprising as VEGAS is the first global carbon cycle model with full parameterization representing the Green Revolution. To improve model performance for different major regions, we modified the default values of management intensity associated with the agricultural Green Revolution differences across various regions to better match the FAO statistical data at the continental level and for selected countries. Across all the selected countries, the updated results reduced the RMSE from 19.0 to 10.5 TgC yr-1 (˜ 45 % decrease). The results suggest that these regional differences in model parameterization are due to differences in socioeconomic development. To better explain the past changes and predict the future trends, it is important to calibrate key parameters on regional scales and develop data sets for land management history.
Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E
2013-11-15
Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu
Zotti, Alessandro; Banzato, Tommaso; Gelain, Maria Elena; Centelleghe, Cinzia; Vaccaro, Calogero; Aresu, Luca
2015-04-25
Increased cortical or cortical and medullary echogenicity is one of the most common signs of chronic or acute kidney disease in dogs and cats. Subjective evaluation of the echogenicity is reported to be unreliable. Patient and technical-related factors affect in-vivo quantitative evaluation of the echogenicity of parenchymal organs. The aim of the present study is to investigate the relationship between histopathology and ex-vivo renal cortical echogenicity in dogs and cats devoid of any patient and technical-related biases. Kidney samples were collected from 68 dog and 32 cat cadavers donated by the owners to the Veterinary Teaching Hospital of the University of Padua and standardized ultrasonographic images of each sample were collected. The echogenicity of the renal cortex was quantitatively assessed by means of mean gray value (MGV), and then histopathological analysis was performed. Statistical analysis to evaluate the influence of histological lesions on MGV was performed. The differentiation efficiency of MGV to detect pathological changes in the kidneys was calculated for dogs and cats. Statistical analysis revealed that only glomerulosclerosis was an independent determinant of echogenicity in dogs whereas interstitial nephritis, interstitial necrosis and fibrosis were independent determinants of echogenicity in cats. The global influence of histological lesions on renal echogenicity was higher in cats (23%) than in dogs (12%). Different histopathological lesions influence the echogenicity of the kidneys in dogs and cats. Moreover, MGV is a poor test for distinguishing between normal and pathological kidneys in the dog with a sensitivity of 58.3% and specificity of 59.8%. Instead, it seems to perform globally better in the cat, resulting in a fair test, with a sensitivity of 80.6% and a specificity of 56%.
NASA Astrophysics Data System (ADS)
Scolini, Camilla; Messerotti, Mauro; Poedts, Stefaan; Rodriguez, Luciano
2018-02-01
In this study we present a statistical analysis of 53 fast Earth-directed halo CMEs observed by the SOHO/LASCO instrument during the period Jan. 2009-Sep. 2015, and we use this CME sample to test the capabilities of a Sun-to-Earth prediction scheme for CME geoeffectiveness. First, we investigate the CME association with other solar activity features by means of multi-instrument observations of the solar magnetic and plasma properties. Second, using coronagraphic images to derive the CME kinematical properties at 0.1 AU, we propagate the events to 1 AU by means of the WSA-ENLIL+Cone model. Simulation results at Earth are compared with in-situ observations at L1. By applying the pressure balance condition at the magnetopause and a solar wind-Kp index coupling function, we estimate the expected magnetospheric compression and geomagnetic activity level, and compare them with global data records. The analysis indicates that 82% of the CMEs arrived at Earth in the next 4 days. Almost the totality of them compressed the magnetopause below geosynchronous orbits and triggered a geomagnetic storm. Complex sunspot-rich active regions associated with energetic flares result the most favourable configurations from which geoeffective CMEs originate. The analysis of related SEP events shows that 74% of the CMEs associated with major SEPs were geoeffective. Moreover, the SEP production is enhanced in the case of fast and interacting CMEs. In this work we present a first attempt at applying a Sun-to-Earth geoeffectiveness prediction scheme - based on 3D simulations and solar wind-geomagnetic activity coupling functions - to a statistical set of potentially geoeffective halo CMEs. The results of the prediction scheme are in good agreement with geomagnetic activity data records, although further studies performing a fine-tuning of such scheme are needed.
Smith, W Brad; Cuenca Lara, Rubí Angélica; Delgado Caballero, Carina Edith; Godínez Valdivia, Carlos Isaías; Kapron, Joseph S; Leyva Reyes, Juan Carlos; Meneses Tovar, Carmen Lourdes; Miles, Patrick D; Oswalt, Sonja N; Ramírez Salgado, Mayra; Song, Xilong Alex; Stinson, Graham; Villela Gaytán, Sergio Armando
2018-05-21
Forests cannot be managed sustainably without reliable data to inform decisions. National Forest Inventories (NFI) tend to report national statistics, with sub-national stratification based on domestic ecological classification systems. It is becoming increasingly important to be able to report statistics on ecosystems that span international borders, as global change and globalization expand stakeholders' spheres of concern. The state of a transnational ecosystem can only be properly assessed by examining the entire ecosystem. In global forest resource assessments, it may be useful to break national statistics down by ecosystem, especially for large countries. The Inventory and Monitoring Working Group (IMWG) of the North American Forest Commission (NAFC) has begun developing a harmonized North American Forest Database (NAFD) for managing forest inventory data, enabling consistent, continental-scale forest assessment supporting ecosystem-level reporting and relational queries. The first iteration of the database contains data describing 1.9 billion ha, including 677.5 million ha of forest. Data harmonization is made challenging by the existence of definitions and methodologies tailored to suit national circumstances, emerging from each country's professional forestry development. This paper reports the methods used to synchronize three national forest inventories, starting with a small suite of variables and attributes.
Gregori, Josep; Méndez, Olga; Katsila, Theodora; Pujals, Mireia; Salvans, Cándida; Villarreal, Laura; Arribas, Joaquin; Tabernero, Josep; Sánchez, Alex; Villanueva, Josep
2014-07-15
Secretome profiling has become a methodology of choice for the identification of tumor biomarkers. We hypothesized that due to the dynamic nature of secretomes cellular perturbations could affect their composition but also change the global amount of protein secreted per cell. We confirmed our hypothesis by measuring the levels of secreted proteins taking into account the amount of proteome produced per cell. Then, we established a correlation between cell proliferation and protein secretion that explained the observed changes in global protein secretion. Next, we implemented a normalization correcting the statistical results of secretome studies by the global protein secretion of cells into a generalized linear model (GLM). The application of the normalization to two biological perturbations on tumor cells resulted in drastic changes in the list of statistically significant proteins. Furthermore, we found that known epithelial-to-mesenchymal transition (EMT) effectors were only statistically significant when the normalization was applied. Therefore, the normalization proposed here increases the sensitivity of statistical tests by increasing the number of true-positives. From an oncology perspective, the correlation between protein secretion and cellular proliferation suggests that slow-growing tumors could have high-protein secretion rates and consequently contribute strongly to tumor paracrine signaling.
Evaluation of different models to estimate the global solar radiation on inclined surface
NASA Astrophysics Data System (ADS)
Demain, C.; Journée, M.; Bertrand, C.
2012-04-01
Global and diffuse solar radiation intensities are, in general, measured on horizontal surfaces, whereas stationary solar conversion systems (both flat plate solar collector and solar photovoltaic) are mounted on inclined surface to maximize the amount of solar radiation incident on the collector surface. Consequently, the solar radiation incident measured on a tilted surface has to be determined by converting solar radiation from horizontal surface to tilted surface of interest. This study evaluates the performance of 14 models transposing 10 minutes, hourly and daily diffuse solar irradiation from horizontal to inclined surface. Solar radiation data from 8 months (April to November 2011) which include diverse atmospheric conditions and solar altitudes, measured on the roof of the radiation tower of the Royal Meteorological Institute of Belgium in Uccle (Longitude 4.35°, Latitude 50.79°) were used for validation purposes. The individual model performance is assessed by an inter-comparison between the calculated and measured solar global radiation on the south-oriented surface tilted at 50.79° using statistical methods. The relative performance of the different models under different sky conditions has been studied. Comparison of the statistical errors between the different radiation models in function of the clearness index shows that some models perform better under one type of sky condition. Putting together different models acting under different sky conditions can lead to a diminution of the statistical error between global measured solar radiation and global estimated solar radiation. As models described in this paper have been developed for hourly data inputs, statistical error indexes are minimum for hourly data and increase for 10 minutes and one day frequency data.
Local indicators of geocoding accuracy (LIGA): theory and application
Jacquez, Geoffrey M; Rommel, Robert
2009-01-01
Background Although sources of positional error in geographic locations (e.g. geocoding error) used for describing and modeling spatial patterns are widely acknowledged, research on how such error impacts the statistical results has been limited. In this paper we explore techniques for quantifying the perturbability of spatial weights to different specifications of positional error. Results We find that a family of curves describes the relationship between perturbability and positional error, and use these curves to evaluate sensitivity of alternative spatial weight specifications to positional error both globally (when all locations are considered simultaneously) and locally (to identify those locations that would benefit most from increased geocoding accuracy). We evaluate the approach in simulation studies, and demonstrate it using a case-control study of bladder cancer in south-eastern Michigan. Conclusion Three results are significant. First, the shape of the probability distributions of positional error (e.g. circular, elliptical, cross) has little impact on the perturbability of spatial weights, which instead depends on the mean positional error. Second, our methodology allows researchers to evaluate the sensitivity of spatial statistics to positional accuracy for specific geographies. This has substantial practical implications since it makes possible routine sensitivity analysis of spatial statistics to positional error arising in geocoded street addresses, global positioning systems, LIDAR and other geographic data. Third, those locations with high perturbability (most sensitive to positional error) and high leverage (that contribute the most to the spatial weight being considered) will benefit the most from increased positional accuracy. These are rapidly identified using a new visualization tool we call the LIGA scatterplot. Herein lies a paradox for spatial analysis: For a given level of positional error increasing sample density to more accurately follow the underlying population distribution increases perturbability and introduces error into the spatial weights matrix. In some studies positional error may not impact the statistical results, and in others it might invalidate the results. We therefore must understand the relationships between positional accuracy and the perturbability of the spatial weights in order to have confidence in a study's results. PMID:19863795
Recommended GIS Analysis Methods for Global Gridded Population Data
NASA Astrophysics Data System (ADS)
Frye, C. E.; Sorichetta, A.; Rose, A.
2017-12-01
When using geographic information systems (GIS) to analyze gridded, i.e., raster, population data, analysts need a detailed understanding of several factors that affect raster data processing, and thus, the accuracy of the results. Global raster data is most often provided in an unprojected state, usually in the WGS 1984 geographic coordinate system. Most GIS functions and tools evaluate data based on overlay relationships (area) or proximity (distance). Area and distance for global raster data can be either calculated directly using the various earth ellipsoids or after transforming the data to equal-area/equidistant projected coordinate systems to analyze all locations equally. However, unlike when projecting vector data, not all projected coordinate systems can support such analyses equally, and the process of transforming raster data from one coordinate space to another often results unmanaged loss of data through a process called resampling. Resampling determines which values to use in the result dataset given an imperfect locational match in the input dataset(s). Cell size or resolution, registration, resampling method, statistical type, and whether the raster represents continuous or discreet information potentially influence the quality of the result. Gridded population data represent estimates of population in each raster cell, and this presentation will provide guidelines for accurately transforming population rasters for analysis in GIS. Resampling impacts the display of high resolution global gridded population data, and we will discuss how to properly handle pyramid creation using the Aggregate tool with the sum option to create overviews for mosaic datasets.
NASA Astrophysics Data System (ADS)
Ionita, M.; Grosfeld, K.; Scholz, P.; Lohmann, G.
2016-12-01
Sea ice in both Polar Regions is an important indicator for the expression of global climate change and its polar amplification. Consequently, a broad information interest exists on sea ice, its coverage, variability and long term change. Knowledge on sea ice requires high quality data on ice extent, thickness and its dynamics. However, its predictability depends on various climate parameters and conditions. In order to provide insights into the potential development of a monthly/seasonal signal, we developed a robust statistical model based on ocean heat content, sea surface temperature and atmospheric variables to calculate an estimate of the September minimum sea ice extent for every year. Although previous statistical attempts at monthly/seasonal forecasts of September sea ice minimum show a relatively reduced skill, here it is shown that more than 97% (r = 0.98) of the September sea ice extent can predicted three months in advance by using previous months conditions via a multiple linear regression model based on global sea surface temperature (SST), mean sea level pressure (SLP), air temperature at 850hPa (TT850), surface winds and sea ice extent persistence. The statistical model is based on the identification of regions with stable teleconnections between the predictors (climatological parameters) and the predictand (here sea ice extent). The results based on our statistical model contribute to the sea ice prediction network for the sea ice outlook report (https://www.arcus.org/sipn) and could provide a tool for identifying relevant regions and climate parameters that are important for the sea ice development in the Arctic and for detecting sensitive and critical regions in global coupled climate models with focus on sea ice formation.
Spatial Scaling of Global Rainfall and Flood Extremes
NASA Astrophysics Data System (ADS)
Devineni, Naresh; Lall, Upmanu; Xi, Chen; Ward, Philip
2014-05-01
Floods associated with severe storms are a significant source of risk for property, life and supply chains. These property losses tend to be determined as much by the duration and spatial extent of flooding as by the depth and velocity of inundation. High duration floods are typically induced by persistent rainfall (up to 30 day duration) as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Events related to persistent and recurrent rainfall appear to correspond to the persistence of specific global climate patterns that may be identifiable from global, historical data fields, and also from climate models that project future conditions. In this paper, we investigate the statistical properties of the spatial manifestation of the rainfall exceedances and floods. We present the first ever results on a global analysis of the scaling characteristics of extreme rainfall and flood event duration, volumes and contiguous flooded areas as a result of large scale organization of long duration rainfall events. Results are organized by latitude and with reference to the phases of ENSO, and reveal surprising invariance across latitude. Speculation as to the potential relation to the dynamical factors is presented
Mapping permeability over the surface of the Earth
Gleeson, T.; Smith, L.; Moosdorf, N.; Hartmann, J.; Durr, H.H.; Manning, A.H.; Van Beek, L. P. H.; Jellinek, A. Mark
2011-01-01
Permeability, the ease of fluid flow through porous rocks and soils, is a fundamental but often poorly quantified component in the analysis of regional-scale water fluxes. Permeability is difficult to quantify because it varies over more than 13 orders of magnitude and is heterogeneous and dependent on flow direction. Indeed, at the regional scale, maps of permeability only exist for soil to depths of 1-2 m. Here we use an extensive compilation of results from hydrogeologic models to show that regional-scale (>5 km) permeability of consolidated and unconsolidated geologic units below soil horizons (hydrolithologies) can be characterized in a statistically meaningful way. The representative permeabilities of these hydrolithologies are used to map the distribution of near-surface (on the order of 100 m depth) permeability globally and over North America. The distribution of each hydrolithology is generally scale independent. The near-surface mean permeability is of the order of ???5 ?? 10-14 m2. The results provide the first global picture of near-surface permeability and will be of particular value for evaluating global water resources and modeling the influence of climate-surface-subsurface interactions on global climate change. Copyright ?? 2011 by the American Geophysical Union.
Mapping permeability over the surface of the Earth
Gleeson, Tom; Smith, Leslie; Moosdorf, Nils; Hartmann, Jens; Durr, Hans H.; Manning, Andrew H.; van Beek, Ludovicus P. H.; Jellinek, A. Mark
2011-01-01
Permeability, the ease of fluid flow through porous rocks and soils, is a fundamental but often poorly quantified component in the analysis of regional-scale water fluxes. Permeability is difficult to quantify because it varies over more than 13 orders of magnitude and is heterogeneous and dependent on flow direction. Indeed, at the regional scale, maps of permeability only exist for soil to depths of 1-2 m. Here we use an extensive compilation of results from hydrogeologic models to show that regional-scale (>5 km) permeability of consolidated and unconsolidated geologic units below soil horizons (hydrolithologies) can be characterized in a statistically meaningful way. The representative permeabilities of these hydrolithologies are used to map the distribution of near-surface (on the order of 100 m depth) permeability globally and over North America. The distribution of each hydrolithology is generally scale independent. The near-surface mean permeability is of the order of -5 x 10-14 m2. The results provide the first global picture of near-surface permeability and will be of particular value for evaluating global water resources and modeling the influence of climate-surface-subsurface interactions on global climate change.
Global-scale modes of surface temperature variability on interannual to century timescales
NASA Technical Reports Server (NTRS)
Mann, Michael E.; Park, Jeffrey
1994-01-01
Using 100 years of global temperature anomaly data, we have performed a singluar value decomposition of temperature variations in narrow frequency bands to isolate coherent spatio-temporal modes of global climate variability. Statistical significance is determined from confidence limits obtained by Monte Carlo simulations. Secular variance is dominated by a globally coherent trend; with nearly all grid points warming in phase at varying amplitude. A smaller, but significant, share of the secular variance corresponds to a pattern dominated by warming and subsequent cooling in the high latitude North Atlantic with a roughly centennial timescale. Spatial patterns associated with significant peaks in variance within a broad period range from 2.8 to 5.7 years exhibit characteristic El Nino-Southern Oscillation (ENSO) patterns. A recent transition to a regime of higher ENSO frequency is suggested by our analysis. An interdecadal mode in the 15-to-18 years period and a mode centered at 7-to-8 years period both exhibit predominantly a North Atlantic Oscillation (NAO) temperature pattern. A potentially significant decadal mode centered on 11-to-12 years period also exhibits an NAO temperature pattern and may be modulated by the century-scale North Atlantic variability.
Analysis of statistical misconception in terms of statistical reasoning
NASA Astrophysics Data System (ADS)
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
On the establishment and maintenance of a modern conventional terrestrial reference system
NASA Technical Reports Server (NTRS)
Bock, Y.; Zhu, S. Y.
1982-01-01
The frame of the Conventional Terrestrial Reference System (CTS) is defined by an adopted set of coordinates, at a fundamental epoxh, of a global network of stations which contribute the vertices of a fundamental polyhedron. A method to estimate this set of coordinates using a combination of modern three dimensional geodetic systems is presented. Once established, the function of the CTS is twofold. The first is to monitor the external (or global) motions of the polyhedron with respect to the frame of a Conventional Inertial Reference System, i.e., those motions common to all stations. The second is to monitor the internal motions (or deformations) of the polyhedron, i.e., those motions that are not common to all stations. Two possible estimators for use in earth deformation analysis are given and their statistical and physical properties are described.
Global Diffusion Pattern and Hot SPOT Analysis of Vaccine-Preventable Diseases
NASA Astrophysics Data System (ADS)
Jiang, Y.; Fan, F.; Zanoni, I. Holly; Li, Y.
2017-10-01
Spatial characteristics reveal the concentration of vaccine-preventable disease in Africa and the Near East and that disease dispersion is variable depending on disease. The exception is whooping cough, which has a highly variable center of concentration from year to year. Measles exhibited the only statistically significant spatial autocorrelation among all the diseases under investigation. Hottest spots of measles are in Africa and coldest spots are in United States, warm spots are in Near East and cool spots are in Western Europe. Finally, cases of measles could not be explained by the independent variables, including Gini index, health expenditure, or rate of immunization. Since the literature confirms that each of the selected variables is considered determinants of disease dissemination, it is anticipated that the global dataset of disease cases was influenced by reporting bias.
Geodetic positioning using a global positioning system of satellites
NASA Technical Reports Server (NTRS)
Fell, P. J.
1980-01-01
Geodetic positioning using range, integrated Doppler, and interferometric observations from a constellation of twenty-four Global Positioning System satellites is analyzed. A summary of the proposals for geodetic positioning and baseline determination is given which includes a description of measurement techniques and comments on rank deficiency and error sources. An analysis of variance comparison of range, Doppler, and interferometric time delay to determine their relative geometric strength for baseline determination is included. An analytic examination to the effect of a priori constraints on positioning using simultaneous observations from two stations is presented. Dynamic point positioning and baseline determination using range and Doppler is examined in detail. Models for the error sources influencing dynamic positioning are developed. Included is a discussion of atomic clock stability, and range and Doppler observation error statistics based on random correlated atomic clock error are derived.
Has the magnitude of floods across the USA changed with global CO2 levels?
Hirsch, Robert M.; Ryberg, Karen R.
2012-01-01
Statistical relationships between annual floods at 200 long-term (85–127 years of record) streamgauges in the coterminous United States and the global mean carbon dioxide concentration (GMCO2) record are explored. The streamgauge locations are limited to those with little or no regulation or urban development. The coterminous US is divided into four large regions and stationary bootstrapping is used to evaluate if the patterns of these statistical associations are significantly different from what would be expected under the null hypothesis that flood magnitudes are independent of GMCO2. In none of the four regions defined in this study is there strong statistical evidence for flood magnitudes increasing with increasing GMCO2. One region, the southwest, showed a statistically significant negative relationship between GMCO2 and flood magnitudes. The statistical methods applied compensate both for the inter-site correlation of flood magnitudes and the shorter-term (up to a few decades) serial correlation of floods.
Has the magnitude of floods across the USA changed with global CO 2 levels?
Hirsch, R.M.; Ryberg, K.R.
2012-01-01
Statistical relationships between annual floods at 200 long-term (85-127 years of record) streamgauges in the coterminous United States and the global mean carbon dioxide concentration (GMCO2) record are explored. The streamgauge locations are limited to those with little or no regulation or urban development. The coterminous US is divided into four large regions and stationary bootstrapping is used to evaluate if the patterns of these statistical associations are significantly different from what would be expected under the null hypothesis that flood magnitudes are independent of GMCO2. In none of the four regions defined in this study is there strong statistical evidence for flood magnitudes increasing with increasing GMCO2. One region, the southwest, showed a statistically significant negative relationship between GMCO2 and flood magnitudes. The statistical methods applied compensate both for the inter-site correlation of flood magnitudes and the shorter-term (up to a few decades) serial correlation of floods.
Inferring the anthropogenic contribution to local temperature extremes
Stone, Dáithí A.; Paciorek, Christopher J.; Prabhat, .; ...
2013-03-19
Here, in PNAS, Hansen et al. document an observed planet-wide increase in the frequency of extremely hot months and a decrease in the frequency of extremely cold months, consistent with earlier studies. This analysis is achieved through aggregation of gridded monthly temperature measurements from all over the planet. Such aggregation is advantageous in achieving statistical sampling power; however, it sacrifices regional specificity. Lastly, in that light, we find the conclusion of Hansen et al. that “the extreme summer climate anomalies in Texas in 2011, in Moscow in 2010, and in France in 2003 almost certainly would not have occurred inmore » the absence of global warming” to be unsubstantiated by their analysis.« less
NASA Astrophysics Data System (ADS)
Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.
2018-03-01
We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.
Inferring the anthropogenic contribution to local temperature extremes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, Dáithí A.; Paciorek, Christopher J.; Prabhat, .
Here, in PNAS, Hansen et al. document an observed planet-wide increase in the frequency of extremely hot months and a decrease in the frequency of extremely cold months, consistent with earlier studies. This analysis is achieved through aggregation of gridded monthly temperature measurements from all over the planet. Such aggregation is advantageous in achieving statistical sampling power; however, it sacrifices regional specificity. Lastly, in that light, we find the conclusion of Hansen et al. that “the extreme summer climate anomalies in Texas in 2011, in Moscow in 2010, and in France in 2003 almost certainly would not have occurred inmore » the absence of global warming” to be unsubstantiated by their analysis.« less
NASA Astrophysics Data System (ADS)
Nezhadhaghighi, Mohsen Ghasemi
2017-08-01
Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.
Nezhadhaghighi, Mohsen Ghasemi
2017-08-01
Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolker, Eugene
Our project focused primarily on analysis of different types of data produced by global high-throughput technologies, data integration of gene annotation, and gene and protein expression information, as well as on getting a better functional annotation of Shewanella genes. Specifically, four of our numerous major activities and achievements include the development of: statistical models for identification and expression proteomics, superior to currently available approaches (including our own earlier ones); approaches to improve gene annotations on the whole-organism scale; standards for annotation, transcriptomics and proteomics approaches; and generalized approaches for data integration of gene annotation, gene and protein expression information.
Whole-Genome Analysis of the SHORT-ROOT Developmental Pathway in Arabidopsis
Busch, Wolfgang; Cui, Hongchang; Wang, Jean Y; Blilou, Ikram; Hassan, Hala; Nakajima, Keiji; Matsumoto, Noritaka; Lohmann, Jan U; Scheres, Ben
2006-01-01
Stem cell function during organogenesis is a key issue in developmental biology. The transcription factor SHORT-ROOT (SHR) is a critical component in a developmental pathway regulating both the specification of the root stem cell niche and the differentiation potential of a subset of stem cells in the Arabidopsis root. To obtain a comprehensive view of the SHR pathway, we used a statistical method called meta-analysis to combine the results of several microarray experiments measuring the changes in global expression profiles after modulating SHR activity. Meta-analysis was first used to identify the direct targets of SHR by combining results from an inducible form of SHR driven by its endogenous promoter, ectopic expression, followed by cell sorting and comparisons of mutant to wild-type roots. Eight putative direct targets of SHR were identified, all with expression patterns encompassing subsets of the native SHR expression domain. Further evidence for direct regulation by SHR came from binding of SHR in vivo to the promoter regions of four of the eight putative targets. A new role for SHR in the vascular cylinder was predicted from the expression pattern of several direct targets and confirmed with independent markers. The meta-analysis approach was then used to perform a global survey of the SHR indirect targets. Our analysis suggests that the SHR pathway regulates root development not only through a large transcription regulatory network but also through hormonal pathways and signaling pathways using receptor-like kinases. Taken together, our results not only identify the first nodes in the SHR pathway and a new function for SHR in the development of the vascular tissue but also reveal the global architecture of this developmental pathway. PMID:16640459
Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-06-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.
Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong
2016-01-01
With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.
Fasoli, Marianna; Dal Santo, Silvia; Zenoni, Sara; Tornielli, Giovanni Battista; Farina, Lorenzo; Zamboni, Anita; Porceddu, Andrea; Venturini, Luca; Bicego, Manuele; Murino, Vittorio; Ferrarini, Alberto; Delledonne, Massimo; Pezzotti, Mario
2012-09-01
We developed a genome-wide transcriptomic atlas of grapevine (Vitis vinifera) based on 54 samples representing green and woody tissues and organs at different developmental stages as well as specialized tissues such as pollen and senescent leaves. Together, these samples expressed ∼91% of the predicted grapevine genes. Pollen and senescent leaves had unique transcriptomes reflecting their specialized functions and physiological status. However, microarray and RNA-seq analysis grouped all the other samples into two major classes based on maturity rather than organ identity, namely, the vegetative/green and mature/woody categories. This division represents a fundamental transcriptomic reprogramming during the maturation process and was highlighted by three statistical approaches identifying the transcriptional relationships among samples (correlation analysis), putative biomarkers (O2PLS-DA approach), and sets of strongly and consistently expressed genes that define groups (topics) of similar samples (biclustering analysis). Gene coexpression analysis indicated that the mature/woody developmental program results from the reiterative coactivation of pathways that are largely inactive in vegetative/green tissues, often involving the coregulation of clusters of neighboring genes and global regulation based on codon preference. This global transcriptomic reprogramming during maturation has not been observed in herbaceous annual species and may be a defining characteristic of perennial woody plants.
Evaluation of a Soil Moisture Data Assimilation System Over the Conterminous United States
NASA Astrophysics Data System (ADS)
Bolten, J. D.; Crow, W. T.; Zhan, X.; Reynolds, C. A.; Jackson, T. J.
2008-12-01
A data assimilation system has been designed to integrate surface soil moisture estimates from the EOS Advanced Microwave Scanning Radiometer (AMSR-E) with an online soil moisture model used by the USDA Foreign Agriculture Service for global crop estimation. USDA's International Production Assessment Division (IPAD) of the Office of Global Analysis (OGA) ingests global soil moisture within a Crop Assessment Data Retrieval and Evaluation (CADRE) Decision Support System (DSS) to provide nowcasts of crop conditions and agricultural-drought. This information is primarily used to derive mid-season crop yield estimates for the improvement of foreign market access for U.S. agricultural products. The CADRE is forced by daily meteorological observations (precipitation and temperature) provided by the Air Force Weather Agency (AFWA) and World Meteorological Organization (WMO). The integration of AMSR-E observations into the two-layer soil moisture model employed by IPAD can potentially enhance the reliability of the CADRE soil moisture estimates due to AMSR-E's improved repeat time and greater spatial coverage. Assimilation of the AMSR-E soil moisture estimates is accomplished using a 1-D Ensemble Kalman filter (EnKF) at daily time steps. A diagnostic calibration of the filter is performed using innovation statistics by accurately weighting the filter observation and modeling errors for three ranges of vegetation biomass density estimated using historical data from the Advanced Very High Resolution Radiometer (AVHRR). Assessment of the AMSR-E assimilation has been completed for a five year duration over the conterminous United States. To evaluate the ability of the filter to compensate for incorrect precipitation forcing into the model, a data denial approach is employed by comparing soil moisture results obtained from separate model simulations forced with precipitation products of varying uncertainty. An analysis of surface and root-zone anomalies is presented for each model simulation over the conterminous United States, as well as statistical assessments for each simulation over various land cover types.
Thomas, Diala; Bachy, Manon; Courvoisier, Aurélien; Dubory, Arnaud; Bouloussa, Houssam; Vialle, Raphaël
2015-03-01
Spinopelvic alignment is crucial in assessing an energy-efficient posture in both normal and disease states, such as high-displacement developmental spondylolisthesis (HDDS). The overall effect in patients with HDDS who have undergone local surgical correction of lumbosacral imbalance for the global correction of spinal balance remains unclear. This paper reports the progressive spontaneous improvement of global sagittal balance following surgical correction of lumbosacral imbalance in patients with HDDS. The records of 15 patients with HDDS who underwent surgery between 2005 and 2010 were reviewed. The treatment consisted of L4-sacrum reduction and fusion via a posterior approach, resulting in complete correction of lumbosacral kyphosis. Preoperative, 6-month postoperative, and final follow-up postoperative angular measurements were taken from full-spine lateral radiographs obtained with the patient in a standard standing position. Radiographic measurements included pelvic incidence, sacral slope, lumbar lordosis, and thoracic kyphosis. The degree of lumbosacral kyphosis was evaluated by the lumbosacral angle. Because of the small number of patients, nonparametric tests were considered for data analysis. Preoperative lumbosacral kyphosis and L-5 anterior slip were corrected by instrumentation. Transient neurological complications were noted in 5 patients. Statistical analysis showed a significant increase of thoracic kyphosis on 6-month postoperative and final follow-up radiographs (p < 0.001). A statistically significant decrease of lumbar lordosis was noted between preoperative and 6-month control radiographs (p < 0.001) and between preoperative and final follow-up radiographs (p < 0.001). Based on the authors' observations, this technique resulted in an effective reduction of L-5 anterior slip and significant reduction of lumbosacral kyphosis (from 69.8° to 105.13°). Due to complete reduction of lumbosacral kyphosis and anterior trunk displacement associated with L-5 anterior slipping, lumbar lordosis progressively decreased and thoracic kyphosis progressively increased postoperatively. Adjusting the sagittal trunk balance produced not only pelvic anteversion, but also reciprocal adjustment of lumbar lordosis and thoracic kyphosis, creating a satisfactory level of compensated global sagittal balance.
Ibinson, James W; Vogt, Keith M; Taylor, Kevin B; Dua, Shiv B; Becker, Christopher J; Loggia, Marco; Wasan, Ajay D
2015-12-01
The insula is uniquely located between the temporal and parietal cortices, making it anatomically well-positioned to act as an integrating center between the sensory and affective domains for the processing of painful stimulation. This can be studied through resting-state functional connectivity (fcMRI) imaging; however, the lack of a clear methodology for the analysis of fcMRI complicates the interpretation of these data during acute pain. Detected connectivity changes may reflect actual alterations in low-frequency synchronous neuronal activity related to pain, may be due to changes in global cerebral blood flow or the superimposed task-induced neuronal activity. The primary goal of this study was to investigate the effects of global signal regression (GSR) and task paradigm regression (TPR) on the changes in functional connectivity of the left (contralateral) insula in healthy subjects at rest and during acute painful electric nerve stimulation of the right hand. The use of GSR reduced the size and statistical significance of connectivity clusters and created negative correlation coefficients for some connectivity clusters. TPR with cyclic stimulation gave task versus rest connectivity differences similar to those with a constant task, suggesting that analysis which includes TPR is more accurately reflective of low-frequency neuronal activity. Both GSR and TPR have been inconsistently applied to fcMRI analysis. Based on these results, investigators need to consider the impact GSR and TPR have on connectivity during task performance when attempting to synthesize the literature.
NASA Astrophysics Data System (ADS)
Crutchfield, J.
2016-12-01
The presentation will discuss the current status of the International Production Assessment Division of the USDA ForeignAgricultural Service for operational monitoring and forecasting of current crop conditions, and anticipated productionchanges to produce monthly, multi-source consensus reports on global crop conditions including the use of Earthobservations (EO) from satellite and in situ sources.United States Department of Agriculture (USDA) Foreign Agricultural Service (FAS) International Production AssessmentDivision (IPAD) deals exclusively with global crop production forecasting and agricultural analysis in support of the USDAWorld Agricultural Outlook Board (WAOB) lockup process and contributions to the World Agricultural Supply DemandEstimates (WASE) report. Analysts are responsible for discrete regions or countries and conduct in-depth long-termresearch into national agricultural statistics, farming systems, climatic, environmental, and economic factors affectingcrop production. IPAD analysts become highly valued cross-commodity specialists over time, and are routinely soughtout for specialized analyses to support governmental studies. IPAD is responsible for grain, oilseed, and cotton analysison a global basis. IPAD is unique in the tools it uses to analyze crop conditions around the world, including customweather analysis software and databases, satellite imagery and value-added image interpretation products. It alsoincorporates all traditional agricultural intelligence resources into its forecasting program, to make the fullest use ofavailable information in its operational commodity forecasts and analysis. International travel and training play animportant role in learning about foreign agricultural production systems and in developing analyst knowledge andcapabilities.
NASA Astrophysics Data System (ADS)
Kwon, O.; Kim, W.; Kim, J.
2017-12-01
Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)
MTHFR gene polymorphism and risk of myeloid leukemia: a meta-analysis.
Dong, Song; Liu, Yueling; Chen, Jieping
2014-09-01
An increasing body of evidence has shown that the amino acid changes at position 1298 might eliminate methylenetetrahydrofolate reductase (MTHFR) enzyme activity, leading to insufficient folic acid and subsequent human chromosome breakage. Epidemiological studies have linked MTHFR single-nucleotide polymorphism (SNP) rs1801131 to myeloid leukemia risk, with considerable discrepancy in their results. We therefore were prompted to clarify this issue by use of a meta-analysis. The search terms were used to cover the possible reports in the MEDLINE, Web of Knowledge, and China National Knowledge Infrastructure (CNKI) databases. Odds ratios were estimated to assess the association of SNP rs1801131 with myeloid leukemia risk. Statistical heterogeneity was detected using the Q-statistic and I (2) metric. Subgroup analysis was performed by ethnicity, histological subtype, and Hardy-Weinberg equilibrium (HWE). This meta-analysis of eight publications with a total of 1,114 cases and 3,227 controls revealed no global association. Nor did the subgroup analysis according to histological subtype and HWE show any significant associations. However, Asian individuals who harbored the CC genotype were found to have 1.66-fold higher risk of myeloid leukemia (odds ratio, 1.66; 95 % confidence interval, 1.10 to 2.49; P h = 0.342; I (2) = 0.114). Our meta-analysis has presented evidence supporting a possible association between the CC genotype of MTHFR SNP rs1801131 and myeloid leukemia in Asian populations.
Decadal power in land air temperatures: Is it statistically significant?
NASA Astrophysics Data System (ADS)
Thejll, Peter A.
2001-12-01
The geographical distribution and properties of the well-known 10-11 year signal in terrestrial temperature records is investigated. By analyzing the Global Historical Climate Network data for surface air temperatures we verify that the signal is strongest in North America and is similar in nature to that reported earlier by R. G. Currie. The decadal signal is statistically significant for individual stations, but it is not possible to show that the signal is statistically significant globally, using strict tests. In North America, during the twentieth century, the decadal variability in the solar activity cycle is associated with the decadal part of the North Atlantic Oscillation index series in such a way that both of these signals correspond to the same spatial pattern of cooling and warming. A method for testing statistical results with Monte Carlo trials on data fields with specified temporal structure and specific spatial correlation retained is presented.
[Generalization of the results of clinical studies through the analysis of subgroups].
Costa, João; Fareleira, Filipa; Ascensão, Raquel; Vaz Carneiro, António
2012-01-01
Subgroup analysis in clinical trials are usually performed to define the potential heterogeneity of treatment effect in relation with the baseline risk, physiopathology, practical application of therapy or the under-utilization in clinical practice of effective interventions due to uncertainties of its benefit/risk ratio. When appropriately planned, subgroup analysis are a valid methodology the define benefits in subgroups of patients, thus providing good quality evidence to support clinical decision making. However, in order to be correct, subgroup analysis should be defined a priori, done in small numbers, should be fully reported and, most important, must endure statistical tests for interaction. In this paper we present an example of the treatment of post-menopausal osteoporosis, in which the benefits of an intervention (the higher the fracture risk is, the better the benefit is) with a specific agent (bazedoxifene) was only disclosed after a post-hoc analysis of the initial global trial sample.
Statistical assessment of crosstalk enrichment between gene groups in biological networks.
McCormack, Theodore; Frings, Oliver; Alexeyenko, Andrey; Sonnhammer, Erik L L
2013-01-01
Analyzing groups of functionally coupled genes or proteins in the context of global interaction networks has become an important aspect of bioinformatic investigations. Assessing the statistical significance of crosstalk enrichment between or within groups of genes can be a valuable tool for functional annotation of experimental gene sets. Here we present CrossTalkZ, a statistical method and software to assess the significance of crosstalk enrichment between pairs of gene or protein groups in large biological networks. We demonstrate that the standard z-score is generally an appropriate and unbiased statistic. We further evaluate the ability of four different methods to reliably recover crosstalk within known biological pathways. We conclude that the methods preserving the second-order topological network properties perform best. Finally, we show how CrossTalkZ can be used to annotate experimental gene sets using known pathway annotations and that its performance at this task is superior to gene enrichment analysis (GEA). CrossTalkZ (available at http://sonnhammer.sbc.su.se/download/software/CrossTalkZ/) is implemented in C++, easy to use, fast, accepts various input file formats, and produces a number of statistics. These include z-score, p-value, false discovery rate, and a test of normality for the null distributions.
NASA Astrophysics Data System (ADS)
Flores-Marquez, Leticia Elsa; Ramirez Rojaz, Alejandro; Telesca, Luciano
2015-04-01
The study of two statistical approaches is analyzed for two different types of data sets, one is the seismicity generated by the subduction processes occurred at south Pacific coast of Mexico between 2005 and 2012, and the other corresponds to the synthetic seismic data generated by a stick-slip experimental model. The statistical methods used for the present study are the visibility graph in order to investigate the time dynamics of the series and the scaled probability density function in the natural time domain to investigate the critical order of the system. This comparison has the purpose to show the similarities between the dynamical behaviors of both types of data sets, from the point of view of critical systems. The observed behaviors allow us to conclude that the experimental set up globally reproduces the behavior observed in the statistical approaches used to analyses the seismicity of the subduction zone. The present study was supported by the Bilateral Project Italy-Mexico Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences, jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016.
NASA Astrophysics Data System (ADS)
Li, Xing; Mao, Fenlan; Lin, Mian; Yadi, Nan
2017-12-01
This research presents a conceptual framework for incorporating organizational learning and innovations as the mediating variables between market orientation and organizational performance. The samples of this study include 145 companies from the information technology industry in the Scientific Industry Parks. The global model fit is acceptable. This empirical result supports the constructs mentioned above. 1. Market orientation has a positive and direct impact on organizational learning, administrative and technical innovation. 2. Organizational learning has a positive and direct impact on administrative and technical innovation, but with no statistically significant direct impact on performance. 3. Organizational learning does have a positive and indirect impact on performance by means of organizational innovations. 4. It is not statistically significant that the impact of the two innovation types (both administrative and technical) interact with each other.
NASA Astrophysics Data System (ADS)
Broothaerts, Nils; López-Sáez, José Antonio; Verstraeten, Gert
2017-04-01
Reconstructing and quantifying human impact is an important step to understand human-environment interactions in the past. Quantitative measures of human impact on the landscape are needed to fully understand long-term influence of anthropogenic land cover changes on the global climate, ecosystems and geomorphic processes. Nevertheless, quantifying past human impact is not straightforward. Recently, multivariate statistical analysis of fossil pollen records have been proposed to characterize vegetation changes and to get insights in past human impact. Although statistical analysis of fossil pollen data can provide useful insights in anthropogenic driven vegetation changes, still it cannot be used as an absolute quantification of past human impact. To overcome this shortcoming, in this study fossil pollen records were included in a multivariate statistical analysis (cluster analysis and non-metric multidimensional scaling (NMDS)) together with modern pollen data and modern vegetation data. The information on the modern pollen and vegetation dataset can be used to get a better interpretation of the representativeness of the fossil pollen records, and can result in a full quantification of human impact in the past. This methodology was applied in two contrasting environments: SW Turkey and Central Spain. For each region, fossil pollen data from different study sites were integrated, together with modern pollen data and information on modern vegetation. In this way, arboreal cover, grazing pressure and agricultural activities in the past were reconstructed and quantified. The data from SW Turkey provides new integrated information on changing human impact through time in the Sagalassos territory, and shows that human impact was most intense during the Hellenistic and Roman Period (ca. 2200-1750 cal a BP) and decreased and changed in nature afterwards. The data from central Spain shows for several sites that arboreal cover decreases bellow 5% from the Feudal period onwards (ca. 850 cal a BP) related to increasing human impact in the landscape. At other study sites arboreal cover remained above 25% beside significant human impact. Overall, the presented examples from two contrasting environments shows how cluster analysis and NMDS of modern and fossil pollen data can help to provide quantitative insights in anthropogenic land cover changes. Our study extensively discuss and illustrate the possibilities and limitations of statistical analysis of pollen data to quantify human induced land use changes.
Sun, Gang; Hoff, Steven J; Zelle, Brian C; Nelson, Minda A
2008-12-01
It is vital to forecast gas and particle matter concentrations and emission rates (GPCER) from livestock production facilities to assess the impact of airborne pollutants on human health, ecological environment, and global warming. Modeling source air quality is a complex process because of abundant nonlinear interactions between GPCER and other factors. The objective of this study was to introduce statistical methods and radial basis function (RBF) neural network to predict daily source air quality in Iowa swine deep-pit finishing buildings. The results show that four variables (outdoor and indoor temperature, animal units, and ventilation rates) were identified as relative important model inputs using statistical methods. It can be further demonstrated that only two factors, the environment factor and the animal factor, were capable of explaining more than 94% of the total variability after performing principal component analysis. The introduction of fewer uncorrelated variables to the neural network would result in the reduction of the model structure complexity, minimize computation cost, and eliminate model overfitting problems. The obtained results of RBF network prediction were in good agreement with the actual measurements, with values of the correlation coefficient between 0.741 and 0.995 and very low values of systemic performance indexes for all the models. The good results indicated the RBF network could be trained to model these highly nonlinear relationships. Thus, the RBF neural network technology combined with multivariate statistical methods is a promising tool for air pollutant emissions modeling.
NASA Technical Reports Server (NTRS)
Jasperson, W. H.; Nastron, G. D.; Davis, R. E.; Holdeman, J. D.
1984-01-01
Summary studies are presented for the entire cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle-concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud-encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long-range airline routes, and to assess the probability and extent of laminaar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical. This report is presented in two volumes. Volume I contains the narrative, analysis, and conclusions. Volume II contains five supporting appendixes.
FGWAS: Functional genome wide association analysis.
Huang, Chao; Thompson, Paul; Wang, Yalin; Yu, Yang; Zhang, Jingwen; Kong, Dehan; Colen, Rivka R; Knickmeyer, Rebecca C; Zhu, Hongtu
2017-10-01
Functional phenotypes (e.g., subcortical surface representation), which commonly arise in imaging genetic studies, have been used to detect putative genes for complexly inherited neuropsychiatric and neurodegenerative disorders. However, existing statistical methods largely ignore the functional features (e.g., functional smoothness and correlation). The aim of this paper is to develop a functional genome-wide association analysis (FGWAS) framework to efficiently carry out whole-genome analyses of functional phenotypes. FGWAS consists of three components: a multivariate varying coefficient model, a global sure independence screening procedure, and a test procedure. Compared with the standard multivariate regression model, the multivariate varying coefficient model explicitly models the functional features of functional phenotypes through the integration of smooth coefficient functions and functional principal component analysis. Statistically, compared with existing methods for genome-wide association studies (GWAS), FGWAS can substantially boost the detection power for discovering important genetic variants influencing brain structure and function. Simulation studies show that FGWAS outperforms existing GWAS methods for searching sparse signals in an extremely large search space, while controlling for the family-wise error rate. We have successfully applied FGWAS to large-scale analysis of data from the Alzheimer's Disease Neuroimaging Initiative for 708 subjects, 30,000 vertices on the left and right hippocampal surfaces, and 501,584 SNPs. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rbaihi, E.; Belafhal, A.; Vander Auwera, J.; Naı̈m, S.; Fayt, A.
1998-09-01
We have measured the FT spectrum of natural OCS from 4800 to 8000 cm-1with a near Doppler resolution and a line-position accuracy between 2 and 8 × 10-4cm-1. For the normal isotopic species16O12C32S, 37 vibrational transitions have been analyzed for both frequencies and intensities. We also report six bands of16O12C34S, five bands of16O13C32S, two bands of16O12C33S, and two bands of18O12C32S. Important effective Herman-Wallis terms are explained by the anharmonic resonances between closely spaced states. As those results complete the study of the Fourier transform spectra of natural carbonyl sulfide from 1800 to 8000 cm-1, a new global rovibrational analysis of16O12C32S has been performed. We have determined a set of 148 molecular parameters, and a statistical agreement is obtained with all the available experimental data.
Zhang, Yun-Peng; Qian, Bang-Ping; Qiu, Yong; Qu, Zhe; Mao, Sai-Hu; Jiang, Jun; Zhu, Ze-Zhang
2017-08-01
This is a retrospective study. To identify the relationship between global sagittal alignment and health-related quality of life (HRQoL) in ankylosing spondylitis (AS) patients with thoracolumbar kyphosis. Little data are available on correlation between global sagittal alignment and HRQoL in AS. A total of 107 AS patients were included in this study. The radiographic parameters were measured on lateral radiographs of the whole spine, including sagittal vertical axias (SVA), spinosacral angle (SSA), spinopelvic angle (SPA), and T1 pelvic angle (TPA). HRQoL was assessed using the oswestry disability index questionnaire, the bath ankylosing spondylitis disease activity index, the bath ankylosing spondylitis functional index, and short form-36 questionnaire. The patients were divided into 2 groups: group A (n=76, global kyphosis≤70 degrees), group B (n=31, global kyphosis>70 degrees). Statistical analysis was performed to identify significant differences between these 2 groups. In addition, correlation analysis and multiple regression analysis between radiologic parameters and clinical questionnaires were conducted. With respect to SVA, SSA, SPA, TPA, and HRQoL scores, significant differences were observed between 2 groups (P<0.05). Also, SVA, SSA, SPA, and TPA were significantly related to HRQoL. Multiple regression analysis revealed that SVA, SSA, SPA, and TPA were significant parameters in the prediction of HRQoL in AS patients with thoracolumbar kyphosis. Of note, HRQoL related much more to SSA and SPA than SVA and TPA. AS patients with moderate and severe deformity were demonstrated to be significantly different in terms of SVA, SSA, SPA, TPA, and HRQoL. Moreover, SVA, SSA, SPA, and TPA correlated with HRQoL significantly. In particular, SSA and SPA could better predict HRQoL than SVA and TPA in AS patients with thoracolumbar kyphosis.
Quasi-Global Precipitation as Depicted in the GPCPV2.2 and TMPA V7
NASA Technical Reports Server (NTRS)
Huffman, George J.; Bolvin, David T.; Nelkin, Eric J.; Adler, Robert F.
2012-01-01
After a lengthy incubation period, the year 2012 saw the release of the Global Precipitation Climatology Project (GPCP) Version 2.2 monthly dataset and the TRMM Multi-satellite Precipitation Analysis (TMPA) Version 7. One primary feature of the new data sets is that DMSP SSMIS data are now used, which entailed a great deal of development work to overcome calibration issues. In addition, the GPCP V2.2 included a slight upgrade to the gauge analysis input datasets, particularly over China, while the TMPA V7 saw more-substantial upgrades: 1) The gauge analysis record in Version 6 used the (older) GPCP monitoring product through April 2005 and the CAMS analysis thereafter, which introduced an inhomogeneity. Version 7 uses the Version 6 GPCC Full analysis, switching to the Version 4 Monitoring analysis thereafter. 2) The inhomogeneously processed AMSU record in Version 6 is uniformly processed in Version 7. 3) The TMI and SSMI input data have been upgraded to the GPROF2010 algorithm. The global-change, water cycle, and other user communities are acutely interested in how these data sets compare, as consistency between differently processed, long-term, quasi-global data sets provides some assurance that the statistics computed from them provide a good representation of the atmosphere's behavior. Within resolution differences, the two data sets agree well over land as the gauge data (which tend to dominate the land results) are the same in both. Over ocean the results differ more because the satellite products used for calibration are based on very different algorithms and the dominant input data sets are different. The time series of tropical (30 N-S) ocean average precipitation shows that the TMPA V7 follows the TMI-PR Combined Product calibrator, although running approximately 5% higher on average. The GPCP and TMPA time series are fairly consistent, although the GPCP runs approximately 10% lower than the TMPA, and has a somewhat larger interannual variation. As well, the GPCP and TMPA interannual variations have an apparent phase shift, with GPCP running a few months later. Additional diagnostics will include mean maps and selected scatter plots.
Bremer, Peer-Timo; Weber, Gunther; Tierny, Julien; Pascucci, Valerio; Day, Marcus S; Bell, John B
2011-09-01
Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications, these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single-pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of features for any given parameter selection in a postprocessing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.
Dynamics of Autotrophic Marine Planktonic Thaumarchaeota in the East China Sea
Hu, Anyi; Yang, Zao; Yu, Chang-Ping; Jiao, Nianzhi
2013-01-01
The ubiquitous and abundant distribution of ammonia-oxidizing Thaumarchaeota in marine environments is now well documented, and their crucial role in the global nitrogen cycle has been highlighted. However, the potential contribution of Thaumarchaeota in the carbon cycle remains poorly understood. Here we present for the first time a seasonal investigation on the shelf region (bathymetry≤200 m) of the East China Sea (ECS) involving analysis of both thaumarchaeal 16S rRNA and autotrophy-related genes (acetyl-CoA carboxylase gene, accA). Quantitative PCR results clearly showed a higher abundance of thaumarchaeal 16S and accA genes in late-autumn (November) than summer (August), whereas the diversity and community structure of autotrophic Thaumarchaeota showed no statistically significant difference between different seasons as revealed by thaumarchaeal accA gene clone libraries. Phylogenetic analysis indicated that shallow ecotypes dominated the autotrophic Thaumarchaeota in the ECS shelf (86.3% of total sequences), while a novel non-marine thaumarchaeal accA lineage was identified in the Changjiang estuary in summer (when freshwater plumes become larger) but not in autumn, implying that Changjiang freshwater discharge played a certain role in transporting terrestrial microorganisms to the ECS. Multivariate statistical analysis indicated that the biogeography of the autotrophic Thaumarchaeota in the shelf water of the ECS was influenced by complex hydrographic conditions. However, an in silico comparative analysis suggested that the diversity and abundance of the autotrophic Thaumarchaeota might be biased by the ‘universal’ thaumarchaeal accA gene primers Cren529F/Cren981R since this primer set is likely to miss some members within particular phylogenetic groups. Collectively, this study improved our understanding of the biogeographic patterns of the autotrophic Thaumarchaeota in temperate coastal waters, and suggested that new accA primers with improved coverage and sensitivity across phylogenetic groups are needed to gain a more thorough understanding of the role of the autotrophic Thaumarchaeota in the global carbon cycle. PMID:23565298
Zhou, Yi-Biao; Liang, Song; Wang, Qi-Xing; Gong, Yu-Han; Nie, Shi-Jiao; Nan, Lei; Yang, Ai-Hui; Liao, Qiang; Song, Xiu-Xia; Jiang, Qing-Wu
2014-03-10
HIV-, HCV- and HIV/HCV co-infections among drug users have become a rapidly emerging global public health problem. In order to constrain the dual epidemics of HIV/AIDS and drug use, China has adopted a methadone maintenance treatment program (MMTP) since 2004. Studies of the geographic heterogeneity of HIV and HCV infections at a local scale are sparse, which has critical implications for future MMTP implementation and health policies covering both HIV and HCV prevention among drug users in China. This study aimed to characterize geographic patterns of HIV and HCV prevalence at the township level among drug users in a Yi Autonomous Prefecture, Southwest of China. Data on demographic and clinical characteristics of all clients in the 11 MMTP clinics of the Yi Autonomous Prefecture from March 2004 to December 2012 were collected. A GIS-based geographic analysis involving geographic autocorrelation analysis and geographic scan statistics were employed to identify the geographic distribution pattern of HIV-, HCV- and co-infections among drug users. A total of 6690 MMTP clients was analyzed. The prevalence of HIV-, HCV- and co-infections were 25.2%, 30.8%, and 10.9% respectively. There were significant global and local geographic autocorrelations for HIV-, HCV-, and co-infection. The Moran's I was 0.3015, 0.3449, and 0.3155, respectively (P < 0.0001). Both the geographic autocorrelation analysis and the geographic scan statistical analysis showed that HIV-, HCV-, and co-infections in the prefecture exhibited significant geographic clustering at the township level. The geographic distribution pattern of each infection group was different. HIV-, HCV-, and co-infections among drug users in the Yi Autonomous Prefecture all exhibited substantial geographic heterogeneity at the township level. The geographic distribution patterns of the three groups were different. These findings imply that it may be necessary to inform or invent site-specific intervention strategies to better devote currently limited resource to combat these two viruses.
Dynamics of autotrophic marine planktonic thaumarchaeota in the East China Sea.
Hu, Anyi; Yang, Zao; Yu, Chang-Ping; Jiao, Nianzhi
2013-01-01
The ubiquitous and abundant distribution of ammonia-oxidizing Thaumarchaeota in marine environments is now well documented, and their crucial role in the global nitrogen cycle has been highlighted. However, the potential contribution of Thaumarchaeota in the carbon cycle remains poorly understood. Here we present for the first time a seasonal investigation on the shelf region (bathymetry≤200 m) of the East China Sea (ECS) involving analysis of both thaumarchaeal 16S rRNA and autotrophy-related genes (acetyl-CoA carboxylase gene, accA). Quantitative PCR results clearly showed a higher abundance of thaumarchaeal 16S and accA genes in late-autumn (November) than summer (August), whereas the diversity and community structure of autotrophic Thaumarchaeota showed no statistically significant difference between different seasons as revealed by thaumarchaeal accA gene clone libraries. Phylogenetic analysis indicated that shallow ecotypes dominated the autotrophic Thaumarchaeota in the ECS shelf (86.3% of total sequences), while a novel non-marine thaumarchaeal accA lineage was identified in the Changjiang estuary in summer (when freshwater plumes become larger) but not in autumn, implying that Changjiang freshwater discharge played a certain role in transporting terrestrial microorganisms to the ECS. Multivariate statistical analysis indicated that the biogeography of the autotrophic Thaumarchaeota in the shelf water of the ECS was influenced by complex hydrographic conditions. However, an in silico comparative analysis suggested that the diversity and abundance of the autotrophic Thaumarchaeota might be biased by the 'universal' thaumarchaeal accA gene primers Cren529F/Cren981R since this primer set is likely to miss some members within particular phylogenetic groups. Collectively, this study improved our understanding of the biogeographic patterns of the autotrophic Thaumarchaeota in temperate coastal waters, and suggested that new accA primers with improved coverage and sensitivity across phylogenetic groups are needed to gain a more thorough understanding of the role of the autotrophic Thaumarchaeota in the global carbon cycle.
Characterizing and Addressing the Need for Statistical Adjustment of Global Climate Model Data
NASA Astrophysics Data System (ADS)
White, K. D.; Baker, B.; Mueller, C.; Villarini, G.; Foley, P.; Friedman, D.
2017-12-01
As part of its mission to research and measure the effects of the changing climate, the U. S. Army Corps of Engineers (USACE) regularly uses the World Climate Research Programme's Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model dataset. However, these data are generated at a global level and are not fine-tuned for specific watersheds. This often causes CMIP5 output to vary from locally observed patterns in the climate. Several downscaling methods have been developed to increase the resolution of the CMIP5 data and decrease systemic differences to support decision-makers as they evaluate results at the watershed scale. Evaluating preliminary comparisons of observed and projected flow frequency curves over the US revealed a simple framework for water resources decision makers to plan and design water resources management measures under changing conditions using standard tools. Using this framework as a basis, USACE has begun to explore to use of statistical adjustment to alter global climate model data to better match the locally observed patterns while preserving the general structure and behavior of the model data. When paired with careful measurement and hypothesis testing, statistical adjustment can be particularly effective at navigating the compromise between the locally observed patterns and the global climate model structures for decision makers.
Evolution of regional to global paddy rice mapping methods
NASA Astrophysics Data System (ADS)
Dong, J.; Xiao, X.
2016-12-01
Paddy rice agriculture plays an important role in various environmental issues including food security, water use, climate change, and disease transmission. However, regional and global paddy rice maps are surprisingly scarce and sporadic despite numerous efforts in paddy rice mapping algorithms and applications. In this presentation we would like to review the existing paddy rice mapping methods from the literatures ranging from the 1980s to 2015. In particular, we illustrated the evolution of these paddy rice mapping efforts, looking specifically at the future trajectory of paddy rice mapping methodologies. The biophysical features and growth phases of paddy rice were analyzed first, and feature selections for paddy rice mapping were analyzed from spectral, polarimetric, temporal, spatial, and textural aspects. We sorted out paddy rice mapping algorithms into four categories: 1) Reflectance data and image statistic-based approaches, 2) vegetation index (VI) data and enhanced image statistic-based approaches, 3) VI or RADAR backscatter-based temporal analysis approaches, and 4) phenology-based approaches through remote sensing recognition of key growth phases. The phenology-based approaches using unique features of paddy rice (e.g., transplanting) for mapping have been increasingly used in paddy rice mapping. Based on the literature review, we discussed a series of issues for large scale operational paddy rice mapping.
Level set method for image segmentation based on moment competition
NASA Astrophysics Data System (ADS)
Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai
2015-05-01
We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.
The challenges for scientific publishing, 60 years on.
Hausmann, Laura; Murphy, Sean P
2016-10-01
The most obvious difference in science publishing between 'then' and 'now' is the dramatic change in the communication of data and in their interpretation. The democratization of science via the Internet has brought not only benefits but also challenges to publishing including fraudulent behavior and plagiarism, data and statistics reporting standards, authorship confirmation and other issues which affect authors, readers, and publishers in different ways. The wide accessibility of data on a global scale permits acquisition and meta-analysis to mine for novel synergies, and has created a highly commercialized environment. As we illustrate here, identifying unacceptable practices leads to changes in the standards for data reporting. In the past decades, science publishing underwent dramatic changes in the communication of data and in their interpretation, in the increasing pressure and commercialization, and the democratization of science on a global scale via the Internet. This article reviews the benefits and challenges to publishing including fraudulent behavior and plagiarism, data and statistics reporting standards, authorship confirmation and other issues, with the aim to provide readers with practical examples and hands-on guidelines. As we illustrate here, identifying unacceptable practices leads to changes in the standards for data reporting. This article is part of the 60th Anniversary special issue. © 2016 International Society for Neurochemistry.
Background noise spectra of global seismic stations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wada, M.M.; Claassen, J.P.
1996-08-01
Over an extended period of time station noise spectra were collected from various sources for use in estimating the detection and location performance of global networks of seismic stations. As the database of noise spectra enlarged and duplicate entries became available, an effort was mounted to more carefully select station noise spectra while discarding others. This report discusses the methodology and criteria by which the noise spectra were selected. It also identifies and illustrates the station noise spectra which survived the selection process and which currently contribute to the modeling efforts. The resulting catalog of noise statistics not only benefitsmore » those who model network performance but also those who wish to select stations on the basis of their noise level as may occur in designing networks or in selecting seismological data for analysis on the basis of station noise level. In view of the various ways by which station noise were estimated by the different contributors, it is advisable that future efforts which predict network performance have available station noise data and spectral estimation methods which are compatible with the statistics underlying seismic noise. This appropriately requires (1) averaging noise over seasonal and/or diurnal cycles, (2) averaging noise over time intervals comparable to those employed by actual detectors, and (3) using logarithmic measures of the noise.« less
NASA Technical Reports Server (NTRS)
Ricks, Trenton M.; Lacy, Thomas E., Jr.; Bednarcyk, Brett A.; Arnold, Steven M.; Hutchins, John W.
2014-01-01
A multiscale modeling methodology was developed for continuous fiber composites that incorporates a statistical distribution of fiber strengths into coupled multiscale micromechanics/finite element (FE) analyses. A modified two-parameter Weibull cumulative distribution function, which accounts for the effect of fiber length on the probability of failure, was used to characterize the statistical distribution of fiber strengths. A parametric study using the NASA Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) was performed to assess the effect of variable fiber strengths on local composite failure within a repeating unit cell (RUC) and subsequent global failure. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a unidirectional SCS-6/TIMETAL 21S metal matrix composite tensile dogbone specimen at 650 degC. Multiscale progressive failure analyses were performed to quantify the effect of spatially varying fiber strengths on the RUC-averaged and global stress-strain responses and failure. The ultimate composite strengths and distribution of failure locations (predominately within the gage section) reasonably matched the experimentally observed failure behavior. The predicted composite failure behavior suggests that use of macroscale models that exploit global geometric symmetries are inappropriate for cases where the actual distribution of local fiber strengths displays no such symmetries. This issue has not received much attention in the literature. Moreover, the model discretization at a specific length scale can have a profound effect on the computational costs associated with multiscale simulations.models that yield accurate yet tractable results.
SERE: single-parameter quality control and sample comparison for RNA-Seq.
Schulze, Stefan K; Kanwar, Rahul; Gölzenleuchter, Meike; Therneau, Terry M; Beutler, Andreas S
2012-10-03
Assessing the reliability of experimental replicates (or global alterations corresponding to different experimental conditions) is a critical step in analyzing RNA-Seq data. Pearson's correlation coefficient r has been widely used in the RNA-Seq field even though its statistical characteristics may be poorly suited to the task. Here we present a single-parameter test procedure for count data, the Simple Error Ratio Estimate (SERE), that can determine whether two RNA-Seq libraries are faithful replicates or globally different. Benchmarking shows that the interpretation of SERE is unambiguous regardless of the total read count or the range of expression differences among bins (exons or genes), a score of 1 indicating faithful replication (i.e., samples are affected only by Poisson variation of individual counts), a score of 0 indicating data duplication, and scores >1 corresponding to true global differences between RNA-Seq libraries. On the contrary the interpretation of Pearson's r is generally ambiguous and highly dependent on sequencing depth and the range of expression levels inherent to the sample (difference between lowest and highest bin count). Cohen's simple Kappa results are also ambiguous and are highly dependent on the choice of bins. For quantifying global sample differences SERE performs similarly to a measure based on the negative binomial distribution yet is simpler to compute. SERE can therefore serve as a straightforward and reliable statistical procedure for the global assessment of pairs or large groups of RNA-Seq datasets by a single statistical parameter.
SERE: Single-parameter quality control and sample comparison for RNA-Seq
2012-01-01
Background Assessing the reliability of experimental replicates (or global alterations corresponding to different experimental conditions) is a critical step in analyzing RNA-Seq data. Pearson’s correlation coefficient r has been widely used in the RNA-Seq field even though its statistical characteristics may be poorly suited to the task. Results Here we present a single-parameter test procedure for count data, the Simple Error Ratio Estimate (SERE), that can determine whether two RNA-Seq libraries are faithful replicates or globally different. Benchmarking shows that the interpretation of SERE is unambiguous regardless of the total read count or the range of expression differences among bins (exons or genes), a score of 1 indicating faithful replication (i.e., samples are affected only by Poisson variation of individual counts), a score of 0 indicating data duplication, and scores >1 corresponding to true global differences between RNA-Seq libraries. On the contrary the interpretation of Pearson’s r is generally ambiguous and highly dependent on sequencing depth and the range of expression levels inherent to the sample (difference between lowest and highest bin count). Cohen’s simple Kappa results are also ambiguous and are highly dependent on the choice of bins. For quantifying global sample differences SERE performs similarly to a measure based on the negative binomial distribution yet is simpler to compute. Conclusions SERE can therefore serve as a straightforward and reliable statistical procedure for the global assessment of pairs or large groups of RNA-Seq datasets by a single statistical parameter. PMID:23033915
NASA Astrophysics Data System (ADS)
Lin, J.
2017-12-01
Recent studies have revealed the issue of globalizing air pollution through complex coupling of atmospheric transport (physical route) and economic trade (socioeconomic route). Recognition of such globalizing air pollution has important implications for understanding the impacts of regional and global consumption (of goods and services) on air quality, public health, climate and the ecosystems. And addressing these questions often requires improved modeling, measurements and economic-emission statistics. This talk will introduce the concept and mechanism of globalizing air pollution, with following demonstrations based on recent works on modeling, satellite measurement and multi-disciplinary assessment.
Estimating the volume and age of water stored in global lakes using a geo-statistical approach
Messager, Mathis Loïc; Lehner, Bernhard; Grill, Günther; Nedeva, Irena; Schmitt, Oliver
2016-01-01
Lakes are key components of biogeochemical and ecological processes, thus knowledge about their distribution, volume and residence time is crucial in understanding their properties and interactions within the Earth system. However, global information is scarce and inconsistent across spatial scales and regions. Here we develop a geo-statistical model to estimate the volume of global lakes with a surface area of at least 10 ha based on the surrounding terrain information. Our spatially resolved database shows 1.42 million individual polygons of natural lakes with a total surface area of 2.67 × 106 km2 (1.8% of global land area), a total shoreline length of 7.2 × 106 km (about four times longer than the world's ocean coastline) and a total volume of 181.9 × 103 km3 (0.8% of total global non-frozen terrestrial water stocks). We also compute mean and median hydraulic residence times for all lakes to be 1,834 days and 456 days, respectively. PMID:27976671
NASA Astrophysics Data System (ADS)
Wu, Qing; Luu, Quang-Hung; Tkalich, Pavel; Chen, Ge
2018-04-01
Having great impacts on human lives, global warming and associated sea level rise are believed to be strongly linked to anthropogenic causes. Statistical approach offers a simple and yet conceptually verifiable combination of remotely connected climate variables and indices, including sea level and surface temperature. We propose an improved statistical reconstruction model based on the empirical dynamic control system by taking into account the climate variability and deriving parameters from Monte Carlo cross-validation random experiments. For the historic data from 1880 to 2001, we yielded higher correlation results compared to those from other dynamic empirical models. The averaged root mean square errors are reduced in both reconstructed fields, namely, the global mean surface temperature (by 24-37%) and the global mean sea level (by 5-25%). Our model is also more robust as it notably diminished the unstable problem associated with varying initial values. Such results suggest that the model not only enhances significantly the global mean reconstructions of temperature and sea level but also may have a potential to improve future projections.
The Impact of Arts Activity on Nursing Staff Well-Being: An Intervention in the Workplace
Karpavičiūtė, Simona; Macijauskienė, Jūratė
2016-01-01
Over 59 million workers are employed in the healthcare sector globally, with a daily risk of being exposed to a complex variety of health and safety hazards. The purpose of this study was to investigate the impact of arts activity on the well-being of nursing staff. During October–December 2014, 115 nursing staff working in a hospital, took part in this study, which lasted for 10 weeks. The intervention group (n = 56) took part in silk painting activities once a week. Data was collected using socio-demographic questions, the Warwick-Edinburgh Mental Well-Being Scale, Short Form—36 Health Survey questionnaire, Reeder stress scale, and Multidimensional fatigue inventory (before and after art activities in both groups). Statistical data analysis included descriptive statistics (frequency, percentage, mean, standard deviation), non-parametric statistics analysis (Man Whitney U Test; Wilcoxon signed—ranks test), Fisher’s exact test and reliability analysis (Cronbach’s Alpha). The level of significance was set at p ≤ 0.05. In the intervention group, there was a tendency for participation in arts activity having a positive impact on their general health and mental well-being, reducing stress and fatigue, awaking creativity and increasing a sense of community at work. The control group did not show any improvements. Of the intervention group 93% reported enjoyment, with 75% aspiring to continue arts activity in the future. This research suggests that arts activity, as a workplace intervention, can be used to promote nursing staff well-being at work. PMID:27104550
SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliopoulos, AS; Sun, X; Floros, D
Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less
NASA Astrophysics Data System (ADS)
Octavianty, Toharudin, Toni; Jaya, I. G. N. Mindra
2017-03-01
Tuberculosis (TB) is a disease caused by a bacterium, called Mycobacterium tuberculosis, which typically attacks the lungs but can also affect the kidney, spine, and brain (Centers for Disease Control and Prevention). Indonesia had the largest number of TB cases after India (Global Tuberculosis Report 2015 by WHO). The distribution of Mycobacterium tuberculosis genotypes in Indonesia showed the high genetic diversity and tended to vary by geographic regions. For instance, in Bandung city, the prevalence rate of TB morbidity is quite high. A number of TB patients belong to the counted data. To determine the factors that significantly influence the number of tuberculosis patients in each location of the observations can be used statistical analysis tool that is Geographically Weighted Poisson Regression Semiparametric (GWPRS). GWPRS is an extension of the Poisson regression and GWPR that is influenced by geographical factors, and there is also variables that influence globally and locally. Using the TB Data in Bandung city (in 2015), the results show that the global and local variables that influence the number of tuberculosis patients in every sub-district.
Measuring Spatial Dependence for Infectious Disease Epidemiology
Grabowski, M. Kate; Cummings, Derek A. T.
2016-01-01
Global spatial clustering is the tendency of points, here cases of infectious disease, to occur closer together than expected by chance. The extent of global clustering can provide a window into the spatial scale of disease transmission, thereby providing insights into the mechanism of spread, and informing optimal surveillance and control. Here the authors present an interpretable measure of spatial clustering, τ, which can be understood as a measure of relative risk. When biological or temporal information can be used to identify sets of potentially linked and likely unlinked cases, this measure can be estimated without knowledge of the underlying population distribution. The greater our ability to distinguish closely related (i.e., separated by few generations of transmission) from more distantly related cases, the more closely τ will track the true scale of transmission. The authors illustrate this approach using examples from the analyses of HIV, dengue and measles, and provide an R package implementing the methods described. The statistic presented, and measures of global clustering in general, can be powerful tools for analysis of spatially resolved data on infectious diseases. PMID:27196422
Liaw, Winston; Bazemore, Andrew; Xierali, Imam; Walden, John; Diller, Phillip; Morikawa, Masahiro J
2013-04-01
Global health tracks (GHTs) improve knowledge and skills, but their impact on career plans is unclear. The objective of this analysis was to determine whether GHT participants are more likely to practice in underserved areas than nonparticipants. In this retrospective cohort study, using the 2009 American Medical Association Masterfile, we assessed the practice location of the 480 graduates from 1980--2008 of two family medicine residencies-Residency 1 and Residency 2. The outcomes of interest were the percentage of graduates in health professional shortage areas (HPSAs), medically underserved areas (MUAs), rural areas, areas of dense poverty, or any area of underservice. Thirty-seven percent of Residency 1 participants and 20% of nonparticipants practiced in HPSAs; 69% of Residency 2 participants and 55.5% of nonparticipants practiced in areas of dense poverty. All other combined and within-residency differences were not statistically significant. These findings neither confirm nor refute the results of prior surveys suggesting that global health training is associated with increased interest in underserved care. Studies involving more GHTs and complimentary methods are needed to more precisely elucidate the impact of this training.
NASA Astrophysics Data System (ADS)
Ayanshola, Ayanniyi; Olofintoye, Oluwatosin; Obadofin, Ebenezer
2018-03-01
This study presents the impact of global warming on precipitation patterns in Ilorin, Nigeria, and its implications on the hydrological balance of the Awun basin under the prevailing climate conditions. The study analyzes 39 years of rainfall and temperature data of relevant stations within the study areas. Simulated data from the Coupled Global Climate model for historical and future datasets were investigated under the A2 emission scenario. Statistical regression and a Mann-Kendall analysis were performed to determine the nature of the trends in the hydrological variables and their significance levels, while a Soil and Water Assessment Tool (SWAT) was used to estimate the water balance and derive the stream flow and yield of the Awun basin. The study revealed that while minimum and maximum temperatures in Ilorin are increasing, rainfall is generally decreasing. The assessment of the trends in the water balance parameters in the basin indicates that there is no improvement in the water yield as the population increases. This may result in major stresses to the water supply in the near future.
Tropospheric Ozone Assessment Report: Database and Metrics Data of Global Surface Ozone Observations
Schultz, Martin G.; Schroder, Sabine; Lyapina, Olga; ...
2017-11-27
In support of the first Tropospheric Ozone Assessment Report (TOAR) a relational database of global surface ozone observations has been developed and populated with hourly measurement data and enhanced metadata. A comprehensive suite of ozone data products including standard statistics, health and vegetation impact metrics, and trend information, are made available through a common data portal and a web interface. These data form the basis of the TOAR analyses focusing on human health, vegetation, and climate relevant ozone issues, which are part of this special feature. Cooperation among many data centers and individual researchers worldwide made it possible to buildmore » the world's largest collection of in-situ hourly surface ozone data covering the period from 1970 to 2015. By combining the data from almost 10,000 measurement sites around the world with global metadata information, new analyses of surface ozone have become possible, such as the first globally consistent characterisations of measurement sites as either urban or rural/remote. Exploitation of these global metadata allows for new insights into the global distribution, and seasonal and long-term changes of tropospheric ozone and they enable TOAR to perform the first, globally consistent analysis of present-day ozone concentrations and recent ozone changes with relevance to health, agriculture, and climate. Considerable effort was made to harmonize and synthesize data formats and metadata information from various networks and individual data submissions. Extensive quality control was applied to identify questionable and erroneous data, including changes in apparent instrument offsets or calibrations. Such data were excluded from TOAR data products. Limitations of a posteriori data quality assurance are discussed. As a result of the work presented here, global coverage of surface ozone data for scientific analysis has been significantly extended. Yet, large gaps remain in the surface observation network both in terms of regions without monitoring, and in terms of regions that have monitoring programs but no public access to the data archive. Therefore future improvements to the database will require not only improved data harmonization, but also expanded data sharing and increased monitoring in data-sparse regions.« less
Tropospheric Ozone Assessment Report: Database and Metrics Data of Global Surface Ozone Observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Martin G.; Schroder, Sabine; Lyapina, Olga
In support of the first Tropospheric Ozone Assessment Report (TOAR) a relational database of global surface ozone observations has been developed and populated with hourly measurement data and enhanced metadata. A comprehensive suite of ozone data products including standard statistics, health and vegetation impact metrics, and trend information, are made available through a common data portal and a web interface. These data form the basis of the TOAR analyses focusing on human health, vegetation, and climate relevant ozone issues, which are part of this special feature. Cooperation among many data centers and individual researchers worldwide made it possible to buildmore » the world's largest collection of in-situ hourly surface ozone data covering the period from 1970 to 2015. By combining the data from almost 10,000 measurement sites around the world with global metadata information, new analyses of surface ozone have become possible, such as the first globally consistent characterisations of measurement sites as either urban or rural/remote. Exploitation of these global metadata allows for new insights into the global distribution, and seasonal and long-term changes of tropospheric ozone and they enable TOAR to perform the first, globally consistent analysis of present-day ozone concentrations and recent ozone changes with relevance to health, agriculture, and climate. Considerable effort was made to harmonize and synthesize data formats and metadata information from various networks and individual data submissions. Extensive quality control was applied to identify questionable and erroneous data, including changes in apparent instrument offsets or calibrations. Such data were excluded from TOAR data products. Limitations of a posteriori data quality assurance are discussed. As a result of the work presented here, global coverage of surface ozone data for scientific analysis has been significantly extended. Yet, large gaps remain in the surface observation network both in terms of regions without monitoring, and in terms of regions that have monitoring programs but no public access to the data archive. Therefore future improvements to the database will require not only improved data harmonization, but also expanded data sharing and increased monitoring in data-sparse regions.« less
NASA Astrophysics Data System (ADS)
Hopke, Jill E.
In this dissertation, I study the network structure and content of a transnational movement against hydraulic fracturing and shale development, Global Frackdown. I apply a relational perspective to the study of role of digital technologies in transnational political organizing. I examine the structure of the social movement through analysis of hyperlinking patterns and qualitative analysis of the content of the ties in one strand of the movement. I explicate three actor types: coordinator, broker, and hyper-local. This research intervenes in the paradigm that considers international actors as the key nodes to understanding transnational advocacy networks. I argue this focus on the international scale obscures the role of globally minded local groups in mediating global issues back to the hyper-local scale. While international NGOs play a coordinating role, local groups with a global worldview can connect transnational movements to the hyper-local scale by networking with groups that are too small to appear in a transnational network. I also examine the movement's messaging on the social media platform Twitter. Findings show that Global Frackdown tweeters engage in framing practices of: movement convergence and solidarity, declarative and targeted engagement, prefabricated messaging, and multilingual tweeting. The episodic, loosely-coordinated and often personalized, transnational framing practices of Global Frackdown tweeters support core organizers' goal of promoting the globalness of activism to ban fracking. Global Frackdown activists use Twitter as a tool to advance the movement and to bolster its moral authority, as well as to forge linkages between localized groups on a transnational scale. Lastly, I study the relative prominence of negative messaging about shale development in relation to pro-shale messaging on Twitter across five hashtags (#fracking, #globalfrackdown, #natgas, #shale, and #shalegas). I analyze the top actors tweeting using the #fracking hashtag and receiving mentions with the hashtag. Results show statistically significant differences in the sentiment about shale development across the five hashtags. Results also indicate that the discourse on the main contested hashtag #fracking is dominated by activists, both individual activists and organizations.