Sample records for latin hypercube sampling

  1. Progressive Sampling Technique for Efficient and Robust Uncertainty and Sensitivity Analysis of Environmental Systems Models: Stability and Convergence

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, R.; Hosseini, N.; Razavi, S.

    2016-12-01

    Modern earth and environmental models are usually characterized by a large parameter space and high computational cost. These two features prevent effective implementation of sampling-based analysis such as sensitivity and uncertainty analysis, which require running these computationally expensive models several times to adequately explore the parameter/problem space. Therefore, developing efficient sampling techniques that scale with the size of the problem, computational budget, and users' needs is essential. In this presentation, we propose an efficient sequential sampling strategy, called Progressive Latin Hypercube Sampling (PLHS), which provides an increasingly improved coverage of the parameter space, while satisfying pre-defined requirements. The original Latin hypercube sampling (LHS) approach generates the entire sample set in one stage; on the contrary, PLHS generates a series of smaller sub-sets (also called `slices') while: (1) each sub-set is Latin hypercube and achieves maximum stratification in any one dimensional projection; (2) the progressive addition of sub-sets remains Latin hypercube; and thus (3) the entire sample set is Latin hypercube. Therefore, it has the capability to preserve the intended sampling properties throughout the sampling procedure. PLHS is deemed advantageous over the existing methods, particularly because it nearly avoids over- or under-sampling. Through different case studies, we show that PHLS has multiple advantages over the one-stage sampling approaches, including improved convergence and stability of the analysis results with fewer model runs. In addition, PLHS can help to minimize the total simulation time by only running the simulations necessary to achieve the desired level of quality (e.g., accuracy, and convergence rate).

  2. A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Wyss, Gregory Dane

    2004-07-01

    This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a librarymore » that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.« less

  3. A Novel Latin Hypercube Algorithm via Translational Propagation

    PubMed Central

    Pan, Guang; Ye, Pengcheng

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the experimental designs used. Optimal Latin hypercube designs are frequently used and have been shown to have good space-filling and projective properties. However, the high cost in constructing them limits their use. In this paper, a methodology for creating novel Latin hypercube designs via translational propagation and successive local enumeration algorithm (TPSLE) is developed without using formal optimization. TPSLE algorithm is based on the inspiration that a near optimal Latin Hypercube design can be constructed by a simple initial block with a few points generated by algorithm SLE as a building block. In fact, TPSLE algorithm offers a balanced trade-off between the efficiency and sampling performance. The proposed algorithm is compared to two existing algorithms and is found to be much more efficient in terms of the computation time and has acceptable space-filling and projective properties. PMID:25276844

  4. Optimization of an Advanced Multi-Junction Solar-Cell Design for Space Environments (AM0) Using Nearly Orthogonal Latin Hypercubes

    DTIC Science & Technology

    2017-06-01

    AN ADVANCED MULTI-JUNCTION SOLAR -CELL DESIGN FOR SPACE ENVIRONMENTS (AM0) USING NEARLY ORTHOGONAL LATIN HYPERCUBES by Silvio Pueschel June...ADVANCED MULTI-JUNCTION SOLAR -CELL DESIGN FOR SPACE ENVIRONMENTS (AM0) USING NEARLY ORTHOGONAL LATIN HYPERCUBES 5. FUNDING NUMBERS 6. AUTHOR(S) Silvio...multi-junction solar cells with Silvaco Atlas simulation software. It introduces the nearly orthogonal Latin hypercube (NOLH) design of experiments (DoE

  5. Identification of stochastic interactions in nonlinear models of structural mechanics

    NASA Astrophysics Data System (ADS)

    Kala, Zdeněk

    2017-07-01

    In the paper, the polynomial approximation is presented by which the Sobol sensitivity analysis can be evaluated with all sensitivity indices. The nonlinear FEM model is approximated. The input area is mapped using simulations runs of Latin Hypercube Sampling method. The domain of the approximation polynomial is chosen so that it were possible to apply large number of simulation runs of Latin Hypercube Sampling method. The method presented also makes possible to evaluate higher-order sensitivity indices, which could not be identified in case of nonlinear FEM.

  6. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    PubMed

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  7. Extension of latin hypercube samples with correlated variables.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hora, Stephen Curtis; Helton, Jon Craig; Sallaberry, Cedric J. PhD.

    2006-11-01

    A procedure for extending the size of a Latin hypercube sample (LHS) with rank correlated variables is described and illustrated. The extension procedure starts with an LHS of size m and associated rank correlation matrix C and constructs a new LHS of size 2m that contains the elements of the original LHS and has a rank correlation matrix that is close to the original rank correlation matrix C. The procedure is intended for use in conjunction with uncertainty and sensitivity analysis of computationally demanding models in which it is important to make efficient use of a necessarily limited number ofmore » model evaluations.« less

  8. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  9. Remote sensing data with the conditional latin hypercube sampling and geostatistical approach to delineate landscape changes induced by large chronological physical disturbances.

    PubMed

    Lin, Yu-Pin; Chu, Hone-Jay; Wang, Cheng-Long; Yu, Hsiao-Hsuan; Wang, Yung-Chieh

    2009-01-01

    This study applies variogram analyses of normalized difference vegetation index (NDVI) images derived from SPOT HRV images obtained before and after the ChiChi earthquake in the Chenyulan watershed, Taiwan, as well as images after four large typhoons, to delineate the spatial patterns, spatial structures and spatial variability of landscapes caused by these large disturbances. The conditional Latin hypercube sampling approach was applied to select samples from multiple NDVI images. Kriging and sequential Gaussian simulation with sufficient samples were then used to generate maps of NDVI images. The variography of NDVI image results demonstrate that spatial patterns of disturbed landscapes were successfully delineated by variogram analysis in study areas. The high-magnitude Chi-Chi earthquake created spatial landscape variations in the study area. After the earthquake, the cumulative impacts of typhoons on landscape patterns depended on the magnitudes and paths of typhoons, but were not always evident in the spatiotemporal variability of landscapes in the study area. The statistics and spatial structures of multiple NDVI images were captured by 3,000 samples from 62,500 grids in the NDVI images. Kriging and sequential Gaussian simulation with the 3,000 samples effectively reproduced spatial patterns of NDVI images. However, the proposed approach, which integrates the conditional Latin hypercube sampling approach, variogram, kriging and sequential Gaussian simulation in remotely sensed images, efficiently monitors, samples and maps the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial variability and heterogeneity.

  10. Latin hypercube approach to estimate uncertainty in ground water vulnerability

    USGS Publications Warehouse

    Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.

    2007-01-01

    A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.

  11. Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping.

    NASA Astrophysics Data System (ADS)

    Hamalainen, Sampsa; Geng, Xiaoyuan; He, Juanxia

    2017-04-01

    Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping. Sampsa Hamalainen, Xiaoyuan Geng, and Juanxia, He. AAFC - Agriculture and Agr-Food Canada, Ottawa, Canada. The Latin Hypercube Sampling (LHS) approach to assist with Digital Soil Mapping has been developed for some time now, however the purpose of this work was to complement LHS with use of multiple spatial resolutions of covariate datasets and variability in the range of sampling points produced. This allowed for specific sets of LHS points to be produced to fulfil the needs of various partners from multiple projects working in the Ontario and Prince Edward Island provinces of Canada. Secondary soil and environmental attributes are critical inputs that are required in the development of sampling points by LHS. These include a required Digital Elevation Model (DEM) and subsequent covariate datasets produced as a result of a Digital Terrain Analysis performed on the DEM. These additional covariates often include but are not limited to Topographic Wetness Index (TWI), Length-Slope (LS) Factor, and Slope which are continuous data. The range of specific points created in LHS included 50 - 200 depending on the size of the watershed and more importantly the number of soil types found within. The spatial resolution of covariates included within the work ranged from 5 - 30 m. The iterations within the LHS sampling were run at an optimal level so the LHS model provided a good spatial representation of the environmental attributes within the watershed. Also, additional covariates were included in the Latin Hypercube Sampling approach which is categorical in nature such as external Surficial Geology data. Some initial results of the work include using a 1000 iteration variable within the LHS model. 1000 iterations was consistently a reasonable value used to produce sampling points that provided a good spatial representation of the environmental attributes. When working within the same spatial resolution for covariates, however only modifying the desired number of sampling points produced, the change of point location portrayed a strong geospatial relationship when using continuous data. Access to agricultural fields and adjacent land uses is often "pinned" as the greatest deterrent to performing soil sampling for both soil survey and soil attribute validation work. The lack of access can be a result of poor road access and/or difficult geographical conditions to navigate for field work individuals. This seems a simple yet continuous issue to overcome for the scientific community and in particular, soils professionals. The ability to assist with the ease of access to sampling points will be in the future a contribution to the Latin Hypercube Sampling (LHS) approach. By removing all locations in the initial instance from the DEM, the LHS model can be restricted to locations only with access from the adjacent road or trail. To further the approach, a road network geospatial dataset can be included within spatial Geographic Information Systems (GIS) applications to access already produced points using a shortest-distance network method.

  12. Improvements in sub-grid, microphysics averages using quadrature based approaches

    NASA Astrophysics Data System (ADS)

    Chowdhary, K.; Debusschere, B.; Larson, V. E.

    2013-12-01

    Sub-grid variability in microphysical processes plays a critical role in atmospheric climate models. In order to account for this sub-grid variability, Larson and Schanen (2013) propose placing a probability density function on the sub-grid cloud microphysics quantities, e.g. autoconversion rate, essentially interpreting the cloud microphysics quantities as a random variable in each grid box. Random sampling techniques, e.g. Monte Carlo and Latin Hypercube, can be used to calculate statistics, e.g. averages, on the microphysics quantities, which then feed back into the model dynamics on the coarse scale. We propose an alternate approach using numerical quadrature methods based on deterministic sampling points to compute the statistical moments of microphysics quantities in each grid box. We have performed a preliminary test on the Kessler autoconversion formula, and, upon comparison with Latin Hypercube sampling, our approach shows an increased level of accuracy with a reduction in sample size by almost two orders of magnitude. Application to other microphysics processes is the subject of ongoing research.

  13. Hybrid real-code ant colony optimisation for constrained mechanical design

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Bureerat, Sujin

    2016-01-01

    This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.

  14. Influence of scanning parameters on the estimation accuracy of control points of B-spline surfaces

    NASA Astrophysics Data System (ADS)

    Aichinger, Julia; Schwieger, Volker

    2018-04-01

    This contribution deals with the influence of scanning parameters like scanning distance, incidence angle, surface quality and sampling width on the average estimated standard deviations of the position of control points from B-spline surfaces which are used to model surfaces from terrestrial laser scanning data. The influence of the scanning parameters is analyzed by the Monte Carlo based variance analysis. The samples were generated for non-correlated and correlated data, leading to the samples generated by Latin hypercube and replicated Latin hypercube sampling algorithms. Finally, the investigations show that the most influential scanning parameter is the distance from the laser scanner to the object. The angle of incidence shows a significant effect for distances of 50 m and longer, while the surface quality contributes only negligible effects. The sampling width has no influence. Optimal scanning parameters can be found in the smallest possible object distance at an angle of incidence close to 0° in the highest surface quality. The consideration of correlations improves the estimation accuracy and underlines the importance of complete stochastic models for TLS measurements.

  15. FORMAL UNCERTAINTY ANALYSIS OF A LAGRANGIAN PHOTOCHEMICAL AIR POLLUTION MODEL. (R824792)

    EPA Science Inventory

    This study applied Monte Carlo analysis with Latin
    hypercube sampling to evaluate the effects of uncertainty
    in air parcel trajectory paths, emissions, rate constants,
    deposition affinities, mixing heights, and atmospheric stability
    on predictions from a vertically...

  16. Approximation of the exponential integral (well function) using sampling methods

    NASA Astrophysics Data System (ADS)

    Baalousha, Husam Musa

    2015-04-01

    Exponential integral (also known as well function) is often used in hydrogeology to solve Theis and Hantush equations. Many methods have been developed to approximate the exponential integral. Most of these methods are based on numerical approximations and are valid for a certain range of the argument value. This paper presents a new approach to approximate the exponential integral. The new approach is based on sampling methods. Three different sampling methods; Latin Hypercube Sampling (LHS), Orthogonal Array (OA), and Orthogonal Array-based Latin Hypercube (OA-LH) have been used to approximate the function. Different argument values, covering a wide range, have been used. The results of sampling methods were compared with results obtained by Mathematica software, which was used as a benchmark. All three sampling methods converge to the result obtained by Mathematica, at different rates. It was found that the orthogonal array (OA) method has the fastest convergence rate compared with LHS and OA-LH. The root mean square error RMSE of OA was in the order of 1E-08. This method can be used with any argument value, and can be used to solve other integrals in hydrogeology such as the leaky aquifer integral.

  17. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  18. Analysis of hepatitis C viral dynamics using Latin hypercube sampling

    NASA Astrophysics Data System (ADS)

    Pachpute, Gaurav; Chakrabarty, Siddhartha P.

    2012-12-01

    We consider a mathematical model comprising four coupled ordinary differential equations (ODEs) to study hepatitis C viral dynamics. The model includes the efficacies of a combination therapy of interferon and ribavirin. There are two main objectives of this paper. The first one is to approximate the percentage of cases in which there is a viral clearance in absence of treatment as well as percentage of response to treatment for various efficacy levels. The other is to better understand and identify the parameters that play a key role in the decline of viral load and can be estimated in a clinical setting. A condition for the stability of the uninfected and the infected steady states is presented. A large number of sample points for the model parameters (which are physiologically feasible) are generated using Latin hypercube sampling. An analysis of the simulated values identifies that, approximately 29.85% cases result in clearance of the virus during the early phase of the infection. Results from the χ2 and the Spearman's tests done on the samples, indicate a distinctly different distribution for certain parameters for the cases exhibiting viral clearance under the combination therapy.

  19. Uncertainty quantification of resonant ultrasound spectroscopy for material property and single crystal orientation estimation on a complex part

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Mayes, Alexander; Jauriqui, Leanne; Biedermann, Eric; Heffernan, Julieanne; Livings, Richard; Goodlet, Brent; Mazdiyasni, Siamack

    2018-04-01

    A case study is presented evaluating uncertainty in Resonance Ultrasound Spectroscopy (RUS) inversion for a single crystal (SX) Ni-based superalloy Mar-M247 cylindrical dog-bone specimens. A number of surrogate models were developed with FEM model solutions, using different sampling schemes (regular grid, Monte Carlo sampling, Latin Hyper-cube sampling) and model approaches, N-dimensional cubic spline interpolation and Kriging. Repeated studies were used to quantify the well-posedness of the inversion problem, and the uncertainty was assessed in material property and crystallographic orientation estimates given typical geometric dimension variability in aerospace components. Surrogate model quality was found to be an important factor in inversion results when the model more closely represents the test data. One important discovery was when the model matches well with test data, a Kriging surrogate model using un-sorted Latin Hypercube sampled data performed as well as the best results from an N-dimensional interpolation model using sorted data. However, both surrogate model quality and mode sorting were found to be less critical when inverting properties from either experimental data or simulated test cases with uncontrolled geometric variation.

  20. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this relative improvement decreases with increasing number of sample points and input parameter dimensions. Since the computational time and efforts for generating the sample designs in the two approaches are identical, the use of midpoint LHS as the initial design in OLHS is thus recommended.

  1. Comparison of Drainmod Based Watershed Scale Models

    Treesearch

    Glenn P. Fernandez; George M. Chescheir; R. Wayne Skaggs; Devendra M. Amatya

    2004-01-01

    Watershed scale hydrology and water quality models (DRAINMOD-DUFLOW, DRAINMOD-W, DRAINMOD-GIS and WATGIS) that describe the nitrogen loadings at the outlet of poorly drained watersheds were examined with respect to their accuracy and uncertainty in model predictions. Latin Hypercube Sampling (LHS) was applied to determine the impact of uncertainty in estimating field...

  2. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  3. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less

  4. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Treesearch

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  5. Sensitivity analysis of static resistance of slender beam under bending

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valeš, Jan

    2016-06-08

    The paper deals with statical and sensitivity analyses of resistance of simply supported I-beams under bending. The resistance was solved by geometrically nonlinear finite element method in the programme Ansys. The beams are modelled with initial geometrical imperfections following the first eigenmode of buckling. Imperfections were, together with geometrical characteristics of cross section, and material characteristics of steel, considered as random quantities. The method Latin Hypercube Sampling was applied to evaluate statistical and sensitivity resistance analyses.

  6. Constraining Unsaturated Hydraulic Parameters Using the Latin Hypercube Sampling Method and Coupled Hydrogeophysical Approach

    NASA Astrophysics Data System (ADS)

    Farzamian, Mohammad; Monteiro Santos, Fernando A.; Khalil, Mohamed A.

    2017-12-01

    The coupled hydrogeophysical approach has proved to be a valuable tool for improving the use of geoelectrical data for hydrological model parameterization. In the coupled approach, hydrological parameters are directly inferred from geoelectrical measurements in a forward manner to eliminate the uncertainty connected to the independent inversion of electrical resistivity data. Several numerical studies have been conducted to demonstrate the advantages of a coupled approach; however, only a few attempts have been made to apply the coupled approach to actual field data. In this study, we developed a 1D coupled hydrogeophysical code to estimate the van Genuchten-Mualem model parameters, K s, n, θ r and α, from time-lapse vertical electrical sounding data collected during a constant inflow infiltration experiment. van Genuchten-Mualem parameters were sampled using the Latin hypercube sampling method to provide a full coverage of the range of each parameter from their distributions. By applying the coupled approach, vertical electrical sounding data were coupled to hydrological models inferred from van Genuchten-Mualem parameter samples to investigate the feasibility of constraining the hydrological model. The key approaches taken in the study are to (1) integrate electrical resistivity and hydrological data and avoiding data inversion, (2) estimate the total water mass recovery of electrical resistivity data and consider it in van Genuchten-Mualem parameters evaluation and (3) correct the influence of subsurface temperature fluctuations during the infiltration experiment on electrical resistivity data. The results of the study revealed that the coupled hydrogeophysical approach can improve the value of geophysical measurements in hydrological model parameterization. However, the approach cannot overcome the technical limitations of the geoelectrical method associated with resolution and of water mass recovery.

  7. Stochastic Investigation of Natural Frequency for Functionally Graded Plates

    NASA Astrophysics Data System (ADS)

    Karsh, P. K.; Mukhopadhyay, T.; Dey, S.

    2018-03-01

    This paper presents the stochastic natural frequency analysis of functionally graded plates by applying artificial neural network (ANN) approach. Latin hypercube sampling is utilised to train the ANN model. The proposed algorithm for stochastic natural frequency analysis of FGM plates is validated and verified with original finite element method and Monte Carlo simulation (MCS). The combined stochastic variation of input parameters such as, elastic modulus, shear modulus, Poisson ratio, and mass density are considered. Power law is applied to distribute the material properties across the thickness. The present ANN model reduces the sample size and computationally found efficient as compared to conventional Monte Carlo simulation.

  8. Assessing performance and validating finite element simulations using probabilistic knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolin, Ronald M.; Rodriguez, E. A.

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrencemore » results are used to validate finite element predictions.« less

  9. Convoy Protection under Multi-Threat Scenario

    DTIC Science & Technology

    2017-06-01

    14. SUBJECT TERMS antisubmarine warfare, convoy protection, screening, design of experiments, agent-based simulation 15. NUMBER OF...46 5. Scenarios 33–36 (Red Submarine Tactic-2) ...............................46 IV. DESIGN OF EXPERIMENT...47 C. NEARLY ORTHOGONAL LATIN HYPERCUBE DESIGN ............51 V. DATA ANALYSIS

  10. Characterizing the optimal flux space of genome-scale metabolic reconstructions through modified latin-hypercube sampling.

    PubMed

    Chaudhary, Neha; Tøndel, Kristin; Bhatnagar, Rakesh; dos Santos, Vítor A P Martins; Puchałka, Jacek

    2016-03-01

    Genome-Scale Metabolic Reconstructions (GSMRs), along with optimization-based methods, predominantly Flux Balance Analysis (FBA) and its derivatives, are widely applied for assessing and predicting the behavior of metabolic networks upon perturbation, thereby enabling identification of potential novel drug targets and biotechnologically relevant pathways. The abundance of alternate flux profiles has led to the evolution of methods to explore the complete solution space aiming to increase the accuracy of predictions. Herein we present a novel, generic algorithm to characterize the entire flux space of GSMR upon application of FBA, leading to the optimal value of the objective (the optimal flux space). Our method employs Modified Latin-Hypercube Sampling (LHS) to effectively border the optimal space, followed by Principal Component Analysis (PCA) to identify and explain the major sources of variability within it. The approach was validated with the elementary mode analysis of a smaller network of Saccharomyces cerevisiae and applied to the GSMR of Pseudomonas aeruginosa PAO1 (iMO1086). It is shown to surpass the commonly used Monte Carlo Sampling (MCS) in providing a more uniform coverage for a much larger network in less number of samples. Results show that although many fluxes are identified as variable upon fixing the objective value, majority of the variability can be reduced to several main patterns arising from a few alternative pathways. In iMO1086, initial variability of 211 reactions could almost entirely be explained by 7 alternative pathway groups. These findings imply that the possibilities to reroute greater portions of flux may be limited within metabolic networks of bacteria. Furthermore, the optimal flux space is subject to change with environmental conditions. Our method may be a useful device to validate the predictions made by FBA-based tools, by describing the optimal flux space associated with these predictions, thus to improve them.

  11. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    NASA Astrophysics Data System (ADS)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  12. Extending Orthogonal and Nearly Orthogonal Latin Hypercube Designs for Computer Simulation and Experimentation

    DTIC Science & Technology

    2006-12-01

    The code was initially developed to be run within the netBeans IDE 5.04 running J2SE 5.0. During the course of the development, Eclipse SDK 3.2...covers the results from the research. Chapter V concludes and recommends future research. 4 netBeans

  13. Comparison of empirical strategies to maximize GENEHUNTER lod scores.

    PubMed

    Chen, C H; Finch, S J; Mendell, N R; Gordon, D

    1999-01-01

    We compare four strategies for finding the settings of genetic parameters that maximize the lod scores reported in GENEHUNTER 1.2. The four strategies are iterated complete factorial designs, iterated orthogonal Latin hypercubes, evolutionary operation, and numerical optimization. The genetic parameters that are set are the phenocopy rate, penetrance, and disease allele frequency; both recessive and dominant models are considered. We selected the optimization of a recessive model on the Collaborative Study on the Genetics of Alcoholism (COGA) data of chromosome 1 for complete analysis. Convergence to a setting producing a local maximum required the evaluation of over 100 settings (for a time budget of 800 minutes on a Pentium II 300 MHz PC). Two notable local maxima were detected, suggesting the need for a more extensive search before claiming that a global maximum had been found. The orthogonal Latin hypercube design was the best strategy for finding areas that produced high lod scores with small numbers of evaluations. Numerical optimization starting from a region producing high lod scores was the strategy that found the highest maximum observed.

  14. What Friends Are For: Collaborative Intelligence Analysis and Search

    DTIC Science & Technology

    2014-06-01

    14. SUBJECT TERMS Intelligence Community, information retrieval, recommender systems , search engines, social networks, user profiling, Lucene...improvements over existing search systems . The improvements are shown to be robust to high levels of human error and low similarity between users ...precision NOLH nearly orthogonal Latin hypercubes P@ precision at documents RS recommender systems TREC Text REtrieval Conference USM user

  15. Mixing-model Sensitivity to Initial Conditions in Hydrodynamic Predictions

    NASA Astrophysics Data System (ADS)

    Bigelow, Josiah; Silva, Humberto; Truman, C. Randall; Vorobieff, Peter

    2017-11-01

    Amagat and Dalton mixing-models were studied to compare their thermodynamic prediction of shock states. Numerical simulations with the Sandia National Laboratories shock hydrodynamic code CTH modeled University of New Mexico (UNM) shock tube laboratory experiments shocking a 1:1 molar mixture of helium (He) and sulfur hexafluoride (SF6) . Five input parameters were varied for sensitivity analysis: driver section pressure, driver section density, test section pressure, test section density, and mixture ratio (mole fraction). We show via incremental Latin hypercube sampling (LHS) analysis that significant differences exist between Amagat and Dalton mixing-model predictions. The differences observed in predicted shock speeds, temperatures, and pressures grow more pronounced with higher shock speeds. Supported by NNSA Grant DE-0002913.

  16. Flexible Space-Filling Designs for Complex System Simulations

    DTIC Science & Technology

    2013-06-01

    interior of the experimental region and cannot fit higher-order models. We present a genetic algorithm that constructs space-filling designs with...Computer Experiments, Design of Experiments, Genetic Algorithm , Latin Hypercube, Response Surface Methodology, Nearly Orthogonal 15. NUMBER OF PAGES 147...experimental region and cannot fit higher-order models. We present a genetic algorithm that constructs space-filling designs with minimal correlations

  17. Least squares polynomial chaos expansion: A review of sampling strategies

    NASA Astrophysics Data System (ADS)

    Hadigol, Mohammad; Doostan, Alireza

    2018-04-01

    As non-institutive polynomial chaos expansion (PCE) techniques have gained growing popularity among researchers, we here provide a comprehensive review of major sampling strategies for the least squares based PCE. Traditional sampling methods, such as Monte Carlo, Latin hypercube, quasi-Monte Carlo, optimal design of experiments (ODE), Gaussian quadratures, as well as more recent techniques, such as coherence-optimal and randomized quadratures are discussed. We also propose a hybrid sampling method, dubbed alphabetic-coherence-optimal, that employs the so-called alphabetic optimality criteria used in the context of ODE in conjunction with coherence-optimal samples. A comparison between the empirical performance of the selected sampling methods applied to three numerical examples, including high-order PCE's, high-dimensional problems, and low oversampling ratios, is presented to provide a road map for practitioners seeking the most suitable sampling technique for a problem at hand. We observed that the alphabetic-coherence-optimal technique outperforms other sampling methods, specially when high-order ODE are employed and/or the oversampling ratio is low.

  18. Estimation of the discharges of the multiple water level stations by multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Matsumoto, Kazuhiro; Miyamoto, Mamoru; Yamakage, Yuzuru; Tsuda, Morimasa; Yanami, Hitoshi; Anai, Hirokazu; Iwami, Yoichi

    2016-04-01

    This presentation shows two aspects of the parameter identification to estimate the discharges of the multiple water level stations by multi-objective optimization. One is how to adjust the parameters to estimate the discharges accurately. The other is which optimization algorithms are suitable for the parameter identification. Regarding the previous studies, there is a study that minimizes the weighted error of the discharges of the multiple water level stations by single-objective optimization. On the other hand, there are some studies that minimize the multiple error assessment functions of the discharge of a single water level station by multi-objective optimization. This presentation features to simultaneously minimize the errors of the discharges of the multiple water level stations by multi-objective optimization. Abe River basin in Japan is targeted. The basin area is 567.0km2. There are thirteen rainfall stations and three water level stations. Nine flood events are investigated. They occurred from 2005 to 2012 and the maximum discharges exceed 1,000m3/s. The discharges are calculated with PWRI distributed hydrological model. The basin is partitioned into the meshes of 500m x 500m. Two-layer tanks are placed on each mesh. Fourteen parameters are adjusted to estimate the discharges accurately. Twelve of them are the hydrological parameters and two of them are the parameters of the initial water levels of the tanks. Three objective functions are the mean squared errors between the observed and calculated discharges at the water level stations. Latin Hypercube sampling is one of the uniformly sampling algorithms. The discharges are calculated with respect to the parameter values sampled by a simplified version of Latin Hypercube sampling. The observed discharge is surrounded by the calculated discharges. It suggests that it might be possible to estimate the discharge accurately by adjusting the parameters. In a sense, it is true that the discharge of a water level station can be accurately estimated by setting the parameter values optimized to the responding water level station. However, there are some cases that the calculated discharge by setting the parameter values optimized to one water level station does not meet the observed discharge at another water level station. It is important to estimate the discharges of all the water level stations in some degree of accuracy. It turns out to be possible to select the parameter values from the pareto optimal solutions by the condition that all the normalized errors by the minimum error of the responding water level station are under 3. The optimization performance of five implementations of the algorithms and a simplified version of Latin Hypercube sampling are compared. Five implementations are NSGA2 and PAES of an optimization software inspyred and MCO_NSGA2R, MOPSOCD and NSGA2R_NSGA2R of a statistical software R. NSGA2, PAES and MOPSOCD are the optimization algorithms of a genetic algorithm, an evolution strategy and a particle swarm optimization respectively. The number of the evaluations of the objective functions is 10,000. Two implementations of NSGA2 of R outperform the others. They are promising to be suitable for the parameter identification of PWRI distributed hydrological model.

  19. In Silico Oncology: Quantification of the In Vivo Antitumor Efficacy of Cisplatin-Based Doublet Therapy in Non-Small Cell Lung Cancer (NSCLC) through a Multiscale Mechanistic Model

    PubMed Central

    Kolokotroni, Eleni; Dionysiou, Dimitra; Veith, Christian; Kim, Yoo-Jin; Franz, Astrid; Grgic, Aleksandar; Bohle, Rainer M.; Stamatakos, Georgios

    2016-01-01

    The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs’ cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of such estimates with the molecular profile of patients could serve as a basis for reliable personalized predictions. PMID:27657742

  20. Erosion Modeling in Central China - Soil Data Acquisition by Conditioned Latin Hypercube Sampling and Incorporation of Legacy Data

    NASA Astrophysics Data System (ADS)

    Stumpf, Felix; Schönbrodt-Stitt, Sarah; Schmidt, Karsten; Behrens, Thorsten; Scholten, Thomas

    2013-04-01

    The Three Gorges Dam at the Yangtze River in Central China outlines a prominent example of human-induced environmental impacts. Throughout one year the water table at the main river fluctuates about 30m due to impoundment and drainage activities. The dynamic water table implicates a range of georisks such as soil erosion, mass movements, sediment transport and diffuse matter inputs into the reservoir. Within the framework of the joint Sino-German project YANGTZE GEO, the subproject "Soil Erosion" deals with soil erosion risks and sediment transport pathways into the reservoir. The study site is a small catchment (4.8 km²) in Badong, approximately 100 km upstream the dam. It is characterized by scattered plots of agricultural landuse and resettlements in a largely wooded, steep sloping and mountainous area. Our research is focused on data acquisition and processing to develop a process-oriented erosion model. Hereby, area-covering knowledge of specific soil properties in the catchment is an intrinsic input parameter. This will be acquired by means of digital soil mapping (DSM). Thereby, soil properties are estimated by covariates. The functions are calibrated by soil property samples. The DSM approach is based on an appropriate sample design, which reflects the heterogeneity of the catchment, regarding the covariates with influence on the relevant soil properties. In this approach the covariates, processed by a digital terrain analysis, are outlined by the slope, altitude, profile curvature, plane curvature, and the aspect. For the development of the sample design, we chose the Conditioned Latin Hypercube Sampling (cLHS) procedure (Minasny and McBratney, 2006). It provides an efficient method of sampling variables from their multivariate distribution. Thereby, a sample size n from multiple variables is drawn such that for each variable the sample is marginally maximally stratified. The method ensures the maximal stratification by two features: First, number of strata equals the sample size n and secondly, the probability of falling in each of the strata is n-¹ (McKay et al., 1979). We extended the classical cLHS with extremes (Schmidt et al., 2012) approach by incorporating legacy data of previous field campaigns. Instead of identifying precise sample locations by CLHS, we demarcate the multivariate attribute space of the samples based on the histogram borders of each stratum. This widens the spatial scope of the actual CLHS sample locations and allows the incorporation of legacy data lying within that scope. Furthermore, this approach provides an extended potential regarding the accessibility of sample sites in the field.

  1. Probabilistic Modeling of Intracranial Pressure Effects on Optic Nerve Biomechanics

    NASA Technical Reports Server (NTRS)

    Ethier, C. R.; Feola, Andrew J.; Raykin, Julia; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.

    2016-01-01

    Altered intracranial pressure (ICP) is involved/implicated in several ocular conditions: papilledema, glaucoma and Visual Impairment and Intracranial Pressure (VIIP) syndrome. The biomechanical effects of altered ICP on optic nerve head (ONH) tissues in these conditions are uncertain but likely important. We have quantified ICP-induced deformations of ONH tissues, using finite element (FE) and probabilistic modeling (Latin Hypercube Simulations (LHS)) to consider a range of tissue properties and relevant pressures.

  2. Design Optimization of a Centrifugal Fan with Splitter Blades

    NASA Astrophysics Data System (ADS)

    Heo, Man-Woong; Kim, Jin-Hyuk; Kim, Kwang-Yong

    2015-05-01

    Multi-objective optimization of a centrifugal fan with additionally installed splitter blades was performed to simultaneously maximize the efficiency and pressure rise using three-dimensional Reynolds-averaged Navier-Stokes equations and hybrid multi-objective evolutionary algorithm. Two design variables defining the location of splitter, and the height ratio between inlet and outlet of impeller were selected for the optimization. In addition, the aerodynamic characteristics of the centrifugal fan were investigated with the variation of design variables in the design space. Latin hypercube sampling was used to select the training points, and response surface approximation models were constructed as surrogate models of the objective functions. With the optimization, both the efficiency and pressure rise of the centrifugal fan with splitter blades were improved considerably compared to the reference model.

  3. Construction of nested maximin designs based on successive local enumeration and modified novel global harmony search algorithm

    NASA Astrophysics Data System (ADS)

    Yi, Jin; Li, Xinyu; Xiao, Mi; Xu, Junnan; Zhang, Lin

    2017-01-01

    Engineering design often involves different types of simulation, which results in expensive computational costs. Variable fidelity approximation-based design optimization approaches can realize effective simulation and efficiency optimization of the design space using approximation models with different levels of fidelity and have been widely used in different fields. As the foundations of variable fidelity approximation models, the selection of sample points of variable-fidelity approximation, called nested designs, is essential. In this article a novel nested maximin Latin hypercube design is constructed based on successive local enumeration and a modified novel global harmony search algorithm. In the proposed nested designs, successive local enumeration is employed to select sample points for a low-fidelity model, whereas the modified novel global harmony search algorithm is employed to select sample points for a high-fidelity model. A comparative study with multiple criteria and an engineering application are employed to verify the efficiency of the proposed nested designs approach.

  4. Pu239 Cross-Section Variations Based on Experimental Uncertainties and Covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sigeti, David Edward; Williams, Brian J.; Parsons, D. Kent

    2016-10-18

    Algorithms and software have been developed for producing variations in plutonium-239 neutron cross sections based on experimental uncertainties and covariances. The varied cross-section sets may be produced as random samples from the multi-variate normal distribution defined by an experimental mean vector and covariance matrix, or they may be produced as Latin-Hypercube/Orthogonal-Array samples (based on the same means and covariances) for use in parametrized studies. The variations obey two classes of constraints that are obligatory for cross-section sets and which put related constraints on the mean vector and covariance matrix that detemine the sampling. Because the experimental means and covariances domore » not obey some of these constraints to sufficient precision, imposing the constraints requires modifying the experimental mean vector and covariance matrix. Modification is done with an algorithm based on linear algebra that minimizes changes to the means and covariances while insuring that the operations that impose the different constraints do not conflict with each other.« less

  5. Validation of heat transfer, thermal decomposition, and container pressurization of polyurethane foam using mean value and Latin hypercube sampling approaches

    DOE PAGES

    Scott, Sarah N.; Dodd, Amanda B.; Larsen, Marvin E.; ...

    2014-12-09

    In this study, polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. It can be advantageous to surround objects of interest, such as electronics, with foams in a hermetically sealed container in order to protect them from hostile environments or from accidents such as fire. In fire environments, gas pressure from thermal decomposition of foams can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of polymeric methylene diisocyanate (PMDI)-polyether-polyol based polyurethane foam is presented and compared to experimental results to assess the validity of a 3-D finite element model of themore » heat transfer and degradation processes. In this series of experiments, 320 kg/m 3 PMDI foam in a 0.2 L sealed steel container is heated to 1,073 K at a rate of 150 K/min. The experiment ends when the can breaches due to the buildup of pressure. The temperature at key location is monitored as well as the internal pressure of the can. Both experimental uncertainty and computational uncertainty are examined and compared. The mean value method (MV) and Latin hypercube sampling (LHS) approach are used to propagate the uncertainty through the model. The results of the both the MV method and the LHS approach show that while the model generally can predict the temperature at given locations in the system, it is less successful at predicting the pressure response. Also, these two approaches for propagating uncertainty agree with each other, the importance of each input parameter on the simulation results is also investigated, showing that for the temperature response the conductivity of the steel container and the effective conductivity of the foam, are the most important parameters. For the pressure response, the activation energy, effective conductivity, and specific heat are most important. The comparison to experiments and the identification of the drivers of uncertainty allow for targeted development of the computational model and for definition of the experiments necessary to improve accuracy.« less

  6. Validation of heat transfer, thermal decomposition, and container pressurization of polyurethane foam using mean value and Latin hypercube sampling approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Sarah N.; Dodd, Amanda B.; Larsen, Marvin E.

    In this study, polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. It can be advantageous to surround objects of interest, such as electronics, with foams in a hermetically sealed container in order to protect them from hostile environments or from accidents such as fire. In fire environments, gas pressure from thermal decomposition of foams can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of polymeric methylene diisocyanate (PMDI)-polyether-polyol based polyurethane foam is presented and compared to experimental results to assess the validity of a 3-D finite element model of themore » heat transfer and degradation processes. In this series of experiments, 320 kg/m 3 PMDI foam in a 0.2 L sealed steel container is heated to 1,073 K at a rate of 150 K/min. The experiment ends when the can breaches due to the buildup of pressure. The temperature at key location is monitored as well as the internal pressure of the can. Both experimental uncertainty and computational uncertainty are examined and compared. The mean value method (MV) and Latin hypercube sampling (LHS) approach are used to propagate the uncertainty through the model. The results of the both the MV method and the LHS approach show that while the model generally can predict the temperature at given locations in the system, it is less successful at predicting the pressure response. Also, these two approaches for propagating uncertainty agree with each other, the importance of each input parameter on the simulation results is also investigated, showing that for the temperature response the conductivity of the steel container and the effective conductivity of the foam, are the most important parameters. For the pressure response, the activation energy, effective conductivity, and specific heat are most important. The comparison to experiments and the identification of the drivers of uncertainty allow for targeted development of the computational model and for definition of the experiments necessary to improve accuracy.« less

  7. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). Here, the resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  8. Validation of Heat Transfer Thermal Decomposition and Container Pressurization of Polyurethane Foam.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Sarah Nicole; Dodd, Amanda B.; Larsen, Marvin E.

    Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. In fire environments, gas pressure from thermal decomposition of polymers can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of PMDI-based polyurethane foam is presented to assess the validity of the computational model. Both experimental measurement uncertainty and model prediction uncertainty are examined and compared. Both the mean value method and Latin hypercube sampling approach are used to propagate the uncertainty through the model. In addition to comparing computational and experimental results, the importance of each input parameter on the simulation resultmore » is also investigated. These results show that further development in the physics model of the foam and appropriate associated material testing are necessary to improve model accuracy.« less

  9. Stochastic Multi-Commodity Facility Location Based on a New Scenario Generation Technique

    NASA Astrophysics Data System (ADS)

    Mahootchi, M.; Fattahi, M.; Khakbazan, E.

    2011-11-01

    This paper extends two models for stochastic multi-commodity facility location problem. The problem is formulated as two-stage stochastic programming. As a main point of this study, a new algorithm is applied to efficiently generate scenarios for uncertain correlated customers' demands. This algorithm uses Latin Hypercube Sampling (LHS) and a scenario reduction approach. The relation between customer satisfaction level and cost are considered in model I. The risk measure using Conditional Value-at-Risk (CVaR) is embedded into the optimization model II. Here, the structure of the network contains three facility layers including plants, distribution centers, and retailers. The first stage decisions are the number, locations, and the capacity of distribution centers. In the second stage, the decisions are the amount of productions, the volume of transportation between plants and customers.

  10. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    DOE PAGES

    Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra; ...

    2017-11-20

    The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less

  11. Development of building energy asset rating using stock modelling in the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Makhmalbaf, Atefe

    2016-01-29

    The US Building Energy Asset Score helps building stakeholders quickly gain insight into the efficiency of building systems (envelope, electrical and mechanical systems). A robust, easy-to-understand 10-point scoring system was developed to facilitate an unbiased comparison of similar building types across the country. The Asset Score does not rely on a database or specific building baselines to establish a rating. Rather, distributions of energy use intensity (EUI) for various building use types were constructed using Latin hypercube sampling and converted to a series of stepped linear scales to score buildings. A score is calculated based on the modelled source EUImore » after adjusting for climate. A web-based scoring tool, which incorporates an analytical engine and a simulation engine, was developed to standardize energy modelling and reduce implementation cost. This paper discusses the methodology used to perform several hundred thousand building simulation runs and develop the scoring scales.« less

  12. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Undisturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-19

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment for the Waste Isolation Pilot Plant are presented for two-phase flow the vicinity of the repository under undisturbed conditions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformation are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure is potentially the most important due to its influence on spallings and direct brine releases, with the uncertainty in its value being dominated by the extent to whichmore » the microbial degradation of cellulose takes place, the rate at which the corrosion of steel takes place, and the amount of brine that drains from the surrounding disturbed rock zone into the repository.« less

  13. Generalized Likelihood Uncertainty Estimation (GLUE) methodology for optimization of extraction in natural products.

    PubMed

    Maulidiani; Rudiyanto; Abas, Faridah; Ismail, Intan Safinar; Lajis, Nordin H

    2018-06-01

    Optimization process is an important aspect in the natural product extractions. Herein, an alternative approach is proposed for the optimization in extraction, namely, the Generalized Likelihood Uncertainty Estimation (GLUE). The approach combines the Latin hypercube sampling, the feasible range of independent variables, the Monte Carlo simulation, and the threshold criteria of response variables. The GLUE method is tested in three different techniques including the ultrasound, the microwave, and the supercritical CO 2 assisted extractions utilizing the data from previously published reports. The study found that this method can: provide more information on the combined effects of the independent variables on the response variables in the dotty plots; deal with unlimited number of independent and response variables; consider combined multiple threshold criteria, which is subjective depending on the target of the investigation for response variables; and provide a range of values with their distribution for the optimization. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Optimisation of lateral car dynamics taking into account parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Busch, Jochen; Bestle, Dieter

    2014-02-01

    Simulation studies on an active all-wheel-steering car show that disturbance of vehicle parameters have high influence on lateral car dynamics. This motivates the need of robust design against such parameter uncertainties. A specific parametrisation is established combining deterministic, velocity-dependent steering control parameters with partly uncertain, velocity-independent vehicle parameters for simultaneous use in a numerical optimisation process. Model-based objectives are formulated and summarised in a multi-objective optimisation problem where especially the lateral steady-state behaviour is improved by an adaption strategy based on measurable uncertainties. The normally distributed uncertainties are generated by optimal Latin hypercube sampling and a response surface based strategy helps to cut down time consuming model evaluations which offers the possibility to use a genetic optimisation algorithm. Optimisation results are discussed in different criterion spaces and the achieved improvements confirm the validity of the proposed procedure.

  15. An uncertainty analysis of the flood-stage upstream from a bridge.

    PubMed

    Sowiński, M

    2006-01-01

    The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.

  16. Efficient hybrid evolutionary algorithm for optimization of a strip coiling process

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Park, Won-Woong; Kim, Dong-Kyu; Im, Yong-Taek; Bureerat, Sujin; Kwon, Hyuck-Cheol; Chun, Myung-Sik

    2015-04-01

    This article proposes an efficient metaheuristic based on hybridization of teaching-learning-based optimization and differential evolution for optimization to improve the flatness of a strip during a strip coiling process. Differential evolution operators were integrated into the teaching-learning-based optimization with a Latin hypercube sampling technique for generation of an initial population. The objective function was introduced to reduce axial inhomogeneity of the stress distribution and the maximum compressive stress calculated by Love's elastic solution within the thin strip, which may cause an irregular surface profile of the strip during the strip coiling process. The hybrid optimizer and several well-established evolutionary algorithms (EAs) were used to solve the optimization problem. The comparative studies show that the proposed hybrid algorithm outperformed other EAs in terms of convergence rate and consistency. It was found that the proposed hybrid approach was powerful for process optimization, especially with a large-scale design problem.

  17. Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

    NASA Astrophysics Data System (ADS)

    García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier

    2016-04-01

    Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.

  18. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra

    The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less

  19. When to Make Mountains out of Molehills: The Pros and Cons of Simple and Complex Model Calibration Procedures

    NASA Astrophysics Data System (ADS)

    Smith, K. A.; Barker, L. J.; Harrigan, S.; Prudhomme, C.; Hannaford, J.; Tanguy, M.; Parry, S.

    2017-12-01

    Earth and environmental models are relied upon to investigate system responses that cannot otherwise be examined. In simulating physical processes, models have adjustable parameters which may, or may not, have a physical meaning. Determining the values to assign to these model parameters is an enduring challenge for earth and environmental modellers. Selecting different error metrics by which the models results are compared to observations will lead to different sets of calibrated model parameters, and thus different model results. Furthermore, models may exhibit `equifinal' behaviour, where multiple combinations of model parameters lead to equally acceptable model performance against observations. These decisions in model calibration introduce uncertainty that must be considered when model results are used to inform environmental decision-making. This presentation focusses on the uncertainties that derive from the calibration of a four parameter lumped catchment hydrological model (GR4J). The GR models contain an inbuilt automatic calibration algorithm that can satisfactorily calibrate against four error metrics in only a few seconds. However, a single, deterministic model result does not provide information on parameter uncertainty. Furthermore, a modeller interested in extreme events, such as droughts, may wish to calibrate against more low flows specific error metrics. In a comprehensive assessment, the GR4J model has been run with 500,000 Latin Hypercube Sampled parameter sets across 303 catchments in the United Kingdom. These parameter sets have been assessed against six error metrics, including two drought specific metrics. This presentation compares the two approaches, and demonstrates that the inbuilt automatic calibration can outperform the Latin Hypercube experiment approach in single metric assessed performance. However, it is also shown that there are many merits of the more comprehensive assessment, which allows for probabilistic model results, multi-objective optimisation, and better tailoring to calibrate the model for specific applications such as drought event characterisation. Modellers and decision-makers may be constrained in their choice of calibration method, so it is important that they recognise the strengths and limitations of their chosen approach.

  20. Agent-Based Modeling of China's Rural-Urban Migration and Social Network Structure.

    PubMed

    Fu, Zhaohao; Hao, Lingxin

    2018-01-15

    We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k -core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.

  1. Agent-based modeling of China's rural-urban migration and social network structure

    NASA Astrophysics Data System (ADS)

    Fu, Zhaohao; Hao, Lingxin

    2018-01-01

    We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k-core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.

  2. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    NASA Astrophysics Data System (ADS)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  3. The Influence of PV Module Materials and Design on Solder Joint Thermal Fatigue Durability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosco, Nick; Silverman, Timothy J.; Kurtz, Sarah

    Finite element model (FEM) simulations have been performed to elucidate the effect of flat plate photovoltaic (PV) module materials and design on PbSn eutectic solder joint thermal fatigue durability. The statistical method of Latin Hypercube sampling was employed to investigate the sensitivity of simulated damage to each input variable. Variables of laminate material properties and their thicknesses were investigated. Using analysis of variance, we determined that the rate of solder fatigue was most sensitive to solder layer thickness, with copper ribbon and silicon thickness being the next two most sensitive variables. By simulating both accelerated thermal cycles (ATCs) and PVmore » cell temperature histories through two characteristic days of service, we determined that the acceleration factor between the ATC and outdoor service was independent of the variables sampled in this study. This result implies that an ATC test will represent a similar time of outdoor exposure for a wide range of module designs. This is an encouraging result for the standard ATC that must be universally applied across all modules.« less

  4. Dynamics of hepatitis C under optimal therapy and sampling based analysis

    NASA Astrophysics Data System (ADS)

    Pachpute, Gaurav; Chakrabarty, Siddhartha P.

    2013-08-01

    We examine two models for hepatitis C viral (HCV) dynamics, one for monotherapy with interferon (IFN) and the other for combination therapy with IFN and ribavirin. Optimal therapy for both the models is determined using the steepest gradient method, by defining an objective functional which minimizes infected hepatocyte levels, virion population and side-effects of the drug(s). The optimal therapies for both the models show an initial period of high efficacy, followed by a gradual decline. The period of high efficacy coincides with a significant decrease in the viral load, whereas the efficacy drops after hepatocyte levels are restored. We use the Latin hypercube sampling technique to randomly generate a large number of patient scenarios and study the dynamics of each set under the optimal therapy already determined. Results show an increase in the percentage of responders (indicated by drop in viral load below detection levels) in case of combination therapy (72%) as compared to monotherapy (57%). Statistical tests performed to study correlations between sample parameters and time required for the viral load to fall below detection level, show a strong monotonic correlation with the death rate of infected hepatocytes, identifying it to be an important factor in deciding individual drug regimens.

  5. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S. M.; Kim, K. Y.

    Printed circuit heat exchanger (PCHE) is recently considered as a recuperator for the high temperature gas cooled reactor. In this work, the zigzag-channels of a PCHE have been optimized by using three-dimensional Reynolds-Averaged Navier-Stokes (RANS) analysis and response surface approximation (RSA) modeling technique to enhance thermal-hydraulic performance. Shear stress transport turbulence model is used as a turbulence closure. The objective function is defined as a linear combination of the functions related to heat transfer and friction loss of the PCHE, respectively. Three geometric design variables viz., the ratio of the radius of the fillet to hydraulic diameter of the channels,more » the ratio of wavelength to hydraulic diameter of the channels, and the ratio of wave height to hydraulic diameter of the channels, are used for the optimization. Design points are selected through Latin-hypercube sampling. The optimal design is determined through the RSA model which uses RANS derived calculations at the design points. The results show that the optimum shape enhances considerably the thermal-hydraulic performance than a reference shape. (authors)« less

  7. Uncertain dynamic analysis for rigid-flexible mechanisms with random geometry and material properties

    NASA Astrophysics Data System (ADS)

    Wu, Jinglai; Luo, Zhen; Zhang, Nong; Zhang, Yunqing; Walker, Paul D.

    2017-02-01

    This paper proposes an uncertain modelling and computational method to analyze dynamic responses of rigid-flexible multibody systems (or mechanisms) with random geometry and material properties. Firstly, the deterministic model for the rigid-flexible multibody system is built with the absolute node coordinate formula (ANCF), in which the flexible parts are modeled by using ANCF elements, while the rigid parts are described by ANCF reference nodes (ANCF-RNs). Secondly, uncertainty for the geometry of rigid parts is expressed as uniform random variables, while the uncertainty for the material properties of flexible parts is modeled as a continuous random field, which is further discretized to Gaussian random variables using a series expansion method. Finally, a non-intrusive numerical method is developed to solve the dynamic equations of systems involving both types of random variables, which systematically integrates the deterministic generalized-α solver with Latin Hypercube sampling (LHS) and Polynomial Chaos (PC) expansion. The benchmark slider-crank mechanism is used as a numerical example to demonstrate the characteristics of the proposed method.

  8. Design optimization of hydraulic turbine draft tube based on CFD and DOE method

    NASA Astrophysics Data System (ADS)

    Nam, Mun chol; Dechun, Ba; Xiangji, Yue; Mingri, Jin

    2018-03-01

    In order to improve performance of the hydraulic turbine draft tube in its design process, the optimization for draft tube is performed based on multi-disciplinary collaborative design optimization platform by combining the computation fluid dynamic (CFD) and the design of experiment (DOE) in this paper. The geometrical design variables are considered as the median section in the draft tube and the cross section in its exit diffuser and objective function is to maximize the pressure recovery factor (Cp). Sample matrixes required for the shape optimization of the draft tube are generated by optimal Latin hypercube (OLH) method of the DOE technique and their performances are evaluated through computational fluid dynamic (CFD) numerical simulation. Subsequently the main effect analysis and the sensitivity analysis of the geometrical parameters of the draft tube are accomplished. Then, the design optimization of the geometrical design variables is determined using the response surface method. The optimization result of the draft tube shows a marked performance improvement over the original.

  9. Developing an in silico model of the modulation of base excision repair using methoxyamine for more targeted cancer therapeutics.

    PubMed

    Gurkan-Cavusoglu, Evren; Avadhani, Sriya; Liu, Lili; Kinsella, Timothy J; Loparo, Kenneth A

    2013-04-01

    Base excision repair (BER) is a major DNA repair pathway involved in the processing of exogenous non-bulky base damages from certain classes of cancer chemotherapy drugs as well as ionising radiation (IR). Methoxyamine (MX) is a small molecule chemical inhibitor of BER that is shown to enhance chemotherapy and/or IR cytotoxicity in human cancers. In this study, the authors have analysed the inhibitory effect of MX on the BER pathway kinetics using a computational model of the repair pathway. The inhibitory effect of MX depends on the BER efficiency. The authors have generated variable efficiency groups using different sets of protein concentrations generated by Latin hypercube sampling, and they have clustered simulation results into high, medium and low efficiency repair groups. From analysis of the inhibitory effect of MX on each of the three groups, it is found that the inhibition is most effective for high efficiency BER, and least effective for low efficiency repair.

  10. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  11. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  12. Machine learning from computer simulations with applications in rail vehicle dynamics

    NASA Astrophysics Data System (ADS)

    Taheri, Mehdi; Ahmadian, Mehdi

    2016-05-01

    The application of stochastic modelling for learning the behaviour of a multibody dynamics (MBD) models is investigated. Post-processing data from a simulation run are used to train the stochastic model that estimates the relationship between model inputs (suspension relative displacement and velocity) and the output (sum of suspension forces). The stochastic model can be used to reduce the computational burden of the MBD model by replacing a computationally expensive subsystem in the model (suspension subsystem). With minor changes, the stochastic modelling technique is able to learn the behaviour of a physical system and integrate its behaviour within MBD models. The technique is highly advantageous for MBD models where real-time simulations are necessary, or with models that have a large number of repeated substructures, e.g. modelling a train with a large number of railcars. The fact that the training data are acquired prior to the development of the stochastic model discards the conventional sampling plan strategies like Latin Hypercube sampling plans where simulations are performed using the inputs dictated by the sampling plan. Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, a sampling plan suitable for the process is developed where the most space-filling subset of the acquired data with ? number of sample points that best describes the dynamic behaviour of the system under study is selected as the training data.

  13. Non-intrusive reduced order modeling of nonlinear problems using neural networks

    NASA Astrophysics Data System (ADS)

    Hesthaven, J. S.; Ubbiali, S.

    2018-06-01

    We develop a non-intrusive reduced basis (RB) method for parametrized steady-state partial differential equations (PDEs). The method extracts a reduced basis from a collection of high-fidelity solutions via a proper orthogonal decomposition (POD) and employs artificial neural networks (ANNs), particularly multi-layer perceptrons (MLPs), to accurately approximate the coefficients of the reduced model. The search for the optimal number of neurons and the minimum amount of training samples to avoid overfitting is carried out in the offline phase through an automatic routine, relying upon a joint use of the Latin hypercube sampling (LHS) and the Levenberg-Marquardt (LM) training algorithm. This guarantees a complete offline-online decoupling, leading to an efficient RB method - referred to as POD-NN - suitable also for general nonlinear problems with a non-affine parametric dependence. Numerical studies are presented for the nonlinear Poisson equation and for driven cavity viscous flows, modeled through the steady incompressible Navier-Stokes equations. Both physical and geometrical parametrizations are considered. Several results confirm the accuracy of the POD-NN method and show the substantial speed-up enabled at the online stage as compared to a traditional RB strategy.

  14. Uncertainty and Sensitivity Analyses of a Pebble Bed HTGR Loss of Cooling Event

    DOE PAGES

    Strydom, Gerhard

    2013-01-01

    The Very High Temperature Reactor Methods Development group at the Idaho National Laboratory identified the need for a defensible and systematic uncertainty and sensitivity approach in 2009. This paper summarizes the results of an uncertainty and sensitivity quantification investigation performed with the SUSA code, utilizing the International Atomic Energy Agency CRP 5 Pebble Bed Modular Reactor benchmark and the INL code suite PEBBED-THERMIX. Eight model input parameters were selected for inclusion in this study, and after the input parameters variations and probability density functions were specified, a total of 800 steady state and depressurized loss of forced cooling (DLOFC) transientmore » PEBBED-THERMIX calculations were performed. The six data sets were statistically analyzed to determine the 5% and 95% DLOFC peak fuel temperature tolerance intervals with 95% confidence levels. It was found that the uncertainties in the decay heat and graphite thermal conductivities were the most significant contributors to the propagated DLOFC peak fuel temperature uncertainty. No significant differences were observed between the results of Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) data sets, and use of uniform or normal input parameter distributions also did not lead to any significant differences between these data sets.« less

  15. Maximum projection designs for computer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan; Gul, Evren; Ba, Shan

    Space-filling properties are important in designing computer experiments. The traditional maximin and minimax distance designs only consider space-filling in the full dimensional space. This can result in poor projections onto lower dimensional spaces, which is undesirable when only a few factors are active. Restricting maximin distance design to the class of Latin hypercubes can improve one-dimensional projections, but cannot guarantee good space-filling properties in larger subspaces. We propose designs that maximize space-filling properties on projections to all subsets of factors. We call our designs maximum projection designs. As a result, our design criterion can be computed at a cost nomore » more than a design criterion that ignores projection properties.« less

  16. Maximum projection designs for computer experiments

    DOE PAGES

    Joseph, V. Roshan; Gul, Evren; Ba, Shan

    2015-03-18

    Space-filling properties are important in designing computer experiments. The traditional maximin and minimax distance designs only consider space-filling in the full dimensional space. This can result in poor projections onto lower dimensional spaces, which is undesirable when only a few factors are active. Restricting maximin distance design to the class of Latin hypercubes can improve one-dimensional projections, but cannot guarantee good space-filling properties in larger subspaces. We propose designs that maximize space-filling properties on projections to all subsets of factors. We call our designs maximum projection designs. As a result, our design criterion can be computed at a cost nomore » more than a design criterion that ignores projection properties.« less

  17. Parameter identification and optimization of slide guide joint of CNC machine tools

    NASA Astrophysics Data System (ADS)

    Zhou, S.; Sun, B. B.

    2017-11-01

    The joint surface has an important influence on the performance of CNC machine tools. In order to identify the dynamic parameters of slide guide joint, the parametric finite element model of the joint is established and optimum design method is used based on the finite element simulation and modal test. Then the mode that has the most influence on the dynamics of slip joint is found through harmonic response analysis. Take the frequency of this mode as objective, the sensitivity analysis of the stiffness of each joint surface is carried out using Latin Hypercube Sampling and Monte Carlo Simulation. The result shows that the vertical stiffness of slip joint surface constituted by the bed and the slide plate has the most obvious influence on the structure. Therefore, this stiffness is taken as the optimization variable and the optimal value is obtained through studying the relationship between structural dynamic performance and stiffness. Take the stiffness values before and after optimization into the FEM of machine tool, and it is found that the dynamic performance of the machine tool is improved.

  18. Optimized Reduction of Unsteady Radial Forces in a Singlechannel Pump for Wastewater Treatment

    NASA Astrophysics Data System (ADS)

    Kim, Jin-Hyuk; Cho, Bo-Min; Choi, Young-Seok; Lee, Kyoung-Yong; Peck, Jong-Hyeon; Kim, Seon-Chang

    2016-11-01

    A single-channel pump for wastewater treatment was optimized to reduce unsteady radial force sources caused by impeller-volute interactions. The steady and unsteady Reynolds- averaged Navier-Stokes equations using the shear-stress transport turbulence model were discretized by finite volume approximations and solved on tetrahedral grids to analyze the flow in the single-channel pump. The sweep area of radial force during one revolution and the distance of the sweep-area center of mass from the origin were selected as the objective functions; the two design variables were related to the internal flow cross-sectional area of the volute. These objective functions were integrated into one objective function by applying the weighting factor for optimization. Latin hypercube sampling was employed to generate twelve design points within the design space. A response-surface approximation model was constructed as a surrogate model for the objectives, based on the objective function values at the generated design points. The optimized results showed considerable reduction in the unsteady radial force sources in the optimum design, relative to those of the reference design.

  19. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Disturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-22

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) are presented for two-phase flow in the vicinity of the repository under disturbed conditions resulting from drilling intrusions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure and brine flow from the repository to the Culebra Dolomite are potentially the most important in PA for the WIPP. Subsequentmore » to a drilling intrusion repository pressure was dominated by borehole permeability and generally below the level (i.e., 8 MPa) that could potentially produce spallings and direct brine releases. Brine flow from the repository to the Culebra Dolomite tended to be small or nonexistent with its occurrence and size also dominated by borehole permeability.« less

  20. A Novel Approach for Determining Source–Receptor Relationships in Model Simulations: A Case Study of Black Carbon Transport in Northern Hemisphere Winter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Po-Lun; Gattiker, J. R.; Liu, Xiaohong

    2013-06-27

    A Gaussian process (GP) emulator is applied to quantify the contribution of local and remote emissions of black carbon (BC) on the BC concentrations in different regions using a Latin Hypercube sampling strategy for emission perturbations in the offline version of the Community Atmosphere Model Version 5.1 (CAM5) simulations. The source-receptor relationships are computed based on simulations constrained by a standard free-running CAM5 simulation and the ERA-Interim reanalysis product. The analysis demonstrates that the emulator is capable of retrieving the source-receptor relationships based on a small number of CAM5 simulations. Most regions are found susceptible to their local emissions. Themore » emulator also finds that the source-receptor relationships retrieved from the model-driven and the reanalysis-driven simulations are very similar, suggesting that the simulated circulation in CAM5 resembles the assimilated meteorology in ERA-Interim. The robustness of the results provides confidence for applying the emulator to detect dose-response signals in the climate system.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Jim Bouchard

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. Themore » fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.« less

  2. A smart Monte Carlo procedure for production costing and uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, C.; Stremel, J.

    1996-11-01

    Electric utilities using chronological production costing models to decide whether to buy or sell power over the next week or next few weeks need to determine potential profits or losses under a number of uncertainties. A large amount of money can be at stake--often $100,000 a day or more--and one party of the sale must always take on the risk. In the case of fixed price ($/MWh) contracts, the seller accepts the risk. In the case of cost plus contracts, the buyer must accept the risk. So, modeling uncertainty and understanding the risk accurately can improve the competitive edge ofmore » the user. This paper investigates an efficient procedure for representing risks and costs from capacity outages. Typically, production costing models use an algorithm based on some form of random number generator to select resources as available or on outage. These algorithms allow experiments to be repeated and gains and losses to be observed in a short time. The authors perform several experiments to examine the capability of three unit outage selection methods and measures their results. Specifically, a brute force Monte Carlo procedure, a Monte Carlo procedure with Latin Hypercube sampling, and a Smart Monte Carlo procedure with cost stratification and directed sampling are examined.« less

  3. Estimated Under-Five Deaths Associated with Poor-Quality Antimalarials in Sub-Saharan Africa

    PubMed Central

    Renschler, John P.; Walters, Kelsey M.; Newton, Paul N.; Laxminarayan, Ramanan

    2015-01-01

    Many antimalarials sold in sub-Saharan Africa are poor-quality (falsified, substandard, or degraded), and the burden of disease caused by this problem is inadequately quantified. In this article, we estimate the number of under-five deaths caused by ineffective treatment of malaria associated with consumption of poor-quality antimalarials in 39 sub-Saharan countries. Using Latin hypercube sampling our estimates were calculated as the product of the number of private sector antimalarials consumed by malaria-positive children in 2013; the proportion of private sector antimalarials consumed that were of poor-quality; and the case fatality rate (CFR) of under-five malaria-positive children who did not receive appropriate treatment. An estimated 122,350 (interquartile range [IQR]: 91,577–154,736) under-five malaria deaths were associated with consumption of poor-quality antimalarials, representing 3.75% (IQR: 2.81–4.75%) of all under-five deaths in our sample of 39 countries. There is considerable uncertainty surrounding our results because of gaps in data on case fatality rates and prevalence of poor-quality antimalarials. Our analysis highlights the need for further investigation into the distribution of poor-quality antimalarials and the need for stronger surveillance and regulatory efforts to prevent the sale of poor-quality antimalarials. PMID:25897068

  4. Uncertainty quantification analysis of the dynamics of an electrostatically actuated microelectromechanical switch model

    NASA Astrophysics Data System (ADS)

    Snow, Michael G.; Bajaj, Anil K.

    2015-08-01

    This work presents an uncertainty quantification (UQ) analysis of a comprehensive model for an electrostatically actuated microelectromechanical system (MEMS) switch. The goal is to elucidate the effects of parameter variations on certain key performance characteristics of the switch. A sufficiently detailed model of the electrostatically actuated switch in the basic configuration of a clamped-clamped beam is developed. This multi-physics model accounts for various physical effects, including the electrostatic fringing field, finite length of electrodes, squeeze film damping, and contact between the beam and the dielectric layer. The performance characteristics of immediate interest are the static and dynamic pull-in voltages for the switch. Numerical approaches for evaluating these characteristics are developed and described. Using Latin Hypercube Sampling and other sampling methods, the model is evaluated to find these performance characteristics when variability in the model's geometric and physical parameters is specified. Response surfaces of these results are constructed via a Multivariate Adaptive Regression Splines (MARS) technique. Using a Direct Simulation Monte Carlo (DSMC) technique on these response surfaces gives smooth probability density functions (PDFs) of the outputs characteristics when input probability characteristics are specified. The relative variation in the two pull-in voltages due to each of the input parameters is used to determine the critical parameters.

  5. Simulation of Fault Tolerance in a Hypercube Arrangement of Discrete Processors.

    DTIC Science & Technology

    1987-12-01

    Geometric Properties .................... 22 Binary Properties ....................... 26 Intel Hypercube Hardware Arrangement ... 28 IV. Cube-Connected... Properties of the CCC..............35 CCC Redundancy............................... 38 iii 6L V. Re-Configurable Cube-Connected Cycles ....... 40 Global...o........ 74 iv List of Figures Page Figure 1: Hypercubes of Different Dimensions ......... 21 Figure 2: Hypercube Properties

  6. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  7. Economic consequences of aviation system disruptions: A reduced-form computable general equilibrium analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin

    The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less

  8. Photovoltaic frequency–watt curve design for frequency regulation and fast contingency reserves

    DOE PAGES

    Johnson, Jay; Neely, Jason C.; Delhotal, Jarod J.; ...

    2016-09-02

    When renewable energy resources are installed in electricity grids, they typically increase generation variability and displace thermal generator control action and inertia. Grid operators combat these emerging challenges with advanced distributed energy resource (DER) functions to support frequency and provide voltage regulation and protection mechanisms. This paper focuses on providing frequency reserves using autonomous IEC TR 61850-90-7 pointwise frequency-watt (FW) functions that adjust DER active power as a function of measured grid frequency. The importance of incorporating FW functions into a fleet of photovoltaic (PV) systems is demonstrated in simulation. Effects of FW curve design, including curtailment, deadband, and droop,more » were analyzed against performance metrics using Latin hypercube sampling for 20%, 70%, and 120% PV penetration scenarios on the Hawaiian island of Lanai. Finally, to understand the financial implications of FW functions to utilities, a performance function was defined based on monetary costs attributable to curtailed PV production, load shedding, and generator wear. An optimization wrapper was then created to find the best FW function curve for each penetration level. Lastly, it was found that in all cases, the utility would save money by implementing appropriate FW functions.« less

  9. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  10. Sensitivity analysis of infectious disease models: methods, advances and their application

    PubMed Central

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  11. Feasibility of Stochastic Voltage/VAr Optimization Considering Renewable Energy Resources for Smart Grid

    NASA Astrophysics Data System (ADS)

    Momoh, James A.; Salkuti, Surender Reddy

    2016-06-01

    This paper proposes a stochastic optimization technique for solving the Voltage/VAr control problem including the load demand and Renewable Energy Resources (RERs) variation. The RERs often take along some inputs like stochastic behavior. One of the important challenges i. e., Voltage/VAr control is a prime source for handling power system complexity and reliability, hence it is the fundamental requirement for all the utility companies. There is a need for the robust and efficient Voltage/VAr optimization technique to meet the peak demand and reduction of system losses. The voltages beyond the limit may damage costly sub-station devices and equipments at consumer end as well. Especially, the RERs introduces more disturbances and some of the RERs are not even capable enough to meet the VAr demand. Therefore, there is a strong need for the Voltage/VAr control in RERs environment. This paper aims at the development of optimal scheme for Voltage/VAr control involving RERs. In this paper, Latin Hypercube Sampling (LHS) method is used to cover full range of variables by maximally satisfying the marginal distribution. Here, backward scenario reduction technique is used to reduce the number of scenarios effectively and maximally retain the fitting accuracy of samples. The developed optimization scheme is tested on IEEE 24 bus Reliability Test System (RTS) considering the load demand and RERs variation.

  12. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  13. Using an ensemble smoother to evaluate parameter uncertainty of an integrated hydrological model of Yanqi basin

    NASA Astrophysics Data System (ADS)

    Li, Ning; McLaughlin, Dennis; Kinzelbach, Wolfgang; Li, WenPeng; Dong, XinGuang

    2015-10-01

    Model uncertainty needs to be quantified to provide objective assessments of the reliability of model predictions and of the risk associated with management decisions that rely on these predictions. This is particularly true in water resource studies that depend on model-based assessments of alternative management strategies. In recent decades, Bayesian data assimilation methods have been widely used in hydrology to assess uncertain model parameters and predictions. In this case study, a particular data assimilation algorithm, the Ensemble Smoother with Multiple Data Assimilation (ESMDA) (Emerick and Reynolds, 2012), is used to derive posterior samples of uncertain model parameters and forecasts for a distributed hydrological model of Yanqi basin, China. This model is constructed using MIKESHE/MIKE11software, which provides for coupling between surface and subsurface processes (DHI, 2011a-d). The random samples in the posterior parameter ensemble are obtained by using measurements to update 50 prior parameter samples generated with a Latin Hypercube Sampling (LHS) procedure. The posterior forecast samples are obtained from model runs that use the corresponding posterior parameter samples. Two iterative sample update methods are considered: one based on an a perturbed observation Kalman filter update and one based on a square root Kalman filter update. These alternatives give nearly the same results and converge in only two iterations. The uncertain parameters considered include hydraulic conductivities, drainage and river leakage factors, van Genuchten soil property parameters, and dispersion coefficients. The results show that the uncertainty in many of the parameters is reduced during the smoother updating process, reflecting information obtained from the observations. Some of the parameters are insensitive and do not benefit from measurement information. The correlation coefficients among certain parameters increase in each iteration, although they generally stay below 0.50.

  14. Concurrent Image Processing Executive (CIPE). Volume 2: Programmer's guide

    NASA Technical Reports Server (NTRS)

    Williams, Winifred I.

    1990-01-01

    This manual is intended as a guide for application programmers using the Concurrent Image Processing Executive (CIPE). CIPE is intended to become the support system software for a prototype high performance science analysis workstation. In its current configuration CIPE utilizes a JPL/Caltech Mark 3fp Hypercube with a Sun-4 host. CIPE's design is capable of incorporating other concurrent architectures as well. CIPE provides a programming environment to applications' programmers to shield them from various user interfaces, file transactions, and architectural complexities. A programmer may choose to write applications to use only the Sun-4 or to use the Sun-4 with the hypercube. A hypercube program will use the hypercube's data processors and optionally the Weitek floating point accelerators. The CIPE programming environment provides a simple set of subroutines to activate user interface functions, specify data distributions, activate hypercube resident applications, and to communicate parameters to and from the hypercube.

  15. A discrete random walk on the hypercube

    NASA Astrophysics Data System (ADS)

    Zhang, Jingyuan; Xiang, Yonghong; Sun, Weigang

    2018-03-01

    In this paper, we study the scaling for mean first-passage time (MFPT) of random walks on the hypercube and obtain a closed-form formula for the MFPT over all node pairs. We also determine the exponent of scaling efficiency characterizing the random walks and compare it with those of the existing networks. Finally we study the random walks on the hypercube with a located trap and provide a solution of the Kirchhoff index of the hypercube.

  16. Statistical uncertainty analysis applied to the DRAGONv4 code lattice calculations and based on JENDL-4 covariance data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez-Solis, A.; Demaziere, C.; Ekberg, C.

    2012-07-01

    In this paper, multi-group microscopic cross-section uncertainty is propagated through the DRAGON (Version 4) lattice code, in order to perform uncertainty analysis on k{infinity} and 2-group homogenized macroscopic cross-sections predictions. A statistical methodology is employed for such purposes, where cross-sections of certain isotopes of various elements belonging to the 172 groups DRAGLIB library format, are considered as normal random variables. This library is based on JENDL-4 data, because JENDL-4 contains the largest amount of isotopic covariance matrixes among the different major nuclear data libraries. The aim is to propagate multi-group nuclide uncertainty by running the DRAGONv4 code 500 times, andmore » to assess the output uncertainty of a test case corresponding to a 17 x 17 PWR fuel assembly segment without poison. The chosen sampling strategy for the current study is Latin Hypercube Sampling (LHS). The quasi-random LHS allows a much better coverage of the input uncertainties than simple random sampling (SRS) because it densely stratifies across the range of each input probability distribution. Output uncertainty assessment is based on the tolerance limits concept, where the sample formed by the code calculations infers to cover 95% of the output population with at least a 95% of confidence. This analysis is the first attempt to propagate parameter uncertainties of modern multi-group libraries, which are used to feed advanced lattice codes that perform state of the art resonant self-shielding calculations such as DRAGONv4. (authors)« less

  17. Physically motivated correlation formalism in hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Roy, Ankita; Rafert, J. Bruce

    2004-05-01

    Most remote sensing data-sets contain a limiting number of independent spatial and spectral measurements, beyond which no effective increase in information is achieved. This paper presents a Physically Motivated Correlation Formalism (PMCF) ,which places both Spatial and Spectral data on an equivalent mathematical footing in the context of a specific Kernel, such that, optimal combinations of independent data can be selected from the entire Hypercube via the method of "Correlation Moments". We present an experimental and computational analysis of Hyperspectral data sets using the Michigan Tech VFTHSI [Visible Fourier Transform Hyperspectral Imager] based on a Sagnac Interferometer, adjusted to obtain high SNR levels. The captured Signal Interferograms of different targets - aerial snaps of Houghton and lab-based data (white light , He-Ne laser , discharge tube sources) with the provision of customized scan of targets with the same exposures are processed using inverse imaging transformations and filtering techniques to obtain the Spectral profiles and generate Hypercubes to compute Spectral/Spatial/Cross Moments. PMCF answers the question of how optimally the entire hypercube should be sampled and finds how many spatial-spectral pixels are required for a particular target recognition.

  18. Parallel and fault-tolerant algorithms for hypercube multiprocessors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aykanat, C.

    1988-01-01

    Several techniques for increasing the performance of parallel algorithms on distributed-memory message-passing multi-processor systems are investigated. These techniques are effectively implemented for the parallelization of the Scaled Conjugate Gradient (SCG) algorithm on a hypercube connected message-passing multi-processor. Significant performance improvement is achieved by using these techniques. The SCG algorithm is used for the solution phase of an FE modeling system. Almost linear speed-up is achieved, and it is shown that hypercube topology is scalable for an FE class of problem. The SCG algorithm is also shown to be suitable for vectorization, and near supercomputer performance is achieved on a vectormore » hypercube multiprocessor by exploiting both parallelization and vectorization. Fault-tolerance issues for the parallel SCG algorithm and for the hypercube topology are also addressed.« less

  19. Uncertainty and sensitivity analysis of the basic reproduction number of diphtheria: a case study of a Rohingya refugee camp in Bangladesh, November–December 2017

    PubMed Central

    Matsuyama, Ryota; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya

    2018-01-01

    Background A Rohingya refugee camp in Cox’s Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R0. Methods A renewal process model was devised to estimate the R0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. Results R0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R0 would become smaller with greater variance of the generation time. Discussion Estimated R0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified. PMID:29629244

  20. Optimization on the impeller of a low-specific-speed centrifugal pump for hydraulic performance improvement

    NASA Astrophysics Data System (ADS)

    Pei, Ji; Wang, Wenjie; Yuan, Shouqi; Zhang, Jinfeng

    2016-09-01

    In order to widen the high-efficiency operating range of a low-specific-speed centrifugal pump, an optimization process for considering efficiencies under 1.0 Q d and 1.4 Q d is proposed. Three parameters, namely, the blade outlet width b 2, blade outlet angle β 2, and blade wrap angle φ, are selected as design variables. Impellers are generated using the optimal Latin hypercube sampling method. The pump efficiencies are calculated using the software CFX 14.5 at two operating points selected as objectives. Surrogate models are also constructed to analyze the relationship between the objectives and the design variables. Finally, the particle swarm optimization algorithm is applied to calculate the surrogate model to determine the best combination of the impeller parameters. The results show that the performance curve predicted by numerical simulation has a good agreement with the experimental results. Compared with the efficiencies of the original impeller, the hydraulic efficiencies of the optimized impeller are increased by 4.18% and 0.62% under 1.0 Q d and 1.4Qd, respectively. The comparison of inner flow between the original pump and optimized one illustrates the improvement of performance. The optimization process can provide a useful reference on performance improvement of other pumps, even on reduction of pressure fluctuations.

  1. Geometry control of long-span continuous girder concrete bridge during construction through finite element model updating

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Yan, Quan-sheng; Li, Jian; Hu, Min-yi

    2016-04-01

    In bridge construction, geometry control is critical to ensure that the final constructed bridge has the consistent shape as design. A common method is by predicting the deflections of the bridge during each construction phase through the associated finite element models. Therefore, the cambers of the bridge during different construction phases can be determined beforehand. These finite element models are mostly based on the design drawings and nominal material properties. However, the accuracy of these bridge models can be large due to significant uncertainties of the actual properties of the materials used in construction. Therefore, the predicted cambers may not be accurate to ensure agreement of bridge geometry with design, especially for long-span bridges. In this paper, an improved geometry control method is described, which incorporates finite element (FE) model updating during the construction process based on measured bridge deflections. A method based on the Kriging model and Latin hypercube sampling is proposed to perform the FE model updating due to its simplicity and efficiency. The proposed method has been applied to a long-span continuous girder concrete bridge during its construction. Results show that the method is effective in reducing construction error and ensuring the accuracy of the geometry of the final constructed bridge.

  2. Uncertainty and sensitivity analysis of the basic reproduction number of diphtheria: a case study of a Rohingya refugee camp in Bangladesh, November-December 2017.

    PubMed

    Matsuyama, Ryota; Akhmetzhanov, Andrei R; Endo, Akira; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya; Nishiura, Hiroshi

    2018-01-01

    A Rohingya refugee camp in Cox's Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R 0 . A renewal process model was devised to estimate the R 0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. R 0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R 0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R 0 would become smaller with greater variance of the generation time. Estimated R 0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified.

  3. Data Analysis Approaches for the Risk-Informed Safety Margins Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Alfonsi, Andrea; Maljovec, Daniel P.

    2016-09-01

    In the past decades, several numerical simulation codes have been employed to simulate accident dynamics (e.g., RELAP5-3D, RELAP-7, MELCOR, MAAP). In order to evaluate the impact of uncertainties into accident dynamics, several stochastic methodologies have been coupled with these codes. These stochastic methods range from classical Monte-Carlo and Latin Hypercube sampling to stochastic polynomial methods. Similar approaches have been introduced into the risk and safety community where stochastic methods (such as RAVEN, ADAPT, MCDET, ADS) have been coupled with safety analysis codes in order to evaluate the safety impact of timing and sequencing of events. These approaches are usually calledmore » Dynamic PRA or simulation-based PRA methods. These uncertainties and safety methods usually generate a large number of simulation runs (database storage may be on the order of gigabytes or higher). The scope of this paper is to present a broad overview of methods and algorithms that can be used to analyze and extract information from large data sets containing time dependent data. In this context, “extracting information” means constructing input-output correlations, finding commonalities, and identifying outliers. Some of the algorithms presented here have been developed or are under development within the RAVEN statistical framework.« less

  4. Identification of robust adaptation gene regulatory network parameters using an improved particle swarm optimization algorithm.

    PubMed

    Huang, X N; Ren, H P

    2016-05-13

    Robust adaptation is a critical ability of gene regulatory network (GRN) to survive in a fluctuating environment, which represents the system responding to an input stimulus rapidly and then returning to its pre-stimulus steady state timely. In this paper, the GRN is modeled using the Michaelis-Menten rate equations, which are highly nonlinear differential equations containing 12 undetermined parameters. The robust adaption is quantitatively described by two conflicting indices. To identify the parameter sets in order to confer the GRNs with robust adaptation is a multi-variable, multi-objective, and multi-peak optimization problem, which is difficult to acquire satisfactory solutions especially high-quality solutions. A new best-neighbor particle swarm optimization algorithm is proposed to implement this task. The proposed algorithm employs a Latin hypercube sampling method to generate the initial population. The particle crossover operation and elitist preservation strategy are also used in the proposed algorithm. The simulation results revealed that the proposed algorithm could identify multiple solutions in one time running. Moreover, it demonstrated a superior performance as compared to the previous methods in the sense of detecting more high-quality solutions within an acceptable time. The proposed methodology, owing to its universality and simplicity, is useful for providing the guidance to design GRN with superior robust adaptation.

  5. Efficient stochastic approaches for sensitivity studies of an Eulerian large-scale air pollution model

    NASA Astrophysics Data System (ADS)

    Dimov, I.; Georgieva, R.; Todorov, V.; Ostromsky, Tz.

    2017-10-01

    Reliability of large-scale mathematical models is an important issue when such models are used to support decision makers. Sensitivity analysis of model outputs to variation or natural uncertainties of model inputs is crucial for improving the reliability of mathematical models. A comprehensive experimental study of Monte Carlo algorithms based on Sobol sequences for multidimensional numerical integration has been done. A comparison with Latin hypercube sampling and a particular quasi-Monte Carlo lattice rule based on generalized Fibonacci numbers has been presented. The algorithms have been successfully applied to compute global Sobol sensitivity measures corresponding to the influence of several input parameters (six chemical reactions rates and four different groups of pollutants) on the concentrations of important air pollutants. The concentration values have been generated by the Unified Danish Eulerian Model. The sensitivity study has been done for the areas of several European cities with different geographical locations. The numerical tests show that the stochastic algorithms under consideration are efficient for multidimensional integration and especially for computing small by value sensitivity indices. It is a crucial element since even small indices may be important to be estimated in order to achieve a more accurate distribution of inputs influence and a more reliable interpretation of the mathematical model results.

  6. Assessment of the transportation route of oversize and excessive loads in relation to the load-bearing capacity of existing bridges

    NASA Astrophysics Data System (ADS)

    Doležel, Jiří; Novák, Drahomír; Petrů, Jan

    2017-09-01

    Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.

  7. Fault tree models for fault tolerant hypercube multiprocessors

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Tuazon, Jezus O.

    1991-01-01

    Three candidate fault tolerant hypercube architectures are modeled, their reliability analyses are compared, and the resulting implications of these methods of incorporating fault tolerance into hypercube multiprocessors are discussed. In the course of performing the reliability analyses, the use of HARP and fault trees in modeling sequence dependent system behaviors is demonstrated.

  8. High-throughput Raman chemical imaging for evaluating food safety and quality

    NASA Astrophysics Data System (ADS)

    Qin, Jianwei; Chao, Kuanglin; Kim, Moon S.

    2014-05-01

    A line-scan hyperspectral system was developed to enable Raman chemical imaging for large sample areas. A custom-designed 785 nm line-laser based on a scanning mirror serves as an excitation source. A 45° dichroic beamsplitter reflects the laser light to form a 24 cm x 1 mm excitation line normally incident on the sample surface. Raman signals along the laser line are collected by a detection module consisting of a dispersive imaging spectrograph and a CCD camera. A hypercube is accumulated line by line as a motorized table moves the samples transversely through the laser line. The system covers a Raman shift range of -648.7-2889.0 cm-1 and a 23 cm wide area. An example application, for authenticating milk powder, was presented to demonstrate the system performance. In four minutes, the system acquired a 512x110x1024 hypercube (56,320 spectra) from four 47-mm-diameter Petri dishes containing four powder samples. Chemical images were created for detecting two adulterants (melamine and dicyandiamide) that had been mixed into the milk powder.

  9. Multi-objective aerodynamic shape optimization of small livestock trailers

    NASA Astrophysics Data System (ADS)

    Gilkeson, C. A.; Toropov, V. V.; Thompson, H. M.; Wilson, M. C. T.; Foxley, N. A.; Gaskell, P. H.

    2013-11-01

    This article presents a formal optimization study of the design of small livestock trailers, within which the majority of animals are transported to market in the UK. The benefits of employing a headboard fairing to reduce aerodynamic drag without compromising the ventilation of the animals' microclimate are investigated using a multi-stage process involving computational fluid dynamics (CFD), optimal Latin hypercube (OLH) design of experiments (DoE) and moving least squares (MLS) metamodels. Fairings are parameterized in terms of three design variables and CFD solutions are obtained at 50 permutations of design variables. Both global and local search methods are employed to locate the global minimum from metamodels of the objective functions and a Pareto front is generated. The importance of carefully selecting an objective function is demonstrated and optimal fairing designs, offering drag reductions in excess of 5% without compromising animal ventilation, are presented.

  10. Critical Concentration Ratio for Solar Thermoelectric Generators

    NASA Astrophysics Data System (ADS)

    ur Rehman, Naveed; Siddiqui, Mubashir Ali

    2016-10-01

    A correlation for determining the critical concentration ratio (CCR) of solar concentrated thermoelectric generators (SCTEGs) has been established, and the significance of the contributing parameters is discussed in detail. For any SCTEG, higher concentration ratio leads to higher temperatures at the hot side of modules. However, the maximum value of this temperature for safe operation is limited by the material properties of the modules and should be considered as an important design constraint. Taking into account this limitation, the CCR can be defined as the maximum concentration ratio usable for a particular SCTEG. The established correlation is based on factors associated with the material and geometric properties of modules, thermal characteristics of the receiver, installation site attributes, and thermal and electrical operating conditions. To reduce the number of terms in the correlation, these factors are combined to form dimensionless groups by applying the Buckingham Pi theorem. A correlation model containing these groups is proposed and fit to a dataset obtained by simulating a thermodynamic (physical) model over sampled values acquired by applying the Latin hypercube sampling (LHS) technique over a realistic distribution of factors. The coefficient of determination and relative error are found to be 97% and ±20%, respectively. The correlation is validated by comparing the predicted results with literature values. In addition, the significance and effects of the Pi groups on the CCR are evaluated and thoroughly discussed. This study will lead to a wide range of opportunities regarding design and optimization of SCTEGs.

  11. Landscape patterns and soil organic carbon stocks in agricultural bocage landscapes

    NASA Astrophysics Data System (ADS)

    Viaud, Valérie; Lacoste, Marine; Michot, Didier; Walter, Christian

    2014-05-01

    Soil organic carbon (SOC) has a crucial impact on global carbon storage at world scale. SOC spatial variability is controlled by the landscape patterns resulting from the continuous interactions between the physical environment and the society. Natural and anthropogenic processes occurring and interplaying at the landscape scale, such as soil redistribution in the lateral and vertical dimensions by tillage and water erosion processes or spatial differentiation of land-use and land-management practices, strongly affect SOC dynamics. Inventories of SOC stocks, reflecting their spatial distribution, are thus key elements to develop relevant management strategies to improving carbon sequestration and mitigating climate change and soil degradation. This study aims to quantify SOC stocks and their spatial distribution in a 1,000-ha agricultural bocage landscape with dairy production as dominant farming system (Zone Atelier Armorique, LTER Europe, NW France). The site is characterized by high heterogeneity on short distance due to a high diversity of soils with varying waterlogging, soil parent material, topography, land-use and hedgerow density. SOC content and stocks were measured up to 105-cm depth in 200 sampling locations selected using conditioned Latin hypercube sampling. Additive sampling was designed to specifically explore SOC distribution near to hedges: 112 points were sampled at fixed distance on 14 transects perpendicular from hedges. We illustrate the heterogeneity of spatial and vertical distribution of SOC stocks at landscape scale, and quantify SOC stocks in the various landscape components. Using multivariate statistics, we discuss the variability and co-variability of existing spatial organization of cropping systems, environmental factors, and SOM stocks, over landscape. Ultimately, our results may contribute to improving regional or national digital soil mapping approaches, by considering the distribution of SOC stocks within each modeling unit and by accounting for the impact of sensitive ecosystems.

  12. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  13. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  14. Optimization of High-Dimensional Functions through Hypercube Evaluation

    PubMed Central

    Abiyev, Rahib H.; Tunay, Mustafa

    2015-01-01

    A novel learning algorithm for solving global numerical optimization problems is proposed. The proposed learning algorithm is intense stochastic search method which is based on evaluation and optimization of a hypercube and is called the hypercube optimization (HO) algorithm. The HO algorithm comprises the initialization and evaluation process, displacement-shrink process, and searching space process. The initialization and evaluation process initializes initial solution and evaluates the solutions in given hypercube. The displacement-shrink process determines displacement and evaluates objective functions using new points, and the search area process determines next hypercube using certain rules and evaluates the new solutions. The algorithms for these processes have been designed and presented in the paper. The designed HO algorithm is tested on specific benchmark functions. The simulations of HO algorithm have been performed for optimization of functions of 1000-, 5000-, or even 10000 dimensions. The comparative simulation results with other approaches demonstrate that the proposed algorithm is a potential candidate for optimization of both low and high dimensional functions. PMID:26339237

  15. A parallel simulated annealing algorithm for standard cell placement on a hypercube computer

    NASA Technical Reports Server (NTRS)

    Jones, Mark Howard

    1987-01-01

    A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.

  16. Benchmarking hypercube hardware and software

    NASA Technical Reports Server (NTRS)

    Grunwald, Dirk C.; Reed, Daniel A.

    1986-01-01

    It was long a truism in computer systems design that balanced systems achieve the best performance. Message passing parallel processors are no different. To quantify the balance of a hypercube design, an experimental methodology was developed and the associated suite of benchmarks was applied to several existing hypercubes. The benchmark suite includes tests of both processor speed in the absence of internode communication and message transmission speed as a function of communication patterns.

  17. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less

  18. A Bayesian ensemble data assimilation to constrain model parameters and land-use carbon emissions

    NASA Astrophysics Data System (ADS)

    Lienert, Sebastian; Joos, Fortunat

    2018-05-01

    A dynamic global vegetation model (DGVM) is applied in a probabilistic framework and benchmarking system to constrain uncertain model parameters by observations and to quantify carbon emissions from land-use and land-cover change (LULCC). Processes featured in DGVMs include parameters which are prone to substantial uncertainty. To cope with these uncertainties Latin hypercube sampling (LHS) is used to create a 1000-member perturbed parameter ensemble, which is then evaluated with a diverse set of global and spatiotemporally resolved observational constraints. We discuss the performance of the constrained ensemble and use it to formulate a new best-guess version of the model (LPX-Bern v1.4). The observationally constrained ensemble is used to investigate historical emissions due to LULCC (ELUC) and their sensitivity to model parametrization. We find a global ELUC estimate of 158 (108, 211) PgC (median and 90 % confidence interval) between 1800 and 2016. We compare ELUC to other estimates both globally and regionally. Spatial patterns are investigated and estimates of ELUC of the 10 countries with the largest contribution to the flux over the historical period are reported. We consider model versions with and without additional land-use processes (shifting cultivation and wood harvest) and find that the difference in global ELUC is on the same order of magnitude as parameter-induced uncertainty and in some cases could potentially even be offset with appropriate parameter choice.

  19. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  20. A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts

    NASA Astrophysics Data System (ADS)

    Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel

    2016-04-01

    Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.

  1. A Monte Carlo risk assessment model for acrylamide formation in French fries.

    PubMed

    Cummins, Enda; Butler, Francis; Gormley, Ronan; Brunton, Nigel

    2009-10-01

    The objective of this study is to estimate the likely human exposure to the group 2a carcinogen, acrylamide, from French fries by Irish consumers by developing a quantitative risk assessment model using Monte Carlo simulation techniques. Various stages in the French-fry-making process were modeled from initial potato harvest, storage, and processing procedures. The model was developed in Microsoft Excel with the @Risk add-on package. The model was run for 10,000 iterations using Latin hypercube sampling. The simulated mean acrylamide level in French fries was calculated to be 317 microg/kg. It was found that females are exposed to smaller levels of acrylamide than males (mean exposure of 0.20 microg/kg bw/day and 0.27 microg/kg bw/day, respectively). Although the carcinogenic potency of acrylamide is not well known, the simulated probability of exceeding the average chronic human dietary intake of 1 microg/kg bw/day (as suggested by WHO) was 0.054 and 0.029 for males and females, respectively. A sensitivity analysis highlighted the importance of the selection of appropriate cultivars with known low reducing sugar levels for French fry production. Strict control of cooking conditions (correlation coefficient of 0.42 and 0.35 for frying time and temperature, respectively) and blanching procedures (correlation coefficient -0.25) were also found to be important in ensuring minimal acrylamide formation.

  2. Probability-based methodology for buckling investigation of sandwich composite shells with and without cut-outs

    NASA Astrophysics Data System (ADS)

    Alfano, M.; Bisagni, C.

    2017-01-01

    The objective of the running EU project DESICOS (New Robust DESign Guideline for Imperfection Sensitive COmposite Launcher Structures) is to formulate an improved shell design methodology in order to meet the demand of aerospace industry for lighter structures. Within the project, this article discusses the development of a probability-based methodology developed at Politecnico di Milano. It is based on the combination of the Stress-Strength Interference Method and the Latin Hypercube Method with the aim to predict the bucking response of three sandwich composite cylindrical shells, assuming a loading condition of pure compression. The three shells are made of the same material, but have different stacking sequence and geometric dimensions. One of them presents three circular cut-outs. Different types of input imperfections, treated as random variables, are taken into account independently and in combination: variability in longitudinal Young's modulus, ply misalignment, geometric imperfections, and boundary imperfections. The methodology enables a first assessment of the structural reliability of the shells through the calculation of a probabilistic buckling factor for a specified level of probability. The factor depends highly on the reliability level, on the number of adopted samples, and on the assumptions made in modeling the input imperfections. The main advantage of the developed procedure is the versatility, as it can be applied to the buckling analysis of laminated composite shells and sandwich composite shells including different types of imperfections.

  3. The Mark III Hypercube-Ensemble Computers

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Tuazon, Jesus O.; Lieberman, Don; Pniel, Moshe

    1988-01-01

    Mark III Hypercube concept applied in development of series of increasingly powerful computers. Processor of each node of Mark III Hypercube ensemble is specialized computer containing three subprocessors and shared main memory. Solves problem quickly by simultaneously processing part of problem at each such node and passing combined results to host computer. Disciplines benefitting from speed and memory capacity include astrophysics, geophysics, chemistry, weather, high-energy physics, applied mechanics, image processing, oil exploration, aircraft design, and microcircuit design.

  4. A parallel row-based algorithm with error control for standard-cell replacement on a hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Sargent, Jeff Scott

    1988-01-01

    A new row-based parallel algorithm for standard-cell placement targeted for execution on a hypercube multiprocessor is presented. Key features of this implementation include a dynamic simulated-annealing schedule, row-partitioning of the VLSI chip image, and two novel new approaches to controlling error in parallel cell-placement algorithms; Heuristic Cell-Coloring and Adaptive (Parallel Move) Sequence Control. Heuristic Cell-Coloring identifies sets of noninteracting cells that can be moved repeatedly, and in parallel, with no buildup of error in the placement cost. Adaptive Sequence Control allows multiple parallel cell moves to take place between global cell-position updates. This feedback mechanism is based on an error bound derived analytically from the traditional annealing move-acceptance profile. Placement results are presented for real industry circuits and the performance is summarized of an implementation on the Intel iPSC/2 Hypercube. The runtime of this algorithm is 5 to 16 times faster than a previous program developed for the Hypercube, while producing equivalent quality placement. An integrated place and route program for the Intel iPSC/2 Hypercube is currently being developed.

  5. Parametric Sensitivity Analysis of Precipitation at Global and Local Scales in the Community Atmosphere Model CAM5

    DOE PAGES

    Qian, Yun; Yan, Huiping; Hou, Zhangshuan; ...

    2015-04-10

    We investigate the sensitivity of precipitation characteristics (mean, extreme and diurnal cycle) to a set of uncertain parameters that influence the qualitative and quantitative behavior of the cloud and aerosol processes in the Community Atmosphere Model (CAM5). We adopt both the Latin hypercube and quasi-Monte Carlo sampling approaches to effectively explore the high-dimensional parameter space and then conduct two large sets of simulations. One set consists of 1100 simulations (cloud ensemble) perturbing 22 parameters related to cloud physics and convection, and the other set consists of 256 simulations (aerosol ensemble) focusing on 16 parameters related to aerosols and cloud microphysics.more » Results show that for the 22 parameters perturbed in the cloud ensemble, the six having the greatest influences on the global mean precipitation are identified, three of which (related to the deep convection scheme) are the primary contributors to the total variance of the phase and amplitude of the precipitation diurnal cycle over land. The extreme precipitation characteristics are sensitive to a fewer number of parameters. The precipitation does not always respond monotonically to parameter change. The influence of individual parameters does not depend on the sampling approaches or concomitant parameters selected. Generally the GLM is able to explain more of the parametric sensitivity of global precipitation than local or regional features. The total explained variance for precipitation is primarily due to contributions from the individual parameters (75-90% in total). The total variance shows a significant seasonal variability in the mid-latitude continental regions, but very small in tropical continental regions.« less

  6. Simulating maize yield and bomass with spatial variability of soil field capacity

    USGS Publications Warehouse

    Ma, Liwang; Ahuja, Lajpat; Trout, Thomas; Nolan, Bernard T.; Malone, Robert W.

    2015-01-01

    Spatial variability in field soil properties is a challenge for system modelers who use single representative values, such as means, for model inputs, rather than their distributions. In this study, the root zone water quality model (RZWQM2) was first calibrated for 4 yr of maize (Zea mays L.) data at six irrigation levels in northern Colorado and then used to study spatial variability of soil field capacity (FC) estimated in 96 plots on maize yield and biomass. The best results were obtained when the crop parameters were fitted along with FCs, with a root mean squared error (RMSE) of 354 kg ha–1 for yield and 1202 kg ha–1 for biomass. When running the model using each of the 96 sets of field-estimated FC values, instead of calibrating FCs, the average simulated yield and biomass from the 96 runs were close to measured values with a RMSE of 376 kg ha–1 for yield and 1504 kg ha–1 for biomass. When an average of the 96 FC values for each soil layer was used, simulated yield and biomass were also acceptable with a RMSE of 438 kg ha–1 for yield and 1627 kg ha–1 for biomass. Therefore, when there are large numbers of FC measurements, an average value might be sufficient for model inputs. However, when the ranges of FC measurements were known for each soil layer, a sampled distribution of FCs using the Latin hypercube sampling (LHS) might be used for model inputs.

  7. Hyperswitch Network For Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Chow, Edward; Madan, Herbert; Peterson, John

    1989-01-01

    Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.

  8. Multiobjective robust design of the double wishbone suspension system based on particle swarm optimization.

    PubMed

    Cheng, Xianfu; Lin, Yuqun

    2014-01-01

    The performance of the suspension system is one of the most important factors in the vehicle design. For the double wishbone suspension system, the conventional deterministic optimization does not consider any deviations of design parameters, so design sensitivity analysis and robust optimization design are proposed. In this study, the design parameters of the robust optimization are the positions of the key points, and the random factors are the uncertainties in manufacturing. A simplified model of the double wishbone suspension is established by software ADAMS. The sensitivity analysis is utilized to determine main design variables. Then, the simulation experiment is arranged and the Latin hypercube design is adopted to find the initial points. The Kriging model is employed for fitting the mean and variance of the quality characteristics according to the simulation results. Further, a particle swarm optimization method based on simple PSO is applied and the tradeoff between the mean and deviation of performance is made to solve the robust optimization problem of the double wishbone suspension system.

  9. Soil organic carbon content assessment in a heterogeneous landscape: comparison of digital soil mapping and visible and near Infrared spectroscopy approaches

    NASA Astrophysics Data System (ADS)

    Michot, Didier; Fouad, Youssef; Pascal, Pichelin; Viaud, Valérie; Soltani, Inès; Walter, Christian

    2017-04-01

    This study aims are: i) to assess SOC content distribution according to the global soil map (GSM) project recommendations in a heterogeneous landscape ; ii) to compare the prediction performance of digital soil mapping (DSM) and visible-near infrared (Vis-NIR) spectroscopy approaches. The study area of 140 ha, located at Plancoët, surrounds the unique mineral spring water of Brittany (Western France). It's a hillock characterized by a heterogeneous landscape mosaic with different types of forest, permanent pastures and wetlands along a small coastal river. We acquired two independent datasets: j) 50 points selected using a conditioned Latin hypercube sampling (cLHS); jj) 254 points corresponding to the GSM grid. Soil samples were collected in three layers (0-5, 20-25 and 40-50cm) for both sampling strategies. SOC content was only measured in cLHS soil samples, while Vis-NIR spectra were measured on all the collected samples. For the DSM approach, a machine-learning algorithm (Cubist) was applied on the cLHS calibration data to build rule-based models linking soil carbon content in the different layers with environmental covariates, derived from digital elevation model, geological variables, land use data and existing large scale soil maps. For the spectroscopy approach, we used two calibration datasets: k) the local cLHS ; kk) a subset selected from the regional spectral database of Brittany after a PCA with a hierarchical clustering analysis and spiked by local cLHS spectra. The PLS regression algorithm with "leave-one-out" cross validation was performed for both calibration datasets. SOC contents for the 3 layers of the GSM grid were predicted using the different approaches and were compared with each other. Their prediction performance was evaluated by the following parameters: R2, RMSE and RPD. Both approaches led to satisfactory predictions for SOC content with an advantage for the spectral approach, particularly as regards the pertinence of the variation range.

  10. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  11. Optimal cube-connected cube multiprocessors

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He; Wu, Jie

    1993-01-01

    Many CFD (computational fluid dynamics) and other scientific applications can be partitioned into subproblems. However, in general the partitioned subproblems are very large. They demand high performance computing power themselves, and the solutions of the subproblems have to be combined at each time step. The cube-connect cube (CCCube) architecture is studied. The CCCube architecture is an extended hypercube structure with each node represented as a cube. It requires fewer physical links between nodes than the hypercube, and provides the same communication support as the hypercube does on many applications. The reduced physical links can be used to enhance the bandwidth of the remaining links and, therefore, enhance the overall performance. The concept and the method to obtain optimal CCCubes, which are the CCCubes with a minimum number of links under a given total number of nodes, are proposed. The superiority of optimal CCCubes over standard hypercubes was also shown in terms of the link usage in the embedding of a binomial tree. A useful computation structure based on a semi-binomial tree for divide-and-conquer type of parallel algorithms was identified. It was shown that this structure can be implemented in optimal CCCubes without performance degradation compared with regular hypercubes. The result presented should provide a useful approach to design of scientific parallel computers.

  12. Modeling Soil Organic Carbon Variation Along Climatic and Topographic Trajectories in the Central Andes

    NASA Astrophysics Data System (ADS)

    Gavilan, C.; Grunwald, S.; Quiroz, R.; Zhu, L.

    2015-12-01

    The Andes represent the largest and highest mountain range in the tropics. Geological and climatic differentiation favored landscape and soil diversity, resulting in ecosystems adapted to very different climatic patterns. Although several studies support the fact that the Andes are a vast sink of soil organic carbon (SOC) only few have quantified this variable in situ. Estimating the spatial distribution of SOC stocks in data-poor and/or poorly accessible areas, like the Andean region, is challenging due to the lack of recent soil data at high spatial resolution and the wide range of coexistent ecosystems. Thus, the sampling strategy is vital in order to ensure the whole range of environmental covariates (EC) controlling SOC dynamics is represented. This approach allows grasping the variability of the area, which leads to more efficient statistical estimates and improves the modeling process. The objectives of this study were to i) characterize and model the spatial distribution of SOC stocks in the Central Andean region using soil-landscape modeling techniques, and to ii) validate and evaluate the model for predicting SOC content in the area. For that purpose, three representative study areas were identified and a suite of variables including elevation, mean annual temperature, annual precipitation and Normalized Difference Vegetation Index (NDVI), among others, was selected as EC. A stratified random sampling (namely conditioned Latin Hypercube) was implemented and a total of 400 sampling locations were identified. At all sites, four composite topsoil samples (0-30 cm) were collected within a 2 m radius. SOC content was measured using dry combustion and SOC stocks were estimated using bulk density measurements. Regression Kriging was used to map the spatial variation of SOC stocks. The accuracy, fit and bias of SOC models was assessed using a rigorous validation assessment. This study produced the first comprehensive, geospatial SOC stock assessment in this undersampled region that serves as a baseline reference to assess potential impacts of climate and land use change.

  13. Retrospective and current risks of mercury to panthers in the Florida Everglades.

    PubMed

    Barron, Mace G; Duvall, Stephanie E; Barron, Kyle J

    2004-04-01

    Florida panthers are an endangered species inhabiting south Florida. Hg has been suggested as a causative factor for low populations and some reported panther deaths, but a quantitative assessment of risks has never been performed. This study quantitatively evaluated retrospective (pre-1992) and current (2002) risks of chronic dietary Hg exposures to panthers in the Florida Everglades. A probabilistic assessment of Hg risks was performed using a dietary exposure model and Latin Hypercube sampling that incorporated the variability and uncertainty in ingestion rate, diet, body weight, and mercury exposure of panthers. Hazard quotients (HQs) for retrospective risks ranged from less than 0.1-20, with a 46% probability of exceeding chronic dietary thresholds for methylmercury. Retrospective risks of developing clinical symptoms, including ataxia and convulsions, had an HQ range of <0.1-5.4 with a 17% probability of exceeding an HQ of 1. Current risks were substantially lower (4% probability of exceedences; HQ range <0.1-3.5) because of an estimated 70-90% decline in Hg exposure to panthers over the last decade. Under worst case conditions of panthers consuming only raccoons from the most contaminated area of the Everglades, current risks of developing clinical symptoms that may lead to death was 4.6%. Current risks of mercury poisoning of panthers with a diversified diet was 0.1% (HQ range of <0.1-1.4). The results of this assessment indicate that past Hg exposures likely adversely affected panthers in the Everglades, but current risks of Hg are low.

  14. The simulation of a two-dimensional (2D) transport problem in a rectangular region with Lattice Boltzmann method with two-relaxation-time

    NASA Astrophysics Data System (ADS)

    Sugiyanto, S.; Hardyanto, W.; Marwoto, P.

    2018-03-01

    Transport phenomena are found in many problems in many engineering and industrial sectors. We analyzed a Lattice Boltzmann method with Two-Relaxation Time (LTRT) collision operators for simulation of pollutant moving through the medium as a two-dimensional (2D) transport problem in a rectangular region model. This model consists of a 2D rectangular region with 54 length (x), 27 width (y), and it has isotropic homogeneous medium. Initially, the concentration is zero and is distributed evenly throughout the region of interest. A concentration of 1 is maintained at 9 < y < 18, whereas the concentration of zero is maintained at 0 < y < 9 and 18 < y < 27. A specific discharge (Darcy velocity) of 1.006 is assumed. A diffusion coefficient of 0.8333 is distributed uniformly with a uniform porosity of 0.35. A computer program is written in MATLAB to compute the concentration of pollutant at any specified place and time. The program shows that LTRT solution with quadratic equilibrium distribution functions (EDFs) and relaxation time τa=1.0 are in good agreement result with other numerical solutions methods such as 3DLEWASTE (Hybrid Three-dimensional Lagrangian-Eulerian Finite Element Model of Waste Transport Through Saturated-Unsaturated Media) obtained by Yeh and 3DFEMWATER-LHS (Three-dimensional Finite Element Model of Water Flow Through Saturated-Unsaturated Media with Latin Hypercube Sampling) obtained by Hardyanto.

  15. Development of a decision support tool for seasonal water supply management incorporating system uncertainties and operational constraints

    NASA Astrophysics Data System (ADS)

    Wang, H.; Asefa, T.

    2017-12-01

    A real-time decision support tool (DST) for water supply system would consider system uncertainties, e.g., uncertain streamflow and demand, as well as operational constraints and infrastructure outage (e.g., pump station shutdown, an offline reservoir due to maintenance). Such DST is often used by water managers for resource allocation and delivery for customers. Although most seasonal DST used by water managers recognize those system uncertainties and operational constraints, most use only historical information or assume deterministic outlook of water supply systems. This study presents a seasonal DST that incorporates rainfall/streamflow uncertainties, seasonal demand outlook and system operational constraints. Large scale climate-information is captured through a rainfall simulator driven by a Bayesian non-homogeneous Markov Chain Monte Carlo model that allows non-stationary transition probabilities contingent on Nino 3.4 index. An ad-hoc seasonal demand forecasting model considers weather conditions explicitly and socio-economic factors implicitly. Latin Hypercube sampling is employed to effectively sample probability density functions of flow and demand. Seasonal system operation is modelled as a mixed-integer optimization problem that aims at minimizing operational costs. It embeds the flexibility of modifying operational rules at different components, e.g., surface water treatment plants, desalination facilities, and groundwater pumping stations. The proposed framework is illustrated at a wholesale water supplier in Southeastern United States, Tampa Bay Water. The use of the tool is demonstrated in proving operational guidance in a typical drawdown and refill cycle of a regional reservoir. The DST provided: 1) probabilistic outlook of reservoir storage and chance of a successful refill by the end of rainy season; 2) operational expectations for large infrastructures (e.g., high service pumps and booster stations) throughout the season. Other potential use of such DST is also discussed.

  16. Estimation of ice activation parameters within a particle tracking Lagrangian cloud model using the ensemble Kalman filter to match ISCDAC golden case observations

    NASA Astrophysics Data System (ADS)

    Reisner, J. M.; Dubey, M. K.

    2010-12-01

    To both quantify and reduce uncertainty in ice activation parameterizations for stratus clouds occurring in the temperature range between -5 to -10 C ensemble simulations of an ISDAC golden case have been conducted. To formulate the ensemble, three parameters found within an ice activation model have been sampled using a Latin hypercube technique over a parameter range that induces large variability in both number and mass of ice. The ice activation model is contained within a Lagrangian cloud model that simulates particle number as a function of radius for cloud ice, snow, graupel, cloud, and rain particles. A unique aspect of this model is that it produces very low levels of numerical diffusion that enable the model to accurately resolve the sharp cloud edges associated with the ISDAC stratus deck. Another important aspect of the model is that near the cloud edges the number of particles can be significantly increased to reduce sampling errors and accurately resolve physical processes such as collision-coalescence that occur in this region. Thus, given these relatively low numerical errors, as compared to traditional bin models, the sensitivity of a stratus deck to changes in parameters found within the activation model can be examined without fear of numerical contamination. Likewise, once the ensemble has been completed, ISDAC observations can be incorporated into a Kalman filter to optimally estimate the ice activation parameters and reduce overall model uncertainty. Hence, this work will highlight the ability of an ensemble Kalman filter system coupled to a highly accurate numerical model to estimate important parameters found within microphysical parameterizations containing high uncertainty.

  17. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method—A Case Study of Western Jilin Province

    PubMed Central

    An, Yongkai; Lu, Wenxi; Cheng, Weiguo

    2015-01-01

    This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS) method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately. PMID:26264008

  18. Determining Reduced Order Models for Optimal Stochastic Reduced Order Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonney, Matthew S.; Brake, Matthew R.W.

    2015-08-01

    The use of parameterized reduced order models(PROMs) within the stochastic reduced order model (SROM) framework is a logical progression for both methods. In this report, five different parameterized reduced order models are selected and critiqued against the other models along with truth model for the example of the Brake-Reuss beam. The models are: a Taylor series using finite difference, a proper orthogonal decomposition of the the output, a Craig-Bampton representation of the model, a method that uses Hyper-Dual numbers to determine the sensitivities, and a Meta-Model method that uses the Hyper-Dual results and constructs a polynomial curve to better representmore » the output data. The methods are compared against a parameter sweep and a distribution propagation where the first four statistical moments are used as a comparison. Each method produces very accurate results with the Craig-Bampton reduction having the least accurate results. The models are also compared based on time requirements for the evaluation of each model where the Meta- Model requires the least amount of time for computation by a significant amount. Each of the five models provided accurate results in a reasonable time frame. The determination of which model to use is dependent on the availability of the high-fidelity model and how many evaluations can be performed. Analysis of the output distribution is examined by using a large Monte-Carlo simulation along with a reduced simulation using Latin Hypercube and the stochastic reduced order model sampling technique. Both techniques produced accurate results. The stochastic reduced order modeling technique produced less error when compared to an exhaustive sampling for the majority of methods.« less

  19. Estimating the Non-Monetary Burden of Neurocysticercosis in Mexico

    PubMed Central

    Bhattarai, Rachana; Budke, Christine M.; Carabin, Hélène; Proaño, Jefferson V.; Flores-Rivera, Jose; Corona, Teresa; Ivanek, Renata; Snowden, Karen F.; Flisser, Ana

    2012-01-01

    Background Neurocysticercosis (NCC) is a major public health problem in many developing countries where health education, sanitation, and meat inspection infrastructure are insufficient. The condition occurs when humans ingest eggs of the pork tapeworm Taenia solium, which then develop into larvae in the central nervous system. Although NCC is endemic in many areas of the world and is associated with considerable socio-economic losses, the burden of NCC remains largely unknown. This study provides the first estimate of disability adjusted life years (DALYs) associated with NCC in Mexico. Methods DALYs lost for symptomatic cases of NCC in Mexico were estimated by incorporating morbidity and mortality due to NCC-associated epilepsy, and morbidity due to NCC-associated severe chronic headaches. Latin hypercube sampling methods were employed to sample the distributions of uncertain parameters and to estimate 95% credible regions (95% CRs). Findings In Mexico, 144,433 and 98,520 individuals are estimated to suffer from NCC-associated epilepsy and NCC-associated severe chronic headaches, respectively. A total of 25,341 (95% CR: 12,569–46,640) DALYs were estimated to be lost due to these clinical manifestations, with 0.25 (95% CR: 0.12–0.46) DALY lost per 1,000 person-years of which 90% was due to NCC-associated epilepsy. Conclusion This is the first estimate of DALYs associated with NCC in Mexico. However, this value is likely to be underestimated since only the clinical manifestations of epilepsy and severe chronic headaches were included. In addition, due to limited country specific data, some parameters used in the analysis were based on systematic reviews of the literature or primary research from other geographic locations. Even with these limitations, our estimates suggest that healthy years of life are being lost due to NCC in Mexico. PMID:22363827

  20. High-speed mid-infrared hyperspectral imaging using quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Kelley, David B.; Goyal, Anish K.; Zhu, Ninghui; Wood, Derek A.; Myers, Travis R.; Kotidis, Petros; Murphy, Cara; Georgan, Chelsea; Raz, Gil; Maulini, Richard; Müller, Antoine

    2017-05-01

    We report on a standoff chemical detection system using widely tunable external-cavity quantum cascade lasers (ECQCLs) to illuminate target surfaces in the mid infrared (λ = 7.4 - 10.5 μm). Hyperspectral images (hypercubes) are acquired by synchronously operating the EC-QCLs with a LN2-cooled HgCdTe camera. The use of rapidly tunable lasers and a high-frame-rate camera enables the capture of hypercubes with 128 x 128 pixels and >100 wavelengths in <0.1 s. Furthermore, raster scanning of the laser illumination allowed imaging of a 100-cm2 area at 5-m standoff. Raw hypercubes are post-processed to generate a hypercube that represents the surface reflectance relative to that of a diffuse reflectance standard. Results will be shown for liquids (e.g., silicone oil) and solid particles (e.g., caffeine, acetaminophen) on a variety of surfaces (e.g., aluminum, plastic, glass). Signature spectra are obtained for particulate loadings of RDX on glass of <1 μg/cm2.

  1. Performance of a parallel code for the Euler equations on hypercube computers

    NASA Technical Reports Server (NTRS)

    Barszcz, Eric; Chan, Tony F.; Jesperson, Dennis C.; Tuminaro, Raymond S.

    1990-01-01

    The performance of hypercubes were evaluated on a computational fluid dynamics problem and the parallel environment issues were considered that must be addressed, such as algorithm changes, implementation choices, programming effort, and programming environment. The evaluation focuses on a widely used fluid dynamics code, FLO52, which solves the two dimensional steady Euler equations describing flow around the airfoil. The code development experience is described, including interacting with the operating system, utilizing the message-passing communication system, and code modifications necessary to increase parallel efficiency. Results from two hypercube parallel computers (a 16-node iPSC/2, and a 512-node NCUBE/ten) are discussed and compared. In addition, a mathematical model of the execution time was developed as a function of several machine and algorithm parameters. This model accurately predicts the actual run times obtained and is used to explore the performance of the code in interesting but yet physically realizable regions of the parameter space. Based on this model, predictions about future hypercubes are made.

  2. Experimental demonstration of the optical multi-mesh hypercube: scaleable interconnection network for multiprocessors and multicomputers.

    PubMed

    Louri, A; Furlonge, S; Neocleous, C

    1996-12-10

    A prototype of a novel topology for scaleable optical interconnection networks called the optical multi-mesh hypercube (OMMH) is experimentally demonstrated to as high as a 150-Mbit/s data rate (2(7) - 1 nonreturn-to-zero pseudo-random data pattern) at a bit error rate of 10(-13)/link by the use of commercially available devices. OMMH is a scaleable network [Appl. Opt. 33, 7558 (1994); J. Lightwave Technol. 12, 704 (1994)] architecture that combines the positive features of the hypercube (small diameter, connectivity, symmetry, simple routing, and fault tolerance) and the mesh (constant node degree and size scaleability). The optical implementation method is divided into two levels: high-density local connections for the hypercube modules, and high-bit-rate, low-density, long connections for the mesh links connecting the hypercube modules. Free-space imaging systems utilizing vertical-cavity surface-emitting laser (VCSEL) arrays, lenslet arrays, space-invariant holographic techniques, and photodiode arrays are demonstrated for the local connections. Optobus fiber interconnects from Motorola are used for the long-distance connections. The OMMH was optimized to operate at the data rate of Motorola's Optobus (10-bit-wide, VCSEL-based bidirectional data interconnects at 150 Mbits/s). Difficulties encountered included the varying fan-out efficiencies of the different orders of the hologram, misalignment sensitivity of the free-space links, low power (1 mW) of the individual VCSEL's, and noise.

  3. Requirements analysis for a hardware, discrete-event, simulation engine accelerator

    NASA Astrophysics Data System (ADS)

    Taylor, Paul J., Jr.

    1991-12-01

    An analysis of a general Discrete Event Simulation (DES), executing on the distributed architecture of an eight mode Intel PSC/2 hypercube, was performed. The most time consuming portions of the general DES algorithm were determined to be the functions associated with message passing of required simulation data between processing nodes of the hypercube architecture. A behavioral description, using the IEEE standard VHSIC Hardware Description and Design Language (VHDL), for a general DES hardware accelerator is presented. The behavioral description specifies the operational requirements for a DES coprocessor to augment the hypercube's execution of DES simulations. The DES coprocessor design implements the functions necessary to perform distributed discrete event simulations using a conservative time synchronization protocol.

  4. Regional prediction of soil organic carbon content over temperate croplands using visible near-infrared airborne hyperspectral imagery and synchronous field spectra

    NASA Astrophysics Data System (ADS)

    Vaudour, E.; Gilliot, J. M.; Bel, L.; Lefevre, J.; Chehdi, K.

    2016-07-01

    This study aimed at identifying the potential of Vis-NIR airborne hyperspectral AISA-Eagle data for predicting the topsoil organic carbon (SOC) content of bare cultivated soils over a large peri-urban area (221 km2) with both contrasted soils and SOC contents, located in the western region of Paris, France. Soil types comprised haplic luvisols, calcaric cambisols and colluvic cambisols. Airborne AISA-Eagle data (400-1000 nm, 126 bands) with 1 m-resolution were acquired on 17 April 2013 over 13 tracks. Tracks were atmospherically corrected then mosaicked at a 2 m-resolution using a set of 24 synchronous field spectra of bare soils, black and white targets and impervious surfaces. The land use identification system layer (RPG) of 2012 was used to mask non-agricultural areas, then calculation and thresholding of NDVI from an atmospherically corrected SPOT image acquired the same day enabled to map agricultural fields with bare soil. A total of 101 sites sampled either in 2013 or in the 3 previous years and in 2015 were identified as bare by means of this map. Predictions were made from the mosaic AISA spectra which were related to topsoil SOC contents by means of partial least squares regression (PLSR). Regression robustness was evaluated through a series of 1000 bootstrap data sets of calibration-validation samples, considering 74 sites outside cloud shadows only, and different sampling strategies for selecting calibration samples. Validation root-mean-square errors (RMSE) were comprised between 3.73 and 4.49 g Kg-1 and were ∼4 g Kg-1 in median. The most performing models in terms of coefficient of determination (R2) and Residual Prediction Deviation (RPD) values were the calibration models derived either from Kennard-Stone or conditioned Latin Hypercube sampling on smoothed spectra. The most generalizable model leading to lowest RMSE value of 3.73 g Kg-1 at the regional scale and 1.44 g Kg-1 at the within-field scale and low bias was the cross-validated leave-one-out PLSR model constructed with the 28 near-synchronous samples and raw spectra.

  5. Concurrent Image Processing Executive (CIPE)

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1988-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.

  6. Multiphase complete exchange on Paragon, SP2 and CS-2

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1995-01-01

    The overhead of interprocessor communication is a major factor in limiting the performance of parallel computer systems. The complete exchange is the severest communication pattern in that it requires each processor to send a distinct message to every other processor. This pattern is at the heart of many important parallel applications. On hypercubes, multiphase complete exchange has been developed and shown to provide optimal performance over varying message sizes. Most commercial multicomputer systems do not have a hypercube interconnect. However, they use special purpose hardware and dedicated communication processors to achieve very high performance communication and can be made to emulate the hypercube quite well. Multiphase complete exchange has been implemented on three contemporary parallel architectures: the Intel Paragon, IBM SP2 and Meiko CS-2. The essential features of these machines are described and their basic interprocessor communication overheads are discussed. The performance of multiphase complete exchange is evaluated on each machine. It is shown that the theoretical ideas developed for hypercubes are also applicable in practice to these machines and that multiphase complete exchange can lead to major savings in execution time over traditional solutions.

  7. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  8. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  9. Modeling uncertainty and correlation in soil properties using Restricted Pairing and implications for ensemble-based hillslope-scale soil moisture and temperature estimation

    NASA Astrophysics Data System (ADS)

    Flores, A. N.; Entekhabi, D.; Bras, R. L.

    2007-12-01

    Soil hydraulic and thermal properties (SHTPs) affect both the rate of moisture redistribution in the soil column and the volumetric soil water capacity. Adequately constraining these properties through field and lab analysis to parameterize spatially-distributed hydrology models is often prohibitively expensive. Because SHTPs vary significantly at small spatial scales individual soil samples are also only reliably indicative of local conditions, and these properties remain a significant source of uncertainty in soil moisture and temperature estimation. In ensemble-based soil moisture data assimilation, uncertainty in the model-produced prior estimate due to associated uncertainty in SHTPs must be taken into account to avoid under-dispersive ensembles. To treat SHTP uncertainty for purposes of supplying inputs to a distributed watershed model we use the restricted pairing (RP) algorithm, an extension of Latin Hypercube (LH) sampling. The RP algorithm generates an arbitrary number of SHTP combinations by sampling the appropriate marginal distributions of the individual soil properties using the LH approach, while imposing a target rank correlation among the properties. A previously-published meta- database of 1309 soils representing 12 textural classes is used to fit appropriate marginal distributions to the properties and compute the target rank correlation structure, conditioned on soil texture. Given categorical soil textures, our implementation of the RP algorithm generates an arbitrarily-sized ensemble of realizations of the SHTPs required as input to the TIN-based Realtime Integrated Basin Simulator with vegetation dynamics (tRIBS+VEGGIE) distributed parameter ecohydrology model. Soil moisture ensembles simulated with RP- generated SHTPs exhibit less variance than ensembles simulated with SHTPs generated by a scheme that neglects correlation among properties. Neglecting correlation among SHTPs can lead to physically unrealistic combinations of parameters that exhibit implausible hydrologic behavior when input to the tRIBS+VEGGIE model.

  10. Parallel spatial direct numerical simulations on the Intel iPSC/860 hypercube

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Zubair, Mohammad

    1993-01-01

    The implementation and performance of a parallel spatial direct numerical simulation (PSDNS) approach on the Intel iPSC/860 hypercube is documented. The direct numerical simulation approach is used to compute spatially evolving disturbances associated with the laminar-to-turbulent transition in boundary-layer flows. The feasibility of using the PSDNS on the hypercube to perform transition studies is examined. The results indicate that the direct numerical simulation approach can effectively be parallelized on a distributed-memory parallel machine. By increasing the number of processors nearly ideal linear speedups are achieved with nonoptimized routines; slower than linear speedups are achieved with optimized (machine dependent library) routines. This slower than linear speedup results because the Fast Fourier Transform (FFT) routine dominates the computational cost and because the routine indicates less than ideal speedups. However with the machine-dependent routines the total computational cost decreases by a factor of 4 to 5 compared with standard FORTRAN routines. The computational cost increases linearly with spanwise wall-normal and streamwise grid refinements. The hypercube with 32 processors was estimated to require approximately twice the amount of Cray supercomputer single processor time to complete a comparable simulation; however it is estimated that a subgrid-scale model which reduces the required number of grid points and becomes a large-eddy simulation (PSLES) would reduce the computational cost and memory requirements by a factor of 10 over the PSDNS. This PSLES implementation would enable transition simulations on the hypercube at a reasonable computational cost.

  11. Parallel computation for biological sequence comparison: comparing a portable model to the native model for the Intel Hypercube.

    PubMed

    Nadkarni, P M; Miller, P L

    1991-01-01

    A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations.

  12. System health monitoring using multiple-model adaptive estimation techniques

    NASA Astrophysics Data System (ADS)

    Sifford, Stanley Ryan

    Monitoring system health for fault detection and diagnosis by tracking system parameters concurrently with state estimates is approached using a new multiple-model adaptive estimation (MMAE) method. This novel method is called GRid-based Adaptive Parameter Estimation (GRAPE). GRAPE expands existing MMAE methods by using new techniques to sample the parameter space. GRAPE expands on MMAE with the hypothesis that sample models can be applied and resampled without relying on a predefined set of models. GRAPE is initially implemented in a linear framework using Kalman filter models. A more generalized GRAPE formulation is presented using extended Kalman filter (EKF) models to represent nonlinear systems. GRAPE can handle both time invariant and time varying systems as it is designed to track parameter changes. Two techniques are presented to generate parameter samples for the parallel filter models. The first approach is called selected grid-based stratification (SGBS). SGBS divides the parameter space into equally spaced strata. The second approach uses Latin Hypercube Sampling (LHS) to determine the parameter locations and minimize the total number of required models. LHS is particularly useful when the parameter dimensions grow. Adding more parameters does not require the model count to increase for LHS. Each resample is independent of the prior sample set other than the location of the parameter estimate. SGBS and LHS can be used for both the initial sample and subsequent resamples. Furthermore, resamples are not required to use the same technique. Both techniques are demonstrated for both linear and nonlinear frameworks. The GRAPE framework further formalizes the parameter tracking process through a general approach for nonlinear systems. These additional methods allow GRAPE to either narrow the focus to converged values within a parameter range or expand the range in the appropriate direction to track the parameters outside the current parameter range boundary. Customizable rules define the specific resample behavior when the GRAPE parameter estimates converge. Convergence itself is determined from the derivatives of the parameter estimates using a simple moving average window to filter out noise. The system can be tuned to match the desired performance goals by making adjustments to parameters such as the sample size, convergence criteria, resample criteria, initial sampling method, resampling method, confidence in prior sample covariances, sample delay, and others.

  13. Developing a Screening Algorithm for Type II Diabetes Mellitus in the Resource-Limited Setting of Rural Tanzania.

    PubMed

    West, Caroline; Ploth, David; Fonner, Virginia; Mbwambo, Jessie; Fredrick, Francis; Sweat, Michael

    2016-04-01

    Noncommunicable diseases are on pace to outnumber infectious disease as the leading cause of death in sub-Saharan Africa, yet many questions remain unanswered with concern toward effective methods of screening for type II diabetes mellitus (DM) in this resource-limited setting. We aim to design a screening algorithm for type II DM that optimizes sensitivity and specificity of identifying individuals with undiagnosed DM, as well as affordability to health systems and individuals. Baseline demographic and clinical data, including hemoglobin A1c (HbA1c), were collected from 713 participants using probability sampling of the general population. We used these data, along with model parameters obtained from the literature, to mathematically model 8 purposed DM screening algorithms, while optimizing the sensitivity and specificity using Monte Carlo and Latin Hypercube simulation. An algorithm that combines risk assessment and measurement of fasting blood glucose was found to be superior for the most resource-limited settings (sensitivity 68%, sensitivity 99% and cost per patient having DM identified as $2.94). Incorporating HbA1c testing improves the sensitivity to 75.62%, but raises the cost per DM case identified to $6.04. The preferred algorithms are heavily biased to diagnose those with more severe cases of DM. Using basic risk assessment tools and fasting blood sugar testing in lieu of HbA1c testing in resource-limited settings could allow for significantly more feasible DM screening programs with reasonable sensitivity and specificity. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  14. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    NASA Astrophysics Data System (ADS)

    Bennett, Katrina E.; Urrego Blanco, Jorge R.; Jonko, Alexandra; Bohn, Theodore J.; Atchley, Adam L.; Urban, Nathan M.; Middleton, Richard S.

    2018-01-01

    The Colorado River Basin is a fundamentally important river for society, ecology, and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent, and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model. We combine global sensitivity analysis with a space-filling Latin Hypercube Sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach. We find that snow-dominated regions are much more sensitive to uncertainties in VIC parameters. Although baseflow and runoff changes respond to parameters used in previous sensitivity studies, we discover new key parameter sensitivities. For instance, changes in runoff and evapotranspiration are sensitive to albedo, while changes in snow water equivalent are sensitive to canopy fraction and Leaf Area Index (LAI) in the VIC model. It is critical for improved modeling to narrow uncertainty in these parameters through improved observations and field studies. This is important because LAI and albedo are anticipated to change under future climate and narrowing uncertainty is paramount to advance our application of models such as VIC for water resource management.

  15. Sensitivity Analysis of the Sheet Metal Stamping Processes Based on Inverse Finite Element Modeling and Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Yu, Maolin; Du, R.

    2005-08-01

    Sheet metal stamping is one of the most commonly used manufacturing processes, and hence, much research has been carried for economic gain. Searching through the literatures, however, it is found that there are still a lots of problems unsolved. For example, it is well known that for a same press, same workpiece material, and same set of die, the product quality may vary owing to a number of factors, such as the inhomogeneous of the workpice material, the loading error, the lubrication, and etc. Presently, few seem able to predict the quality variation, not to mention what contribute to the quality variation. As a result, trial-and-error is still needed in the shop floor, causing additional cost and time delay. This paper introduces a new approach to predict the product quality variation and identify the sensitive design / process parameters. The new approach is based on a combination of inverse Finite Element Modeling (FEM) and Monte Carlo Simulation (more specifically, the Latin Hypercube Sampling (LHS) approach). With an acceptable accuracy, the inverse FEM (also called one-step FEM) requires much less computation load than that of the usual incremental FEM and hence, can be used to predict the quality variations under various conditions. LHS is a statistical method, through which the sensitivity analysis can be carried out. The result of the sensitivity analysis has clear physical meaning and can be used to optimize the die design and / or the process design. Two simulation examples are presented including drawing a rectangular box and drawing a two-step rectangular box.

  16. An Elitist Multiobjective Tabu Search for Optimal Design of Groundwater Remediation Systems.

    PubMed

    Yang, Yun; Wu, Jianfeng; Wang, Jinguo; Zhou, Zhifang

    2017-11-01

    This study presents a new multiobjective evolutionary algorithm (MOEA), the elitist multiobjective tabu search (EMOTS), and incorporates it with MODFLOW/MT3DMS to develop a groundwater simulation-optimization (SO) framework based on modular design for optimal design of groundwater remediation systems using pump-and-treat (PAT) technique. The most notable improvement of EMOTS over the original multiple objective tabu search (MOTS) lies in the elitist strategy, selection strategy, and neighborhood move rule. The elitist strategy is to maintain all nondominated solutions within later search process for better converging to the true Pareto front. The elitism-based selection operator is modified to choose two most remote solutions from current candidate list as seed solutions to increase the diversity of searching space. Moreover, neighborhood solutions are uniformly generated using the Latin hypercube sampling (LHS) in the bounded neighborhood space around each seed solution. To demonstrate the performance of the EMOTS, we consider a synthetic groundwater remediation example. Problem formulations consist of two objective functions with continuous decision variables of pumping rates while meeting water quality requirements. Especially, sensitivity analysis is evaluated through the synthetic case for determination of optimal combination of the heuristic parameters. Furthermore, the EMOTS is successfully applied to evaluate remediation options at the field site of the Massachusetts Military Reservation (MMR) in Cape Cod, Massachusetts. With both the hypothetical and the large-scale field remediation sites, the EMOTS-based SO framework is demonstrated to outperform the original MOTS in achieving the performance metrics of optimality and diversity of nondominated frontiers with desirable stability and robustness. © 2017, National Ground Water Association.

  17. Acute effect of Vagus nerve stimulation parameters on cardiac chronotropic, inotropic, and dromotropic responses

    NASA Astrophysics Data System (ADS)

    Ojeda, David; Le Rolle, Virginie; Romero-Ugalde, Hector M.; Gallet, Clément; Bonnet, Jean-Luc; Henry, Christine; Bel, Alain; Mabo, Philippe; Carrault, Guy; Hernández, Alfredo I.

    2017-11-01

    Vagus nerve stimulation (VNS) is an established therapy for drug-resistant epilepsy and depression, and is considered as a potential therapy for other pathologies, including Heart Failure (HF) or inflammatory diseases. In the case of HF, several experimental studies on animals have shown an improvement in the cardiac function and a reverse remodeling of the cardiac cavity when VNS is applied. However, recent clinical trials have not been able to reproduce the same response in humans. One of the hypothesis to explain this lack of response is related to the way in which stimulation parameters are defined. The combined effect of VNS parameters is still poorly-known, especially in the case of VNS synchronously delivered with cardiac activity. In this paper, we propose a methodology to analyze the acute cardiovascular effects of VNS parameters individually, as well as their interactive effects. A Latin hypercube sampling method was applied to design a uniform experimental plan. Data gathered from this experimental plan was used to produce a Gaussian process regression (GPR) model in order to estimate unobserved VNS sequences. Finally, a Morris screening sensitivity analysis method was applied to each obtained GPR model. Results highlight dominant effects of pulse current, pulse width and number of pulses over frequency and delay and, more importantly, the degree of interactions between these parameters on the most important acute cardiovascular responses. In particular, high interacting effects between current and pulse width were found. Similar sensitivity profiles were observed for chronotropic, dromotropic and inotropic effects. These findings are of primary importance for the future development of closed-loop, personalized neuromodulator technologies.

  18. Analysis of the influence of handset phone position on RF exposure of brain tissue.

    PubMed

    Ghanmi, Amal; Varsier, Nadège; Hadjem, Abdelhamid; Conil, Emmanuelle; Picon, Odile; Wiart, Joe

    2014-12-01

    Exposure to mobile phone radio frequency (RF) electromagnetic fields depends on many different parameters. For epidemiological studies investigating the risk of brain cancer linked to RF exposure from mobile phones, it is of great interest to characterize brain tissue exposure and to know which parameters this exposure is sensitive to. One such parameter is the position of the phone during communication. In this article, we analyze the influence of the phone position on the brain exposure by comparing the specific absorption rate (SAR) induced in the head by two different mobile phone models operating in Global System for Mobile Communications (GSM) frequency bands. To achieve this objective, 80 different phone positions were chosen using an experiment based on the Latin hypercube sampling (LHS) to select a representative set of positions. The averaged SAR over 10 g (SAR10 g) in the head, the averaged SAR over 1 g (SAR1 g ) in the brain, and the averaged SAR in different anatomical brain structures were estimated at 900 and 1800 MHz for the 80 positions. The results illustrate that SAR distributions inside the brain area are sensitive to the position of the mobile phone relative to the head. The results also show that for 5-10% of the studied positions the SAR10 g in the head and the SAR1 g in the brain can be 20% higher than the SAR estimated for the standard cheek position and that the Specific Anthropomorphic Mannequin (SAM) model is conservative for 95% of all the studied positions. © 2014 Wiley Periodicals, Inc.

  19. The formation of continuous opinion dynamics based on a gambling mechanism and its sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Alexandre Wang, Qiuping; Li, Wei; Cai, Xu

    2017-09-01

    The formation of continuous opinion dynamics is investigated based on a virtual gambling mechanism where agents fight for a limited resource. We propose a model with agents holding opinions between -1 and 1. Agents are segregated into two cliques according to the sign of their opinions. Local communication happens only when the opinion distance between corresponding agents is no larger than a pre-defined confidence threshold. Theoretical analysis regarding special cases provides a deep understanding of the roles of both the resource allocation parameter and confidence threshold in the formation of opinion dynamics. For a sparse network, the evolution of opinion dynamics is negligible in the region of low confidence threshold when the mindless agents are absent. Numerical results also imply that, in the presence of economic agents, high confidence threshold is required for apparent clustering of agents in opinion. Moreover, a consensus state is generated only when the following three conditions are satisfied simultaneously: mindless agents are absent, the resource is concentrated in one clique, and confidence threshold tends to a critical value(=1.25+2/ka ; k_a>8/3 , the average number of friends of individual agents). For fixed a confidence threshold and resource allocation parameter, the most chaotic steady state of the dynamics happens when the fraction of mindless agents is about 0.7. It is also demonstrated that economic agents are more likely to win at gambling, compared to mindless ones. Finally, the importance of three involved parameters in establishing the uncertainty of model response is quantified in terms of Latin hypercube sampling-based sensitivity analysis.

  20. Optimization design of energy deposition on single expansion ramp nozzle

    NASA Astrophysics Data System (ADS)

    Ju, Shengjun; Yan, Chao; Wang, Xiaoyong; Qin, Yupei; Ye, Zhifei

    2017-11-01

    Optimization design has been widely used in the aerodynamic design process of scramjets. The single expansion ramp nozzle is an important component for scramjets to produces most of thrust force. A new concept of increasing the aerodynamics of the scramjet nozzle with energy deposition is presented. The essence of the method is to create a heated region in the inner flow field of the scramjet nozzle. In the current study, the two-dimensional coupled implicit compressible Reynolds Averaged Navier-Stokes and Menter's shear stress transport turbulence model have been applied to numerically simulate the flow fields of the single expansion ramp nozzle with and without energy deposition. The numerical results show that the proposal of energy deposition can be an effective method to increase force characteristics of the scramjet nozzle, the thrust coefficient CT increase by 6.94% and lift coefficient CN decrease by 26.89%. Further, the non-dominated sorting genetic algorithm coupled with the Radial Basis Function neural network surrogate model has been employed to determine optimum location and density of the energy deposition. The thrust coefficient CT and lift coefficient CN are selected as objective functions, and the sampling points are obtained numerically by using a Latin hypercube design method. The optimized thrust coefficient CT further increase by 1.94%, meanwhile, the optimized lift coefficient CN further decrease by 15.02% respectively. At the same time, the optimized performances are in good and reasonable agreement with the numerical predictions. The findings suggest that scramjet nozzle design and performance can benefit from the application of energy deposition.

  1. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  2. 3-D ballistic transport of ellipsoidal volcanic projectiles considering horizontal wind field and variable shape-dependent drag coefficients

    NASA Astrophysics Data System (ADS)

    Bertin, Daniel

    2017-02-01

    An innovative 3-D numerical model for the dynamics of volcanic ballistic projectiles is presented here. The model focuses on ellipsoidal particles and improves previous approaches by considering horizontal wind field, virtual mass forces, and drag forces subjected to variable shape-dependent drag coefficients. Modeling suggests that the projectile's launch velocity and ejection angle are first-order parameters influencing ballistic trajectories. The projectile's density and minor radius are second-order factors, whereas both intermediate and major radii of the projectile are of third order. Comparing output parameters, assuming different input data, highlights the importance of considering a horizontal wind field and variable shape-dependent drag coefficients in ballistic modeling, which suggests that they should be included in every ballistic model. On the other hand, virtual mass forces should be discarded since they almost do not contribute to ballistic trajectories. Simulation results were used to constrain some crucial input parameters (launch velocity, ejection angle, wind speed, and wind azimuth) of the block that formed the biggest and most distal ballistic impact crater during the 1984-1993 eruptive cycle of Lascar volcano, Northern Chile. Subsequently, up to 106 simulations were performed, whereas nine ejection parameters were defined by a Latin-hypercube sampling approach. Simulation results were summarized as a quantitative probabilistic hazard map for ballistic projectiles. Transects were also done in order to depict aerial hazard zones based on the same probabilistic procedure. Both maps combined can be used as a hazard prevention tool for ground and aerial transits nearby unresting volcanoes.

  3. Evaluation of incremental reactivity and its uncertainty in Southern California.

    PubMed

    Martien, Philip T; Harley, Robert A; Milford, Jana B; Russell, Armistead G

    2003-04-15

    The incremental reactivity (IR) and relative incremental reactivity (RIR) of carbon monoxide and 30 individual volatile organic compounds (VOC) were estimated for the South Coast Air Basin using two photochemical air quality models: a 3-D, grid-based model and a vertically resolved trajectory model. Both models include an extended version of the SAPRC99 chemical mechanism. For the 3-D modeling, the decoupled direct method (DDM-3D) was used to assess reactivities. The trajectory model was applied to estimate uncertainties in reactivities due to uncertainties in chemical rate parameters, deposition parameters, and emission rates using Monte Carlo analysis with Latin hypercube sampling. For most VOC, RIRs were found to be consistent in rankings with those produced by Carter using a box model. However, 3-D simulations show that coastal regions, upwind of most of the emissions, have comparatively low IR but higher RIR than predicted by box models for C4-C5 alkenes and carbonyls that initiate the production of HOx radicals. Biogenic VOC emissions were found to have a lower RIR than predicted by box model estimates, because emissions of these VOC were mostly downwind of the areas of primary ozone production. Uncertainties in RIR of individual VOC were found to be dominated by uncertainties in the rate parameters of their primary oxidation reactions. The coefficient of variation (COV) of most RIR values ranged from 20% to 30%, whereas the COV of absolute incremental reactivity ranged from about 30% to 40%. In general, uncertainty and variability both decreased when relative rather than absolute reactivity metrics were used.

  4. Parameter Uncertainty on AGCM-simulated Tropical Cyclones

    NASA Astrophysics Data System (ADS)

    He, F.

    2015-12-01

    This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.

  5. Significant uncertainty in global scale hydrological modeling from precipitation data errors

    NASA Astrophysics Data System (ADS)

    Sperna Weiland, Frederiek C.; Vrugt, Jasper A.; van Beek, Rens (L.) P. H.; Weerts, Albrecht H.; Bierkens, Marc F. P.

    2015-10-01

    In the past decades significant progress has been made in the fitting of hydrologic models to data. Most of this work has focused on simple, CPU-efficient, lumped hydrologic models using discharge, water table depth, soil moisture, or tracer data from relatively small river basins. In this paper, we focus on large-scale hydrologic modeling and analyze the effect of parameter and rainfall data uncertainty on simulated discharge dynamics with the global hydrologic model PCR-GLOBWB. We use three rainfall data products; the CFSR reanalysis, the ERA-Interim reanalysis, and a combined ERA-40 reanalysis and CRU dataset. Parameter uncertainty is derived from Latin Hypercube Sampling (LHS) using monthly discharge data from five of the largest river systems in the world. Our results demonstrate that the default parameterization of PCR-GLOBWB, derived from global datasets, can be improved by calibrating the model against monthly discharge observations. Yet, it is difficult to find a single parameterization of PCR-GLOBWB that works well for all of the five river basins considered herein and shows consistent performance during both the calibration and evaluation period. Still there may be possibilities for regionalization based on catchment similarities. Our simulations illustrate that parameter uncertainty constitutes only a minor part of predictive uncertainty. Thus, the apparent dichotomy between simulations of global-scale hydrologic behavior and actual data cannot be resolved by simply increasing the model complexity of PCR-GLOBWB and resolving sub-grid processes. Instead, it would be more productive to improve the characterization of global rainfall amounts at spatial resolutions of 0.5° and smaller.

  6. Hypercube technology

    NASA Technical Reports Server (NTRS)

    Parker, Jay W.; Cwik, Tom; Ferraro, Robert D.; Liewer, Paulett C.; Patterson, Jean E.

    1991-01-01

    The JPL designed MARKIII hypercube supercomputer has been in application service since June 1988 and has had successful application to a broad problem set including electromagnetic scattering, discrete event simulation, plasma transport, matrix algorithms, neural network simulation, image processing, and graphics. Currently, problems that are not homogeneous are being attempted, and, through this involvement with real world applications, the software is evolving to handle the heterogeneous class problems efficiently.

  7. Parallel computation for biological sequence comparison: comparing a portable model to the native model for the Intel Hypercube.

    PubMed Central

    Nadkarni, P. M.; Miller, P. L.

    1991-01-01

    A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations. PMID:1807632

  8. Communication overhead on the Intel iPSC-860 hypercube

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1990-01-01

    Experiments were conducted on the Intel iPSC-860 hypercube in order to evaluate the overhead of interprocessor communication. It is demonstrated that: (1) contrary to popular belief, the distance between two communicating processors has a significant impact on communication time, (2) edge contention can increase communication time by a factor of more than 7, and (3) node contention has no measurable impact.

  9. RISC-type microprocessors may revolutionize aerospace simulation

    NASA Astrophysics Data System (ADS)

    Jackson, Albert S.

    The author explores the application of RISC (reduced instruction set computer) processors in massively parallel computer (MPC) designs for aerospace simulation. The MPC approach is shown to be well adapted to the needs of aerospace simulation. It is shown that any of the three common types of interconnection schemes used with MPCs are effective for general-purpose simulation, although the bus-or switch-oriented machines are somewhat easier to use. For partial differential equation models, the hypercube approach at first glance appears more efficient because the nearest-neighbor connections required for three-dimensional models are hardwired in a hypercube machine. However, the data broadcast ability of a bus system, combined with the fact that data can be transmitted over a bus as soon as it has been updated, makes the bus approach very competitive with the hypercube approach even for these types of models.

  10. Generalized hypercube structures and hyperswitch communication network

    NASA Technical Reports Server (NTRS)

    Young, Steven D.

    1992-01-01

    This paper discusses an ongoing study that uses a recent development in communication control technology to implement hybrid hypercube structures. These architectures are similar to binary hypercubes, but they also provide added connectivity between the processors. This added connectivity increases communication reliability while decreasing the latency of interprocessor message passing. Because these factors directly determine the speed that can be obtained by multiprocessor systems, these architectures are attractive for applications such as remote exploration and experimentation, where high performance and ultrareliability are required. This paper describes and enumerates these architectures and discusses how they can be implemented with a modified version of the hyperswitch communication network (HCN). The HCN is analyzed because it has three attractive features that enable these architectures to be effective: speed, fault tolerance, and the ability to pass multiple messages simultaneously through the same hyperswitch controller.

  11. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  12. Detailed design of a lattice composite fuselage structure by a mixed optimization method

    NASA Astrophysics Data System (ADS)

    Liu, D.; Lohse-Busch, H.; Toropov, V.; Hühne, C.; Armani, U.

    2016-10-01

    In this article, a procedure for designing a lattice fuselage barrel is developed. It comprises three stages: first, topology optimization of an aircraft fuselage barrel is performed with respect to weight and structural performance to obtain the conceptual design. The interpretation of the optimal result is given to demonstrate the development of this new lattice airframe concept for the fuselage barrel. Subsequently, parametric optimization of the lattice aircraft fuselage barrel is carried out using genetic algorithms on metamodels generated with genetic programming from a 101-point optimal Latin hypercube design of experiments. The optimal design is achieved in terms of weight savings subject to stability, global stiffness and strain requirements, and then verified by the fine mesh finite element simulation of the lattice fuselage barrel. Finally, a practical design of the composite skin complying with the aircraft industry lay-up rules is presented. It is concluded that the mixed optimization method, combining topology optimization with the global metamodel-based approach, allows the problem to be solved with sufficient accuracy and provides the designers with a wealth of information on the structural behaviour of the novel anisogrid composite fuselage design.

  13. Matched-filter algorithm for subpixel spectral detection in hyperspectral image data

    NASA Astrophysics Data System (ADS)

    Borough, Howard C.

    1991-11-01

    Hyperspectral imagery, spatial imagery with associated wavelength data for every pixel, offers a significant potential for improved detection and identification of certain classes of targets. The ability to make spectral identifications of objects which only partially fill a single pixel (due to range or small size) is of considerable interest. Multiband imagery such as Landsat's 5 and 7 band imagery has demonstrated significant utility in the past. Hyperspectral imaging systems with hundreds of spectral bands offer improved performance. To explore the application of differentpixel spectral detection algorithms a synthesized set of hyperspectral image data (hypercubes) was generated utilizing NASA earth resources and other spectral data. The data was modified using LOWTRAN 7 to model the illumination, atmospheric contributions, attenuations and viewing geometry to represent a nadir view from 10,000 ft. altitude. The base hypercube (HC) represented 16 by 21 spatial pixels with 101 wavelength samples from 0.5 to 2.5 micrometers for each pixel. Insertions were made into the base data to provide random location, random pixel percentage, and random material. Fifteen different hypercubes were generated for blind testing of candidate algorithms. An algorithm utilizing a matched filter in the spectral dimension proved surprisingly good yielding 100% detections for pixels filled greater than 40% with a standard camouflage paint, and a 50% probability of detection for pixels filled 20% with the paint, with no false alarms. The false alarm rate as a function of the number of spectral bands in the range from 101 to 12 bands was measured and found to increase from zero to 50% illustrating the value of a large number of spectral bands. This test was on imagery without system noise; the next step is to incorporate typical system noise sources.

  14. Latin American Study of Nutrition and Health (ELANS): rationale and study design.

    PubMed

    Fisberg, M; Kovalskys, I; Gómez, G; Rigotti, A; Cortés, L Y; Herrera-Cuenca, M; Yépez, M C; Pareja, R G; Guajardo, V; Zimberg, I Z; Chiavegatto Filho, A D P; Pratt, M; Koletzko, B; Tucker, K L

    2016-01-30

    Obesity is growing at an alarming rate in Latin America. Lifestyle behaviours such as physical activity and dietary intake have been largely associated with obesity in many countries; however studies that combine nutrition and physical activity assessment in representative samples of Latin American countries are lacking. The aim of this study is to present the design rationale of the Latin American Study of Nutrition and Health/Estudio Latinoamericano de Nutrición y Salud (ELANS) with a particular focus on its quality control procedures and recruitment processes. The ELANS is a multicenter cross-sectional nutrition and health surveillance study of a nationally representative sample of urban populations from eight Latin American countries (Argentina, Brazil, Chile, Colombia, Costa Rica, Ecuador, Perú and Venezuela). A standard study protocol was designed to evaluate the nutritional intakes, physical activity levels, and anthropometric measurements of 9000 enrolled participants. The study was based on a complex, multistage sample design and the sample was stratified by gender, age (15 to 65 years old) and socioeconomic level. A small-scale pilot study was performed in each country to test the procedures and tools. This study will provide valuable information and a unique dataset regarding Latin America that will enable cross-country comparisons of nutritional statuses that focus on energy and macro- and micronutrient intakes, food patterns, and energy expenditure. Clinical Trials NCT02226627.

  15. Designing an optimal software intensive system acquisition: A game theoretic approach

    NASA Astrophysics Data System (ADS)

    Buettner, Douglas John

    The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.

  16. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  17. A new sampling scheme for developing metamodels with the zeros of Chebyshev polynomials

    NASA Astrophysics Data System (ADS)

    Wu, Jinglai; Luo, Zhen; Zhang, Nong; Zhang, Yunqing

    2015-09-01

    The accuracy of metamodelling is determined by both the sampling and approximation. This article proposes a new sampling method based on the zeros of Chebyshev polynomials to capture the sampling information effectively. First, the zeros of one-dimensional Chebyshev polynomials are applied to construct Chebyshev tensor product (CTP) sampling, and the CTP is then used to construct high-order multi-dimensional metamodels using the 'hypercube' polynomials. Secondly, the CTP sampling is further enhanced to develop Chebyshev collocation method (CCM) sampling, to construct the 'simplex' polynomials. The samples of CCM are randomly and directly chosen from the CTP samples. Two widely studied sampling methods, namely the Smolyak sparse grid and Hammersley, are used to demonstrate the effectiveness of the proposed sampling method. Several numerical examples are utilized to validate the approximation accuracy of the proposed metamodel under different dimensions.

  18. Concurrent electromagnetic scattering analysis

    NASA Technical Reports Server (NTRS)

    Patterson, Jean E.; Cwik, Tom; Ferraro, Robert D.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Parker, Jay

    1989-01-01

    The computational power of the hypercube parallel computing architecture is applied to the solution of large-scale electromagnetic scattering and radiation problems. Three analysis codes have been implemented. A Hypercube Electromagnetic Interactive Analysis Workstation was developed to aid in the design and analysis of metallic structures such as antennas and to facilitate the use of these analysis codes. The workstation provides a general user environment for specification of the structure to be analyzed and graphical representations of the results.

  19. Asynchronous Communication Scheme For Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Madan, Herb S.

    1988-01-01

    Scheme devised for asynchronous-message communication system for Mark III hypercube concurrent-processor network. Network consists of up to 1,024 processing elements connected electrically as though were at corners of 10-dimensional cube. Each node contains two Motorola 68020 processors along with Motorola 68881 floating-point processor utilizing up to 4 megabytes of shared dynamic random-access memory. Scheme intended to support applications requiring passage of both polled or solicited and unsolicited messages.

  20. Image-Processing Software For A Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

    1992-01-01

    Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

  1. Virtual time and time warp on the JPL hypercube. [operating system implementation for distributed simulation

    NASA Technical Reports Server (NTRS)

    Jefferson, David; Beckman, Brian

    1986-01-01

    This paper describes the concept of virtual time and its implementation in the Time Warp Operating System at the Jet Propulsion Laboratory. Virtual time is a distributed synchronization paradigm that is appropriate for distributed simulation, database concurrency control, real time systems, and coordination of replicated processes. The Time Warp Operating System is targeted toward the distributed simulation application and runs on a 32-node JPL Mark II Hypercube.

  2. Electrooptical adaptive switching network for the hypercube computer

    NASA Technical Reports Server (NTRS)

    Chow, E.; Peterson, J.

    1988-01-01

    An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.

  3. Novel snapshot hyperspectral imager for fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Chandler, Lynn; Chandler, Andrea; Periasamy, Ammasi

    2018-02-01

    Hyperspectral imaging has emerged as a new technique for the identification and classification of biological tissue1. Benefitting recent developments in sensor technology, the new class of hyperspectral imagers can capture entire hypercubes with single shot operation and it shows great potential for real-time imaging in biomedical sciences. This paper explores the use of a SnapShot imager in fluorescence imaging via microscope for the very first time. Utilizing the latest imaging sensor, the Snapshot imager is both compact and attachable via C-mount to any commercially available light microscope. Using this setup, fluorescence hypercubes of several cells were generated, containing both spatial and spectral information. The fluorescence images were acquired with one shot operation for all the emission range from visible to near infrared (VIS-IR). The paper will present the hypercubes obtained images from example tissues (475-630nm). This study demonstrates the potential of application in cell biology or biomedical applications for real time monitoring.

  4. Inferring patterns in mitochondrial DNA sequences through hypercube independent spanning trees.

    PubMed

    Silva, Eduardo Sant Ana da; Pedrini, Helio

    2016-03-01

    Given a graph G, a set of spanning trees rooted at a vertex r of G is said vertex/edge independent if, for each vertex v of G, v≠r, the paths of r to v in any pair of trees are vertex/edge disjoint. Independent spanning trees (ISTs) provide a number of advantages in data broadcasting due to their fault tolerant properties. For this reason, some studies have addressed the issue by providing mechanisms for constructing independent spanning trees efficiently. In this work, we investigate how to construct independent spanning trees on hypercubes, which are generated based upon spanning binomial trees, and how to use them to predict mitochondrial DNA sequence parts through paths on the hypercube. The prediction works both for inferring mitochondrial DNA sequences comprised of six bases as well as infer anomalies that probably should not belong to the mitochondrial DNA standard. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Multiphase complete exchange on a circuit switched hypercube

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1991-01-01

    On a distributed memory parallel computer, the complete exchange (all-to-all personalized) communication pattern requires each of n processors to send a different block of data to each of the remaining n - 1 processors. This pattern is at the heart of many important algorithms, most notably the matrix transpose. For a circuit switched hypercube of dimension d(n = 2(sup d)), two algorithms for achieving complete exchange are known. These are (1) the Standard Exchange approach that employs d transmissions of size 2(sup d-1) blocks each and is useful for small block sizes, and (2) the Optimal Circuit Switched algorithm that employs 2(sup d) - 1 transmissions of 1 block each and is best for large block sizes. A unified multiphase algorithm is described that includes these two algorithms as special cases. The complete exchange on a hypercube of dimension d and block size m is achieved by carrying out k partial exchange on subcubes of dimension d(sub i) Sigma(sup k)(sub i=1) d(sub i) = d and effective block size m(sub i) = m2(sup d-di). When k = d and all d(sub i) = 1, this corresponds to algorithm (1) above. For the case of k = 1 and d(sub i) = d, this becomes the circuit switched algorithm (2). Changing the subcube dimensions d, varies the effective block size and permits a compromise between the data permutation and block transmission overhead of (1) and the startup overhead of (2). For a hypercube of dimension d, the number of possible combinations of subcubes is p(d), the number of partitions of the integer d. This is an exponential but very slowly growing function and it is feasible over these partitions to discover the best combination for a given message size. The approach was analyzed for, and implemented on, the Intel iPSC-860 circuit switched hypercube. Measurements show good agreement with predictions and demonstrate that the multiphase approach can substantially improve performance for block sizes in the 0 to 160 byte range. This range, which corresponds to 0 to 40 floating point numbers per processor, is commonly encountered in practical numeric applications. The multiphase technique is applicable to all circuit-switched hypercubes that use the common e-cube routing strategy.

  6. Integrating satellite actual evapotranspiration patterns into distributed model parametrization and evaluation for a mesoscale catchment

    NASA Astrophysics Data System (ADS)

    Demirel, M. C.; Mai, J.; Stisen, S.; Mendiguren González, G.; Koch, J.; Samaniego, L. E.

    2016-12-01

    Distributed hydrologic models are traditionally calibrated and evaluated against observations of streamflow. Spatially distributed remote sensing observations offer a great opportunity to enhance spatial model calibration schemes. For that it is important to identify the model parameters that can change spatial patterns before the satellite based hydrologic model calibration. Our study is based on two main pillars: first we use spatial sensitivity analysis to identify the key parameters controlling the spatial distribution of actual evapotranspiration (AET). Second, we investigate the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale Hydrologic Model (mHM). This distributed model is selected as it allows for a change in the spatial distribution of key soil parameters through the calibration of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) directly as input. In addition the simulated AET can be estimated at the spatial resolution suitable for comparison to the spatial patterns observed using MODIS data. We introduce a new dynamic scaling function employing remotely sensed vegetation to downscale coarse reference evapotranspiration. In total, 17 parameters of 47 mHM parameters are identified using both sequential screening and Latin hypercube one-at-a-time sampling methods. The spatial patterns are found to be sensitive to the vegetation parameters whereas streamflow dynamics are sensitive to the PTF parameters. The results of multi-objective model calibration show that calibration of mHM against observed streamflow does not reduce the spatial errors in AET while they improve only the streamflow simulations. We will further examine the results of model calibration using only multi spatial objective functions measuring the association between observed AET and simulated AET maps and another case including spatial and streamflow metrics together.

  7. Modeling bronchial circulation with application to soluble gas exchange: description and sensitivity analysis.

    PubMed

    Bui, T D; Dabdub, D; George, S C

    1998-06-01

    The steady-state exchange of inert gases across an in situ canine trachea has recently been shown to be limited equally by diffusion and perfusion over a wide range (0.01-350) of blood solubilities (betablood; ml . ml-1 . atm-1). Hence, we hypothesize that the exchange of ethanol (betablood = 1,756 at 37 degrees C) in the airways depends on the blood flow rate from the bronchial circulation. To test this hypothesis, the dynamics of the bronchial circulation were incorporated into an existing model that describes the simultaneous exchange of heat, water, and a soluble gas in the airways. A detailed sensitivity analysis of key model parameters was performed by using the method of Latin hypercube sampling. The model accurately predicted a previously reported experimental exhalation profile of ethanol (R2 = 0.991) as well as the end-exhalation airstream temperature (34.6 degrees C). The model predicts that 27, 29, and 44% of exhaled ethanol in a single exhalation are derived from the tissues of the mucosa and submucosa, the bronchial circulation, and the tissue exterior to the submucosa (which would include the pulmonary circulation), respectively. Although the concentration of ethanol in the bronchial capillary decreased during inspiration, the three key model outputs (end-exhaled ethanol concentration, the slope of phase III, and end-exhaled temperature) were all statistically insensitive (P > 0.05) to the parameters describing the bronchial circulation. In contrast, the model outputs were all sensitive (P < 0.05) to the thickness of tissue separating the core body conditions from the bronchial smooth muscle. We conclude that both the bronchial circulation and the pulmonary circulation impact soluble gas exchange when the entire conducting airway tree is considered.

  8. Global sensitivity analysis of water age and temperature for informing salmonid disease management

    NASA Astrophysics Data System (ADS)

    Javaheri, Amir; Babbar-Sebens, Meghna; Alexander, Julie; Bartholomew, Jerri; Hallett, Sascha

    2018-06-01

    Many rivers in the Pacific Northwest region of North America are anthropogenically manipulated via dam operations, leading to system-wide impacts on hydrodynamic conditions and aquatic communities. Understanding how dam operations alter abiotic and biotic variables is important for designing management actions. For example, in the Klamath River, dam outflows could be manipulated to alter water age and temperature to reduce risk of parasite infections in salmon by diluting or altering viability of parasite spores. However, sensitivity of water age and temperature to the riverine conditions such as bathymetry can affect outcomes from dam operations. To examine this issue in detail, we conducted a global sensitivity analysis of water age and temperature to a comprehensive set of hydraulics and meteorological parameters in the Klamath River, California, where management of salmonid disease is a high priority. We applied an analysis technique, which combined Latin-hypercube and one-at-a-time sampling methods, and included simulation runs with the hydrodynamic numerical model of the Lower Klamath. We found that flow rate and bottom roughness were the two most important parameters that influence water age. Water temperature was more sensitive to inflow temperature, air temperature, solar radiation, wind speed, flow rate, and wet bulb temperature respectively. Our results are relevant for managers because they provide a framework for predicting how water within 'high infection risk' sections of the river will respond to dam water (low infection risk) input. Moreover, these data will be useful for prioritizing the use of water age (dilution) versus temperature (spore viability) under certain contexts when considering flow manipulation as a method to reduce risk of infection and disease in Klamath River salmon.

  9. Threshold conditions for integrated pest management models with pesticides that have residual effects.

    PubMed

    Tang, Sanyi; Liang, Juhua; Tan, Yuanshun; Cheke, Robert A

    2013-01-01

    Impulsive differential equations (hybrid dynamical systems) can provide a natural description of pulse-like actions such as when a pesticide kills a pest instantly. However, pesticides may have long-term residual effects, with some remaining active against pests for several weeks, months or years. Therefore, a more realistic method for modelling chemical control in such cases is to use continuous or piecewise-continuous periodic functions which affect growth rates. How to evaluate the effects of the duration of the pesticide residual effectiveness on successful pest control is key to the implementation of integrated pest management (IPM) in practice. To address these questions in detail, we have modelled IPM including residual effects of pesticides in terms of fixed pulse-type actions. The stability threshold conditions for pest eradication are given. Moreover, effects of the killing efficiency rate and the decay rate of the pesticide on the pest and on its natural enemies, the duration of residual effectiveness, the number of pesticide applications and the number of natural enemy releases on the threshold conditions are investigated with regard to the extent of depression or resurgence resulting from pulses of pesticide applications and predator releases. Latin Hypercube Sampling/Partial Rank Correlation uncertainty and sensitivity analysis techniques are employed to investigate the key control parameters which are most significantly related to threshold values. The findings combined with Volterra's principle confirm that when the pesticide has a strong effect on the natural enemies, repeated use of the same pesticide can result in target pest resurgence. The results also indicate that there exists an optimal number of pesticide applications which can suppress the pest most effectively, and this may help in the design of an optimal control strategy.

  10. Environmental drivers of spatial patterns of topsoil nitrogen and phosphorus under monsoon conditions in a complex terrain of South Korea

    PubMed Central

    Choi, Kwanghun; Spohn, Marie; Park, Soo Jin; Huwe, Bernd; Ließ, Mareike

    2017-01-01

    Nitrogen (N) and phosphorus (P) in topsoils are critical for plant nutrition. Relatively little is known about the spatial patterns of N and P in the organic layer of mountainous landscapes. Therefore, the spatial distributions of N and P in both the organic layer and the A horizon were analyzed using a light detection and ranging (LiDAR) digital elevation model and vegetation metrics. The objective of the study was to analyze the effect of vegetation and topography on the spatial patterns of N and P in a small watershed covered by forest in South Korea. Soil samples were collected using the conditioned latin hypercube method. LiDAR vegetation metrics, the normalized difference vegetation index (NDVI), and terrain parameters were derived as predictors. Spatial explicit predictions of N/P ratios were obtained using a random forest with uncertainty analysis. We tested different strategies of model validation (repeated 2-fold to 20-fold and leave-one-out cross validation). Repeated 10-fold cross validation was selected for model validation due to the comparatively high accuracy and low variance of prediction. Surface curvature was the best predictor of P contents in the organic layer and in the A horizon, while LiDAR vegetation metrics and NDVI were important predictors of N in the organic layer. N/P ratios increased with surface curvature and were higher on the convex upper slope than on the concave lower slope. This was due to P enrichment of the soil on the lower slope and a more even spatial distribution of N. Our digital soil maps showed that the topsoils on the upper slopes contained relatively little P. These findings are critical for understanding N and P dynamics in mountainous ecosystems. PMID:28837590

  11. Towards a consensus-based biokinetic model for green microalgae - The ASM-A.

    PubMed

    Wágner, Dorottya S; Valverde-Pérez, Borja; Sæbø, Mariann; Bregua de la Sotilla, Marta; Van Wagenen, Jonathan; Smets, Barth F; Plósz, Benedek Gy

    2016-10-15

    Cultivation of microalgae in open ponds and closed photobioreactors (PBRs) using wastewater resources offers an opportunity for biochemical nutrient recovery. Effective reactor system design and process control of PBRs requires process models. Several models with different complexities have been developed to predict microalgal growth. However, none of these models can effectively describe all the relevant processes when microalgal growth is coupled with nutrient removal and recovery from wastewaters. Here, we present a mathematical model developed to simulate green microalgal growth (ASM-A) using the systematic approach of the activated sludge modelling (ASM) framework. The process model - identified based on a literature review and using new experimental data - accounts for factors influencing photoautotrophic and heterotrophic microalgal growth, nutrient uptake and storage (i.e. Droop model) and decay of microalgae. Model parameters were estimated using laboratory-scale batch and sequenced batch experiments using the novel Latin Hypercube Sampling based Simplex (LHSS) method. The model was evaluated using independent data obtained in a 24-L PBR operated in sequenced batch mode. Identifiability of the model was assessed. The model can effectively describe microalgal biomass growth, ammonia and phosphate concentrations as well as the phosphorus storage using a set of average parameter values estimated with the experimental data. A statistical analysis of simulation and measured data suggests that culture history and substrate availability can introduce significant variability on parameter values for predicting the reaction rates for bulk nitrate and the intracellularly stored nitrogen state-variables, thereby requiring scenario specific model calibration. ASM-A was identified using standard cultivation medium and it can provide a platform for extensions accounting for factors influencing algal growth and nutrient storage using wastewater resources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Cystic echinococcosis in marketed offal of sheep in Basrah, Iraq: Abattoir-based survey and a probabilistic model estimation of the direct economic losses due to hydatid cyst.

    PubMed

    Abdulhameed, Mohanad F; Habib, Ihab; Al-Azizz, Suzan A; Robertson, Ian

    2018-02-01

    Cystic echinococcosis (CE) is a highly endemic parasitic zoonosis in Iraq with substantial impacts on livestock productivity and human health. The objectives of this study were to study the abattoir-based occurrence of CE in marketed offal of sheep in Basrah province, Iraq, and to estimate, using a probabilistic modelling approach, the direct economic losses due to hydatid cysts. Based on detailed visual meat inspection, results from an active abattoir survey in this study revealed detection of hydatid cysts in 7.3% (95% CI: 5.4; 9.6) of 631 examined sheep carcasses. Post-mortem lesions of hydatid cyst were concurrently present in livers and lungs of more than half (54.3% (25/46)) of the positive sheep. Direct economic losses due to hydatid cysts in marketed offal were estimated using data from government reports, the one abattoir survey completed in this study, and expert opinions of local veterinarians and butchers. A Monte-Carlo simulation model was developed in a spreadsheet utilizing Latin Hypercube sampling to account for uncertainty in the input parameters. The model estimated that the average annual economic losses associated with hydatid cysts in the liver and lungs of sheep marketed for human consumption in Basrah to be US$72,470 (90% Confidence Interval (CI); ±11,302). The mean proportion of annual losses in meat products value (carcasses and offal) due to hydatid cysts in the liver and lungs of sheep marketed in Basrah province was estimated as 0.42% (90% CI; ±0.21). These estimates suggest that CE is responsible for considerable livestock-associated monetary losses in the south of Iraq. These findings can be used to inform different regional CE control program options in Iraq.

  13. Analysis of the hydrological response of a distributed physically-based model using post-assimilation (EnKF) diagnostics of streamflow and in situ soil moisture observations

    NASA Astrophysics Data System (ADS)

    Trudel, Mélanie; Leconte, Robert; Paniconi, Claudio

    2014-06-01

    Data assimilation techniques not only enhance model simulations and forecast, they also provide the opportunity to obtain a diagnostic of both the model and observations used in the assimilation process. In this research, an ensemble Kalman filter was used to assimilate streamflow observations at a basin outlet and at interior locations, as well as soil moisture at two different depths (15 and 45 cm). The simulation model is the distributed physically-based hydrological model CATHY (CATchment HYdrology) and the study site is the Des Anglais watershed, a 690 km2 river basin located in southern Quebec, Canada. Use of Latin hypercube sampling instead of a conventional Monte Carlo method to generate the ensemble reduced the size of the ensemble, and therefore the calculation time. Different post-assimilation diagnostics, based on innovations (observation minus background), analysis residuals (observation minus analysis), and analysis increments (analysis minus background), were used to evaluate assimilation optimality. An important issue in data assimilation is the estimation of error covariance matrices. These diagnostics were also used in a calibration exercise to determine the standard deviation of model parameters, forcing data, and observations that led to optimal assimilations. The analysis of innovations showed a lag between the model forecast and the observation during rainfall events. Assimilation of streamflow observations corrected this discrepancy. Assimilation of outlet streamflow observations improved the Nash-Sutcliffe efficiencies (NSE) between the model forecast (one day) and the observation at both outlet and interior point locations, owing to the structure of the state vector used. However, assimilation of streamflow observations systematically increased the simulated soil moisture values.

  14. Uncertainty analysis for absorbed dose from a brain receptor imaging agent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aydogan, B.; Miller, L.F.; Sparks, R.B.

    Absorbed dose estimates are known to contain uncertainties. A recent literature search indicates that prior to this study no rigorous investigation of uncertainty associated with absorbed dose has been undertaken. A method of uncertainty analysis for absorbed dose calculations has been developed and implemented for the brain receptor imaging agent {sup 123}I-IPT. The two major sources of uncertainty considered were the uncertainty associated with the determination of residence time and that associated with the determination of the S values. There are many sources of uncertainty in the determination of the S values, but only the inter-patient organ mass variation wasmore » considered in this work. The absorbed dose uncertainties were determined for lung, liver, heart and brain. Ninety-five percent confidence intervals of the organ absorbed dose distributions for each patient and for a seven-patient population group were determined by the ``Latin Hypercube Sampling`` method. For an individual patient, the upper bound of the 95% confidence interval of the absorbed dose was found to be about 2.5 times larger than the estimated mean absorbed dose. For the seven-patient population the upper bound of the 95% confidence interval of the absorbed dose distribution was around 45% more than the estimated population mean. For example, the 95% confidence interval of the population liver dose distribution was found to be between 1.49E+0.7 Gy/MBq and 4.65E+07 Gy/MBq with a mean of 2.52E+07 Gy/MBq. This study concluded that patients in a population receiving {sup 123}I-IPT could receive absorbed doses as much as twice as large as the standard estimated absorbed dose due to these uncertainties.« less

  15. Simplifying the complexity of a coupled carbon turnover and pesticide degradation model

    NASA Astrophysics Data System (ADS)

    Marschmann, Gianna; Erhardt, André H.; Pagel, Holger; Kügler, Philipp; Streck, Thilo

    2016-04-01

    The mechanistic one-dimensional model PECCAD (PEsticide degradation Coupled to CArbon turnover in the Detritusphere; Pagel et al. 2014, Biogeochemistry 117, 185-204) has been developed as a tool to elucidate regulation mechanisms of pesticide degradation in soil. A feature of this model is that it integrates functional traits of microorganisms, identifiable by molecular tools, and physicochemical processes such as transport and sorption that control substrate availability. Predicting the behavior of microbially active interfaces demands a fundamental understanding of factors controlling their dynamics. Concepts from dynamical systems theory allow us to study general properties of the model such as its qualitative behavior, intrinsic timescales and dynamic stability: Using a Latin hypercube method we sampled the parameter space for physically realistic steady states of the PECCAD ODE system and set up a numerical continuation and bifurcation problem with the open-source toolbox MatCont in order to obtain a complete classification of the dynamical system's behaviour. Bifurcation analysis reveals an equilibrium state of the system entirely controlled by fungal kinetic parameters. The equilibrium is generally unstable in response to small perturbations except for a small band in parameter space where the pesticide pool is stable. Time scale separation is a phenomenon that occurs in almost every complex open physical system. Motivated by the notion of "initial-stage" and "late-stage" decomposers and the concept of r-, K- or L-selected microbial life strategies, we test the applicability of geometric singular perturbation theory to identify fast and slow time scales of PECCAD. Revealing a generic fast-slow structure would greatly simplify the analysis of complex models of organic matter turnover by reducing the number of unknowns and parameters and providing a systematic mathematical framework for studying their properties.

  16. Effects of surface area and inflow on the performance of stormwater best management practices with uncertainty analysis.

    PubMed

    Park, Daeryong; Roesner, Larry A

    2013-09-01

    The performance of stormwater best management practices (BMPs) is affected by BMP geometric and hydrologic factors. The objective of this study was to investigate the effect of BMP surface area and inflow on BMP performance using the k-C* model with uncertainty analysis. Observed total suspended solids (TSS) from detention basins and retention ponds data sets in the International Stormwater BMP Database were used to build and evaluate the model. Detention basins are regarded as dry ponds because they do not always have water, whereas retention ponds have a permanent pool and are considered wet ponds. In this study, Latin hypercube sampling (LHS) was applied to consider uncertainty in both influent event mean concentration (EMC), C(in), and the areal removal constant, k. The latter was estimated from the hydraulic loading rate, q, through use of a power function relationship. Results show that effluent EMC, C(out), decreased as inflow decreased and as BMP surface area increased in both detention basins and retention ponds. However, the change in C(out), depending on inflow and BMP surface area for detention basins, differed from the change in C(out) for retention ponds. Specifically, C(in) was more dominantly associated with the performance of the k-C* model of detention basins than were BMP surface area and inflow. For retention ponds, however, results suggest that BMP surface area and inflow both influenced changes in C(out) as well as C(in). These results suggest that sensitive factors in the performance of the k-C* model are limited to C(in) for detention basins, whereas BMP surface area, inflow, and C(in) are important for retention ponds.

  17. Using palaeoclimate data to improve models of the Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    Phipps, Steven; King, Matt; Roberts, Jason; White, Duanne

    2017-04-01

    Ice sheet models are the most descriptive tools available to simulate the future evolution of the Antarctic Ice Sheet (AIS), including its contribution towards changes in global sea level. However, our knowledge of the dynamics of the coupled ice-ocean-lithosphere system is inevitably limited, in part due to a lack of observations. Furthemore, to build computationally efficient models that can be run for multiple millennia, it is necessary to use simplified descriptions of ice dynamics. Ice sheet modelling is therefore an inherently uncertain exercise. The past evolution of the AIS provides an opportunity to constrain the description of physical processes within ice sheet models and, therefore, to constrain our understanding of the role of the AIS in driving changes in global sea level. We use the Parallel Ice Sheet Model (PISM) to demonstrate how palaeoclimate data can improve our ability to predict the future evolution of the AIS. A 50-member perturbed-physics ensemble is generated, spanning uncertainty in the parameterisations of three key physical processes within the model: (i) the stress balance within the ice sheet, (ii) basal sliding and (iii) calving of ice shelves. A Latin hypercube approach is used to optimally sample the range of uncertainty in parameter values. This perturbed-physics ensemble is used to simulate the evolution of the AIS from the Last Glacial Maximum ( 21,000 years ago) to present. Palaeoclimate records are then used to determine which ensemble members are the most realistic. This allows us to use data on past climates to directly constrain our understanding of the past contribution of the AIS towards changes in global sea level. Critically, it also allows us to determine which ensemble members are likely to generate the most realistic projections of the future evolution of the AIS.

  18. Using paleoclimate data to improve models of the Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    King, M. A.; Phipps, S. J.; Roberts, J. L.; White, D.

    2016-12-01

    Ice sheet models are the most descriptive tools available to simulate the future evolution of the Antarctic Ice Sheet (AIS), including its contribution towards changes in global sea level. However, our knowledge of the dynamics of the coupled ice-ocean-lithosphere system is inevitably limited, in part due to a lack of observations. Furthemore, to build computationally efficient models that can be run for multiple millennia, it is necessary to use simplified descriptions of ice dynamics. Ice sheet modeling is therefore an inherently uncertain exercise. The past evolution of the AIS provides an opportunity to constrain the description of physical processes within ice sheet models and, therefore, to constrain our understanding of the role of the AIS in driving changes in global sea level. We use the Parallel Ice Sheet Model (PISM) to demonstrate how paleoclimate data can improve our ability to predict the future evolution of the AIS. A large, perturbed-physics ensemble is generated, spanning uncertainty in the parameterizations of four key physical processes within ice sheet models: ice rheology, ice shelf calving, and the stress balances within ice sheets and ice shelves. A Latin hypercube approach is used to optimally sample the range of uncertainty in parameter values. This perturbed-physics ensemble is used to simulate the evolution of the AIS from the Last Glacial Maximum ( 21,000 years ago) to present. Paleoclimate records are then used to determine which ensemble members are the most realistic. This allows us to use data on past climates to directly constrain our understanding of the past contribution of the AIS towards changes in global sea level. Critically, it also allows us to determine which ensemble members are likely to generate the most realistic projections of the future evolution of the AIS.

  19. An uncertainty analysis of the hydrogen source term for a station blackout accident in Sequoyah using MELCOR 1.8.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Bixler, Nathan E.; Wagner, Kenneth Charles

    2014-03-01

    A methodology for using the MELCOR code with the Latin Hypercube Sampling method was developed to estimate uncertainty in various predicted quantities such as hydrogen generation or release of fission products under severe accident conditions. In this case, the emphasis was on estimating the range of hydrogen sources in station blackout conditions in the Sequoyah Ice Condenser plant, taking into account uncertainties in the modeled physics known to affect hydrogen generation. The method uses user-specified likelihood distributions for uncertain model parameters, which may include uncertainties of a stochastic nature, to produce a collection of code calculations, or realizations, characterizing themore » range of possible outcomes. Forty MELCOR code realizations of Sequoyah were conducted that included 10 uncertain parameters, producing a range of in-vessel hydrogen quantities. The range of total hydrogen produced was approximately 583kg 131kg. Sensitivity analyses revealed expected trends with respected to the parameters of greatest importance, however, considerable scatter in results when plotted against any of the uncertain parameters was observed, with no parameter manifesting dominant effects on hydrogen generation. It is concluded that, with respect to the physics parameters investigated, in order to further reduce predicted hydrogen uncertainty, it would be necessary to reduce all physics parameter uncertainties similarly, bearing in mind that some parameters are inherently uncertain within a range. It is suspected that some residual uncertainty associated with modeling complex, coupled and synergistic phenomena, is an inherent aspect of complex systems and cannot be reduced to point value estimates. The probabilistic analyses such as the one demonstrated in this work are important to properly characterize response of complex systems such as severe accident progression in nuclear power plants.« less

  20. Constructing Rigorous and Broad Biosurveillance Networks for Detecting Emerging Zoonotic Outbreaks

    PubMed Central

    Brown, Mac; Moore, Leslie; McMahon, Benjamin; Powell, Dennis; LaBute, Montiago; Hyman, James M.; Rivas, Ariel; Jankowski, Mark; Berendzen, Joel; Loeppky, Jason; Manore, Carrie; Fair, Jeanne

    2015-01-01

    Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a wide range of scenarios of pathogen spread. We demonstrate how a hierarchy of mathematical and statistical tools can be used in surveillance planning help guide successful surveillance and mitigation policies for a wide range of zoonotic pathogens. The model forecasts can help clarify the complexities of potential scenarios, and optimize biosurveillance programs for rapidly detecting infectious diseases. Using the highly pathogenic zoonotic H5N1 avian influenza 2006-2007 epidemic in Nigeria as an example, we determined the risk for infection for localized areas in an outbreak and designed biosurveillance stations that are effective for different pathogen strains and a range of possible outbreak locations. We created a general multi-scale, multi-host stochastic SEIR epidemiological network model, with both short and long-range movement, to simulate the spread of an infectious disease through Nigerian human, poultry, backyard duck, and wild bird populations. We chose parameter ranges specific to avian influenza (but not to a particular strain) and used a Latin hypercube sample experimental design to investigate epidemic predictions in a thousand simulations. We ranked the risk of local regions by the number of times they became infected in the ensemble of simulations. These spatial statistics were then complied into a potential risk map of infection. Finally, we validated the results with a known outbreak, using spatial analysis of all the simulation runs to show the progression matched closely with the observed location of the farms infected in the 2006-2007 epidemic. PMID:25946164

  1. Revolution or evolution? An analysis of E-health innovation and impact using a hypercube model.

    PubMed

    Wu, Jen-Her; Huang, An-Sheng; Hisa, Tzyh-Lih; Tsai, Hsien-Tang

    2006-01-01

    This study utilises a hypercube innovation model to analyse the changes in both healthcare informatics and medical related delivery models based on the innovations from Tele-healthcare, electronic healthcare (E-healthcare), to mobile healthcare (M-healthcare). Further, the critical impacts of these E-health innovations on the stakeholders: healthcare customers, hospitals, healthcare complementary providers and healthcare regulators are identified. Thereafter, the critical capabilities for adopting each innovation are discussed.

  2. Implementation and Performance Analysis of Parallel Assignment Algorithms on a Hypercube Computer.

    DTIC Science & Technology

    1987-12-01

    coupled pro- cessors because of the degree of interaction between processors imposed by the global memory [HwB84]. Another sub-class of MIMD... interaction between the individual processors [MuA87]. Many of the commercial MIMD computers available today are loosely coupled [HwB84]. 2.1.3 The Hypercube...Alpha-beta is a method usually employed in the solution of two-person zero-sum games like chess and checkers [Qui87]. The ha sic approach of the alpha

  3. Advanced Stochastic Collocation Methods for Polynomial Chaos in RAVEN

    NASA Astrophysics Data System (ADS)

    Talbot, Paul W.

    As experiment complexity in fields such as nuclear engineering continually increases, so does the demand for robust computational methods to simulate them. In many simulations, input design parameters and intrinsic experiment properties are sources of uncertainty. Often small perturbations in uncertain parameters have significant impact on the experiment outcome. For instance, in nuclear fuel performance, small changes in fuel thermal conductivity can greatly affect maximum stress on the surrounding cladding. The difficulty quantifying input uncertainty impact in such systems has grown with the complexity of numerical models. Traditionally, uncertainty quantification has been approached using random sampling methods like Monte Carlo. For some models, the input parametric space and corresponding response output space is sufficiently explored with few low-cost calculations. For other models, it is computationally costly to obtain good understanding of the output space. To combat the expense of random sampling, this research explores the possibilities of using advanced methods in Stochastic Collocation for generalized Polynomial Chaos (SCgPC) as an alternative to traditional uncertainty quantification techniques such as Monte Carlo (MC) and Latin Hypercube Sampling (LHS) methods for applications in nuclear engineering. We consider traditional SCgPC construction strategies as well as truncated polynomial spaces using Total Degree and Hyperbolic Cross constructions. We also consider applying anisotropy (unequal treatment of different dimensions) to the polynomial space, and offer methods whereby optimal levels of anisotropy can be approximated. We contribute development to existing adaptive polynomial construction strategies. Finally, we consider High-Dimensional Model Reduction (HDMR) expansions, using SCgPC representations for the subspace terms, and contribute new adaptive methods to construct them. We apply these methods on a series of models of increasing complexity. We use analytic models of various levels of complexity, then demonstrate performance on two engineering-scale problems: a single-physics nuclear reactor neutronics problem, and a multiphysics fuel cell problem coupling fuels performance and neutronics. Lastly, we demonstrate sensitivity analysis for a time-dependent fuels performance problem. We demonstrate the application of all the algorithms in RAVEN, a production-level uncertainty quantification framework.

  4. Curriculum Guide for Latin Heritage in Secondary Schools.

    ERIC Educational Resources Information Center

    Cutt, Eula; And Others

    A statement of educational objectives and an outline of course content for one year of Latin study are followed by a wide variety of sample lessons. They include lessons on grammar, etymology, and civilization. Pattern drills treating verbs and nouns are included. Sections containing sample conversations, stories, story-telling and note-taking,…

  5. Emulation and Sensitivity Analysis of the Community Multiscale Air Quality Model for a UK Ozone Pollution Episode.

    PubMed

    Beddows, Andrew V; Kitwiroon, Nutthida; Williams, Martin L; Beevers, Sean D

    2017-06-06

    Gaussian process emulation techniques have been used with the Community Multiscale Air Quality model, simulating the effects of input uncertainties on ozone and NO 2 output, to allow robust global sensitivity analysis (SA). A screening process ranked the effect of perturbations in 223 inputs, isolating the 30 most influential from emissions, boundary conditions (BCs), and reaction rates. Community Multiscale Air Quality (CMAQ) simulations of a July 2006 ozone pollution episode in the UK were made with input values for these variables plus ozone dry deposition velocity chosen according to a 576 point Latin hypercube design. Emulators trained on the output of these runs were used in variance-based SA of the model output to input uncertainties. Performing these analyses for every hour of a 21 day period spanning the episode and several days on either side allowed the results to be presented as a time series of sensitivity coefficients, showing how the influence of different input uncertainties changed during the episode. This is one of the most complex models to which these methods have been applied, and here, they reveal detailed spatiotemporal patterns of model sensitivities, with NO and isoprene emissions, NO 2 photolysis, ozone BCs, and deposition velocity being among the most influential input uncertainties.

  6. Energies and radial distributions of Bs mesons - the effect of hypercubic blocking

    NASA Astrophysics Data System (ADS)

    Koponen, Jonna

    2006-12-01

    This is a follow-up to our earlier work for the energies and the charge (vector) and matter (scalar) distributions for S-wave states in a heavy-light meson, where the heavy quark is static and the light quark has a mass about that of the strange quark. We study the radial distributions of higher angular momentum states, namely P- and D-wave states, using a "fuzzy" static quark. A new improvement is the use of hypercubic blocking in the time direction, which effectively constrains the heavy quark to move within a 2a hypercube (a is the lattice spacing). The calculation is carried out with dynamical fermions on a 163 × 32 lattice with a ≈ 0.10 fm generated using the non-perturbatively improved clover action. The configurations were gener- ated by the UKQCD Collaboration using lattice action parameters β = 5.2, c SW = 2.0171 and κ = 0.1350. In nature the closest equivalent of this heavy-light system is the Bs meson. Attempts are now being made to understand these results in terms of the Dirac equation.

  7. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.

    PubMed

    Barhen, J; Toomarian, N; Protopopescu, V

    1987-12-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  8. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot

    NASA Technical Reports Server (NTRS)

    Barhen, Jacob; Toomarian, N.; Protopopescu, V.

    1987-01-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  9. Maximum Torque and Momentum Envelopes for Reaction Wheel Arrays

    NASA Technical Reports Server (NTRS)

    Reynolds, R. G.; Markley, F. Landis

    2001-01-01

    Spacecraft reaction wheel maneuvers are limited by the maximum torque and/or angular momentum which the wheels can provide. For an n-wheel configuration, the torque or momentum envelope can be obtained by projecting the n-dimensional hypercube, representing the domain boundary of individual wheel torques or momenta, into three dimensional space via the 3xn matrix of wheel axes. In this paper, the properties of the projected hypercube are discussed, and algorithms are proposed for determining this maximal torque or momentum envelope for general wheel configurations. Practical implementation strategies for specific wheel configurations are also considered.

  10. Within-field and regional-scale accuracies of topsoil organic carbon content prediction from an airborne visible near-infrared hyperspectral image combined with synchronous field spectra for temperate croplands

    NASA Astrophysics Data System (ADS)

    Vaudour, Emmanuelle; Gilliot, Jean-Marc; Bel, Liliane; Lefevre, Josias; Chehdi, Kacem

    2016-04-01

    This study was carried out in the framework of the TOSCA-PLEIADES-CO of the French Space Agency and benefited data from the earlier PROSTOCK-Gessol3 project supported by the French Environment and Energy Management Agency (ADEME). It aimed at identifying the potential of airborne hyperspectral visible near-infrared AISA-Eagle data for predicting the topsoil organic carbon (SOC) content of bare cultivated soils over a large peri-urban area (221 km2) with intensive annual crop cultivation and both contrasted soils and SOC contents, located in the western region of Paris, France. Soils comprise hortic or glossic luvisols, calcaric, rendzic cambisols and colluvic cambisols. Airborne AISA-Eagle images (400-1000 nm, 126 bands) with 1 m-resolution were acquired on 17 April 2013 over 13 tracks. Tracks were atmospherically corrected then mosaicked at a 2 m-resolution using a set of 24 synchronous field spectra of bare soils, black and white targets and impervious surfaces. The land use identification system layer (RPG) of 2012 was used to mask non-agricultural areas, then calculation and thresholding of NDVI from an atmospherically corrected SPOT4 image acquired the same day enabled to map agricultural fields with bare soil. A total of 101 sites, which were sampled either at the regional scale or within one field, were identified as bare by means of this map. Predictions were made from the mosaic AISA spectra which were related to SOC contents by means of partial least squares regression (PLSR). Regression robustness was evaluated through a series of 1000 bootstrap data sets of calibration-validation samples, considering those 75 sites outside cloud shadows only, and different sampling strategies for selecting calibration samples. Validation root-mean-square errors (RMSE) were comprised between 3.73 and 4.49 g. Kg-1 and were ~4 g. Kg-1 in median. The most performing models in terms of coefficient of determination (R²) and Residual Prediction Deviation (RPD) values were the calibration models derived either from Kennard-Stone or conditioned Latin Hypercube sampling on smoothed spectra. However, the most generalizable model leading to lowest RMSE value of 3.73 g. Kg-1 at the regional scale and 1.44 g. Kg-1 at the within-field scale and low validation bias was the cross-validated leave-one-out PLSR model constructed with the 28 near-synchronous samples and raw spectra.

  11. Living conditions and health. A population-based study of labour migrants and Latin American refugees in Sweden and those who were repatriated.

    PubMed

    Sundquist, J

    1995-06-01

    To examine whether there are differences in living conditions and self-rated health between South European labour migrants and Latin American refugees and those who were repatriated to Latin America. Analysis of data from a survey (face-to-face interviews) in 1991 of 338 Latin American refugees and 60 repatriated refugees. A random sample of 161 South European and 396 Finnish labour migrants from the Swedish Annual Level-of-Living Surveys 1980-1981 and 1988-89 was analysed. A random sample of 1,159 age-, sex- and education-matched Swedes served as controls. Lund, a medium-sized town in southern Sweden, Santiago and Montevideo, capitals of Chile and Uruguay, respectively, and Sweden. Labour migrants and refugees in particular lived in rented flats while Swedes lived in privately-owned one-family homes. All immigrants and in particular repatriated Latin Americans had low material standard and meagre economic resources compared with Swedes. Being a Latin American refugee, a South European or Finnish labour migrant were independent risk indicators of self-rated poor health in logistic regression (multivariate analyses). Not feeling secure in everyday life and poor leisure opportunities were independent risk factors for poor health with an estimated odds ratio of 3.13(2.09-4.45) and 1.57(1.22-2.00), respectively. This study shows a clear ethnic segregation in housing and other living conditions between Swedes and immigrants, where Latin American refugees and repatriated Latin Americans were most vulnerable. All immigrants had increased self-rated poor health compared with Swedes. Being an immigrant was a risk factor for poor health of equal importance to more traditional risk factors such as lifestyle factors.

  12. Common mental disorders in primary health care: differences between Latin American-born and Spanish-born residents in Madrid, Spain.

    PubMed

    Salinero-Fort, Miguel A; Jiménez-García, Rodrigo; de Burgos-Lunar, Carmen; Chico-Moraleja, Rosa M; Gómez-Campelo, Paloma

    2015-03-01

    Our main objective was to estimate and compare the prevalence of the most common mental disorders between Latin American-born and Spanish-born patients in Madrid, Spain. We also analyzed sociodemographic factors associated with these disorders and the role of the length of residency for Latin American-born patients. We performed a cross-sectional study to compare Latin American-born (n = 691) and Spanish-born outpatients (n = 903) from 15 primary health care centers in Madrid, Spain. The Primary Care Evaluation of Mental Disorders was used to diagnose common mental disorders. Sociodemographic, psychosocial, and migration data were collected. We detected common mental disorders in 49.9 % (95 % CI = 47.4-52.3 %) of the total sample. Values were higher in Latin American-born patients than in Spanish-born patients for any disorder (57.8 % vs. 43.9 %, p < 0.001), mood disorders (40.1 % vs. 34.8 %, p = 0.030), anxiety disorders (20.5 % vs. 15.3 %, p = 0.006), and somatoform disorders (18.1 % vs. 6.6 %, p < 0.001). There were no statistically significant differences in prevalence between Latin American-born patients with less than 5 years of residency and Latin American-born residents with 5 or more years of residency. Finally, multivariate analysis shows that gender, having/not having children, monthly income, geographic origin, and social support were significantly associated with several disorders. The sample was neither population-based nor representative of the general immigrant or autochthonous populations. The study provides further evidence of the high prevalence of common mental disorders in Latin American-born patients in Spain compared with Spanish-born patients.

  13. Accounting for uncertainty in model-based prevalence estimation: paratuberculosis control in dairy herds.

    PubMed

    Davidson, Ross S; McKendrick, Iain J; Wood, Joanna C; Marion, Glenn; Greig, Alistair; Stevenson, Karen; Sharp, Michael; Hutchings, Michael R

    2012-09-10

    A common approach to the application of epidemiological models is to determine a single (point estimate) parameterisation using the information available in the literature. However, in many cases there is considerable uncertainty about parameter values, reflecting both the incomplete nature of current knowledge and natural variation, for example between farms. Furthermore model outcomes may be highly sensitive to different parameter values. Paratuberculosis is an infection for which many of the key parameter values are poorly understood and highly variable, and for such infections there is a need to develop and apply statistical techniques which make maximal use of available data. A technique based on Latin hypercube sampling combined with a novel reweighting method was developed which enables parameter uncertainty and variability to be incorporated into a model-based framework for estimation of prevalence. The method was evaluated by applying it to a simulation of paratuberculosis in dairy herds which combines a continuous time stochastic algorithm with model features such as within herd variability in disease development and shedding, which have not been previously explored in paratuberculosis models. Generated sample parameter combinations were assigned a weight, determined by quantifying the model's resultant ability to reproduce prevalence data. Once these weights are generated the model can be used to evaluate other scenarios such as control options. To illustrate the utility of this approach these reweighted model outputs were used to compare standard test and cull control strategies both individually and in combination with simple husbandry practices that aim to reduce infection rates. The technique developed has been shown to be applicable to a complex model incorporating realistic control options. For models where parameters are not well known or subject to significant variability, the reweighting scheme allowed estimated distributions of parameter values to be combined with additional sources of information, such as that available from prevalence distributions, resulting in outputs which implicitly handle variation and uncertainty. This methodology allows for more robust predictions from modelling approaches by allowing for parameter uncertainty and combining different sources of information, and is thus expected to be useful in application to a large number of disease systems.

  14. Influence of Population Variation of Physiological Parameters in Computational Models of Space Physiology

    NASA Technical Reports Server (NTRS)

    Myers, J. G.; Feola, A.; Werner, C.; Nelson, E. S.; Raykin, J.; Samuels, B.; Ethier, C. R.

    2016-01-01

    The earliest manifestations of Visual Impairment and Intracranial Pressure (VIIP) syndrome become evident after months of spaceflight and include a variety of ophthalmic changes, including posterior globe flattening and distension of the optic nerve sheath. Prevailing evidence links the occurrence of VIIP to the cephalic fluid shift induced by microgravity and the subsequent pressure changes around the optic nerve and eye. Deducing the etiology of VIIP is challenging due to the wide range of physiological parameters that may be influenced by spaceflight and are required to address a realistic spectrum of physiological responses. Here, we report on the application of an efficient approach to interrogating physiological parameter space through computational modeling. Specifically, we assess the influence of uncertainty in input parameters for two models of VIIP syndrome: a lumped-parameter model (LPM) of the cardiovascular and central nervous systems, and a finite-element model (FEM) of the posterior eye, optic nerve head (ONH) and optic nerve sheath. Methods: To investigate the parameter space in each model, we employed Latin hypercube sampling partial rank correlation coefficient (LHSPRCC) strategies. LHS techniques outperform Monte Carlo approaches by enforcing efficient sampling across the entire range of all parameters. The PRCC method estimates the sensitivity of model outputs to these parameters while adjusting for the linear effects of all other inputs. The LPM analysis addressed uncertainties in 42 physiological parameters, such as initial compartmental volume and nominal compartment percentage of total cardiac output in the supine state, while the FEM evaluated the effects on biomechanical strain from uncertainties in 23 material and pressure parameters for the ocular anatomy. Results and Conclusion: The LPM analysis identified several key factors including high sensitivity to the initial fluid distribution. The FEM study found that intraocular pressure and intracranial pressure had dominant impact on the peak strains in the ONH and retro-laminar optic nerve, respectively; optic nerve and lamina cribrosa stiffness were also important. This investigation illustrates the ability of LHSPRCC to identify the most influential physiological parameters, which must therefore be well-characterized to produce the most accurate numerical results.

  15. Performance of a plasma fluid code on the Intel parallel computers

    NASA Technical Reports Server (NTRS)

    Lynch, V. E.; Carreras, B. A.; Drake, J. B.; Leboeuf, J. N.; Liewer, P.

    1992-01-01

    One approach to improving the real-time efficiency of plasma turbulence calculations is to use a parallel algorithm. A parallel algorithm for plasma turbulence calculations was tested on the Intel iPSC/860 hypercube and the Touchtone Delta machine. Using the 128 processors of the Intel iPSC/860 hypercube, a factor of 5 improvement over a single-processor CRAY-2 is obtained. For the Touchtone Delta machine, the corresponding improvement factor is 16. For plasma edge turbulence calculations, an extrapolation of the present results to the Intel (sigma) machine gives an improvement factor close to 64 over the single-processor CRAY-2.

  16. A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Rao, Hariprasad Nannapaneni

    1989-01-01

    The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.

  17. Model Based Predictive Control of Multivariable Hammerstein Processes with Fuzzy Logic Hypercube Interpolated Models

    PubMed Central

    Coelho, Antonio Augusto Rodrigues

    2016-01-01

    This paper introduces the Fuzzy Logic Hypercube Interpolator (FLHI) and demonstrates applications in control of multiple-input single-output (MISO) and multiple-input multiple-output (MIMO) processes with Hammerstein nonlinearities. FLHI consists of a Takagi-Sugeno fuzzy inference system where membership functions act as kernel functions of an interpolator. Conjunction of membership functions in an unitary hypercube space enables multivariable interpolation of N-dimensions. Membership functions act as interpolation kernels, such that choice of membership functions determines interpolation characteristics, allowing FLHI to behave as a nearest-neighbor, linear, cubic, spline or Lanczos interpolator, to name a few. The proposed interpolator is presented as a solution to the modeling problem of static nonlinearities since it is capable of modeling both a function and its inverse function. Three study cases from literature are presented, a single-input single-output (SISO) system, a MISO and a MIMO system. Good results are obtained regarding performance metrics such as set-point tracking, control variation and robustness. Results demonstrate applicability of the proposed method in modeling Hammerstein nonlinearities and their inverse functions for implementation of an output compensator with Model Based Predictive Control (MBPC), in particular Dynamic Matrix Control (DMC). PMID:27657723

  18. The Sensation Seeking Scale (SSS-V) and Its Use in Latin American Adolescents: Alcohol Consumption Pattern as an External Criterion for Its Validation.

    PubMed

    Schmidt, Vanina; Molina, María Fernanda; Raimundi, María Julia

    2017-11-01

    Sensation Seeking is a trait defined by the seeking of varied, novel, complex, and intense situations and experiences, and the willingness to take physical, social, and financial risks for the sake of such experience. The Sensation Seeking Scale (SSS-V) is the most widely used measure to assess this construct. In previous studies a variety of psychometric limitations were found when using the SSS-V with Latin American population. The purpose of this study is to present additional psychometric properties for its use with Latin American adolescents. It was applied to a 506 adolescent sample (from 12 to 20 years). The result is a scale of 22 items that cover four factors. It seems that sensation seeking among Latin American adolescents can be described in terms of four factors, but with some slightly content differences from what is usually found in adult samples from other countries. Future lines of research are proposed.

  19. The Sensation Seeking Scale (SSS-V) and Its Use in Latin American Adolescents: Alcohol Consumption Pattern as an External Criterion for Its Validation

    PubMed Central

    Schmidt, Vanina; Molina, María Fernanda; Raimundi, María Julia

    2017-01-01

    Sensation Seeking is a trait defined by the seeking of varied, novel, complex, and intense situations and experiences, and the willingness to take physical, social, and financial risks for the sake of such experience. The Sensation Seeking Scale (SSS-V) is the most widely used measure to assess this construct. In previous studies a variety of psychometric limitations were found when using the SSS-V with Latin American population. The purpose of this study is to present additional psychometric properties for its use with Latin American adolescents. It was applied to a 506 adolescent sample (from 12 to 20 years). The result is a scale of 22 items that cover four factors. It seems that sensation seeking among Latin American adolescents can be described in terms of four factors, but with some slightly content differences from what is usually found in adult samples from other countries. Future lines of research are proposed. PMID:29358988

  20. Assessing Degree of Susceptibility to Landslide Hazard

    NASA Astrophysics Data System (ADS)

    Sheridan, M. F.; Cordoba, G. A.; Delgado, H.; Stefanescu, R.

    2013-05-01

    The modeling of hazardous mass flows, both dry and water saturated, is currently an area of active research and several stable models have now emerged that have differing degrees of physical and mathematical fidelity. Models based on the early work of Savage and Hutter (1989) assume that very large dense granular flows could be modeled as incompressible continua governed by a Coulomb failure criterion. Based on this concept, Patra et al. (2005) developed a code for dry avalanches, which proposes a thin layer mathematical model similar to shallow-water equations. This concept was implemented in the widely-used TITAN2D program, which integrates the shock-capturing Godunov solution methodology for the equation system. We propose a method to assess the susceptibility of specific locations susceptible to landslides following heavy tephra fall using the TIATN2D code. Successful application requires that the range of several uncertainties must be framed in the selection of model input data: 1) initial conditions, like volume and location of origin of the landslide, 2) bed and internal friction parameters and 3) digital elevation model (DEM) uncertainties. Among the possible ways of coping with these uncertainties, we chose to use Latin Hypercube Sampling (LHS). This statistical technique reduces a computationally intractable problem to such an extent that is it possible to apply it, even with current personal computers. LHS requires that there is only one sample in each row and each column of the sampling matrix, where each row (multi-dimensional) corresponds to each uncertainty. LHS requires less than 10% of the sample runs needed by Monte Carlo approaches to achieve a stable solution. In our application LHS output provides model sampling for 4 input parameters: initial random volumes, UTM location (x and y), and bed friction. We developed a simple Octave script to link the output of LHS with TITAN2D. In this way, TITAN2D can run several times with successively different initial conditions provided by the LHC routine. Finally the set of results from TITAN2D are computed to obtain the distribution of maximum exceedance probability given that a landslide occurs at a place of interest. We apply this method to find sectors least prone to be affected by landslides, in a region along the Panamerican Highway in the southern part of Colombia. The goal of such a study is to provide decision makers to improve their assessments regarding permissions for development along the highway.

  1. Reduced-Order Model for the Geochemical Impacts of Carbon Dioxide, Brine and Trace Metal Leakage into an Unconfined, Oxidizing Carbonate Aquifer, Version 2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacon, Diana H.

    2013-03-31

    The National Risk Assessment Partnership (NRAP) consists of 5 U.S DOE national laboratories collaborating to develop a framework for predicting the risks associated with carbon sequestration. The approach taken by NRAP is to divide the system into components, including injection target reservoirs, wellbores, natural pathways including faults and fractures, groundwater and the atmosphere. Next, develop a detailed, physics and chemistry-based model of each component. Using the results of the detailed models, develop efficient, simplified models, termed reduced order models (ROM) for each component. Finally, integrate the component ROMs into a system model that calculates risk profiles for the site. Thismore » report details the development of the Groundwater Geochemistry ROM for the Edwards Aquifer at PNNL. The Groundwater Geochemistry ROM for the Edwards Aquifer uses a Wellbore Leakage ROM developed at LANL as input. The detailed model, using the STOMP simulator, covers a 5x8 km area of the Edwards Aquifer near San Antonio, Texas. The model includes heterogeneous hydraulic properties, and equilibrium, kinetic and sorption reactions between groundwater, leaked CO2 gas, brine, and the aquifer carbonate and clay minerals. Latin Hypercube sampling was used to generate 1024 samples of input parameters. For each of these input samples, the STOMP simulator was used to predict the flux of CO2 to the atmosphere, and the volume, length and width of the aquifer where pH was less than the MCL standard, and TDS, arsenic, cadmium and lead exceeded MCL standards. In order to decouple the Wellbore Leakage ROM from the Groundwater Geochemistry ROM, the response surface was transformed to replace Wellbore Leakage ROM input parameters with instantaneous and cumulative CO2 and brine leakage rates. The most sensitive parameters proved to be the CO2 and brine leakage rates from the well, with equilibrium coefficients for calcite and dolomite, as well as the number of illite and kaolinite sorption sites proving to be of secondary importance. The Groundwater Geochemistry ROM was developed using nonlinear regression to fit the response surface with a quadratic polynomial. The goodness of fit was excellent for the CO2 flux to the atmosphere, and very good for predicting the volumes of groundwater exceeding the pH, TDS, As, Cd and Pb threshold values.« less

  2. iTOUGH2 V6.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, Stefan A.

    2010-11-01

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional , multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. It performs sensitivity analysis, parameter estimation, and uncertainty propagation, analysis in geosciences and reservoir engineering and other application areas. It supports a number of different combination of fluids and components [equation-of-state (EOS) modules]. In addition, the optimization routines implemented in iTOUGH2 can also be used or sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files. This link is achieved by means of the PEST application programmingmore » interface. iTOUGH2 solves the inverse problem by minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative fee, gradient-based and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlos simulation for uncertainty propagation analysis. A detailed residual and error analysis is provided. This upgrade includes new EOS modules (specifically EOS7c, ECO2N and TMVOC), hysteretic relative permeability and capillary pressure functions and the PEST API. More details can be found at http://esd.lbl.gov/iTOUGH2 and the publications cited there. Hardware Req.: Multi-platform; Related/auxiliary software PVM (if running in parallel).« less

  3. iTOUGH2 v7.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FINSTERLE, STEFAN; JUNG, YOOJIN; KOWALSKY, MICHAEL

    2016-09-15

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional, multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. iTOUGH2 performs sensitivity analyses, data-worth analyses, parameter estimation, and uncertainty propagation analyses in geosciences and reservoir engineering and other application areas. iTOUGH2 supports a number of different combinations of fluids and components (equation-of-state (EOS) modules). In addition, the optimization routines implemented in iTOUGH2 can also be used for sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files using the PEST protocol. iTOUGH2 solves the inverse problem bymore » minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative-free, gradient-based, and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlo simulations for uncertainty propagation analyses. A detailed residual and error analysis is provided. This upgrade includes (a) global sensitivity analysis methods, (b) dynamic memory allocation (c) additional input features and output analyses, (d) increased forward simulation capabilities, (e) parallel execution on multicore PCs and Linux clusters, and (f) bug fixes. More details can be found at http://esd.lbl.gov/iTOUGH2.« less

  4. Deriving global parameter estimates for the Noah land surface model using FLUXNET and machine learning

    NASA Astrophysics Data System (ADS)

    Chaney, Nathaniel W.; Herman, Jonathan D.; Ek, Michael B.; Wood, Eric F.

    2016-11-01

    With their origins in numerical weather prediction and climate modeling, land surface models aim to accurately partition the surface energy balance. An overlooked challenge in these schemes is the role of model parameter uncertainty, particularly at unmonitored sites. This study provides global parameter estimates for the Noah land surface model using 85 eddy covariance sites in the global FLUXNET network. The at-site parameters are first calibrated using a Latin Hypercube-based ensemble of the most sensitive parameters, determined by the Sobol method, to be the minimum stomatal resistance (rs,min), the Zilitinkevich empirical constant (Czil), and the bare soil evaporation exponent (fxexp). Calibration leads to an increase in the mean Kling-Gupta Efficiency performance metric from 0.54 to 0.71. These calibrated parameter sets are then related to local environmental characteristics using the Extra-Trees machine learning algorithm. The fitted Extra-Trees model is used to map the optimal parameter sets over the globe at a 5 km spatial resolution. The leave-one-out cross validation of the mapped parameters using the Noah land surface model suggests that there is the potential to skillfully relate calibrated model parameter sets to local environmental characteristics. The results demonstrate the potential to use FLUXNET to tune the parameterizations of surface fluxes in land surface models and to provide improved parameter estimates over the globe.

  5. [The Latin American Psychiatrist: profile and degree of satisfaction with the specialty].

    PubMed

    Córdoba, R N; Cano, J F; Alzate, M; Olarte, A F; Salazar, I; Cendales, R

    2009-01-01

    The primary objective is to describe the profile of the psychiatric members of a national psychiatry association in 19 Latin American countries (Argentina, Bolivia, Brazil, Colombia, Costa Rica, Cuba, Dominican Republic, Ecuador, Chile, El Salvador, Guatemala, Honduras, Mexico, Nicaragua, Peru, Panama, Paraguay, Uruguay, and Venezuela). Secondary objectives are to evaluate job satisfaction and examine the factors related with job satisfaction. A total of 8,028 psychiatrists, members of a national psychiatry association in Latin America, were identified. A probabilistic stratified sample of 2465 psychiatrists was designed and they were asked to fill out an anonymous electronic survey. A sample of 1,292 Latin American psychiatrists was obtained between April 2005 and July 2006 (52.4% of the designed sample). Response rates were superior to 70% in 11 countries. Mean age was 48.2 years, mean experience was 18.2 years; 63.8% were male and 99.9% of the surveyed psychiatrists were working as psychiatrists. Most of the respondents declared being satisfied with their quality of life (70.8%), a slightly larger percentage reported they were satisfied with their work (86.4%). However, 35.3% of the psychiatrists reported being unsatisfied with the income perceived for their economic activity as psychiatrists. Factors associated with job dissatisfaction are described. In Latin America, there is lower satisfaction with the incomes obtained from psychiatric practice and with the quality of life level. Nonetheless, the level of commitment with the profession in itself and job satisfaction remain similar to those reported in developed countries.

  6. Optimal design of a piezoelectric transducer for exciting guided wave ultrasound in rails

    NASA Astrophysics Data System (ADS)

    Ramatlo, Dineo A.; Wilke, Daniel N.; Loveday, Philip W.

    2017-02-01

    An existing Ultrasonic Broken Rail Detection System installed in South Africa on a heavy duty railway line is currently being upgraded to include defect detection and location. To accomplish this, an ultrasonic piezoelectric transducer to strongly excite a guided wave mode with energy concentrated in the web (web mode) of a rail is required. A previous study demonstrated that the recently developed SAFE-3D (Semi-Analytical Finite Element - 3 Dimensional) method can effectively predict the guided waves excited by a resonant piezoelectric transducer. In this study, the SAFE-3D model is used in the design optimization of a rail web transducer. A bound-constrained optimization problem was formulated to maximize the energy transmitted by the transducer in the web mode when driven by a pre-defined excitation signal. Dimensions of the transducer components were selected as the three design variables. A Latin hypercube sampled design of experiments that required a total of 500 SAFE-3D analyses in the design space was employed in a response surface-based optimization approach. The Nelder-Mead optimization algorithm was then used to find an optimal transducer design on the constructed response surface. The radial basis function response surface was first verified by comparing a number of predicted responses against the computed SAFE-3D responses. The performance of the optimal transducer predicted by the optimization algorithm on the response surface was also verified to be sufficiently accurate using SAFE-3D. The computational advantages of SAFE-3D in optimal transducer design are noteworthy as more than 500 analyses were performed. The optimal design was then manufactured and experimental measurements were used to validate the predicted performance. The adopted design method has demonstrated the capability to automate the design of transducers for a particular rail cross-section and frequency range.

  7. Development and Implementation of the DTOPLATS-MP land surface model over the Continental US at 30 meters

    NASA Astrophysics Data System (ADS)

    Chaney, N.; Wood, E. F.

    2014-12-01

    The increasing accessibility of high-resolution land data (< 100 m) and high performance computing allows improved parameterizations of subgrid hydrologic processes in macroscale land surface models. Continental scale fully distributed modeling at these spatial scales is possible; however, its practicality for operational use is still unknown due to uncertainties in input data, model parameters, and storage requirements. To address these concerns, we propose a modeling framework that provides the spatial detail of a fully distributed model yet maintains the benefits of a semi-distributed model. In this presentation we will introduce DTOPLATS-MP, a coupling between the NOAH-MP land surface model and the Dynamic TOPMODEL hydrologic model. This new model captures a catchment's spatial heterogeneity by clustering high-resolution land datasets (soil, topography, and land cover) into hundreds of hydrologic similar units (HSUs). A prior DEM analysis defines the connections between each HSU. At each time step, the 1D land surface model updates each HSU; the HSUs then interact laterally via the subsurface and surface. When compared to the fully distributed form of the model, this framework allows a significant decrease in computation and storage while providing most of the same information and enabling parameter transferability. As a proof of concept, we will show how this new modeling framework can be run over CONUS at a 30-meter spatial resolution. For each catchment in the WBD HUC-12 dataset, the model is run between 2002 and 2012 using available high-resolution continental scale land and meteorological datasets over CONUS (dSSURGO, NLCD, NED, and NCEP Stage IV). For each catchment, the model is run with 1000 model parameter sets obtained from a Latin hypercube sample. This exercise will illustrate the feasibility of running the model operationally at continental scales while accounting for model parameter uncertainty.

  8. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5 and Appendices G through I.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Andrew; Lawrence, Earl

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less

  10. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  11. Constructing rigorous and broad biosurveillance networks for detecting emerging zoonotic outbreaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Mac; Moore, Leslie; McMahon, Benjamin

    Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a wide range of scenarios of pathogen spread. We demonstrate how a hierarchy of mathematical and statistical tools can be used in surveillance planning help guide successful surveillance and mitigation policies for a wide range of zoonotic pathogens. The model forecasts can help clarify the complexities ofmore » potential scenarios, and optimize biosurveillance programs for rapidly detecting infectious diseases. Using the highly pathogenic zoonotic H5N1 avian influenza 2006-2007 epidemic in Nigeria as an example, we determined the risk for infection for localized areas in an outbreak and designed biosurveillance stations that are effective for different pathogen strains and a range of possible outbreak locations. We created a general multi-scale, multi-host stochastic SEIR epidemiological network model, with both short and long-range movement, to simulate the spread of an infectious disease through Nigerian human, poultry, backyard duck, and wild bird populations. We chose parameter ranges specific to avian influenza (but not to a particular strain) and used a Latin hypercube sample experimental design to investigate epidemic predictions in a thousand simulations. We ranked the risk of local regions by the number of times they became infected in the ensemble of simulations. These spatial statistics were then complied into a potential risk map of infection. Finally, we validated the results with a known outbreak, using spatial analysis of all the simulation runs to show the progression matched closely with the observed location of the farms infected in the 2006-2007 epidemic.« less

  12. Drivers And Uncertainties Of Increasing Global Water Scarcity

    NASA Astrophysics Data System (ADS)

    Scherer, L.; Pfister, S.

    2015-12-01

    Water scarcity threatens ecosystems and human health and hampers economic development. It generally depends on the ratio of water consumption to availability. We calculated global, spatially explicit water stress indices (WSIs) which describe the vulnerability to additional water consumption on a scale from 0 (low) to 1 (high) and compare them for the decades 1981-1990 and 2001-2010. Input data are obtained from a multi-model ensemble at a resolution of 0.5 degrees. The variability among the models was used to run 1000 Monte Carlo simulations (latin hypercube sampling) and to subsequently estimate uncertainties of the WSIs. Globally, a trend of increasing water scarcity can be observed, however, uncertainties are large. The probability that this trend is actually occurring is as low as 53%. The increase in WSIs is rather driven by higher water use than lower water availability. Water availability is only 40% likely to decrease whereas water consumption is 67% likely to increase. Independent from the trend, we are already living under water scarce conditions, which is reflected in a consumption-weighted average of monthly WSIs of 0.51 in the recent decade. Its coefficient of variation points with 0.8 to the high uncertainties entailed, which might still hide poor model performance where all models consistently over- or underestimate water availability or use. Especially in arid areas, models generally overestimate availability. Although we do not traverse the planetary boundary of freshwater use as global water availability is sufficient, local water scarcity might be high. Therefore the regionalized assessment of WSIs under uncertainty helps to focus on specific regions to optimise water consumption. These global results can also help to raise awareness of water scarcity, and to suggest relevant measures such as more water efficient technologies to international companies, which have to deal with complex and distributed supply chains (e.g. in food production).

  13. Probabilistic seismic history matching using binary images

    NASA Astrophysics Data System (ADS)

    Davolio, Alessandra; Schiozer, Denis Jose

    2018-02-01

    Currently, the goal of history-matching procedures is not only to provide a model matching any observed data but also to generate multiple matched models to properly handle uncertainties. One such approach is a probabilistic history-matching methodology based on the discrete Latin Hypercube sampling algorithm, proposed in previous works, which was particularly efficient for matching well data (production rates and pressure). 4D seismic (4DS) data have been increasingly included into history-matching procedures. A key issue in seismic history matching (SHM) is to transfer data into a common domain: impedance, amplitude or pressure, and saturation. In any case, seismic inversions and/or modeling are required, which can be time consuming. An alternative to avoid these procedures is using binary images in SHM as they allow the shape, rather than the physical values, of observed anomalies to be matched. This work presents the incorporation of binary images in SHM within the aforementioned probabilistic history matching. The application was performed with real data from a segment of the Norne benchmark case that presents strong 4D anomalies, including softening signals due to pressure build up. The binary images are used to match the pressurized zones observed in time-lapse data. Three history matchings were conducted using: only well data, well and 4DS data, and only 4DS. The methodology is very flexible and successfully utilized the addition of binary images for seismic objective functions. Results proved the good convergence of the method in few iterations for all three cases. The matched models of the first two cases provided the best results, with similar well matching quality. The second case provided models presenting pore pressure changes according to the expected dynamic behavior (pressurized zones) observed on 4DS data. The use of binary images in SHM is relatively new with few examples in the literature. This work enriches this discussion by presenting a new application to match pressure in a reservoir segment with complex pressure behavior.

  14. Assessing the impact of intervention strategies against Taenia solium cysticercosis using the EPICYST transmission model.

    PubMed

    Winskill, Peter; Harrison, Wendy E; French, Michael D; Dixon, Matthew A; Abela-Ridder, Bernadette; Basáñez, María-Gloria

    2017-02-09

    The pork tapeworm, Taenia solium, and associated human infections, taeniasis, cysticercosis and neurocysticercosis, are serious public health problems, especially in developing countries. The World Health Organization (WHO) has set goals for having a validated strategy for control and elimination of T. solium taeniasis/cysticercosis by 2015 and interventions scaled-up in selected countries by 2020. Timely achievement of these internationally-endorsed targets requires that the relative benefits and effectiveness of potential interventions be explored rigorously within a quantitative framework. A deterministic, compartmental transmission model (EPICYST) was developed to capture the dynamics of the taeniasis/cysticercosis disease system in the human and pig hosts. Cysticercosis prevalence in humans, an outcome of high epidemiological and clinical importance, was explicitly modelled. A next generation matrix approach was used to derive an expression for the basic reproduction number, R 0 . A full sensitivity analysis was performed using a methodology based on Latin-hypercube sampling partial rank correlation coefficient index. EPICYST outputs indicate that chemotherapeutic intervention targeted at humans or pigs would be highly effective at reducing taeniasis and cysticercosis prevalence when applied singly, with annual chemotherapy of humans and pigs resulting, respectively, in 94 and 74% of human cysticercosis cases averted. Improved sanitation, meat inspection and animal husbandry are less effective but are still able to reduce prevalence singly or in combination. The value of R 0 for taeniasis was estimated at 1.4 (95% Credible Interval: 0.5-3.6). Human- and pig-targeted drug-focussed interventions appear to be the most efficacious approach from the options currently available. The model presented is a forward step towards developing an informed control and elimination strategy for cysticercosis. Together with its validation against field data, EPICYST will be a valuable tool to help reach the WHO goals and to conduct economic evaluations of interventions in varying epidemiological settings.

  15. Performance Improvement of a Return Channel in a Multistage Centrifugal Compressor Using Multiobjective Optimization.

    PubMed

    Nishida, Yoshifumi; Kobayashi, Hiromi; Nishida, Hideo; Sugimura, Kazuyuki

    2013-05-01

    The effect of the design parameters of a return channel on the performance of a multistage centrifugal compressor was numerically investigated, and the shape of the return channel was optimized using a multiobjective optimization method based on a genetic algorithm to improve the performance of the centrifugal compressor. The results of sensitivity analysis using Latin hypercube sampling suggested that the inlet-to-outlet area ratio of the return vane affected the total pressure loss in the return channel, and that the inlet-to-outlet radius ratio of the return vane affected the outlet flow angle from the return vane. Moreover, this analysis suggested that the number of return vanes affected both the loss and the flow angle at the outlet. As a result of optimization, the number of return vane was increased from 14 to 22 and the area ratio was decreased from 0.71 to 0.66. The radius ratio was also decreased from 2.1 to 2.0. Performance tests on a centrifugal compressor with two return channels (the original design and optimized design) were carried out using two-stage test apparatus. The measured flow distribution exhibited a swirl flow in the center region and a reversed swirl flow near the hub and shroud sides. The exit flow of the optimized design was more uniform than that of the original design. For the optimized design, the overall two-stage efficiency and pressure coefficient were increased by 0.7% and 1.5%, respectively. Moreover, the second-stage efficiency and pressure coefficient were respectively increased by 1.0% and 3.2%. It is considered that the increase in the second-stage efficiency was caused by the increased uniformity of the flow, and the rise in the pressure coefficient was caused by a decrease in the residual swirl flow. It was thus concluded from the numerical and experimental results that the optimized return channel improved the performance of the multistage centrifugal compressor.

  16. Capturing spatial heterogeneity of soil organic carbon under changing climate

    NASA Astrophysics Data System (ADS)

    Mishra, U.; Fan, Z.; Jastrow, J. D.; Matamala, R.; Vitharana, U.

    2015-12-01

    The spatial heterogeneity of the land surface affects water, energy, and greenhouse gas exchanges with the atmosphere. Designing observation networks that capture land surface spatial heterogeneity is a critical scientific challenge. Here, we present a geospatial approach to capture the existing spatial heterogeneity of soil organic carbon (SOC) stocks across Alaska, USA. We used the standard deviation of 556 georeferenced SOC profiles previously compiled in Mishra and Riley (2015, Biogeosciences, 12:3993-4004) to calculate the number of observations that would be needed to reliably estimate Alaskan SOC stocks. This analysis indicated that 906 randomly distributed observation sites would be needed to quantify the mean value of SOC stocks across Alaska at a confidence interval of ± 5 kg m-2. We then used soil-forming factors (climate, topography, land cover types, surficial geology) to identify the locations of appropriately distributed observation sites by using the conditioned Latin hypercube sampling approach. Spatial correlation and variogram analyses demonstrated that the spatial structures of soil-forming factors were adequately represented by these 906 sites. Using the spatial correlation length of existing SOC observations, we identified 484 new observation sites would be needed to provide the best estimate of the present status of SOC stocks in Alaska. We then used average decadal projections (2020-2099) of precipitation, temperature, and length of growing season for three representative concentration pathway (RCP 4.5, 6.0, and 8.5) scenarios of the Intergovernmental Panel on Climate Change to investigate whether the location of identified observation sites will shift/change under future climate. Our results showed 12-41 additional observation sites (depending on emission scenarios) will be required to capture the impact of projected climatic conditions by 2100 on the spatial heterogeneity of Alaskan SOC stocks. Our results represent an ideal distribution of observation sites across Alaska that captures the land surface spatial heterogeneity and can be used in efforts to quantify SOC stocks, monitor greenhouse gas emissions, and benchmark Earth System Model results.

  17. A clinically parameterized mathematical model of Shigella immunity to inform vaccine design

    PubMed Central

    Wahid, Rezwanul; Toapanta, Franklin R.; Simon, Jakub K.; Sztein, Marcelo B.

    2018-01-01

    We refine and clinically parameterize a mathematical model of the humoral immune response against Shigella, a diarrheal bacteria that infects 80-165 million people and kills an estimated 600,000 people worldwide each year. Using Latin hypercube sampling and Monte Carlo simulations for parameter estimation, we fit our model to human immune data from two Shigella EcSf2a-2 vaccine trials and a rechallenge study in which antibody and B-cell responses against Shigella′s lipopolysaccharide (LPS) and O-membrane proteins (OMP) were recorded. The clinically grounded model is used to mathematically investigate which key immune mechanisms and bacterial targets confer immunity against Shigella and to predict which humoral immune components should be elicited to create a protective vaccine against Shigella. The model offers insight into why the EcSf2a-2 vaccine had low efficacy and demonstrates that at a group level a humoral immune response induced by EcSf2a-2 vaccine or wild-type challenge against Shigella′s LPS or OMP does not appear sufficient for protection. That is, the model predicts an uncontrolled infection of gut epithelial cells that is present across all best-fit model parameterizations when fit to EcSf2a-2 vaccine or wild-type challenge data. Using sensitivity analysis, we explore which model parameter values must be altered to prevent the destructive epithelial invasion by Shigella bacteria and identify four key parameter groups as potential vaccine targets or immune correlates: 1) the rate that Shigella migrates into the lamina propria or epithelium, 2) the rate that memory B cells (BM) differentiate into antibody-secreting cells (ASC), 3) the rate at which antibodies are produced by activated ASC, and 4) the Shigella-specific BM carrying capacity. This paper underscores the need for a multifaceted approach in ongoing efforts to design an effective Shigella vaccine. PMID:29304144

  18. A clinically parameterized mathematical model of Shigella immunity to inform vaccine design.

    PubMed

    Davis, Courtney L; Wahid, Rezwanul; Toapanta, Franklin R; Simon, Jakub K; Sztein, Marcelo B

    2018-01-01

    We refine and clinically parameterize a mathematical model of the humoral immune response against Shigella, a diarrheal bacteria that infects 80-165 million people and kills an estimated 600,000 people worldwide each year. Using Latin hypercube sampling and Monte Carlo simulations for parameter estimation, we fit our model to human immune data from two Shigella EcSf2a-2 vaccine trials and a rechallenge study in which antibody and B-cell responses against Shigella's lipopolysaccharide (LPS) and O-membrane proteins (OMP) were recorded. The clinically grounded model is used to mathematically investigate which key immune mechanisms and bacterial targets confer immunity against Shigella and to predict which humoral immune components should be elicited to create a protective vaccine against Shigella. The model offers insight into why the EcSf2a-2 vaccine had low efficacy and demonstrates that at a group level a humoral immune response induced by EcSf2a-2 vaccine or wild-type challenge against Shigella's LPS or OMP does not appear sufficient for protection. That is, the model predicts an uncontrolled infection of gut epithelial cells that is present across all best-fit model parameterizations when fit to EcSf2a-2 vaccine or wild-type challenge data. Using sensitivity analysis, we explore which model parameter values must be altered to prevent the destructive epithelial invasion by Shigella bacteria and identify four key parameter groups as potential vaccine targets or immune correlates: 1) the rate that Shigella migrates into the lamina propria or epithelium, 2) the rate that memory B cells (BM) differentiate into antibody-secreting cells (ASC), 3) the rate at which antibodies are produced by activated ASC, and 4) the Shigella-specific BM carrying capacity. This paper underscores the need for a multifaceted approach in ongoing efforts to design an effective Shigella vaccine.

  19. Assessing the Impact of Model Parameter Uncertainty in Simulating Grass Biomass Using a Hybrid Carbon Allocation Strategy

    NASA Astrophysics Data System (ADS)

    Reyes, J. J.; Adam, J. C.; Tague, C.

    2016-12-01

    Grasslands play an important role in agricultural production as forage for livestock; they also provide a diverse set of ecosystem services including soil carbon (C) storage. The partitioning of C between above and belowground plant compartments (i.e. allocation) is influenced by both plant characteristics and environmental conditions. The objectives of this study are to 1) develop and evaluate a hybrid C allocation strategy suitable for grasslands, and 2) apply this strategy to examine the importance of various parameters related to biogeochemical cycling, photosynthesis, allocation, and soil water drainage on above and belowground biomass. We include allocation as an important process in quantifying the model parameter uncertainty, which identifies the most influential parameters and what processes may require further refinement. For this, we use the Regional Hydro-ecologic Simulation System, a mechanistic model that simulates coupled water and biogeochemical processes. A Latin hypercube sampling scheme was used to develop parameter sets for calibration and evaluation of allocation strategies, as well as parameter uncertainty analysis. We developed the hybrid allocation strategy to integrate both growth-based and resource-limited allocation mechanisms. When evaluating the new strategy simultaneously for above and belowground biomass, it produced a larger number of less biased parameter sets: 16% more compared to resource-limited and 9% more compared to growth-based. This also demonstrates its flexible application across diverse plant types and environmental conditions. We found that higher parameter importance corresponded to sub- or supra-optimal resource availability (i.e. water, nutrients) and temperature ranges (i.e. too hot or cold). For example, photosynthesis-related parameters were more important at sites warmer than the theoretical optimal growth temperature. Therefore, larger values of parameter importance indicate greater relative sensitivity in adequately representing the relevant process to capture limiting resources or manage atypical environmental conditions. These results may inform future experimental work by focusing efforts on quantifying specific parameters under various environmental conditions or across diverse plant functional types.

  20. Constructing rigorous and broad biosurveillance networks for detecting emerging zoonotic outbreaks

    DOE PAGES

    Brown, Mac; Moore, Leslie; McMahon, Benjamin; ...

    2015-05-06

    Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a wide range of scenarios of pathogen spread. We demonstrate how a hierarchy of mathematical and statistical tools can be used in surveillance planning help guide successful surveillance and mitigation policies for a wide range of zoonotic pathogens. The model forecasts can help clarify the complexities ofmore » potential scenarios, and optimize biosurveillance programs for rapidly detecting infectious diseases. Using the highly pathogenic zoonotic H5N1 avian influenza 2006-2007 epidemic in Nigeria as an example, we determined the risk for infection for localized areas in an outbreak and designed biosurveillance stations that are effective for different pathogen strains and a range of possible outbreak locations. We created a general multi-scale, multi-host stochastic SEIR epidemiological network model, with both short and long-range movement, to simulate the spread of an infectious disease through Nigerian human, poultry, backyard duck, and wild bird populations. We chose parameter ranges specific to avian influenza (but not to a particular strain) and used a Latin hypercube sample experimental design to investigate epidemic predictions in a thousand simulations. We ranked the risk of local regions by the number of times they became infected in the ensemble of simulations. These spatial statistics were then complied into a potential risk map of infection. Finally, we validated the results with a known outbreak, using spatial analysis of all the simulation runs to show the progression matched closely with the observed location of the farms infected in the 2006-2007 epidemic.« less

  1. [Empathy, inter-professional collaboration, and lifelong medical learning in Spanish and Latin-American physicians-in-training who start their postgraduate training in hospitals in Spain. Preliminary outcomes].

    PubMed

    San-Martín, Montserrat; Roig-Carrera, Helena; Villalonga-Vadell, Rosa M; Benito-Sevillano, Carmen; Torres-Salinas, Miquel; Claret-Teruel, Gemma; Robles, Bernabé; Sans-Boix, Antonia; Alcorta-Garza, Adelina; Vivanco, Luis

    2017-01-01

    To identify similarities and differences in empathy, abilities toward inter-professional collaboration, and lifelong medical learning, between Spanish and Latin-American physicians-in-training who start their posgraduate training in teaching hospitals in Spain. Observational study using self-administered questionnaires. Five teaching hospitals in the province of Barcelona, Spain. Spanish and Latin-American physicians-in-training who started their first year of post-graduate medical training. Empathy was measured using the Jefferson scale of empathy. Abilities for inter-professional collaboration were measured using the Jefferson scale attitudes towards nurse-physician collaboration. Learning was measured using the Jefferson scale of medical lifelong learning scale. From a sample of 156 physicians-in-training, 110 from Spain and 40 from Latin America, the Spanish group showed the highest empathy (p<.05). On the other hand, Latin-American physicians had the highest scores in lifelong learning abilities (p<.001). A positive relationship was found between empathy and inter-professional collaboration for the whole sample (r=+0.34; p<.05). These results confirm previous preliminary data and underline the positive influence of empathy in the development of inter-professional collaboration abilities. In Latin-American physicians who start posgraduate training programs, lifelong learning abilities have a positive influence on the development of other professional competencies. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  2. A hypercube compact neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rostykus, P.L.; Somani, A.K.

    1988-09-01

    A major problem facing implementation of neural networks is the connection problem. One popular tradeoff is to remove connections. Random disconnection severely degrades the capabilities. The hypercube based Compact Neural Network (CNN) has structured architecture combined with a rearrangement of the memory vectors gives a larger input space and better degradation than a cost equivalent network with more connections. The CNNs are based on a Hopfield network. The changes from the Hopfield net include states of -1 and +1 and when a node was evaluated to 0, it was not biased either positive or negative, instead it resumed its previousmore » state. L = PEs, N = memories and t/sub ij/s is the weights between i and j.« less

  3. Mapping implicit spectral methods to distributed memory architectures

    NASA Technical Reports Server (NTRS)

    Overman, Andrea L.; Vanrosendale, John

    1991-01-01

    Spectral methods were proven invaluable in numerical simulation of PDEs (Partial Differential Equations), but the frequent global communication required raises a fundamental barrier to their use on highly parallel architectures. To explore this issue, a 3-D implicit spectral method was implemented on an Intel hypercube. Utilization of about 50 percent was achieved on a 32 node iPSC/860 hypercube, for a 64 x 64 x 64 Fourier-spectral grid; finer grids yield higher utilizations. Chebyshev-spectral grids are more problematic, since plane-relaxation based multigrid is required. However, by using a semicoarsening multigrid algorithm, and by relaxing all multigrid levels concurrently, relatively high utilizations were also achieved in this harder case.

  4. Maximum Torque and Momentum Envelopes for Reaction Wheel Arrays

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Reynolds, Reid G.; Liu, Frank X.; Lebsock, Kenneth L.

    2009-01-01

    Spacecraft reaction wheel maneuvers are limited by the maximum torque and/or angular momentum that the wheels can provide. For an n-wheel configuration, the torque or momentum envelope can be obtained by projecting the n-dimensional hypercube, representing the domain boundary of individual wheel torques or momenta, into three dimensional space via the 3xn matrix of wheel axes. In this paper, the properties of the projected hypercube are discussed, and algorithms are proposed for determining this maximal torque or momentum envelope for general wheel configurations. Practical strategies for distributing a prescribed torque or momentum among the n wheels are presented, with special emphasis on configurations of four, five, and six wheels.

  5. Ordered fast Fourier transforms on a massively parallel hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Tong, Charles; Swarztrauber, Paul N.

    1991-01-01

    The present evaluation of alternative, massively parallel hypercube processor-applicable designs for ordered radix-2 decimation-in-frequency FFT algorithms gives attention to the reduction of computation time-dominating communication. A combination of the order and computational phases of the FFT is accordingly employed, in conjunction with sequence-to-processor maps which reduce communication. Two orderings, 'standard' and 'cyclic', in which the order of the transform is the same as that of the input sequence, can be implemented with ease on the Connection Machine (where orderings are determined by geometries and priorities. A parallel method for trigonometric coefficient computation is presented which does not employ trigonometric functions or interprocessor communication.

  6. PC-CUBE: A Personal Computer Based Hypercube

    NASA Technical Reports Server (NTRS)

    Ho, Alex; Fox, Geoffrey; Walker, David; Snyder, Scott; Chang, Douglas; Chen, Stanley; Breaden, Matt; Cole, Terry

    1988-01-01

    PC-CUBE is an ensemble of IBM PCs or close compatibles connected in the hypercube topology with ordinary computer cables. Communication occurs at the rate of 115.2 K-band via the RS-232 serial links. Available for PC-CUBE is the Crystalline Operating System III (CrOS III), Mercury Operating System, CUBIX and PLOTIX which are parallel I/O and graphics libraries. A CrOS performance monitor was developed to facilitate the measurement of communication and computation time of a program and their effects on performance. Also available are CXLISP, a parallel version of the XLISP interpreter; GRAFIX, some graphics routines for the EGA and CGA; and a general execution profiler for determining execution time spent by program subroutines. PC-CUBE provides a programming environment similar to all hypercube systems running CrOS III, Mercury and CUBIX. In addition, every node (personal computer) has its own graphics display monitor and storage devices. These allow data to be displayed or stored at every processor, which has much instructional value and enables easier debugging of applications. Some application programs which are taken from the book Solving Problems on Concurrent Processors (Fox 88) were implemented with graphics enhancement on PC-CUBE. The applications range from solving the Mandelbrot set, Laplace equation, wave equation, long range force interaction, to WaTor, an ecological simulation.

  7. Ordered fast fourier transforms on a massively parallel hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Tong, Charles; Swarztrauber, Paul N.

    1989-01-01

    Design alternatives for ordered Fast Fourier Transformation (FFT) algorithms were examined on massively parallel hypercube multiprocessors such as the Connection Machine. Particular emphasis is placed on reducing communication which is known to dominate the overall computing time. To this end, the order and computational phases of the FFT were combined, and the sequence to processor maps that reduce communication were used. The class of ordered transforms is expanded to include any FFT in which the order of the transform is the same as that of the input sequence. Two such orderings are examined, namely, standard-order and A-order which can be implemented with equal ease on the Connection Machine where orderings are determined by geometries and priorities. If the sequence has N = 2 exp r elements and the hypercube has P = 2 exp d processors, then a standard-order FFT can be implemented with d + r/2 + 1 parallel transmissions. An A-order sequence can be transformed with 2d - r/2 parallel transmissions which is r - d + 1 fewer than the standard order. A parallel method for computing the trigonometric coefficients is presented that does not use trigonometric functions or interprocessor communication. A performance of 0.9 GFLOPS was obtained for an A-order transform on the Connection Machine.

  8. Fossil vs. non-fossil sources of fine carbonaceous aerosols in four Chinese cities during the extreme winter haze episode of 2013

    NASA Astrophysics Data System (ADS)

    Zhang, Y.-L.; Huang, R.-J.; El Haddad, I.; Ho, K.-F.; Cao, J.-J.; Han, Y.; Zotter, P.; Bozzetti, C.; Daellenbach, K. R.; Canonaco, F.; Slowik, J. G.; Salazar, G.; Schwikowski, M.; Schnelle-Kreis, J.; Abbaszade, G.; Zimmermann, R.; Baltensperger, U.; Prévôt, A. S. H.; Szidat, S.

    2015-02-01

    During winter 2013, extremely high concentrations (i.e., 4-20 times higher than the World Health Organization guideline) of PM2.5 (particulate matter with an aerodynamic diameter < 2.5 μm) mass concentrations (24 h samples) were found in four major cities in China including Xi'an, Beijing, Shanghai and Guangzhou. Statistical analysis of a combined data set from elemental carbon (EC), organic carbon (OC), 14C and biomass-burning marker measurements using Latin hypercube sampling allowed a quantitative source apportionment of carbonaceous aerosols. Based on 14C measurements of EC fractions (six samples each city), we found that fossil emissions from coal combustion and vehicle exhaust dominated EC with a mean contribution of 75 ± 8% across all sites. The remaining 25 ± 8% was exclusively attributed to biomass combustion, consistent with the measurements of biomass-burning markers such as anhydrosugars (levoglucosan and mannosan) and water-soluble potassium (K+). With a combination of the levoglucosan-to-mannosan and levoglucosan-to-K+ ratios, the major source of biomass burning in winter in China is suggested to be combustion of crop residues. The contribution of fossil sources to OC was highest in Beijing (58 ± 5%) and decreased from Shanghai (49 ± 2%) to Xi'an (38 ± 3%) and Guangzhou (35 ± 7%). Generally, a larger fraction of fossil OC was from secondary origins than primary sources for all sites. Non-fossil sources accounted on average for 55 ± 10 and 48 ± 9% of OC and total carbon (TC), respectively, which suggests that non-fossil emissions were very important contributors of urban carbonaceous aerosols in China. The primary biomass-burning emissions accounted for 40 ± 8, 48 ± 18, 53 ± 4 and 65 ± 26% of non-fossil OC for Xi'an, Beijing, Shanghai and Guangzhou, respectively. Other non-fossil sources excluding primary biomass burning were mainly attributed to formation of secondary organic carbon (SOC) from non-fossil precursors such as biomass-burning emissions. For each site, we also compared samples from moderately to heavily polluted days according to particulate matter mass. Despite a significant increase of the absolute mass concentrations of primary emissions from both fossil and non-fossil sources during the heavily polluted events, their relative contribution to TC was even decreased, whereas the portion of SOC was consistently increased at all sites. This observation indicates that SOC was an important fraction in the increment of carbonaceous aerosols during the haze episode in China.

  9. National working conditions surveys in Latin America: comparison of methodological characteristics

    PubMed Central

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G.

    2015-01-01

    Background: High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. Objective: To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. Methods: The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Results: Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Conclusions: Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region. PMID:26079314

  10. National working conditions surveys in Latin America: comparison of methodological characteristics.

    PubMed

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G

    2015-01-01

    High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region.

  11. Stochastic species abundance models involving special copulas

    NASA Astrophysics Data System (ADS)

    Huillet, Thierry E.

    2018-01-01

    Copulas offer a very general tool to describe the dependence structure of random variables supported by the hypercube. Inspired by problems of species abundances in Biology, we study three distinct toy models where copulas play a key role. In a first one, a Marshall-Olkin copula arises in a species extinction model with catastrophe. In a second one, a quasi-copula problem arises in a flagged species abundance model. In a third model, we study completely random species abundance models in the hypercube as those, not of product type, with uniform margins and singular. These can be understood from a singular copula supported by an inflated simplex. An exchangeable singular Dirichlet copula is also introduced, together with its induced completely random species abundance vector.

  12. Performance analysis of parallel branch and bound search with the hypercube architecture

    NASA Technical Reports Server (NTRS)

    Mraz, Richard T.

    1987-01-01

    With the availability of commercial parallel computers, researchers are examining new classes of problems which might benefit from parallel computing. This paper presents results of an investigation of the class of search intensive problems. The specific problem discussed is the Least-Cost Branch and Bound search method of deadline job scheduling. The object-oriented design methodology was used to map the problem into a parallel solution. While the initial design was good for a prototype, the best performance resulted from fine-tuning the algorithm for a specific computer. The experiments analyze the computation time, the speed up over a VAX 11/785, and the load balance of the problem when using loosely coupled multiprocessor system based on the hypercube architecture.

  13. A site oriented supercomputer for theoretical physics: The Fermilab Advanced Computer Program Multi Array Processor System (ACMAPS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nash, T.; Atac, R.; Cook, A.

    1989-03-06

    The ACPMAPS multipocessor is a highly cost effective, local memory parallel computer with a hypercube or compound hypercube architecture. Communication requires the attention of only the two communicating nodes. The design is aimed at floating point intensive, grid like problems, particularly those with extreme computing requirements. The processing nodes of the system are single board array processors, each with a peak power of 20 Mflops, supported by 8 Mbytes of data and 2 Mbytes of instruction memory. The system currently being assembled has a peak power of 5 Gflops. The nodes are based on the Weitek XL Chip set. Themore » system delivers performance at approximately $300/Mflop. 8 refs., 4 figs.« less

  14. Daily Family Conflict and Emotional Distress among Adolescents from Latin American, Asian, and European Backgrounds

    ERIC Educational Resources Information Center

    Chung, Grace H.; Flook, Lisa; Fuligni, Andrew J.

    2009-01-01

    The authors employed a daily diary method to assess daily frequencies of interparental and parent-adolescent conflict over a 2-week period and their implications for emotional distress across the high school years in a longitudinal sample of 415 adolescents from Latin American, Asian, and European backgrounds. Although family conflict remained…

  15. Beyond Growth: The Next Stage in Language and Area Studies.

    DTIC Science & Technology

    1984-04-01

    including Spanish for Latin America, enjoy a favorable situation on all four dimensions; Quechua and other Indian languages (not listed in Table 2.2) have...34 Sample size: 51 applicants in EE, of which 27 took " EE language instruction. LATIN AMERICA Spanish 4 1 3 4 Portuguese 3 3 2 5 Quechua 1 2 1 - Sample...100.0 - - 2 Portuguese 80.7 11.5 7.8 398 Quechua 100.0 - - 12 Quiche maya 100.0 - - 10 Spanish 76.5 16.2 7.3 7368 MIDDLE EAST 7 Arabic Colloquial

  16. Cross cultural analysis of factors associated with age at natural menopause among Latin-American immigrants to Madrid and their Spanish neighbors.

    PubMed

    Pérez-Alcalá, Irene; Sievert, Lynnette Leidy; Obermeyer, Carla Makhlouf; Reher, David Sven

    2013-01-01

    In this study, age at menopause was examined in relation to demographic and life style factors among Latin-American immigrants to Madrid and their Spanish counterparts. Respondents were drawn from the Decisions at Menopause Study (2002-2003) and from a recent sample of Latin-American immigrants to Madrid (2010-2011). The final sample included 484 women after excluding women with induced menopause and use of HT. Probit analyses and Cox proportional hazard models were used to estimate median age at menopause and to identify factors associated with an early age at menopause. Median estimated age at natural menopause was 52.0 years (51.2-53.0) for Spanish women and 50.5 years (49.9-51.2) for Latin-American women. Immigrant women were more likely to reach menopause at an earlier age after controlling for confounding factors. Nulliparity and lower levels of education were associated with an earlier age at menopause. A higher body mass index was associated with a later age at menopause in the Spanish model. Among the Latin-American sample, women from the Dominican Republic and women who underwent menopause before migrating were more likely to reach menopause at an earlier age. The results reported here demonstrate that early life events, including place of birth, and later life events, such as timing of migration, were associated with age at menopause. This study highlights the importance of taking into account differences in the age of onset of menopause in the multicultural population of Madrid when considering the health of women at midlife and beyond. Copyright © 2013 Wiley Periodicals, Inc.

  17. Polyhedra and Higher Dimensions.

    ERIC Educational Resources Information Center

    Scott, Paul

    1988-01-01

    Describes the definition and characteristics of a regular polyhedron, tessellation, and pseudopolyhedra with diagrams. Discusses the nature of simplex, hypercube, and cross-polytope in the fourth dimension and beyond. (YP)

  18. Homework and Primary-School Students' Academic Achievement in Latin America

    ERIC Educational Resources Information Center

    Murillo, F. Javier; Martinez-Garrido, Cynthia

    2014-01-01

    This paper explores teachers' habits (1) in terms of setting homework for their students and (2) in terms of building on homework in the classroom. Based on data collected in UNESCO's Second Regional Comparative and Explanatory Study (SERCE), the sample size of this analysis is about 200,000 Primary Grade 3 and 6 students in 16 Latin American…

  19. Using Response Surface Methods to Correlate the Modal Test of an Inflatable Test Article

    NASA Technical Reports Server (NTRS)

    Gupta, Anju

    2013-01-01

    This paper presents a practical application of response surface methods (RSM) to correlate a finite element model of a structural modal test. The test article is a quasi-cylindrical inflatable structure which primarily consists of a fabric weave, with an internal bladder and metallic bulkheads on either end. To mitigate model size, the fabric weave was simplified by representing it with shell elements. The task at hand is to represent the material behavior of the weave. The success of the model correlation is measured by comparing the four major modal frequencies of the analysis model to the four major modal frequencies of the test article. Given that only individual strap material properties were provided and material properties of the overall weave were not available, defining the material properties of the finite element model became very complex. First it was necessary to determine which material properties (modulus of elasticity in the hoop and longitudinal directions, shear modulus, Poisson's ratio, etc.) affected the modal frequencies. Then a Latin Hypercube of the parameter space was created to form an efficiently distributed finite case set. Each case was then analyzed with the results input into RSM. In the resulting response surface it was possible to see how each material parameter affected the modal frequencies of the analysis model. If the modal frequencies of the analysis model and its corresponding parameters match the test with acceptable accuracy, it can be said that the model correlation is successful.

  20. Robust human body model injury prediction in simulated side impact crashes.

    PubMed

    Golman, Adam J; Danelson, Kerry A; Stitzel, Joel D

    2016-01-01

    This study developed a parametric methodology to robustly predict occupant injuries sustained in real-world crashes using a finite element (FE) human body model (HBM). One hundred and twenty near-side impact motor vehicle crashes were simulated over a range of parameters using a Toyota RAV4 (bullet vehicle), Ford Taurus (struck vehicle) FE models and a validated human body model (HBM) Total HUman Model for Safety (THUMS). Three bullet vehicle crash parameters (speed, location and angle) and two occupant parameters (seat position and age) were varied using a Latin hypercube design of Experiments. Four injury metrics (head injury criterion, half deflection, thoracic trauma index and pelvic force) were used to calculate injury risk. Rib fracture prediction and lung strain metrics were also analysed. As hypothesized, bullet speed had the greatest effect on each injury measure. Injury risk was reduced when bullet location was further from the B-pillar or when the bullet angle was more oblique. Age had strong correlation to rib fractures frequency and lung strain severity. The injuries from a real-world crash were predicted using two different methods by (1) subsampling the injury predictors from the 12 best crush profile matching simulations and (2) using regression models. Both injury prediction methods successfully predicted the case occupant's low risk for pelvic injury, high risk for thoracic injury, rib fractures and high lung strains with tight confidence intervals. This parametric methodology was successfully used to explore crash parameter interactions and to robustly predict real-world injuries.

  1. Effect sizes and cut-off points: a meta-analytical review of burnout in latin American countries.

    PubMed

    García-Arroyo, Jose; Osca Segovia, Amparo

    2018-05-02

    Burnout is a highly prevalent globalized health issue that causes significant physical and psychological health problems. In Latin America research on this topic has increased in recent years, however there are no studies comparing results across countries, nor normative reference cut-offs. The present meta-analysis examines the intensity of burnout (emotional exhaustion, cynicism and personal accomplishment) in 58 adult nonclinical samples from 8 countries (Argentina, Brazil, Chile, Colombia, Ecuador, Mexico, Peru and Venezuela). We found low intensity of burnout but there are significant differences between countries in emotional exhaustion explained by occupation and language. Social and human service professionals (police officers, social workers, public administration staff) are more exhausted than health professionals (physicians, nurses) or teachers. The samples with Portuguese language score higher in emotional exhaustion than Spanish, supporting the theory of cultural relativism. Demographics (sex, age) and study variables (sample size, instrument), were not found significant to predict burnout. The effect size and confidence intervals found are proposed as a useful baseline for research and medical diagnosis of burnout in Latin American countries.

  2. Monte Carlo Study of Four-Dimensional Self-avoiding Walks of up to One Billion Steps

    NASA Astrophysics Data System (ADS)

    Clisby, Nathan

    2018-04-01

    We study self-avoiding walks on the four-dimensional hypercubic lattice via Monte Carlo simulations of walks with up to one billion steps. We study the expected logarithmic corrections to scaling, and find convincing evidence in support the scaling form predicted by the renormalization group, with an estimate for the power of the logarithmic factor of 0.2516(14), which is consistent with the predicted value of 1/4. We also characterize the behaviour of the pivot algorithm for sampling four dimensional self-avoiding walks, and conjecture that the probability of a pivot move being successful for an N-step walk is O([ log N ]^{-1/4}).

  3. Domain decomposition methods in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Gropp, William D.; Keyes, David E.

    1991-01-01

    The divide-and-conquer paradigm of iterative domain decomposition, or substructuring, has become a practical tool in computational fluid dynamic applications because of its flexibility in accommodating adaptive refinement through locally uniform (or quasi-uniform) grids, its ability to exploit multiple discretizations of the operator equations, and the modular pathway it provides towards parallelism. These features are illustrated on the classic model problem of flow over a backstep using Newton's method as the nonlinear iteration. Multiple discretizations (second-order in the operator and first-order in the preconditioner) and locally uniform mesh refinement pay dividends separately, and they can be combined synergistically. Sample performance results are included from an Intel iPSC/860 hypercube implementation.

  4. Efficient uncertainty quantification in fully-integrated surface and subsurface hydrologic simulations

    NASA Astrophysics Data System (ADS)

    Miller, K. L.; Berg, S. J.; Davison, J. H.; Sudicky, E. A.; Forsyth, P. A.

    2018-01-01

    Although high performance computers and advanced numerical methods have made the application of fully-integrated surface and subsurface flow and transport models such as HydroGeoSphere common place, run times for large complex basin models can still be on the order of days to weeks, thus, limiting the usefulness of traditional workhorse algorithms for uncertainty quantification (UQ) such as Latin Hypercube simulation (LHS) or Monte Carlo simulation (MCS), which generally require thousands of simulations to achieve an acceptable level of accuracy. In this paper we investigate non-intrusive polynomial chaos for uncertainty quantification, which in contrast to random sampling methods (e.g., LHS and MCS), represents a model response of interest as a weighted sum of polynomials over the random inputs. Once a chaos expansion has been constructed, approximating the mean, covariance, probability density function, cumulative distribution function, and other common statistics as well as local and global sensitivity measures is straightforward and computationally inexpensive, thus making PCE an attractive UQ method for hydrologic models with long run times. Our polynomial chaos implementation was validated through comparison with analytical solutions as well as solutions obtained via LHS for simple numerical problems. It was then used to quantify parametric uncertainty in a series of numerical problems with increasing complexity, including a two-dimensional fully-saturated, steady flow and transient transport problem with six uncertain parameters and one quantity of interest; a one-dimensional variably-saturated column test involving transient flow and transport, four uncertain parameters, and two quantities of interest at 101 spatial locations and five different times each (1010 total); and a three-dimensional fully-integrated surface and subsurface flow and transport problem for a small test catchment involving seven uncertain parameters and three quantities of interest at 241 different times each. Numerical experiments show that polynomial chaos is an effective and robust method for quantifying uncertainty in fully-integrated hydrologic simulations, which provides a rich set of features and is computationally efficient. Our approach has the potential for significant speedup over existing sampling based methods when the number of uncertain model parameters is modest ( ≤ 20). To our knowledge, this is the first implementation of the algorithm in a comprehensive, fully-integrated, physically-based three-dimensional hydrosystem model.

  5. CROSS CULTURAL ANALYSIS OF DETERMINANTS OF HOT FLASHES AND NIGHT SWEATS: LATIN-AMERICAN IMMIGRANTS TO MADRID AND THEIR SPANISH NEIGHBORS

    PubMed Central

    Pérez-Alcalá, Irene; Sievert, Lynnette Leidy; Obermeyer, Carla Makhlouf; Reher, David Sven

    2013-01-01

    Objective This study applies a biocultural perspective to better understand the determinants of hot flashes and night sweats within immigrant and local populations in Madrid, Spain. Methods A combined sample of 575 women from Madrid, aged 45 to 55, was drawn from two studies. The Spanish sample (n=274) participated in the Decisions at Menopause Study (DAMES) in 2000–2002. The Latin-American sample (n=301) was drawn from immigrants to Madrid in 2010–2011. Chi square analyses and logistic regression models were carried out among the combined controlling by origin of provenance. Results Forty four percent of the women reported hot flashes, 36% reported night sweats and 26% both symptoms. Compared to Spanish women, Latin-American women were less likely to report hot flashes (OR 0.7, 95% CI 0.4–0.9) after controlling for demographic variables and menopausal status. The same was not found for night sweats and for both symptoms combined. Determinants of hot flashes differed from determinants of night sweats. Conclusions Because determinants differed, hot flashes and night sweats should be queried and analyzed separately. Latin-American women were less likely to report hot flashes, but not night sweats or both symptoms combined. More research is needed to clarify the differences in reported hot flashes as the lesser report among immigrants could have been a cultural rather than a biological phenomenon. PMID:23571525

  6. Exploring the spatial variability of soil properties in an Alfisol Catena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosemary, F.; Vitharana, U. W. A.; Indraratne, S. P.

    Detailed digital soil maps showing the spatial heterogeneity of soil properties consistent with the landscape are required for site-specific management of plant nutrients, land use planning and process-based environmental modeling. We characterized the short-scale spatial heterogeneity of soil properties in an Alfisol catena in a tropical landscape of Sri Lanka. The impact of different land-uses (paddy, vegetable and un-cultivated) was examined to assess the impact of anthropogenic activities on the variability of soil properties at the catenary level. Conditioned Latin hypercube sampling was used to collect 58 geo-referenced topsoil samples (0–30 cm) from the study area. Soil samples were analyzedmore » for pH, electrical conductivity (EC), organic carbon (OC), cation exchange capacity (CEC) and texture. The spatial correlation between soil properties was analyzed by computing crossvariograms and subsequent fitting of theoretical model. Spatial distribution maps were developed using ordinary kriging. The range of soil properties, pH: 4.3–7.9; EC: 0.01–0.18 dS m –1 ; OC: 0.1–1.37%; CEC: 0.44– 11.51 cmol (+) kg –1 ; clay: 1.5–25% and sand: 59.1–84.4% and their coefficient of variations indicated a large variability in the study area. Electrical conductivity and pH showed a strong spatial correlation which was reflected by the cross-variogram close to the hull of the perfect correlation. Moreover, cross-variograms calculated for EC and Clay, CEC and OC, CEC and clay and CEC and pH indicated weak positive spatial correlation between these properties. Relative nugget effect (RNE) calculated from variograms showed strongly structured spatial variability for pH, EC and sand content (RNE < 25%) while CEC, organic carbon and clay content showed moderately structured spatial variability (25% < RNE < 75%). Spatial dependencies for examined soil properties ranged from 48 to 984 m. The mixed effects model fitting followed by Tukey's post-hoc test showed significant effect of land use on the spatial variability of EC. Our study revealed a structured variability of topsoil properties in the selected tropical Alfisol catena. Except for EC, observed variability was not modified by the land uses. Investigated soil properties showed distinct spatial structures at different scales and magnitudes of strength. Our results will be useful for digital soil mapping, site specific management of soil properties, developing appropriate land use plans and quantifying anthropogenic impacts on the soil system.« less

  7. Genetic code, hamming distance and stochastic matrices.

    PubMed

    He, Matthew X; Petoukhov, Sergei V; Ricci, Paolo E

    2004-09-01

    In this paper we use the Gray code representation of the genetic code C=00, U=10, G=11 and A=01 (C pairs with G, A pairs with U) to generate a sequence of genetic code-based matrices. In connection with these code-based matrices, we use the Hamming distance to generate a sequence of numerical matrices. We then further investigate the properties of the numerical matrices and show that they are doubly stochastic and symmetric. We determine the frequency distributions of the Hamming distances, building blocks of the matrices, decomposition and iterations of matrices. We present an explicit decomposition formula for the genetic code-based matrix in terms of permutation matrices, which provides a hypercube representation of the genetic code. It is also observed that there is a Hamiltonian cycle in a genetic code-based hypercube.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, J.; Derrida, B.

    The problem of directed polymers on disordered hierarchical and hypercubic lattices is considered. For the hierarchical lattices the problem can be reduced to the study of the stable laws for combining random variables in a nonlinear way. The authors present the results of numerical simulations of two hierarchical lattices, finding evidence of a phase transition in one case. For a limiting case they extend the perturbation theory developed by Derrida and Griffiths to nonzero temperature and to higher order and use this approach to calculate thermal and geometrical properties (overlaps) of the model. In this limit they obtain an interpolationmore » formula, allowing one to obtain the noninteger moments of the partition function from the integer moments. They obtain bounds for the transition temperature for hierarchical and hypercubic lattices, and some similarities between the problem on the two different types of lattice are discussed.« less

  9. Water-borne protozoa parasites: The Latin American perspective.

    PubMed

    Rosado-García, Félix Manuel; Guerrero-Flórez, Milena; Karanis, Gabriele; Hinojosa, María Del Carmen; Karanis, Panagiotis

    2017-07-01

    Health systems, sanitation and water access have certain limitations in nations of Latin America (LA): typical matters of developing countries. Water is often contaminated and therefore unhealthy for the consumers and users. Information on prevalence and detection of waterborne parasitic protozoa are limited or not available in LA. Only few reports have documented in this field during the last forty years and Brazil leads the list, including countries in South America and Mexico within Central America region and Caribbean islands. From 1979 to 2015, 16 outbreaks of waterborne-protozoa, were reported in Latin American countries. T. gondii and C. cayetanensis were the protozoa, which caused more outbreaks and Giardia spp. and Cryptosporidium spp. were the most frequently found protozoa in water samples. On the other hand, Latin America countries have not got a coherent methodology for detection of protozoa in water samples despite whole LA is highly vulnerable to extreme weather events related to waterborne-infections; although Brazil and Colombia have some implemented laws in their surveillance systems. It would be important to coordinate all surveillance systems in between all countries for early detection and measures against waterborne-protozoan and to establish effective and suitable diagnosis tools according to the country's economic strength and particular needs. Copyright © 2017 Elsevier GmbH. All rights reserved.

  10. Statistics of lattice animals

    NASA Astrophysics Data System (ADS)

    Hsu, Hsiao-Ping; Nadler, Walder; Grassberger, Peter

    2005-07-01

    The scaling behavior of randomly branched polymers in a good solvent is studied in two to nine dimensions, modeled by lattice animals on simple hypercubic lattices. For the simulations, we use a biased sequential sampling algorithm with re-sampling, similar to the pruned-enriched Rosenbluth method (PERM) used extensively for linear polymers. We obtain high statistics of animals with up to several thousand sites in all dimension 2⩽d⩽9. The partition sum (number of different animals) and gyration radii are estimated. In all dimensions we verify the Parisi-Sourlas prediction, and we verify all exactly known critical exponents in dimensions 2, 3, 4, and ⩾8. In addition, we present the hitherto most precise estimates for growth constants in d⩾3. For clusters with one site attached to an attractive surface, we verify the superuniversality of the cross-over exponent at the adsorption transition predicted by Janssen and Lyssy.

  11. Molecular Diversity of Trypanosoma cruzi Detected in the Vector Triatoma protracta from California, USA.

    PubMed

    Shender, Lisa A; Lewis, Michael D; Rejmanek, Daniel; Mazet, Jonna A K

    2016-01-01

    Trypanosoma cruzi, causative agent of Chagas disease in humans and dogs, is a vector-borne zoonotic protozoan parasite that can cause fatal cardiac disease. While recognized as the most economically important parasitic infection in Latin America, the incidence of Chagas disease in the United States of America (US) may be underreported and even increasing. The extensive genetic diversity of T. cruzi in Latin America is well-documented and likely influences disease progression, severity and treatment efficacy; however, little is known regarding T. cruzi strains endemic to the US. It is therefore important to expand our knowledge on US T. cruzi strains, to improve upon the recognition of and response to locally acquired infections. We conducted a study of T. cruzi molecular diversity in California, augmenting sparse genetic data from southern California and for the first time investigating genetic sequences from northern California. The vector Triatoma protracta was collected from southern (Escondido and Los Angeles) and northern (Vallecito) California regions. Samples were initially screened via sensitive nuclear repetitive DNA and kinetoplast minicircle DNA PCR assays, yielding an overall prevalence of approximately 28% and 55% for southern and northern California regions, respectively. Positive samples were further processed to identify discrete typing units (DTUs), revealing both TcI and TcIV lineages in southern California, but only TcI in northern California. Phylogenetic analyses (targeting COII-ND1, TR and RB19 genes) were performed on a subset of positive samples to compare Californian T. cruzi samples to strains from other US regions and Latin America. Results indicated that within the TcI DTU, California sequences were similar to those from the southeastern US, as well as to several isolates from Latin America responsible for causing Chagas disease in humans. Triatoma protracta populations in California are frequently infected with T. cruzi. Our data extend the northern limits of the range of TcI and identify a novel genetic exchange event between TcI and TcIV. High similarity between sequences from California and specific Latin American strains indicates US strains may be equally capable of causing human disease. Additional genetic characterization of Californian and other US T. cruzi strains is recommended.

  12. Admixture in Latin America: Geographic Structure, Phenotypic Diversity and Self-Perception of Ancestry Based on 7,342 Individuals

    PubMed Central

    Ruiz-Linares, Andrés; Adhikari, Kaustubh; Acuña-Alonzo, Victor; Quinto-Sanchez, Mirsha; Jaramillo, Claudia; Arias, William; Fuentes, Macarena; Pizarro, María; Everardo, Paola; de Avila, Francisco; Gómez-Valdés, Jorge; León-Mimila, Paola; Hunemeier, Tábita; Ramallo, Virginia; Silva de Cerqueira, Caio C.; Burley, Mari-Wyn; Konca, Esra; de Oliveira, Marcelo Zagonel; Veronez, Mauricio Roberto; Rubio-Codina, Marta; Attanasio, Orazio; Gibbon, Sahra; Ray, Nicolas; Gallo, Carla; Poletti, Giovanni; Rosique, Javier; Schuler-Faccini, Lavinia; Salzano, Francisco M.; Bortolini, Maria-Cátira; Canizales-Quinteros, Samuel; Rothhammer, Francisco; Bedoya, Gabriel; Balding, David; Gonzalez-José, Rolando

    2014-01-01

    The current genetic makeup of Latin America has been shaped by a history of extensive admixture between Africans, Europeans and Native Americans, a process taking place within the context of extensive geographic and social stratification. We estimated individual ancestry proportions in a sample of 7,342 subjects ascertained in five countries (Brazil, Chile, Colombia, México and Perú). These individuals were also characterized for a range of physical appearance traits and for self-perception of ancestry. The geographic distribution of admixture proportions in this sample reveals extensive population structure, illustrating the continuing impact of demographic history on the genetic diversity of Latin America. Significant ancestry effects were detected for most phenotypes studied. However, ancestry generally explains only a modest proportion of total phenotypic variation. Genetically estimated and self-perceived ancestry correlate significantly, but certain physical attributes have a strong impact on self-perception and bias self-perception of ancestry relative to genetically estimated ancestry. PMID:25254375

  13. Admixture in Latin America: geographic structure, phenotypic diversity and self-perception of ancestry based on 7,342 individuals.

    PubMed

    Ruiz-Linares, Andrés; Adhikari, Kaustubh; Acuña-Alonzo, Victor; Quinto-Sanchez, Mirsha; Jaramillo, Claudia; Arias, William; Fuentes, Macarena; Pizarro, María; Everardo, Paola; de Avila, Francisco; Gómez-Valdés, Jorge; León-Mimila, Paola; Hunemeier, Tábita; Ramallo, Virginia; Silva de Cerqueira, Caio C; Burley, Mari-Wyn; Konca, Esra; de Oliveira, Marcelo Zagonel; Veronez, Mauricio Roberto; Rubio-Codina, Marta; Attanasio, Orazio; Gibbon, Sahra; Ray, Nicolas; Gallo, Carla; Poletti, Giovanni; Rosique, Javier; Schuler-Faccini, Lavinia; Salzano, Francisco M; Bortolini, Maria-Cátira; Canizales-Quinteros, Samuel; Rothhammer, Francisco; Bedoya, Gabriel; Balding, David; Gonzalez-José, Rolando

    2014-09-01

    The current genetic makeup of Latin America has been shaped by a history of extensive admixture between Africans, Europeans and Native Americans, a process taking place within the context of extensive geographic and social stratification. We estimated individual ancestry proportions in a sample of 7,342 subjects ascertained in five countries (Brazil, Chile, Colombia, México and Perú). These individuals were also characterized for a range of physical appearance traits and for self-perception of ancestry. The geographic distribution of admixture proportions in this sample reveals extensive population structure, illustrating the continuing impact of demographic history on the genetic diversity of Latin America. Significant ancestry effects were detected for most phenotypes studied. However, ancestry generally explains only a modest proportion of total phenotypic variation. Genetically estimated and self-perceived ancestry correlate significantly, but certain physical attributes have a strong impact on self-perception and bias self-perception of ancestry relative to genetically estimated ancestry.

  14. Systematic review of Latin American national oral health surveys in adults.

    PubMed

    Duran, Doris; Monsalves, Maria Jose; Aubert, Josefina; Zarate, Victor; Espinoza, Iris

    2018-04-27

    Oral diseases represent a main public health problem worldwide. There is scarce information about oral health indicators in adults in middle-income countries in Latin America and Africa. To identify and describe national health surveys with national representative samples that included oral health assessment for adults in Latin America. A systematic review was conducted in scientific and regional bibliographic databases (PubMed, SciELO, Wos and Embase); this was complemented with searchings in grey literature (Google Scholar, Open Grey and government health organization websites), from August 2016 to May 2017 (from 2000 to date). Studies conducted, supervised or funded by Ministries of Health or National Health Institutes were included. Data extracted included country, year, methods, interview and dental examination. Two researchers independently performed search and data extraction. Results were discussed as a group. Only 5 countries in Latin America have developed national health surveys evaluating the dental status in adults, with overall national representative samples during 2000-2015: Brazil, Colombia, Panama, Chile and Uruguay. Main differences were observed in the type of dental indicators selected, measure of dental services access and the professional who performed the dental examination. While some dental surveys were specifically designed as oral health surveys (Brazil, Colombia, Panama and Uruguay) and the examination was performed by dentists, other surveys represent a module within a general health survey (Chile) and the examination was performed by nurses. There are a small number of Latin American countries that report research about dental status with national representation samples. Most of these studies have been conducted as national oral health surveys, and fieldwork was carried out by dentists. The development of oral health research in this part of the world should be promoted as these surveys provide relevant information to monitor oral health and evaluate the effectiveness of health programmes. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. High-Accuracy Finite Element Method: Benchmark Calculations

    NASA Astrophysics Data System (ADS)

    Gusev, Alexander; Vinitsky, Sergue; Chuluunbaatar, Ochbadrakh; Chuluunbaatar, Galmandakh; Gerdt, Vladimir; Derbov, Vladimir; Góźdź, Andrzej; Krassovitskiy, Pavel

    2018-02-01

    We describe a new high-accuracy finite element scheme with simplex elements for solving the elliptic boundary-value problems and show its efficiency on benchmark solutions of the Helmholtz equation for the triangle membrane and hypercube.

  16. Characterizing commercial oil palm expansion in Latin America: land use change and trade

    NASA Astrophysics Data System (ADS)

    Furumo, Paul Richard; Aide, T. Mitchell

    2017-02-01

    Commodity crop expansion has increased with the globalization of production systems and consumer demand, linking distant socio-ecological systems. Oil palm plantations are expanding in the tropics to satisfy growing oilseed and biofuel markets, and much of this expansion has caused extensive deforestation, especially in Asia. In Latin America, palm oil output has doubled since 2001, and the majority of expansion seems to be occurring on non-forested lands. We used MODIS satellite imagery (250 m resolution) to map current oil palm plantations in Latin America and determined prior land use and land cover (LULC) using high-resolution images in Google Earth. In addition, we compiled trade data to determine where Latin American palm oil flows, in order to better understand the underlying drivers of expansion in the region. Based on a sample of 342 032 ha of oil palm plantations across Latin America, we found that 79% replaced previously intervened lands (e.g. pastures, croplands, bananas), primarily cattle pastures (56%). The remaining 21% came from areas that were classified as woody vegetation (e.g. forests), most notably in the Amazon and the Petén region in northern Guatemala. Latin America is a net exporter of palm oil but the majority of palm oil exports (70%) stayed within the region, with Mexico importing about half. Growth of the oil palm sector may be driven by global factors, but environmental and economic outcomes vary between regions (i.e. Asia and Latin America), within regions (i.e. Colombia and Peru), and within single countries (i.e. Guatemala), suggesting that local conditions are influential. The present trend of oil palm expanding onto previously cleared lands, guided by roundtable certifications programs, provides an opportunity for more sustainable development of the oil palm sector in Latin America.

  17. A probabilistic assessment of calcium carbonate export and dissolution in the modern ocean

    NASA Astrophysics Data System (ADS)

    Battaglia, G.; Steinacher, M.; Joos, F.

    2015-12-01

    The marine cycle of calcium carbonate (CaCO3) is an important element of the carbon cycle and co-governs the distribution of carbon and alkalinity within the ocean. However, CaCO3 fluxes and mechanisms governing CaCO3 dissolution are highly uncertain. We present an observationally-constrained, probabilistic assessment of the global and regional CaCO3 budgets. Parameters governing pelagic CaCO3 export fluxes and dissolution rates are sampled using a Latin-Hypercube scheme to construct a 1000 member ensemble with the Bern3D ocean model. Ensemble results are constrained by comparing simulated and observation-based fields of excess dissolved calcium carbonate (TA*). The minerals calcite and aragonite are modelled explicitly and ocean-sediment fluxes are considered. For local dissolution rates either a strong, a weak or no dependency on CaCO3 saturation is assumed. Median (68 % confidence interval) global CaCO3 export is 0.82 (0.67-0.98) Gt PIC yr-1, within the lower half of previously published estimates (0.4-1.8 Gt PIC yr-1). The spatial pattern of CaCO3 export is broadly consistent with earlier assessments. Export is large in the Southern Ocean, the tropical Indo-Pacific, the northern Pacific and relatively small in the Atlantic. Dissolution within the 200 to 1500 m depth range (0.33; 0.26-0.40 Gt PIC yr-1) is substantially lower than inferred from the TA*-CFC age method (1 ± 0.5 Gt PIC yr-1). The latter estimate is likely biased high as the TA*-CFC method neglects transport. The constrained results are robust across a range of diapycnal mixing coefficients and, thus, ocean circulation strengths. Modelled ocean circulation and transport time scales for the different setups were further evaluated with CFC11 and radiocarbon observations. Parameters and mechanisms governing dissolution are hardly constrained by either the TA* data or the current compilation of CaCO3 flux measurements such that model realisations with and without saturation-dependent dissolution achieve skill. We suggest to apply saturation-independent dissolution rates in Earth System Models to minimise computational costs.

  18. Understanding controls of hydrologic processes across two headwater monolithological catchments using model-data synthesis

    NASA Astrophysics Data System (ADS)

    Xiao, D.; Shi, Y.; Hoagland, B.; Del Vecchio, J.; Russo, T. A.; DiBiase, R. A.; Li, L.

    2017-12-01

    How do watershed hydrologic processes differ in catchments derived from different lithology? This study compares two first order, deciduous forest watersheds in Pennsylvania, a sandstone watershed, Garner Run (GR, 1.34 km2), and a shale-derived watershed, Shale Hills (SH, 0.08 km2). Both watersheds are simulated using a combination of national datasets and field measurements, and a physics-based land surface hydrologic model, Flux-PIHM. We aim to evaluate the effects of lithology on watershed hydrology and assess if we can simulate a new watershed without intensive measurements, i.e., directly use calibration information from one watershed (SH) to reproduce hydrologic dynamics of another watershed (GR). Without any calibration, the model at GR based on national datasets and calibration inforamtion from SH cannot capture some discharge peaks or the baseflow during dry periods. The model prediction agrees well with the GR field discharge and soil moisture after calibrating the soil hydraulic parameters using the uncertainty based Hornberger-Spear-Young algorithm and the Latin Hypercube Sampling method. Agreeing with the field observation and national datasets, the difference in parameter values shows that the sandstone watershed has a larger averaged soil pore diameter, greater water storage created by porosity, lower water retention ability, and greater preferential flow. The water budget calculation shows that the riparian zone and the colluvial valley serves as buffer zones that stores water at GR. Using the same procedure, we compared Flux-PIHM simulations with and without a field measured surface boulder map at GR. When the boulder map is used, the prediction of areal averaged soil moisture is improved, without performing extra calibration. When calibrated separately, the cases with or without boulder map yield different calibration values, but their hydrologic predictions are similar, showing equifinality. The calibrated soil hydraulic parameter values in the with boulder map case is more physically plausible than the without boulder map case. We switched the topography and soil properties between GR and SH, and results indicate that the hydrologic processes are more sensitive to changes in domain topography than to changes in the soil properties.

  19. Uncertainty Analysis of Simulated Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Chen, M.; Sun, Y.; Fu, P.; Carrigan, C. R.; Lu, Z.

    2012-12-01

    Artificial hydraulic fracturing is being used widely to stimulate production of oil, natural gas, and geothermal reservoirs with low natural permeability. Optimization of field design and operation is limited by the incomplete characterization of the reservoir, as well as the complexity of hydrological and geomechanical processes that control the fracturing. Thus, there are a variety of uncertainties associated with the pre-existing fracture distribution, rock mechanics, and hydraulic-fracture engineering that require evaluation of their impact on the optimized design. In this study, a multiple-stage scheme was employed to evaluate the uncertainty. We first define the ranges and distributions of 11 input parameters that characterize the natural fracture topology, in situ stress, geomechanical behavior of the rock matrix and joint interfaces, and pumping operation, to cover a wide spectrum of potential conditions expected for a natural reservoir. These parameters were then sampled 1,000 times in an 11-dimensional parameter space constrained by the specified ranges using the Latin-hypercube method. These 1,000 parameter sets were fed into the fracture simulators, and the outputs were used to construct three designed objective functions, i.e. fracture density, opened fracture length and area density. Using PSUADE, three response surfaces (11-dimensional) of the objective functions were developed and global sensitivity was analyzed to identify the most sensitive parameters for the objective functions representing fracture connectivity, which are critical for sweep efficiency of the recovery process. The second-stage high resolution response surfaces were constructed with dimension reduced to the number of the most sensitive parameters. An additional response surface with respect to the objective function of the fractal dimension for fracture distributions was constructed in this stage. Based on these response surfaces, comprehensive uncertainty analyses were conducted among input parameters and objective functions. In addition, reduced-order emulation models resulting from this analysis can be used for optimal control of hydraulic fracturing. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  20. Male Circumcision and the Epidemic Emergence of HIV-2 in West Africa.

    PubMed

    Sousa, João Dinis; Temudo, Marina Padrão; Hewlett, Barry Stephen; Camacho, Ricardo Jorge; Müller, Viktor; Vandamme, Anne-Mieke

    2016-01-01

    Epidemic HIV-2 (groups A and B) emerged in humans circa 1930-40. Its closest ancestors are SIVsmm infecting sooty mangabeys from southwestern Côte d'Ivoire. The earliest large-scale serological surveys of HIV-2 in West Africa (1985-91) show a patchy spread. Côte d'Ivoire and Guinea-Bissau had the highest prevalence rates by then, and phylogeographical analysis suggests they were the earliest epicenters. Wars and parenteral transmission have been hypothesized to have promoted HIV-2 spread. Male circumcision (MC) is known to correlate negatively with HIV-1 prevalence in Africa, but studies examining this issue for HIV-2 are lacking. We reviewed published HIV-2 serosurveys for 30 cities of all West African countries and obtained credible estimates of real prevalence through Bayesian estimation. We estimated past MC rates of 218 West African ethnic groups, based on ethnographic literature and fieldwork. We collected demographic tables specifying the ethnic partition in cities. Uncertainty was incorporated by defining plausible ranges of parameters (e.g. timing of introduction, proportion circumcised). We generated 1,000 sets of past MC rates per city using Latin Hypercube Sampling with different parameter combinations, and explored the correlation between HIV-2 prevalence and estimated MC rate (both logit-transformed) in the 1,000 replicates. Our survey reveals that, in the early 20th century, MC was far less common and geographically more variable than nowadays. HIV-2 prevalence in 1985-91 and MC rates in 1950 were negatively correlated (Spearman rho = -0.546, IQR: -0.553--0.546, p≤0.0021). Guinea-Bissau and Côte d'Ivoire cities had markedly lower MC rates. In addition, MC was uncommon in rural southwestern Côte d'Ivoire in 1930.The differential HIV-2 spread in West Africa correlates with different historical MC rates. We suggest HIV-2 only formed early substantial foci in cities with substantial uncircumcised populations. Lack of MC in rural areas exposed to bushmeat may have had a role in successful HIV-2 emergence.

  1. Uncertainty quantification of surface-water/groundwater exchange estimates in large wetland systems using Python

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; Metz, P. A.

    2014-12-01

    Most watershed studies include observation-based water budget analyses to develop first-order estimates of significant flow terms. Surface-water/groundwater (SWGW) exchange is typically assumed to be equal to the residual of the sum of inflows and outflows in a watershed. These estimates of SWGW exchange, however, are highly uncertain as a result of the propagation of uncertainty inherent in the calculation or processing of the other terms of the water budget, such as stage-area-volume relations, and uncertainties associated with land-cover based evapotranspiration (ET) rate estimates. Furthermore, the uncertainty of estimated SWGW exchanges can be magnified in large wetland systems that transition from dry to wet during wet periods. Although it is well understood that observation-based estimates of SWGW exchange are uncertain it is uncommon for the uncertainty of these estimates to be directly quantified. High-level programming languages like Python can greatly reduce the effort required to (1) quantify the uncertainty of estimated SWGW exchange in large wetland systems and (2) evaluate how different approaches for partitioning land-cover data in a watershed may affect the water-budget uncertainty. We have used Python with the Numpy, Scipy.stats, and pyDOE packages to implement an unconstrained Monte Carlo approach with Latin Hypercube sampling to quantify the uncertainty of monthly estimates of SWGW exchange in the Floral City watershed of the Tsala Apopka wetland system in west-central Florida, USA. Possible sources of uncertainty in the water budget analysis include rainfall, ET, canal discharge, and land/bathymetric surface elevations. Each of these input variables was assigned a probability distribution based on observation error or spanning the range of probable values. The Monte Carlo integration process exposes the uncertainties in land-cover based ET rate estimates as the dominant contributor to the uncertainty in SWGW exchange estimates. We will discuss the uncertainty of SWGW exchange estimates using an ET model that partitions the watershed into open water and wetland land-cover types. We will also discuss the uncertainty of SWGW exchange estimates calculated using ET models partitioned into additional land-cover types.

  2. Colorado River basin sensitivity to disturbance impacts

    NASA Astrophysics Data System (ADS)

    Bennett, K. E.; Urrego-Blanco, J. R.; Jonko, A. K.; Vano, J. A.; Newman, A. J.; Bohn, T. J.; Middleton, R. S.

    2017-12-01

    The Colorado River basin is an important river for the food-energy-water nexus in the United States and is projected to change under future scenarios of increased CO2emissions and warming. Streamflow estimates to consider climate impacts occurring as a result of this warming are often provided using modeling tools which rely on uncertain inputs—to fully understand impacts on streamflow sensitivity analysis can help determine how models respond under changing disturbances such as climate and vegetation. In this study, we conduct a global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the Variable Infiltration Capacity (VIC) hydrologic model to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in VIC. Additionally, we examine sensitivities of basin-wide model simulations using an approach that incorporates changes in temperature, precipitation and vegetation to consider impact responses for snow-dominated headwater catchments, low elevation arid basins, and for the upper and lower river basins. We find that for the Colorado River basin, snow-dominated regions are more sensitive to uncertainties. New parameter sensitivities identified include runoff/evapotranspiration sensitivity to albedo, while changes in snow water equivalent are sensitive to canopy fraction and Leaf Area Index (LAI). Basin-wide streamflow sensitivities to precipitation, temperature and vegetation are variable seasonally and also between sub-basins; with the largest sensitivities for smaller, snow-driven headwater systems where forests are dense. For a major headwater basin, a 1ºC of warming equaled a 30% loss of forest cover, while a 10% precipitation loss equaled a 90% forest cover decline. Scenarios utilizing multiple disturbances led to unexpected results where changes could either magnify or diminish extremes, such as low and peak flows and streamflow timing, dependent on the strength and direction of the forcing. These results indicate the importance of understanding model sensitivities under disturbance impacts to manage these shifts; plan for future water resource changes and determine how the impacts will affect the sustainability and adaptability of food-energy-water systems.

  3. The MIT Integrated Global System Model: A facility for Assessing and Communicating Climate Change Uncertainty (Invited)

    NASA Astrophysics Data System (ADS)

    Prinn, R. G.

    2013-12-01

    The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that we need to evaluate policies based on their ability to lower risk, and to re-evaluate decisions over time as new knowledge is gained. Reference: R. G. Prinn, Development and Application of Earth System Models, Proceedings, National Academy of Science, June 15, 2012, http://www.pnas.org/cgi/doi/10.1073/pnas.1107470109.

  4. Development and Implementation of a Formal Framework for Bottom-up Uncertainty Analysis of Input Emissions: Case Study of Residential Wood Combustion

    NASA Astrophysics Data System (ADS)

    Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.

    2016-12-01

    We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The approach described here provides an effective means for formal quantification of uncertainties in estimated emissions from various source sectors and for continuous documentation, assessment, and reduction of emission uncertainties.

  5. A Monte Carlo Uncertainty Analysis of Ozone Trend Predictions in a Two Dimensional Model. Revision

    NASA Technical Reports Server (NTRS)

    Considine, D. B.; Stolarski, R. S.; Hollandsworth, S. M.; Jackman, C. H.; Fleming, E. L.

    1998-01-01

    We use Monte Carlo analysis to estimate the uncertainty in predictions of total O3 trends between 1979 and 1995 made by the Goddard Space Flight Center (GSFC) two-dimensional (2D) model of stratospheric photochemistry and dynamics. The uncertainty is caused by gas-phase chemical reaction rates, photolysis coefficients, and heterogeneous reaction parameters which are model inputs. The uncertainty represents a lower bound to the total model uncertainty assuming the input parameter uncertainties are characterized correctly. Each of the Monte Carlo runs was initialized in 1970 and integrated for 26 model years through the end of 1995. This was repeated 419 times using input parameter sets generated by Latin Hypercube Sampling. The standard deviation (a) of the Monte Carlo ensemble of total 03 trend predictions is used to quantify the model uncertainty. The 34% difference between the model trend in globally and annually averaged total O3 using nominal inputs and atmospheric trends calculated from Nimbus 7 and Meteor 3 total ozone mapping spectrometer (TOMS) version 7 data is less than the 46% calculated 1 (sigma), model uncertainty, so there is no significant difference between the modeled and observed trends. In the northern hemisphere midlatitude spring the modeled and observed total 03 trends differ by more than 1(sigma) but less than 2(sigma), which we refer to as marginal significance. We perform a multiple linear regression analysis of the runs which suggests that only a few of the model reactions contribute significantly to the variance in the model predictions. The lack of significance in these comparisons suggests that they are of questionable use as guides for continuing model development. Large model/measurement differences which are many multiples of the input parameter uncertainty are seen in the meridional gradients of the trend and the peak-to-peak variations in the trends over an annual cycle. These discrepancies unambiguously indicate model formulation problems and provide a measure of model performance which can be used in attempts to improve such models.

  6. Male Circumcision and the Epidemic Emergence of HIV-2 in West Africa

    PubMed Central

    Hewlett, Barry Stephen; Camacho, Ricardo Jorge

    2016-01-01

    Background Epidemic HIV-2 (groups A and B) emerged in humans circa 1930–40. Its closest ancestors are SIVsmm infecting sooty mangabeys from southwestern Côte d'Ivoire. The earliest large-scale serological surveys of HIV-2 in West Africa (1985–91) show a patchy spread. Côte d'Ivoire and Guinea-Bissau had the highest prevalence rates by then, and phylogeographical analysis suggests they were the earliest epicenters. Wars and parenteral transmission have been hypothesized to have promoted HIV-2 spread. Male circumcision (MC) is known to correlate negatively with HIV-1 prevalence in Africa, but studies examining this issue for HIV-2 are lacking. Methods We reviewed published HIV-2 serosurveys for 30 cities of all West African countries and obtained credible estimates of real prevalence through Bayesian estimation. We estimated past MC rates of 218 West African ethnic groups, based on ethnographic literature and fieldwork. We collected demographic tables specifying the ethnic partition in cities. Uncertainty was incorporated by defining plausible ranges of parameters (e.g. timing of introduction, proportion circumcised). We generated 1,000 sets of past MC rates per city using Latin Hypercube Sampling with different parameter combinations, and explored the correlation between HIV-2 prevalence and estimated MC rate (both logit-transformed) in the 1,000 replicates. Results and Conclusions Our survey reveals that, in the early 20th century, MC was far less common and geographically more variable than nowadays. HIV-2 prevalence in 1985–91 and MC rates in 1950 were negatively correlated (Spearman rho = -0.546, IQR: -0.553–-0.546, p≤0.0021). Guinea-Bissau and Côte d'Ivoire cities had markedly lower MC rates. In addition, MC was uncommon in rural southwestern Côte d'Ivoire in 1930.The differential HIV-2 spread in West Africa correlates with different historical MC rates. We suggest HIV-2 only formed early substantial foci in cities with substantial uncircumcised populations. Lack of MC in rural areas exposed to bushmeat may have had a role in successful HIV-2 emergence. PMID:27926927

  7. Uncertainty Analysis of Decomposing Polyurethane Foam

    NASA Technical Reports Server (NTRS)

    Hobbs, Michael L.; Romero, Vicente J.

    2000-01-01

    Sensitivity/uncertainty analyses are necessary to determine where to allocate resources for improved predictions in support of our nation's nuclear safety mission. Yet, sensitivity/uncertainty analyses are not commonly performed on complex combustion models because the calculations are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, a variety of sensitivity/uncertainty analyses were used to determine the uncertainty associated with thermal decomposition of polyurethane foam exposed to high radiative flux boundary conditions. The polyurethane used in this study is a rigid closed-cell foam used as an encapsulant. Related polyurethane binders such as Estane are used in many energetic materials of interest to the JANNAF community. The complex, finite element foam decomposition model used in this study has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state decomposition front velocity calculated as the derivative of the decomposition front location versus time. An analytical mean value sensitivity/uncertainty (MV) analysis was used to determine the standard deviation by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation was essentially determined from a second derivative that was extremely sensitive to numerical noise. To minimize the numerical noise, 50-micrometer element dimensions and approximately 1-msec time steps were required to obtain stable uncertainty results. As an alternative method to determine the uncertainty and sensitivity in the decomposition front velocity, surrogate response surfaces were generated for use with a constrained Latin Hypercube Sampling (LHS) technique. Two surrogate response surfaces were investigated: 1) a linear surrogate response surface (LIN) and 2) a quadratic response surface (QUAD). The LHS techniques do not require derivatives of the response variable and are subsequently relatively insensitive to numerical noise. To compare the LIN and QUAD methods to the MV method, a direct LHS analysis (DLHS) was performed using the full grid and timestep resolved finite element model. The surrogate response models (LIN and QUAD) are shown to give acceptable values of the mean and standard deviation when compared to the fully converged DLHS model.

  8. Uncertainty and sensitivity assessments of an agricultural-hydrological model (RZWQM2) using the GLUE method

    NASA Astrophysics Data System (ADS)

    Sun, Mei; Zhang, Xiaolin; Huo, Zailin; Feng, Shaoyuan; Huang, Guanhua; Mao, Xiaomin

    2016-03-01

    Quantitatively ascertaining and analyzing the effects of model uncertainty on model reliability is a focal point for agricultural-hydrological models due to more uncertainties of inputs and processes. In this study, the generalized likelihood uncertainty estimation (GLUE) method with Latin hypercube sampling (LHS) was used to evaluate the uncertainty of the RZWQM-DSSAT (RZWQM2) model outputs responses and the sensitivity of 25 parameters related to soil properties, nutrient transport and crop genetics. To avoid the one-sided risk of model prediction caused by using a single calibration criterion, the combined likelihood (CL) function integrated information concerning water, nitrogen, and crop production was introduced in GLUE analysis for the predictions of the following four model output responses: the total amount of water content (T-SWC) and the nitrate nitrogen (T-NIT) within the 1-m soil profile, the seed yields of waxy maize (Y-Maize) and winter wheat (Y-Wheat). In the process of evaluating RZWQM2, measurements and meteorological data were obtained from a field experiment that involved a winter wheat and waxy maize crop rotation system conducted from 2003 to 2004 in southern Beijing. The calibration and validation results indicated that RZWQM2 model can be used to simulate the crop growth and water-nitrogen migration and transformation in wheat-maize crop rotation planting system. The results of uncertainty analysis using of GLUE method showed T-NIT was sensitive to parameters relative to nitrification coefficient, maize growth characteristics on seedling period, wheat vernalization period, and wheat photoperiod. Parameters on soil saturated hydraulic conductivity, nitrogen nitrification and denitrification, and urea hydrolysis played an important role in crop yield component. The prediction errors for RZWQM2 outputs with CL function were relatively lower and uniform compared with other likelihood functions composed of individual calibration criterion. This new and successful application of the GLUE method for determining the uncertainty and sensitivity of the RZWQM2 could provide a reference for the optimization of model parameters with different emphases according to research interests.

  9. The relationship between national-level carbon dioxide emissions and population size: an assessment of regional and temporal variation, 1960-2005.

    PubMed

    Jorgenson, Andrew K; Clark, Brett

    2013-01-01

    This study examines the regional and temporal differences in the statistical relationship between national-level carbon dioxide emissions and national-level population size. The authors analyze panel data from 1960 to 2005 for a diverse sample of nations, and employ descriptive statistics and rigorous panel regression modeling techniques. Initial descriptive analyses indicate that all regions experienced overall increases in carbon emissions and population size during the 45-year period of investigation, but with notable differences. For carbon emissions, the sample of countries in Asia experienced the largest percent increase, followed by countries in Latin America, Africa, and lastly the sample of relatively affluent countries in Europe, North America, and Oceania combined. For population size, the sample of countries in Africa experienced the largest percent increase, followed countries in Latin America, Asia, and the combined sample of countries in Europe, North America, and Oceania. Findings for two-way fixed effects panel regression elasticity models of national-level carbon emissions indicate that the estimated elasticity coefficient for population size is much smaller for nations in Africa than for nations in other regions of the world. Regarding potential temporal changes, from 1960 to 2005 the estimated elasticity coefficient for population size decreased by 25% for the sample of Africa countries, 14% for the sample of Asia countries, 6.5% for the sample of Latin America countries, but remained the same in size for the sample of countries in Europe, North America, and Oceania. Overall, while population size continues to be the primary driver of total national-level anthropogenic carbon dioxide emissions, the findings for this study highlight the need for future research and policies to recognize that the actual impacts of population size on national-level carbon emissions differ across both time and region.

  10. Development of a panel of genome-wide ancestry informative markers to study admixture throughout the Americas.

    PubMed

    Galanter, Joshua Mark; Fernandez-Lopez, Juan Carlos; Gignoux, Christopher R; Barnholtz-Sloan, Jill; Fernandez-Rozadilla, Ceres; Via, Marc; Hidalgo-Miranda, Alfredo; Contreras, Alejandra V; Figueroa, Laura Uribe; Raska, Paola; Jimenez-Sanchez, Gerardo; Zolezzi, Irma Silva; Torres, Maria; Ponte, Clara Ruiz; Ruiz, Yarimar; Salas, Antonio; Nguyen, Elizabeth; Eng, Celeste; Borjas, Lisbeth; Zabala, William; Barreto, Guillermo; González, Fernando Rondón; Ibarra, Adriana; Taboada, Patricia; Porras, Liliana; Moreno, Fabián; Bigham, Abigail; Gutierrez, Gerardo; Brutsaert, Tom; León-Velarde, Fabiola; Moore, Lorna G; Vargas, Enrique; Cruz, Miguel; Escobedo, Jorge; Rodriguez-Santana, José; Rodriguez-Cintrón, William; Chapela, Rocio; Ford, Jean G; Bustamante, Carlos; Seminara, Daniela; Shriver, Mark; Ziv, Elad; Burchard, Esteban Gonzalez; Haile, Robert; Parra, Esteban; Carracedo, Angel

    2012-01-01

    Most individuals throughout the Americas are admixed descendants of Native American, European, and African ancestors. Complex historical factors have resulted in varying proportions of ancestral contributions between individuals within and among ethnic groups. We developed a panel of 446 ancestry informative markers (AIMs) optimized to estimate ancestral proportions in individuals and populations throughout Latin America. We used genome-wide data from 953 individuals from diverse African, European, and Native American populations to select AIMs optimized for each of the three main continental populations that form the basis of modern Latin American populations. We selected markers on the basis of locus-specific branch length to be informative, well distributed throughout the genome, capable of being genotyped on widely available commercial platforms, and applicable throughout the Americas by minimizing within-continent heterogeneity. We then validated the panel in samples from four admixed populations by comparing ancestry estimates based on the AIMs panel to estimates based on genome-wide association study (GWAS) data. The panel provided balanced discriminatory power among the three ancestral populations and accurate estimates of individual ancestry proportions (R² > 0.9 for ancestral components with significant between-subject variance). Finally, we genotyped samples from 18 populations from Latin America using the AIMs panel and estimated variability in ancestry within and between these populations. This panel and its reference genotype information will be useful resources to explore population history of admixture in Latin America and to correct for the potential effects of population stratification in admixed samples in the region.

  11. Development of a Panel of Genome-Wide Ancestry Informative Markers to Study Admixture Throughout the Americas

    PubMed Central

    Galanter, Joshua Mark; Fernandez-Lopez, Juan Carlos; Gignoux, Christopher R.; Barnholtz-Sloan, Jill; Fernandez-Rozadilla, Ceres; Via, Marc; Hidalgo-Miranda, Alfredo; Contreras, Alejandra V.; Figueroa, Laura Uribe; Raska, Paola; Jimenez-Sanchez, Gerardo; Silva Zolezzi, Irma; Torres, Maria; Ponte, Clara Ruiz; Ruiz, Yarimar; Salas, Antonio; Nguyen, Elizabeth; Eng, Celeste; Borjas, Lisbeth; Zabala, William; Barreto, Guillermo; Rondón González, Fernando; Ibarra, Adriana; Taboada, Patricia; Porras, Liliana; Moreno, Fabián; Bigham, Abigail; Gutierrez, Gerardo; Brutsaert, Tom; León-Velarde, Fabiola; Moore, Lorna G.; Vargas, Enrique; Cruz, Miguel; Escobedo, Jorge; Rodriguez-Santana, José; Rodriguez-Cintrón, William; Chapela, Rocio; Ford, Jean G.; Bustamante, Carlos; Seminara, Daniela; Shriver, Mark; Ziv, Elad; Gonzalez Burchard, Esteban; Haile, Robert

    2012-01-01

    Most individuals throughout the Americas are admixed descendants of Native American, European, and African ancestors. Complex historical factors have resulted in varying proportions of ancestral contributions between individuals within and among ethnic groups. We developed a panel of 446 ancestry informative markers (AIMs) optimized to estimate ancestral proportions in individuals and populations throughout Latin America. We used genome-wide data from 953 individuals from diverse African, European, and Native American populations to select AIMs optimized for each of the three main continental populations that form the basis of modern Latin American populations. We selected markers on the basis of locus-specific branch length to be informative, well distributed throughout the genome, capable of being genotyped on widely available commercial platforms, and applicable throughout the Americas by minimizing within-continent heterogeneity. We then validated the panel in samples from four admixed populations by comparing ancestry estimates based on the AIMs panel to estimates based on genome-wide association study (GWAS) data. The panel provided balanced discriminatory power among the three ancestral populations and accurate estimates of individual ancestry proportions (R2>0.9 for ancestral components with significant between-subject variance). Finally, we genotyped samples from 18 populations from Latin America using the AIMs panel and estimated variability in ancestry within and between these populations. This panel and its reference genotype information will be useful resources to explore population history of admixture in Latin America and to correct for the potential effects of population stratification in admixed samples in the region. PMID:22412386

  12. Performance Evaluation of Parallel Branch and Bound Search with the Intel iPSC (Intel Personal SuperComputer) Hypercube Computer.

    DTIC Science & Technology

    1986-12-01

    17 III. Analysis of Parallel Design ................................................ 18 Parallel Abstract Data ...Types ........................................... 18 Abstract Data Type .................................................. 19 Parallel ADT...22 Data -Structure Design ........................................... 23 Object-Oriented Design

  13. A 2D electrostatic PIC code for the Mark III Hypercube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferraro, R.D.; Liewer, P.C.; Decyk, V.K.

    We have implemented a 2D electrostastic plasma particle in cell (PIC) simulation code on the Caltech/JPL Mark IIIfp Hypercube. The code simulates plasma effects by evolving in time the trajectories of thousands to millions of charged particles subject to their self-consistent fields. Each particle`s position and velocity is advanced in time using a leap frog method for integrating Newton`s equations of motion in electric and magnetic fields. The electric field due to these moving charged particles is calculated on a spatial grid at each time by solving Poisson`s equation in Fourier space. These two tasks represent the largest part ofmore » the computation. To obtain efficient operation on a distributed memory parallel computer, we are using the General Concurrent PIC (GCPIC) algorithm previously developed for a 1D parallel PIC code.« less

  14. Discrete sensitivity derivatives of the Navier-Stokes equations with a parallel Krylov solver

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Taylor, Arthur C., III

    1994-01-01

    This paper solves an 'incremental' form of the sensitivity equations derived by differentiating the discretized thin-layer Navier Stokes equations with respect to certain design variables of interest. The equations are solved with a parallel, preconditioned Generalized Minimal RESidual (GMRES) solver on a distributed-memory architecture. The 'serial' sensitivity analysis code is parallelized by using the Single Program Multiple Data (SPMD) programming model, domain decomposition techniques, and message-passing tools. Sensitivity derivatives are computed for low and high Reynolds number flows over a NACA 1406 airfoil on a 32-processor Intel Hypercube, and found to be identical to those computed on a single-processor Cray Y-MP. It is estimated that the parallel sensitivity analysis code has to be run on 40-50 processors of the Intel Hypercube in order to match the single-processor processing time of a Cray Y-MP.

  15. SIAM Conference on Parallel Processing for Scientific Computing, 4th, Chicago, IL, Dec. 11-13, 1989, Proceedings

    NASA Technical Reports Server (NTRS)

    Dongarra, Jack (Editor); Messina, Paul (Editor); Sorensen, Danny C. (Editor); Voigt, Robert G. (Editor)

    1990-01-01

    Attention is given to such topics as an evaluation of block algorithm variants in LAPACK and presents a large-grain parallel sparse system solver, a multiprocessor method for the solution of the generalized Eigenvalue problem on an interval, and a parallel QR algorithm for iterative subspace methods on the CM2. A discussion of numerical methods includes the topics of asynchronous numerical solutions of PDEs on parallel computers, parallel homotopy curve tracking on a hypercube, and solving Navier-Stokes equations on the Cedar Multi-Cluster system. A section on differential equations includes a discussion of a six-color procedure for the parallel solution of elliptic systems using the finite quadtree structure, data parallel algorithms for the finite element method, and domain decomposition methods in aerodynamics. Topics dealing with massively parallel computing include hypercube vs. 2-dimensional meshes and massively parallel computation of conservation laws. Performance and tools are also discussed.

  16. Optimization of Heat Exchangers with Dimpled Surfaces to Improve the Performance in Thermoelectric Generators Using a Kriging Model

    NASA Astrophysics Data System (ADS)

    Li, Shuai; Wang, Yiping; Wang, Tao; Yang, Xue; Deng, Yadong; Su, Chuqi

    2017-05-01

    Thermoelectric generators (TEGs) have become a topic of interest for vehicle exhaust energy recovery. Electrical power generation is deeply influenced by temperature differences, temperature uniformity and topological structures of TEGs. When the dimpled surfaces are adopted in heat exchangers, the heat transfer rates can be augmented with a minimal pressure drop. However, the temperature distribution shows a large gradient along the flow direction which has adverse effects on the power generation. In the current study, the heat exchanger performance was studied in a computational fluid dynamics (CFD) model. The dimple depth, dimple print diameter, and channel height were chosen as design variables. The objective function was defined as a combination of average temperature, temperature uniformity and pressure loss. The optimal Latin hypercube method was used to determine the experiment points as a method of design of the experiment in order to analyze the sensitivity of the design variables. A Kriging surrogate model was built and verified according to the database resulting from the CFD simulation. A multi-island genetic algorithm was used to optimize the structure in the heat exchanger based on the surrogate model. The results showed that the average temperature of the heat exchanger was most sensitive to the dimple depth. The pressure loss and temperature uniformity were most sensitive to the parameter of channel rear height, h 2. With an optimal design of channel structure, the temperature uniformity can be greatly improved compared with the initial exchanger, and the additional pressure loss also increased.

  17. A Novel Approach for Determining Source-Receptor Relationships of Aerosols in Model Simulations

    NASA Astrophysics Data System (ADS)

    Ma, P.; Gattiker, J.; Liu, X.; Rasch, P. J.

    2013-12-01

    The climate modeling community usually performs sensitivity studies in the 'one-factor-at-a-time' fashion. However, owing to the a-priori unknown complexity and nonlinearity of the climate system and simulation response, it is computationally expensive to systematically identify the cause-and-effect of multiple factors in climate models. In this study, we use a Gaussian Process emulator, based on a small number of Community Atmosphere Model Version 5.1 (CAM5) simulations (constrained by meteorological reanalyses) using a Latin Hypercube experimental design, to demonstrate that it is possible to characterize model behavior accurately and very efficiently without any modifications to the model itself. We use the emulator to characterize the source-receptor relationships of black carbon (BC), focusing specifically on describing the constituent burden and surface deposition rates from emissions in various regions. Our results show that the emulator is capable of quantifying the contribution of aerosol burden and surface deposition from different source regions, finding that most of current Arctic BC comes from remote sources. We also demonstrate that the sensitivity of the BC burdens to emission perturbations differs for various source regions. For example, the emission growth in Africa where dry convections are strong results in a moderate increase of BC burden over the globe while the same emission growth in the Arctic leads to a significant increase of local BC burdens and surface deposition rates. These results provide insights into the dynamical, physical, and chemical processes of the climate model, and the conclusions may have policy implications for making cost-effective global and regional pollution management strategies.

  18. Vertical overlap of probability density functions of cloud and precipitation hydrometeors: CLOUD AND PRECIPITATION PDF OVERLAP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovchinnikov, Mikhail; Lim, Kyo-Sun Sunny; Larson, Vincent E.

    Coarse-resolution climate models increasingly rely on probability density functions (PDFs) to represent subgrid-scale variability of prognostic variables. While PDFs characterize the horizontal variability, a separate treatment is needed to account for the vertical structure of clouds and precipitation. When sub-columns are drawn from these PDFs for microphysics or radiation parameterizations, appropriate vertical correlations must be enforced via PDF overlap specifications. This study evaluates the representation of PDF overlap in the Subgrid Importance Latin Hypercube Sampler (SILHS) employed in the assumed PDF turbulence and cloud scheme called the Cloud Layers Unified By Binormals (CLUBB). PDF overlap in CLUBB-SILHS simulations of continentalmore » and tropical oceanic deep convection is compared with overlap of PDF of various microphysics variables in cloud-resolving model (CRM) simulations of the same cases that explicitly predict the 3D structure of cloud and precipitation fields. CRM results show that PDF overlap varies significantly between different hydrometeor types, as well as between PDFs of mass and number mixing ratios for each species, - a distinction that the current SILHS implementation does not make. In CRM simulations that explicitly resolve cloud and precipitation structures, faster falling species, such as rain and graupel, exhibit significantly higher coherence in their vertical distributions than slow falling cloud liquid and ice. These results suggest that to improve the overlap treatment in the sub-column generator, the PDF correlations need to depend on hydrometeor properties, such as fall speeds, in addition to the currently implemented dependency on the turbulent convective length scale.« less

  19. Social accountability of medical schools and academic primary care training in Latin America: principles but not practice.

    PubMed

    Puschel, Klaus; Rojas, Paulina; Erazo, Alvaro; Thompson, Beti; Lopez, Jorge; Barros, Jorge

    2014-08-01

    Latin America has one of the highest rates of health disparities in the world and is experiencing a steep increase in its number of medical schools. It is not clear if medical school authorities consider social responsibility, defined as the institutional commitment to contribute to the improvement of community well-being, as a priority and if there are any organizational strategies that could reduce health disparities. To study the significance and relevance of social responsibility in the academic training of medical schools in Latin America. The study combined a qualitative thematic literature review of three databases with a quantitative design based on a sample of nine Latin American and non-Latin American countries. The thematic analysis showed high agreement among academic groups on considering medical schools as 'moral agents', part of a 'social contract' and with an institutional responsibility to reduce health disparities mainly through the implementation of strong academic primary care programs. The quantitative analysis showed a significant association between higher development of academic primary care programs and lower level of health disparities by country (P = 0.028). However, the data showed that most Latin American medical schools did not prioritize graduate primary care training. The study shows a discrepancy between the importance given to social responsibility and academic primary care training in Latin America and the practices implemented by medical schools. It highlights the need to refocus medical education policies in the region. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Asthma control in Latin America: the Asthma Insights and Reality in Latin America (AIRLA) survey.

    PubMed

    Neffen, Hugo; Fritscher, Carlos; Schacht, Francisco Cuevas; Levy, Gur; Chiarella, Pascual; Soriano, Joan B; Mechali, Daniel

    2005-03-01

    The aims of this survey were (1) to assess the quality of asthma treatment and control in Latin America, (2) to determine how closely asthma management guidelines are being followed, and (3) to assess perception, knowledge and attitudes related to asthma in Latin America. We surveyed a household sample of 2,184 adults or parents of children with asthma in 2003 in 11 countries in Latin America. Respondents were asked about healthcare utilization, symptom severity, activity limitations and medication use. Daytime asthma symptoms were reported by 56% of the respondents, and 51% reported being awakened by their asthma at night. More than half of those surveyed had been hospitalized, attended a hospital emergency service or made unscheduled emergency visits to other healthcare facilities for asthma during the previous year. Patient perception of asthma control did not match symptom severity, even in patients with severe persistent asthma, 44.7% of whom regarded their disease as being well or completely controlled. Only 2.4% (2.3% adults and 2.6% children) met all criteria for asthma control. Although 37% reported treatment with prescription medications, only 6% were using inhaled corticosteroids. Most adults (79%) and children (68%) in this survey reported that asthma symptoms limited their activities. Absence from school and work was reported by 58% of the children and 31% of adults, respectively. Asthma control in Latin America falls short of goals in international guidelines, and in many aspects asthma care and control in Latin America suffer from the same shortcomings as in other areas of the world.

  1. Socioeconomic and environmental determinants of adolescent asthma in urban Latin America: an ecological analysis.

    PubMed

    Fattore, Gisel Lorena; Santos, Carlos Antonio de Souza Teles; Barreto, Mauricio Lima

    2015-11-01

    The prevalence of asthma is high in urban areas of many Latin-American countries where societies show high levels of inequality and different levels of development. This study aimed to examine the relationship between asthma symptoms prevalence in adolescents living in Latin American urban centers and socioeconomic and environmental determinants measured at the ecological level. Asthma prevalence symptoms were obtained from the International Study of Asthma and Allergies in Childhood (ISAAC) phase III. A hierarchical conceptual framework was defined and the explanatory variables were organized in three levels: distal, intermediate, proximal. Linear regression models weighed by sample size were undertaken between asthma prevalence and the selected variables. Asthma prevalence was positively associated with Gini index, water supply and homicide rate, and inversely associated with the Human Development Index, crowding and adequate sanitation. This study provides evidence of the potential influence of poverty and social inequalities on current wheezing in adolescents in a complex social context like Latin America.

  2. Design of PREVENCION: a population-based study of cardiovascular disease in Peru.

    PubMed

    Medina-Lezama, Josefina; Chirinos, Julio A; Zea Díaz, Humberto; Morey, Oscar; Bolanos, Juan F; Munoz-Atahualpa, Edgar; Chirinos-Pacheco, Julio

    2005-11-02

    Latin America is undergoing the epidemiologic transition that occurred earlier in developed countries, and is likely to face a gigantic epidemic of heart disease in the next few years unless urgent action is taken. The first essential component of any effective cardiovascular disease (CVD) control program is to establish reliable estimates of cardiovascular disease-related morbidity and mortality. However, such data from population-based studies in Latin America are still lacking. In this paper, we present the design and operation of PREVENCION (Estudio Peruano de Prevalencia de Enfermedades Cardiovasculares, for Peruvian Study of the Prevalence of Cardiovascular diseases). PREVENCION is an ongoing population-based study on a representative sample of the civilian non-institutionalized population of the second largest city in Peru. Its population is comparable to the rest of the Peruvian urban population and closely resembles other Latin American populations in countries such as Bolivia and Ecuador. Our study will contribute to the enormous task of understanding and preventing CVD in Latin America.

  3. A framework for building hypercubes using MapReduce

    NASA Astrophysics Data System (ADS)

    Tapiador, D.; O'Mullane, W.; Brown, A. G. A.; Luri, X.; Huedo, E.; Osuna, P.

    2014-05-01

    The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalog will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigm but without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.

  4. Imported Dengue Infection in a Spanish Hospital with a High Proportion of Travelers from Africa: A 9-Year Retrospective Study.

    PubMed

    Toro, Carlos; Trevisi, Patricia; López-Quintana, Beatriz; Amor, Aránzazu; Iglesias, Nuria; Subirats, Mercedes; de Guevara, Concepción Ladrón; Lago, Mar; Arsuaga, Marta; de la Calle-Prieto, Fernando; Herrero, Dolores; Rubio, Margarita; Puente, Sabino; Baquero, Margarita

    2017-03-01

    AbstractEpidemiological data on dengue in Africa are still scarce. We investigated imported dengue infection among travelers with a high proportion of subjects from Africa over a 9-year period. From January 2005 to December 2013, blood samples from travelers with clinical suspicion of dengue were analyzed. Dengue was diagnosed using serological, antigen detection, and molecular methods. Subjects were classified according to birthplace (Europeans versus non-Europeans) and last country visited. Overall, 10,307 serum samples corresponding to 8,295 patients were studied; 62% were European travelers, most of them from Spain, and 35.9% were non-Europeans, the majority of whom were born in Africa (mainly Equatorial Guinea) and Latin America (mainly Bolivia, Ecuador, and Colombia). A total of 492 cases of dengue were identified, the highest number of cases corresponding to subjects who had traveled from Africa ( N = 189), followed by Latin America ( N = 174) and Asia ( N = 113). The rate of cases for Africa (4.5%) was inferior to Asia (9%) and Latin America (6.1%). Three peaks of dengue were found (2007, 2010, and 2013) which correlated with African cases. A total of 2,157 of past dengue infections were diagnosed. Non-Europeans who had traveled from Africa had the highest rate of past infection (67.8%), compared with non-Europeans traveling from Latin America (38.7%) or Asia (35%). Dengue infection in certain regions of Africa is underreported and the burden of the disease may have a magnitude similar to endemic countries in Latin America. It is necessary to consider dengue in the differential diagnosis of other febrile diseases in Africa.

  5. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    PubMed Central

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June–August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants’ self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods. PMID:24362754

  6. Sampling methodologies for epidemiologic surveillance of men who have sex with men and transgender women in Latin America: an empiric comparison of convenience sampling, time space sampling, and respondent driven sampling.

    PubMed

    Clark, J L; Konda, K A; Silva-Santisteban, A; Peinado, J; Lama, J R; Kusunoki, L; Perez-Brumer, A; Pun, M; Cabello, R; Sebastian, J L; Suarez-Ognio, L; Sanchez, J

    2014-12-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants' self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods.

  7. RTNN: The New Parallel Machine in Zaragoza

    NASA Astrophysics Data System (ADS)

    Sijs, A. J. V. D.

    I report on the development of RTNN, a parallel computer designed as a 4^4 hypercube of 256 T9000 transputer nodes, each with 8 MB memory. The peak performance of the machine is expected to be 2.5 Gflops.

  8. Fault tolerant hypercube computer system architecture

    NASA Technical Reports Server (NTRS)

    Madan, Herb S. (Inventor); Chow, Edward (Inventor)

    1989-01-01

    A fault-tolerant multiprocessor computer system of the hypercube type comprising a hierarchy of computers of like kind which can be functionally substituted for one another as necessary is disclosed. Communication between the working nodes is via one communications network while communications between the working nodes and watch dog nodes and load balancing nodes higher in the structure is via another communications network separate from the first. A typical branch of the hierarchy reporting to a master node or host computer comprises, a plurality of first computing nodes; a first network of message conducting paths for interconnecting the first computing nodes as a hypercube. The first network provides a path for message transfer between the first computing nodes; a first watch dog node; and a second network of message connecting paths for connecting the first computing nodes to the first watch dog node independent from the first network, the second network provides an independent path for test message and reconfiguration affecting transfers between the first computing nodes and the first switch watch dog node. There is additionally, a plurality of second computing nodes; a third network of message conducting paths for interconnecting the second computing nodes as a hypercube. The third network provides a path for message transfer between the second computing nodes; a fourth network of message conducting paths for connecting the second computing nodes to the first watch dog node independent from the third network. The fourth network provides an independent path for test message and reconfiguration affecting transfers between the second computing nodes and the first watch dog node; and a first multiplexer disposed between the first watch dog node and the second and fourth networks for allowing the first watch dog node to selectively communicate with individual ones of the computing nodes through the second and fourth networks; as well as, a second watch dog node operably connected to the first multiplexer whereby the second watch dog node can selectively communicate with individual ones of the computing nodes through the second and fourth networks. The branch is completed by a first load balancing node; and a second multiplexer connected between the first load balancing node and the first and second watch dog nodes, allowing the first load balancing node to selectively communicate with the first and second watch dog nodes.

  9. Predictive Array Design. A method for sampling combinatorial chemistry library space.

    PubMed

    Lipkin, M J; Rose, V S; Wood, J

    2002-01-01

    A method, Predictive Array Design, is presented for sampling combinatorial chemistry space and selecting a subarray for synthesis based on the experimental design method of Latin Squares. The method is appropriate for libraries with three sites of variation. Libraries with four sites of variation can be designed using the Graeco-Latin Square. Simulated annealing is used to optimise the physicochemical property profile of the sub-array. The sub-array can be used to make predictions of the activity of compounds in the all combinations array if we assume each monomer has a relatively constant contribution to activity and that the activity of a compound is composed of the sum of the activities of its constitutive monomers.

  10. Progress towards elimination of trans-fatty acids in foods commonly consumed in four Latin American cities.

    PubMed

    Monge-Rojas, Rafael; Colón-Ramos, Uriyoán; Jacoby, Enrique; Alfaro, Thelma; Tavares do Carmo, Maria das Graças; Villalpando, Salvador; Bernal, Claudio

    2017-09-01

    To assess progress towards the elimination of trans-fatty acids (TFA) in foods after the 2008 Pan American Health Organization (PAHO) recommendation of virtual elimination of TFA in Latin America. A descriptive, comparative analysis of foods that were likely to contain TFA and were commonly consumed in four cities in Latin America. San José (Costa Rica), Mexico City (Mexico), Rio de Janeiro (Brazil), Buenos Aires (Argentina). Foods from each city were sampled in 2011; TFA content was analysed using GC. TFA of selected foods was also monitored in 2016. In 2011-2016, there was a significant decrease in the content of TFA in the sampled foods across all sites, particularly in Buenos Aires (from 12·6-34·8 % range in 2011-2012 to nearly 0 % in 2015-2016). All sample products met the recommended levels of TFA content set by the PAHO. TFA were replaced with a mixture of saturated and unsaturated fats. Our results indicate a virtual elimination of TFA from major food sources in the cities studied. This could be due to a combination of factors, including recommendations by national and global public health authorities, voluntary and/or mandatory food reformulation made by the food industry.

  11. Molecular Diversity of Trypanosoma cruzi Detected in the Vector Triatoma protracta from California, USA

    PubMed Central

    Shender, Lisa A.; Lewis, Michael D.; Rejmanek, Daniel; Mazet, Jonna A. K.

    2016-01-01

    Background Trypanosoma cruzi, causative agent of Chagas disease in humans and dogs, is a vector-borne zoonotic protozoan parasite that can cause fatal cardiac disease. While recognized as the most economically important parasitic infection in Latin America, the incidence of Chagas disease in the United States of America (US) may be underreported and even increasing. The extensive genetic diversity of T. cruzi in Latin America is well-documented and likely influences disease progression, severity and treatment efficacy; however, little is known regarding T. cruzi strains endemic to the US. It is therefore important to expand our knowledge on US T. cruzi strains, to improve upon the recognition of and response to locally acquired infections. Methodology/Principle Findings We conducted a study of T. cruzi molecular diversity in California, augmenting sparse genetic data from southern California and for the first time investigating genetic sequences from northern California. The vector Triatoma protracta was collected from southern (Escondido and Los Angeles) and northern (Vallecito) California regions. Samples were initially screened via sensitive nuclear repetitive DNA and kinetoplast minicircle DNA PCR assays, yielding an overall prevalence of approximately 28% and 55% for southern and northern California regions, respectively. Positive samples were further processed to identify discrete typing units (DTUs), revealing both TcI and TcIV lineages in southern California, but only TcI in northern California. Phylogenetic analyses (targeting COII-ND1, TR and RB19 genes) were performed on a subset of positive samples to compare Californian T. cruzi samples to strains from other US regions and Latin America. Results indicated that within the TcI DTU, California sequences were similar to those from the southeastern US, as well as to several isolates from Latin America responsible for causing Chagas disease in humans. Conclusions/Significance Triatoma protracta populations in California are frequently infected with T. cruzi. Our data extend the northern limits of the range of TcI and identify a novel genetic exchange event between TcI and TcIV. High similarity between sequences from California and specific Latin American strains indicates US strains may be equally capable of causing human disease. Additional genetic characterization of Californian and other US T. cruzi strains is recommended. PMID:26797311

  12. Inefficiency in Latin-American market indices

    NASA Astrophysics Data System (ADS)

    Zunino, L.; Tabak, B. M.; Pérez, D. G.; Garavaglia, M.; Rosso, O. A.

    2007-11-01

    We explore the deviations from efficiency in the returns and volatility returns of Latin-American market indices. Two different approaches are considered. The dynamics of the Hurst exponent is obtained via a wavelet rolling sample approach, quantifying the degree of long memory exhibited by the stock market indices under analysis. On the other hand, the Tsallis q entropic index is measured in order to take into account the deviations from the Gaussian hypothesis. Different dynamic rankings of inefficieny are obtained, each of them contemplates a different source of inefficiency. Comparing with the results obtained for a developed country (US), we confirm a similar degree of long-range dependence for our emerging markets. Moreover, we show that the inefficiency in the Latin-American countries comes principally from the non-Gaussian form of the probability distributions.

  13. Global sensitivity analysis in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Tsvetkova, O.; Ouarda, T. B.

    2012-12-01

    Wind energy is one of the most promising renewable energy sources. Nevertheless, it is not yet a common source of energy, although there is enough wind potential to supply world's energy demand. One of the most prominent obstacles on the way of employing wind energy is the uncertainty associated with wind energy assessment. Global sensitivity analysis (SA) studies how the variation of input parameters in an abstract model effects the variation of the variable of interest or the output variable. It also provides ways to calculate explicit measures of importance of input variables (first order and total effect sensitivity indices) in regard to influence on the variation of the output variable. Two methods of determining the above mentioned indices were applied and compared: the brute force method and the best practice estimation procedure In this study a methodology for conducting global SA of wind energy assessment at a planning stage is proposed. Three sampling strategies which are a part of SA procedure were compared: sampling based on Sobol' sequences (SBSS), Latin hypercube sampling (LHS) and pseudo-random sampling (PRS). A case study of Masdar City, a showcase of sustainable living in the UAE, is used to exemplify application of the proposed methodology. Sources of uncertainty in wind energy assessment are very diverse. In the case study the following were identified as uncertain input parameters: the Weibull shape parameter, the Weibull scale parameter, availability of a wind turbine, lifetime of a turbine, air density, electrical losses, blade losses, ineffective time losses. Ineffective time losses are defined as losses during the time when the actual wind speed is lower than the cut-in speed or higher than the cut-out speed. The output variable in the case study is the lifetime energy production. Most influential factors for lifetime energy production are identified with the ranking of the total effect sensitivity indices. The results of the present research show that the brute force method is best for wind assessment purpose, SBSS outperforms other sampling strategies in the majority of cases. The results indicate that the Weibull scale parameter, turbine lifetime and Weibull shape parameter are the three most influential variables in the case study setting. The following conclusions can be drawn from these results: 1) SBSS should be recommended for use in Monte Carlo experiments, 2) The brute force method should be recommended for conducting sensitivity analysis in wind resource assessment, and 3) Little variation in the Weibull scale causes significant variation in energy production. The presence of the two distribution parameters in the top three influential variables (the Weibull shape and scale) emphasizes the importance of accuracy of (a) choosing the distribution to model wind regime at a site and (b) estimating probability distribution parameters. This can be labeled as the most important conclusion of this research because it opens a field for further research, which the authors see could change the wind energy field tremendously.

  14. Genotyping non-small cell lung cancer (NSCLC) in Latin America.

    PubMed

    Arrieta, Oscar; Cardona, Andrés Felipe; Federico Bramuglia, Guillermo; Gallo, Aly; Campos-Parra, Alma D; Serrano, Silvia; Castro, Marcelo; Avilés, Alejandro; Amorin, Edgar; Kirchuk, Ricardo; Cuello, Mauricio; Borbolla, José; Riemersma, Omar; Becerra, Henry; Rosell, Rafael

    2011-11-01

    Frequency of mutations in EGFR and KRAS in non-small cell lung cancer (NSCLC) is different between ethnic groups; however, there is no information in Latin-American population. A total of 1150 biopsies of NSCLC patients from Latin America (Argentina, Colombia, Peru, and Mexico) were used extracting genomic DNA to perform direct sequencing of EGFR gene (exons 18 and 21) and KRAS gene in 650 samples. In Mexico, Scorpions ARMS was also used to obtain a genetic profile. We report the frequency of mutations in EGFR and KRAS genes in four Latin-American countries (n = 1150). Frequency of EGFR mutations in NSCLC was 33.2% (95% confidence interval [CI] 30.5-35.9) (Argentina 19.3%, Colombia 24.8%, Mexico 31.2%, and Peru 67%). The frequency of KRAS mutations was 16.6% (95% CI 13.8-19.4). EGFR mutations were independently associated with adenocarcinoma histology, older age, nonsmokers, and absence of KRAS mutations. Overall response rate to tyrosine kinase inhibitors in EGFR-mutated patients (n = 56) was 62.5% (95% CI 50-75) with a median overall survival of 16.5 months (95% CI 12.4-20.6). Our findings suggest that the frequency of EGFR mutations in Latin America lies between that of Asian and Caucasian populations and therefore support the genetic heterogeneity of NSCLC around the world.

  15. Hypercube Expert System Shell - Applying Production Parallelism.

    DTIC Science & Technology

    1989-12-01

    possible processor organizations, or int( rconntction n thod,, for par- allel architetures . The following are examples of commonlv used interconnection...this timing analysis because match speed-up avaiiah& from production parallelism is proportional to the average number of affected produclions1 ( 11:5

  16. Experiences and results multitasking a hydrodynamics code on global and local memory machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandell, D.

    1987-01-01

    A one-dimensional, time-dependent Lagrangian hydrodynamics code using a Godunov solution method has been multitasked for the Cray X-MP/48, the Intel iPSC hypercube, the Alliant FX series and the IBM RP3 computers. Actual multitasking results have been obtained for the Cray, Intel and Alliant computers and simulated results were obtained for the Cray and RP3 machines. The differences in the methods required to multitask on each of the machines is discussed. Results are presented for a sample problem involving a shock wave moving down a channel. Comparisons are made between theoretical speedups, predicted by Amdahl's law, and the actual speedups obtained.more » The problems of debugging on the different machines are also described.« less

  17. Social anxiety and social norms in individualistic and collectivistic countries

    PubMed Central

    Schreier, Sina-Simone; Heinrichs, Nina; Alden, Lynn; Rapee, Ronald M.; Hofmann, Stefan G.; Chen, Junwen; Ja Oh, Kyung; Bögels, Susan

    2010-01-01

    Background Social anxiety is assumed to be related to cultural norms across countries. Heinrichs and colleagues [1] compared individualistic and collectivistic countries and found higher social anxiety and more positive attitudes toward socially avoidant behaviors in collectivistic than in individualistic countries. However, the authors failed to include Latin American countries in the collectivistic group. Methods To provide support for these earlier results within an extended sample of collectivistic countries, 478 undergraduate students from individualistic countries were compared with 388 undergraduate students from collectivistic countries (including East Asian and Latin American) via self report of social anxiety and social vignettes assessing social norms. Results As expected, the results of Heinrichs and colleagues [1] were replicated for the individualistic and Asian countries but not for Latin American countries. Latin American countries displayed the lowest social anxiety levels, whereas the collectivistic East Asian group displayed the highest. Conclusions These findings indicate that while culture-mediated social norms affect social anxiety and might help to shed light on the etiology of social anxiety disorder, the dimension of individualism-collectivism may not fully capture the relevant norms. PMID:21049538

  18. The Relationship between National-Level Carbon Dioxide Emissions and Population Size: An Assessment of Regional and Temporal Variation, 1960–2005

    PubMed Central

    Jorgenson, Andrew K.; Clark, Brett

    2013-01-01

    This study examines the regional and temporal differences in the statistical relationship between national-level carbon dioxide emissions and national-level population size. The authors analyze panel data from 1960 to 2005 for a diverse sample of nations, and employ descriptive statistics and rigorous panel regression modeling techniques. Initial descriptive analyses indicate that all regions experienced overall increases in carbon emissions and population size during the 45-year period of investigation, but with notable differences. For carbon emissions, the sample of countries in Asia experienced the largest percent increase, followed by countries in Latin America, Africa, and lastly the sample of relatively affluent countries in Europe, North America, and Oceania combined. For population size, the sample of countries in Africa experienced the largest percent increase, followed countries in Latin America, Asia, and the combined sample of countries in Europe, North America, and Oceania. Findings for two-way fixed effects panel regression elasticity models of national-level carbon emissions indicate that the estimated elasticity coefficient for population size is much smaller for nations in Africa than for nations in other regions of the world. Regarding potential temporal changes, from 1960 to 2005 the estimated elasticity coefficient for population size decreased by 25% for the sample of Africa countries, 14% for the sample of Asia countries, 6.5% for the sample of Latin America countries, but remained the same in size for the sample of countries in Europe, North America, and Oceania. Overall, while population size continues to be the primary driver of total national-level anthropogenic carbon dioxide emissions, the findings for this study highlight the need for future research and policies to recognize that the actual impacts of population size on national-level carbon emissions differ across both time and region. PMID:23437323

  19. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    NASA Astrophysics Data System (ADS)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of saltwater intrusion are considered. The salinity levels resulting at strategic locations due to these pumping are predicted using the ensemble surrogates and are constrained to be within pre-specified levels. Different realizations of the concentration values are obtained from the ensemble predictions corresponding to each candidate solution of pumping. Reliability concept is incorporated as the percent of the total number of surrogate models which satisfy the imposed constraints. The methodology was applied to a realistic coastal aquifer system in Burdekin delta area in Australia. It was found that all optimal solutions corresponding to a reliability level of 0.99 satisfy all the constraints and as reducing reliability level decreases the constraint violation increases. Thus ensemble surrogate model based simulation-optimization was found to be useful in deriving multi-objective optimal pumping strategies for coastal aquifers under parameter uncertainty.

  20. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Astrophysics Data System (ADS)

    Godines, Cody R.; Manteufel, Randall D.

    2002-12-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  1. SPOTting model parameters using a ready-made Python package

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Kraft, Philipp; Breuer, Lutz

    2015-04-01

    The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for optimization methods. Here we see simple algorithms like the MCMC struggling to find the global optimum of the function, while algorithms like SCE-UA and DE-MCZ show their strengths. Thirdly, we apply an uncertainty analysis of a one-dimensional physically based hydrological model build with the Catchment Modelling Framework (CMF). The model is driven by meteorological and groundwater data from a Free Air Carbon Enrichment (FACE) experiment in Linden (Hesse, Germany). Simulation results are evaluated with measured soil moisture data. We search for optimal parameter sets of the van Genuchten-Mualem function and find different equally optimal solutions with some of the algorithms. The case studies reveal that the implemented SPOT methods work sufficiently well. They further show the benefit of having one tool at hand that includes a number of parameter search methods, likelihood functions and a priori parameter distributions within one platform independent package.

  2. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy - taking a reasonable computational effort - when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Technical Reports Server (NTRS)

    Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)

    2002-01-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  4. Fossil vs. non-fossil sources of fine carbonaceous aerosols in four Chinese cities during the extreme winter haze episode in 2013

    NASA Astrophysics Data System (ADS)

    Zhang, Y.-L.; Huang, R.-J.; El Haddad, I.; Ho, K.-F.; Cao, J.-J.; Han, Y.; Zotter, P.; Bozzetti, C.; Daellenbach, K. R.; Canonaco, F.; Slowik, J. G.; Salazar, G.; Schwikowski, M.; Schnelle-Kreis, J.; Abbaszade, G.; Zimmermann, R.; Baltensperger, U.; Prévôt, A. S. H.; Szidat, S.

    2014-10-01

    During winter 2013, extremely high concentrations (i.e. 4-20 times higher than the World Health Organization guideline) of PM2.5 (particulate matter with an aerodynamic diameter <2.5 μm) were reported in several large cities in China. In this work, source apportionment of fine carbonaceous aerosols during this haze episode was conducted at four major cities in China including Xian, Beijing, Shanghai and Guangzhou. An effective statistical analysis of a combined dataset from elemental carbon (EC) and organic carbon (OC), radiocarbon (14C) and biomass-burning marker measurements using Latin-hypercube sampling allowed a quantitative source apportionment of carbonaceous aerosols. We found that fossil emissions from coal combustion and vehicle exhaust dominated EC with a mean contribution of 75 ± 8% at all sites. The remaining 25 ± 8% was exclusively attributed to biomass combustion, consistent with the measurements of biomass-burning markers such as anhydrosugars (levoglucosan and mannosan) and water-soluble potassium (K+). With a combination of the levoglucosan-to-mannosan and levoglucosan-to-K+ ratios, the major source of biomass burning in winter in China is suggested to be combustion of crop residues. The contribution of fossil sources to OC was highest in Beijing (58 ± 5%) and decreased from Shanghai (49 ± 2%) to Xian (38 ± 3%) and Guangzhou (35 ± 7%). Generally, a larger fraction of fossil OC was rather from secondary origins than primary sources for all sites. Non-fossil sources accounted on average for 55 ± 10% and 48 ± 9% of OC and TC, respectively, which suggests that non-fossil emissions were very important contributors of urban carbonaceous aerosols in China. The primary biomass-burning emissions accounted for 40 ± 8%, 48 ± 18%, 53 ± 4% and 65 ± 26% of non-fossil OC for Xian, Beijing, Shanghai and Guangzhou, respectively. Other non-fossil sources excluding primary biomass-burning were mainly attributed to formation of secondary organic carbon (SOC) from non-fossil precursors such as biomass-burning emissions. For each site, we also compared samples from moderately with heavily polluted days according to particulate matter mass. Despite a significant increase of absolute mass concentrations of primary emissions from both, fossil and non-fossil sources, during the heavily polluted events, their relative contribution to TC was even decreased, whereas the portion of SOC was consistently increased at all sites. This observation indicates that SOC was an important fraction in the increment of carbonaceous aerosols during the haze episode in China.

  5. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    NASA Astrophysics Data System (ADS)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used to probe the design space of several local minima and maxima. After analysis of numerous samples, an optimum configuration of the design that is more stable than that of the initial design is reached. The above process requires several software tools: CATIA as the CAD tool, ANSYS as the FEA tool, VABS for obtaining the cross-sectional structural properties, and DYMORE for the frequency and dynamic analysis of the rotor. MATLAB codes are also employed to generate input files and read output files of DYMORE. All these tools are connected using ModelCenter.

  6. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  7. PCLIPS: Parallel CLIPS

    NASA Technical Reports Server (NTRS)

    Hall, Lawrence O.; Bennett, Bonnie H.; Tello, Ivan

    1994-01-01

    A parallel version of CLIPS 5.1 has been developed to run on Intel Hypercubes. The user interface is the same as that for CLIPS with some added commands to allow for parallel calls. A complete version of CLIPS runs on each node of the hypercube. The system has been instrumented to display the time spent in the match, recognize, and act cycles on each node. Only rule-level parallelism is supported. Parallel commands enable the assertion and retraction of facts to/from remote nodes working memory. Parallel CLIPS was used to implement a knowledge-based command, control, communications, and intelligence (C(sup 3)I) system to demonstrate the fusion of high-level, disparate sources. We discuss the nature of the information fusion problem, our approach, and implementation. Parallel CLIPS has also be used to run several benchmark parallel knowledge bases such as one to set up a cafeteria. Results show from running Parallel CLIPS with parallel knowledge base partitions indicate that significant speed increases, including superlinear in some cases, are possible.

  8. Optimisation and evaluation of hyperspectral imaging system using machine learning algorithm

    NASA Astrophysics Data System (ADS)

    Suthar, Gajendra; Huang, Jung Y.; Chidangil, Santhosh

    2017-10-01

    Hyperspectral imaging (HSI), also called imaging spectrometer, originated from remote sensing. Hyperspectral imaging is an emerging imaging modality for medical applications, especially in disease diagnosis and image-guided surgery. HSI acquires a three-dimensional dataset called hypercube, with two spatial dimensions and one spectral dimension. Spatially resolved spectral imaging obtained by HSI provides diagnostic information about the objects physiology, morphology, and composition. The present work involves testing and evaluating the performance of the hyperspectral imaging system. The methodology involved manually taking reflectance of the object in many images or scan of the object. The object used for the evaluation of the system was cabbage and tomato. The data is further converted to the required format and the analysis is done using machine learning algorithm. The machine learning algorithms applied were able to distinguish between the object present in the hypercube obtain by the scan. It was concluded from the results that system was working as expected. This was observed by the different spectra obtained by using the machine-learning algorithm.

  9. Relation between the global burden of disease and randomized clinical trials conducted in Latin America published in the five leading medical journals.

    PubMed

    Perel, Pablo; Miranda, J Jaime; Ortiz, Zulma; Casas, Juan Pablo

    2008-02-27

    Since 1990 non communicable diseases and injuries account for the majority of death and disability-adjusted life years in Latin America. We analyzed the relationship between the global burden of disease and Randomized Clinical Trials (RCTs) conducted in Latin America that were published in the five leading medical journals. We included all RCTS in humans, exclusively conducted in Latin American countries, and published in any of the following journals: Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, and New England Journal of Medicine. We described the trials and reported the number of RCTs according to the main categories of the global burden of disease. Sixty-six RCTs were identified. Communicable diseases accounted for 38 (57%) reports. Maternal, perinatal, and nutritional conditions accounted for 19 (29%) trials. Non-communicable diseases represent 48% of the global burden of disease but only 14% of reported trials. No trial addressed injuries despite its 18% contribution to the burden of disease in 2000. A poor correlation between the burden of disease and RCTs publications was found. Non communicable diseases and injuries account for up to two thirds of the burden of disease in Latin America but these topics are seldom addressed in published RCTs in the selected sample of journals. Funding bodies of health research and editors should be aware of the increasing burden of non communicable diseases and injuries occurring in Latin America to ensure that this growing epidemic is not neglected in the research agenda and not affected by publication bias.

  10. Salivary gland tumours in a Mexican sample. A retrospective study.

    PubMed

    Ledesma-Montes, C; Garces-Ortiz, M

    2002-01-01

    Salivary gland tumours are an important part of the Oral and Maxillofacial Pathology, unfortunately, only few studies on these tumours have been done in Latin-American population. The aim of this study was to compare demographic data on salivary gland tumours in a Mexican sample with those previously published from Latin American and non-Latin American countries. All cases of salivary gland tumours or lesions diagnosed in our service were reviewed. Of the reviewed cases,67 were confirmed as salivary gland tumours. Out of these 64.2% were benign neoplasms, 35.8% were malignant and a slight female predominance (56.7%) was found. The most common location was palate followed by lips and floor of the mouth. Mean age for benign tumours was 40.6 years with female predominance (60.5%). Mean age for malignant tumours was 41 years and female predominance was found again. Palate followed by retromolar area were the usual locations. Pleomorphic adenoma (58.2%), mucoepidermoid carcinoma (17.9%) and adenoid cystic carcinoma (11.9%) were the more frequent neoplasms. All retromolar cases were malignant and all submandibular gland tumours were benign. We found a high proportion of salivary gland neoplasms in children. Our results showed that differences of the studied tumours among our sample and previously reported series exist. These differences can be related to race and geographical location.

  11. Stroop Color-Word Interference Test: Normative data for the Latin American Spanish speaking adult population.

    PubMed

    Rivera, D; Perrin, P B; Stevens, L F; Garza, M T; Weil, C; Saracho, C P; Rodríguez, W; Rodríguez-Agudelo, Y; Rábago, B; Weiler, G; García de la Cadena, C; Longoni, M; Martínez, C; Ocampo-Barba, N; Aliaga, A; Galarza-Del-Angel, J; Guerra, A; Esenarro, L; Arango-Lasprilla, J C

    2015-01-01

    To generate normative data on the Stroop Test across 11 countries in Latin America, with country-specific adjustments for gender, age, and education, where appropriate. The sample consisted of 3,977 healthy adults who were recruited from Argentina, Bolivia, Chile, Cuba, El Salvador, Guatemala, Honduras, Mexico, Paraguay, Peru, and, Puerto Rico. Each subject was administered the Stroop Test, as part of a larger neuropsychological battery. A standardized five-step statistical procedure was used to generate the norms. The final multiple linear regression models explained 14-36% of the variance in Stroop Word scores, 12-41% of the variance in the Stoop Color, 14-36% of the variance in the Stroop Word-Color scores, and 4-15% of variance in Stroop Interference scores. Although t-tests showed significant differences between men and women on the Stroop test, none of the countries had an effect size larger than 0.3. As a result, gender-adjusted norms were not generated. This is the first normative multicenter study conducted in Latin America to create norms for the Stoop Test in a Spanish-Speaking sample. This study will therefore have important implications for the future of neuropsychology research and practice throughout the region.

  12. The prevalence of radiographic vertebral fractures in Latin American countries: the Latin American Vertebral Osteoporosis Study (LAVOS).

    PubMed

    Clark, P; Cons-Molina, F; Deleze, M; Ragi, S; Haddock, L; Zanchetta, J R; Jaller, J J; Palermo, L; Talavera, J O; Messina, D O; Morales-Torres, J; Salmeron, J; Navarrete, A; Suarez, E; Pérez, C M; Cummings, S R

    2009-02-01

    In the first population-based study of vertebral fractures in Latin America, we found a 11.18 (95% CI 9.23-13.4) prevalence of radiographically ascertained vertebral fractures in a random sample of 1,922 women from cities within five different countries. These figures are similar to findings from studies in Beijing, China, some regions of Europe, and slightly lower than those found in the USA using the same standardized methodology. We report the first study of radiographic vertebral fractures in Latin America. An age-stratified random sample of 1,922 women aged 50 years and older from Argentina, Brazil, Colombia, Mexico, and Puerto Rico were included. In all cases a standardized questionnaire and lateral X-rays of the lumbar and thoracic spine were obtained after informed consent. A standardized prevalence of 11.18 (95% CI 9.23-13.4) was found. The prevalence was similar in all five countries, increasing from 6.9% (95% CI 4.6-9.1) in women aged 50-59 years to 27.8% (95% CI 23.1-32.4) in those 80 years and older (p for trend < 0.001). Among different risk factors, self-reported height loss OR = 1.63 (95% CI: 1.18-2.25), and previous history of fracture OR = 1.52 (95% CI: 1.14-2.03) were significantly (p < 0.003 and p < 0.04 respectably) associated with the presence of radiographic vertebral fractures in the multivariate analysis. In the bivariate analyses HRT was associated with a 35% lower risk OR = 0.65 (95% CI: 0.46-0.93) and physical activity with a 27% lower risk of having a vertebral fracture OR = 0.73 (95% CI: 0.55-0.98), but were not statistically significant in multivariate analyses We conclude that radiographically ascertained vertebral fractures are common in Latin America. Health authorities in the region should be aware and consider implementing measures to prevent vertebral fractures.

  13. Stepwise sensitivity analysis from qualitative to quantitative: Application to the terrestrial hydrological modeling of a Conjunctive Surface-Subsurface Process (CSSP) land surface model

    NASA Astrophysics Data System (ADS)

    Gan, Yanjun; Liang, Xin-Zhong; Duan, Qingyun; Choi, Hyun Il; Dai, Yongjiu; Wu, Huan

    2015-06-01

    An uncertainty quantification framework was employed to examine the sensitivities of 24 model parameters from a newly developed Conjunctive Surface-Subsurface Process (CSSP) land surface model (LSM). The sensitivity analysis (SA) was performed over 18 representative watersheds in the contiguous United States to examine the influence of model parameters in the simulation of terrestrial hydrological processes. Two normalized metrics, relative bias (RB) and Nash-Sutcliffe efficiency (NSE), were adopted to assess the fit between simulated and observed streamflow discharge (SD) and evapotranspiration (ET) for a 14 year period. SA was conducted using a multiobjective two-stage approach, in which the first stage was a qualitative SA using the Latin Hypercube-based One-At-a-Time (LH-OAT) screening, and the second stage was a quantitative SA using the Multivariate Adaptive Regression Splines (MARS)-based Sobol' sensitivity indices. This approach combines the merits of qualitative and quantitative global SA methods, and is effective and efficient for understanding and simplifying large, complex system models. Ten of the 24 parameters were identified as important across different watersheds. The contribution of each parameter to the total response variance was then quantified by Sobol' sensitivity indices. Generally, parameter interactions contribute the most to the response variance of the CSSP, and only 5 out of 24 parameters dominate model behavior. Four photosynthetic and respiratory parameters are shown to be influential to ET, whereas reference depth for saturated hydraulic conductivity is the most influential parameter for SD in most watersheds. Parameter sensitivity patterns mainly depend on hydroclimatic regime, as well as vegetation type and soil texture. This article was corrected on 26 JUN 2015. See the end of the full text for details.

  14. Optimization of a simplified automobile finite element model using time varying injury metrics.

    PubMed

    Gaewsky, James P; Danelson, Kerry A; Weaver, Caitlin M; Stitzel, Joel D

    2014-01-01

    In 2011, frontal crashes resulted in 55% of passenger car injuries with 10,277 fatalities and 866,000 injuries in the United States. To better understand frontal crash injury mechanisms, human body finite element models (FEMs) can be used to reconstruct Crash Injury Research and Engineering Network (CIREN) cases. A limitation of this method is the paucity of vehicle FEMs; therefore, we developed a functionally equivalent simplified vehicle model. The New Car Assessment Program (NCAP) data for our selected vehicle was from a frontal collision with Hybrid III (H3) Anthropomorphic Test Device (ATD) occupant. From NCAP test reports, the vehicle geometry was created and the H3 ATD was positioned. The material and component properties optimized using a variation study process were: steering column shear bolt fracture force and stroke resistance, seatbelt pretensioner force, frontal and knee bolster airbag stiffness, and belt friction through the D-ring. These parameters were varied using three successive Latin Hypercube Designs of Experiments with 130-200 simulations each. The H3 injury response was compared to the reported NCAP frontal test results for the head, chest and pelvis accelerations, and seat belt and femur forces. The phase, magnitude, and comprehensive error factors, from a Sprague and Geers analysis were calculated for each injury metric and then combined to determine the simulations with the best match to the crash test. The Sprague and Geers analyses typically yield error factors ranging from 0 to 1 with lower scores being more optimized. The total body injury response error factor for the most optimized simulation from each round of the variation study decreased from 0.466 to 0.395 to 0.360. This procedure to optimize vehicle FEMs is a valuable tool to conduct future CIREN case reconstructions in a variety of vehicles.

  15. Constraining a complex biogeochemical model for CO2 and N2O emission simulations from various land uses by model-data fusion

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Kraus, David; Kiese, Ralf; Breuer, Lutz

    2017-07-01

    This study presents the results of a combined measurement and modelling strategy to analyse N2O and CO2 emissions from adjacent arable land, forest and grassland sites in Hesse, Germany. The measured emissions reveal seasonal patterns and management effects, including fertilizer application, tillage, harvest and grazing. The measured annual N2O fluxes are 4.5, 0.4 and 0.1 kg N ha-1 a-1, and the CO2 fluxes are 20.0, 12.2 and 3.0 t C ha-1 a-1 for the arable land, grassland and forest sites, respectively. An innovative model-data fusion concept based on a multicriteria evaluation (soil moisture at different depths, yield, CO2 and N2O emissions) is used to rigorously test the LandscapeDNDC biogeochemical model. The model is run in a Latin-hypercube-based uncertainty analysis framework to constrain model parameter uncertainty and derive behavioural model runs. The results indicate that the model is generally capable of predicting trace gas emissions, as evaluated with RMSE as the objective function. The model shows a reasonable performance in simulating the ecosystem C and N balances. The model-data fusion concept helps to detect remaining model errors, such as missing (e.g. freeze-thaw cycling) or incomplete model processes (e.g. respiration rates after harvest). This concept further elucidates the identification of missing model input sources (e.g. the uptake of N through shallow groundwater on grassland during the vegetation period) and uncertainty in the measured validation data (e.g. forest N2O emissions in winter months). Guidance is provided to improve the model structure and field measurements to further advance landscape-scale model predictions.

  16. A GIS-assisted regional screening tool to evaluate the leaching potential of volatile and non-volatile pesticides

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Ray, Chittaranjan

    2015-03-01

    A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.

  17. Soil hydraulic material properties and layered architecture from time-lapse GPR

    NASA Astrophysics Data System (ADS)

    Jaumann, Stefan; Roth, Kurt

    2018-04-01

    Quantitative knowledge of the subsurface material distribution and its effective soil hydraulic material properties is essential to predict soil water movement. Ground-penetrating radar (GPR) is a noninvasive and nondestructive geophysical measurement method that is suitable to monitor hydraulic processes. Previous studies showed that the GPR signal from a fluctuating groundwater table is sensitive to the soil water characteristic and the hydraulic conductivity function. In this work, we show that the GPR signal originating from both the subsurface architecture and the fluctuating groundwater table is suitable to estimate the position of layers within the subsurface architecture together with the associated effective soil hydraulic material properties with inversion methods. To that end, we parameterize the subsurface architecture, solve the Richards equation, convert the resulting water content to relative permittivity with the complex refractive index model (CRIM), and solve Maxwell's equations numerically. In order to analyze the GPR signal, we implemented a new heuristic algorithm that detects relevant signals in the radargram (events) and extracts the corresponding signal travel time and amplitude. This algorithm is applied to simulated as well as measured radargrams and the detected events are associated automatically. Using events instead of the full wave regularizes the inversion focussing on the relevant measurement signal. For optimization, we use a global-local approach with preconditioning. Starting from an ensemble of initial parameter sets drawn with a Latin hypercube algorithm, we sequentially couple a simulated annealing algorithm with a Levenberg-Marquardt algorithm. The method is applied to synthetic as well as measured data from the ASSESS test site. We show that the method yields reasonable estimates for the position of the layers as well as for the soil hydraulic material properties by comparing the results to references derived from ground truth data as well as from time domain reflectometry (TDR).

  18. Energy intake and food sources of eight Latin American countries: results from the Latin American Study of Nutrition and Health (ELANS).

    PubMed

    Kovalskys, Irina; Fisberg, Mauro; Gómez, Georgina; Pareja, Rossina G; Yépez García, Martha C; Cortés Sanabria, Lilia Y; Herrera-Cuenca, Marianella; Rigotti, Attilio; Guajardo, Viviana; Zalcman Zimberg, Ioná; Nogueira Previdelli, Agatha; Moreno, Luis A; Koletzko, Berthold

    2018-05-31

    Few previous studies in Latin America (LA) have provided data on dietary intake composition with a standardized methodology. The present study aimed to characterize energy intake (EI) and to describe the main food sources of energy in representative samples of the urban population from eight LA countries from the Latin American Study in Nutrition and Health (ELANS). Cross-sectional study. Usual dietary intake was assessed with two non-consecutive 24 h dietary recalls. Urban areas from eight countries (Argentina, Brazil, Chile, Colombia, Costa Rica, Ecuador, Peru, Venezuela), September 2014 to July 2015. Adolescents and adults aged 15-65 years. Final sample comprised 9218 individuals, of whom 6648 (72·1 %) were considered plausible reporters. Overall, mean EI was 8196 kJ/d (1959 kcal/d), with a balanced distribution of macronutrients (54 % carbohydrate, 30 % fat, 16 % protein). Main food sources of energy were grains, pasta and bread (28 %), followed by meat and eggs (19 %), oils and fats (10 %), non-alcoholic homemade beverages (6 %) and ready-to-drink beverages (6 %). More than 25 % of EI was provided from food sources rich in sugar and fat, like sugary drinks, pastries, chips and candies. Meanwhile, only 18 % of EI was from food sources rich in fibre and micronutrients, such as whole grains, roots, fruits, vegetables, beans, fish and nuts. No critical differences were observed by gender or age. Public health efforts oriented to diminish consumption of refined carbohydrates, meats, oils and sugar and to increase nutrient dense-foods are a priority in the region to drive to a healthier diet.

  19. The Discriminatory Ability of the Fibromyalgia Rapid Screening Tool (FiRST): An International Study in Spain and Four Latin American Countries.

    PubMed

    Collado, Antonio; Torres, Xavier; Messina, Osvaldo D; Vidal, Luis F; Clark, Patricia; Ríos, Carlos; Solé, Emília; Arias, Anna; Perrot, Serge; Salomon, Patricia A

    2016-05-01

    To assess the transcultural equivalency of the Spanish version of the Fibromyalgia Rapid Screening Tool (FiRST) and its discriminatory ability in different Latin American samples. Validation study. Departments of Rheumatology in general hospitals and private centers; fibromyalgia unit in a university hospital. 350 chronic pain patients from Spain, Argentina, Mexico, Peru, and Ecuador. The cultural relevance of the Spanish version of the FiRST was evaluated. The ability of the FiRST as a screening tool for fibromyalgia was assessed by logistic regression analysis. To determine the degree to which potential confounders, such as differences in demographics, pain, affective distress, catastrophizing, and disability, might affect the discriminatory ability, the tool was reassessed by hierarchical multivariate logistic regression. Slightly different versions of the FiRST were recommended for use in each Latin American subsample. The FiRST showed acceptable criterion validity and was able to discriminate between fibromyalgia and non-fibromyalgia patients even after controlling for the effect of potential confounders. However, low specificities were observed in samples from Spain and Mexico. The Spanish version of the FiRST may be used as a screening tool for fibromyalgia in several Latin American subsamples, even in those patients with high scores on potential confounders. In Spain and Mexico, the low specificity of the FiRST suggests, however, that it would be best used to support a suspected diagnosis of fibromyalgia, rather than to exclude the diagnosis. © 2015 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Application of positive-real functions in hyperstable discrete model-reference adaptive system design.

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.

    1972-01-01

    Proposal of an algorithmic procedure, based on mathematical programming methods, to design compensators for hyperstable discrete model-reference adaptive systems (MRAS). The objective of the compensator is to render the MRAS insensitive to initial parameter estimates within a maximized hypercube in the model parameter space.

  1. Parallel Algorithm Solves Coupled Differential Equations

    NASA Technical Reports Server (NTRS)

    Hayashi, A.

    1987-01-01

    Numerical methods adapted to concurrent processing. Algorithm solves set of coupled partial differential equations by numerical integration. Adapted to run on hypercube computer, algorithm separates problem into smaller problems solved concurrently. Increase in computing speed with concurrent processing over that achievable with conventional sequential processing appreciable, especially for large problems.

  2. NASA Tech Briefs, November/December 1986, Special Edition

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics: Computing: The View from NASA Headquarters; Earth Resources Laboratory Applications Software: Versatile Tool for Data Analysis; The Hypercube: Cost-Effective Supercomputing; Artificial Intelligence: Rendezvous with NASA; NASA's Ada Connection; COSMIC: NASA's Software Treasurehouse; Golden Oldies: Tried and True NASA Software; Computer Technical Briefs; NASA TU Services; Digital Fly-by-Wire.

  3. Acculturative stress in Latin-American immigrants: an assessment proposal.

    PubMed

    Ruiz Hernández, José Antonio; Torrente Hernández, Ginesa; Rodríguez González, Angel; Ramírez de la Fe, María del Carmen

    2011-05-01

    The aim of this paper was to develop an instrument to assess levels of stress experienced by Latin-American immigrants in their acculturative process in Spain. A sample of 692 immigrants from Latin America, aged 20 to 63 years, took part on this study (54.9% males and 45.1% females). A 24-item questionnaire with high reliability (.92) was elaborated. Six factors related to acculturative stress were found: 1) discrimination and rejection; 2) differences with the out-group (native Spaniards); 3) citizenship problems and legality; 4) problems concerning social relationships with other immigrants; 5) nostalgia and longing; and 6) family break-up. Our findings show that participants have a high level of stress related to nostalgia and longing, family break-up, and the perception of discrimination and rejection by natives. The usefulness of the instrument and its applications and restrictions are discussed.

  4. Symptomatic Dengue in Children in 10 Asian and Latin American Countries.

    PubMed

    L'Azou, Maïna; Moureau, Annick; Sarti, Elsa; Nealon, Joshua; Zambrano, Betzana; Wartel, T Anh; Villar, Luis; Capeding, Maria R Z; Ochiai, R Leon

    2016-03-24

    The control groups in two phase 3 trials of dengue vaccine efficacy included two large regional cohorts that were followed up for dengue infection. These cohorts provided a sample for epidemiologic analyses of symptomatic dengue in children across 10 countries in Southeast Asia and Latin America in which dengue is endemic. We monitored acute febrile illness and virologically confirmed dengue (VCD) in 3424 healthy children, 2 to 16 years of age, in Asia (Indonesia, Malaysia, the Philippines, Thailand, and Vietnam) from June 2011 through December 2013 and in 6939 children, 9 to 18 years of age, in Latin America (Brazil, Colombia, Honduras, Mexico, and Puerto Rico) from June 2011 through April 2014. Acute febrile episodes were determined to be VCD by means of a nonstructural protein 1 antigen immunoassay and reverse-transcriptase-polymerase-chain-reaction assays. Dengue hemorrhagic fever was defined according to 1997 World Health Organization criteria. Approximately 10% of the febrile episodes in each cohort were confirmed to be VCD, with 319 VCD episodes (4.6 episodes per 100 person-years) occurring in the Asian cohort and 389 VCD episodes (2.9 episodes per 100 person-years) occurring in the Latin American cohort; no trend according to age group was observed. The incidence of dengue hemorrhagic fever was less than 0.3 episodes per 100 person-years in each cohort. The percentage of VCD episodes requiring hospitalization was 19.1% in the Asian cohort and 11.1% in the Latin American cohort. In comparable age groups (9 to 12 years and 13 to 16 years), the burden of dengue was higher in Asia than in Latin America. The burdens of dengue were substantial in the two regions and in all age groups. Burdens varied widely according to country, but the rates were generally higher and the disease more frequently severe in Asian countries than in Latin American countries. (Funded by Sanofi Pasteur; CYD14 and CYD15 ClinicalTrials.gov numbers, NCT01373281 and NCT01374516.).

  5. Social anxiety and social norms in individualistic and collectivistic countries.

    PubMed

    Schreier, Sina-Simone; Heinrichs, Nina; Alden, Lynn; Rapee, Ronald M; Hofmann, Stefan G; Chen, Junwen; Oh, Kyung Ja; Bögels, Susan

    2010-12-01

    Social anxiety is assumed to be related to cultural norms across countries. Heinrichs et al. [2006: Behav Res Ther 44:1187-1197] compared individualistic and collectivistic countries and found higher social anxiety and more positive attitudes toward socially avoidant behaviors in collectivistic rather than in individualistic countries. However, the authors failed to include Latin American countries in the collectivistic group. To provide support for these earlier results within an extended sample of collectivistic countries, 478 undergraduate students from individualistic countries were compared with 388 undergraduate students from collectivistic countries (including East Asian and Latin American) via self-report of social anxiety and social vignettes assessing social norms. As expected, the results of Heinrichs et al. [2006: Behav Res Ther 44:1187-1197] were replicated for the individualistic and Asian countries, but not for Latin American countries. Latin American countries displayed the lowest social anxiety levels, whereas the collectivistic East Asian group displayed the highest. These findings indicate that while culture-mediated social norms affect social anxiety and might help to shed light on the etiology of social anxiety disorder, the dimension of individualism-collectivism may not fully capture the relevant norms. © 2010 Wiley-Liss, Inc.

  6. Relation between the Global Burden of Disease and Randomized Clinical Trials Conducted in Latin America Published in the Five Leading Medical Journals

    PubMed Central

    Perel, Pablo; Miranda, J. Jaime; Ortiz, Zulma; Casas, Juan Pablo

    2008-01-01

    Background Since 1990 non communicable diseases and injuries account for the majority of death and disability-adjusted life years in Latin America. We analyzed the relationship between the global burden of disease and Randomized Clinical Trials (RCTs) conducted in Latin America that were published in the five leading medical journals. Methodology/Principal Findings We included all RCTs in humans, exclusively conducted in Latin American countries, and published in any of the following journals: Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, and New England Journal of Medicine. We described the trials and reported the number of RCTs according to the main categories of the global burden of disease. Sixty-six RCTs were identified. Communicable diseases accounted for 38 (57%) reports. Maternal, perinatal, and nutritional conditions accounted for 19 (29%) trials. Non-communicable diseases represent 48% of the global burden of disease but only 14% of reported trials. No trial addressed injuries despite its 18% contribution to the burden of disease in 2000. Conclusions/Significance A poor correlation between the burden of disease and RCTs publications was found. Non communicable diseases and injuries account for up to two thirds of the burden of disease in Latin America but these topics are seldom addressed in published RCTs in the selected sample of journals. Funding bodies of health research and editors should be aware of the increasing burden of non communicable diseases and injuries occurring in Latin America to ensure that this growing epidemic is not neglected in the research agenda and not affected by publication bias. PMID:18301772

  7. Prediction and uncertainty analysis of surface and groundwater exchange in a Rhine floodplain in south-west Germany

    NASA Astrophysics Data System (ADS)

    Maier, Nadine; Breuer, Lutz; Kraft, Philipp

    2017-04-01

    Inundations and the resulting exchange between surface water and groundwater are of importance for all floodplain ecosystems. Because of the high groundwater level in floodplains and the groundwater dependence of floodplain vegetation habitat models of floodplains should include detailed information of groundwater and surface water dynamics. Such models can, for example, serve as a basis for restoration measures, focusing on the re-establishment of rare species. To capture these groundwater and surface water dynamics we use a distributed model approach to simulate the groundwater levels in a floodplain stream section of the Rhine in Hesse, Germany (14.8 km2). This area is part of the large nature reserve "Kühkopf-Knoblochsaue" and hosts rare and endangered flora and fauna. We developed a physical-deterministic model of a floodplain to simulate the groundwater situation and the flooding events in the floodplain. The model is built with the Catchment Modeling Framework (CMF) and includes the interaction of groundwater and surface water flow. To reduce the computation time of the model, we used a simple flood distribution scheme instead of solving the St. Venant equation for surface water fluxes. The floodplain is split into two sub-regions, according to the two nature reserve regions with the same model setup. Each model divides the study area laterally into irregular polygonal cells (270 - 400) with different sizes (114 - 480'000 m2), based on similar elevation and land use. For each sub-region the water level of the Rhine and the groundwater levels of three monitoring wells at the boundary of the model area are used as driving factors. As predictor variables we use observation data from four to six different groundwater monitoring wells in the sub-regions. For each model we run 5,000 simulations following a Latin Hypercube sampling procedure to investigate parameter uncertainty and derive behavioral model runs. We received RMSEs between 0.18 and 0.28 m for the different groundwater wells for the calibration period of 2.5 years and RMSEs between 0.16 and 0.23 m for the validation period of 9.5 years. Finally, we derived hydrological predictors (e.g. longest flooding period, amount of flooding days during the vegetation period, etc) from the model runs for following habitat models.

  8. RAVEN Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    2016-06-01

    RAVEN is a software framework able to perform parametric and stochastic analysis based on the response of complex system codes. The initial development was aimed at providing dynamic risk analysis capabilities to the thermohydraulic code RELAP-7, currently under development at Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose stochastic and uncertainty quantification platform, capable of communicating with any system code. In fact, the provided Application Programming Interfaces (APIs) allow RAVEN to interact with any code as long as all the parameters that need to be perturbed are accessible by input filesmore » or via python interfaces. RAVEN is capable of investigating system response and explore input space using various sampling schemes such as Monte Carlo, grid, or Latin hypercube. However, RAVEN strength lies in its system feature discovery capabilities such as: constructing limit surfaces, separating regions of the input space leading to system failure, and using dynamic supervised learning techniques. The development of RAVEN started in 2012 when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework arose. RAVEN’s principal assignment is to provide the necessary software and algorithms in order to employ the concepts developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just to identify the frequency of an event potentially leading to a system failure, but the proximity (or lack thereof) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. peak pressure in a pipe) is exceeded under certain conditions. Most of the capabilities, implemented having RELAP-7 as a principal focus, are easily deployable to other system codes. For this reason, several side activates have been employed (e.g. RELAP5-3D, any MOOSE-based App, etc.) or are currently ongoing for coupling RAVEN with several different software. The aim of this document is to provide a set of commented examples that can help the user to become familiar with the RAVEN code usage.« less

  9. Characterizing Drought Events from a Hydrological Model Ensemble

    NASA Astrophysics Data System (ADS)

    Smith, Katie; Parry, Simon; Prudhomme, Christel; Hannaford, Jamie; Tanguy, Maliko; Barker, Lucy; Svensson, Cecilia

    2017-04-01

    Hydrological droughts are a slow onset natural hazard that can affect large areas. Within the United Kingdom there have been eight major drought events over the last 50 years, with several events acting at the continental scale, and covering the entire nation. Many of these events have lasted several years and had significant impacts on agriculture, the environment and the economy. Generally in the UK, due to a northwest-southeast gradient in rainfall and relief, as well as varying underlying geology, droughts tend to be most severe in the southeast, which can threaten water supplies to the capital in London. With the impacts of climate change likely to increase the severity and duration of drought events worldwide, it is crucial that we gain an understanding of the characteristics of some of the longer and more extreme droughts of the 19th and 20th centuries, so we may utilize this information in planning for the future. Hydrological models are essential both for reconstructing such events that predate streamflow records, and for use in drought forecasting. However, whilst the uncertainties involved in modelling hydrological extremes on the flooding end of the flow regime have been studied in depth over the past few decades, the uncertainties in simulating droughts and low flow events have not yet received such rigorous academic attention. The "Cascade of Uncertainty" approach has been applied to explore uncertainty and coherence across simulations of notable drought events from the past 50 years using the airGR family of daily lumped catchment models. Parameter uncertainty has been addressed using a Latin Hypercube sampled experiment of 500,000 parameter sets per model (GR4J, GR5J and GR6J), over more than 200 catchments across the UK. The best performing model parameterisations, determined using a multi-objective function approach, have then been taken forward for use in the assessment of the impact of model parameters and model structure on drought event detection and characterization. This ensemble approach allows for uncertainty estimates and confidence intervals to be explored in simulations of drought event characteristics, such as duration and severity, which would not otherwise be available from a deterministic approach. The acquired understanding of uncertainty in drought events may then be applied to historic drought reconstructions, supplying evidence which could prove vital in decision making scenarios.

  10. Coach simplified structure modeling and optimization study based on the PBM method

    NASA Astrophysics Data System (ADS)

    Zhang, Miaoli; Ren, Jindong; Yin, Ying; Du, Jian

    2016-09-01

    For the coach industry, rapid modeling and efficient optimization methods are desirable for structure modeling and optimization based on simplified structures, especially for use early in the concept phase and with capabilities of accurately expressing the mechanical properties of structure and with flexible section forms. However, the present dimension-based methods cannot easily meet these requirements. To achieve these goals, the property-based modeling (PBM) beam modeling method is studied based on the PBM theory and in conjunction with the characteristics of coach structure of taking beam as the main component. For a beam component of concrete length, its mechanical characteristics are primarily affected by the section properties. Four section parameters are adopted to describe the mechanical properties of a beam, including the section area, the principal moments of inertia about the two principal axles, and the torsion constant of the section. Based on the equivalent stiffness strategy, expressions for the above section parameters are derived, and the PBM beam element is implemented in HyperMesh software. A case is realized using this method, in which the structure of a passenger coach is simplified. The model precision is validated by comparing the basic performance of the total structure with that of the original structure, including the bending and torsion stiffness and the first-order bending and torsional modal frequencies. Sensitivity analysis is conducted to choose design variables. The optimal Latin hypercube experiment design is adopted to sample the test points, and polynomial response surfaces are used to fit these points. To improve the bending and torsion stiffness and the first-order torsional frequency and taking the allowable maximum stresses of the braking and left turning conditions as constraints, the multi-objective optimization of the structure is conducted using the NSGA-II genetic algorithm on the ISIGHT platform. The result of the Pareto solution set is acquired, and the selection strategy of the final solution is discussed. The case study demonstrates that the mechanical performances of the structure can be well-modeled and simulated by PBM beam. Because of the merits of fewer parameters and convenience of use, this method is suitable to be applied in the concept stage. Another merit is that the optimization results are the requirements for the mechanical performance of the beam section instead of those of the shape and dimensions, bringing flexibility to the succeeding design.

  11. A parallel offline CFD and closed-form approximation strategy for computationally efficient analysis of complex fluid flows

    NASA Astrophysics Data System (ADS)

    Allphin, Devin

    Computational fluid dynamics (CFD) solution approximations for complex fluid flow problems have become a common and powerful engineering analysis technique. These tools, though qualitatively useful, remain limited in practice by their underlying inverse relationship between simulation accuracy and overall computational expense. While a great volume of research has focused on remedying these issues inherent to CFD, one traditionally overlooked area of resource reduction for engineering analysis concerns the basic definition and determination of functional relationships for the studied fluid flow variables. This artificial relationship-building technique, called meta-modeling or surrogate/offline approximation, uses design of experiments (DOE) theory to efficiently approximate non-physical coupling between the variables of interest in a fluid flow analysis problem. By mathematically approximating these variables, DOE methods can effectively reduce the required quantity of CFD simulations, freeing computational resources for other analytical focuses. An idealized interpretation of a fluid flow problem can also be employed to create suitably accurate approximations of fluid flow variables for the purposes of engineering analysis. When used in parallel with a meta-modeling approximation, a closed-form approximation can provide useful feedback concerning proper construction, suitability, or even necessity of an offline approximation tool. It also provides a short-circuit pathway for further reducing the overall computational demands of a fluid flow analysis, again freeing resources for otherwise unsuitable resource expenditures. To validate these inferences, a design optimization problem was presented requiring the inexpensive estimation of aerodynamic forces applied to a valve operating on a simulated piston-cylinder heat engine. The determination of these forces was to be found using parallel surrogate and exact approximation methods, thus evidencing the comparative benefits of this technique. For the offline approximation, latin hypercube sampling (LHS) was used for design space filling across four (4) independent design variable degrees of freedom (DOF). Flow solutions at the mapped test sites were converged using STAR-CCM+ with aerodynamic forces from the CFD models then functionally approximated using Kriging interpolation. For the closed-form approximation, the problem was interpreted as an ideal 2-D converging-diverging (C-D) nozzle, where aerodynamic forces were directly mapped by application of the Euler equation solutions for isentropic compression/expansion. A cost-weighting procedure was finally established for creating model-selective discretionary logic, with a synthesized parallel simulation resource summary provided.

  12. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ampomah, William; Balch, Robert; Will, Robert

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  13. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE PAGES

    Ampomah, William; Balch, Robert; Will, Robert; ...

    2017-07-01

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  14. Social capital, income inequality and the social gradient in self-rated health in Latin America: A fixed effects analysis.

    PubMed

    Vincens, Natalia; Emmelin, Maria; Stafström, Martin

    2018-01-01

    Latin America is the most unequal region in the world. The current sustainable development agenda increased attention to health inequity and its determinants in the region. Our aim is to investigate the social gradient in health in Latin America and assess the effects of social capital and income inequality on it. We used cross-sectional data from the World Values Survey and the World Bank. Our sample included 10,426 respondents in eight Latin American countries. Self-rated health was used as the outcome. Education level was the socioeconomic position indicator. We measured social capital by associational membership, civic participation, generalized trust, and neighborhood trust indicators at both individual and country levels. Income inequality was operationalized using the Gini index at country-level. We employed fixed effects logistic regressions and cross-level interactions to assess the impact of social capital and income inequality on the heath gradient, controlling for country heterogeneity. Education level was independently associated with self-rated health, representing a clear social gradient in health, favoring individuals in higher socioeconomic positions. Generalized and neighborhood trust at country-level moderated the effect on the association between socioeconomic position and health, yet favoring individuals in lower socioeconomic positions, especially in lower inequality countries, despite their lower individual social capital. Our findings suggest that collective rather than individual social capital can impact the social gradient in health in Latin America, explaining health inequalities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. School-based programs aimed at the prevention and treatment of obesity: evidence-based interventions for youth in Latin America.

    PubMed

    Lobelo, Felipe; Garcia de Quevedo, Isabel; Holub, Christina K; Nagle, Brian J; Arredondo, Elva M; Barquera, Simón; Elder, John P

    2013-09-01

    Rapidly rising childhood obesity rates constitute a public health priority in Latin America which makes it imperative to develop evidence-based strategies. Schools are a promising setting but to date it is unclear how many school-based obesity interventions have been documented in Latin America and what level of evidence can be gathered from such interventions. We performed a systematic review of papers published between 1965 and December 2010. Interventions were considered eligible if they had a school-based component, were done in Latin America, evaluated an obesity related outcome (body mass index [BMI], weight, %body fat, waist circumference, BMI z-score), and compared youth exposed vs not exposed. Ten studies were identified as having a school-based component. Most interventions had a sample of normal and overweight children. The most successful interventions focused on prevention rather than treatment, had longer follow-ups, a multidisciplinary team, and fewer limitations in execution. Three prevention and 2 treatment interventions found sufficient improvements in obesity-related outcomes. We found sufficient evidence to recommend school-based interventions to prevent obesity among youth in Latin America. Evidence-based interventions in the school setting should be promoted as an important component for integrated programs, policies, and monitoring frameworks designed to reverse the childhood obesity in the region. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  16. Insights, attitudes, and perceptions about asthma and its treatment: findings from a multinational survey of patients from Latin America.

    PubMed

    Maspero, Jorge F; Jardim, Jose R; Aranda, Alvaro; Tassinari C, Paolo; Gonzalez-Diaz, Sandra N; Sansores, Raul H; Moreno-Cantu, Jorge J; Fish, James E

    2013-11-04

    In 2011 the Latin America Asthma Insight and Management (LA AIM) survey explored the realities of living with asthma. We investigated perception, knowledge, and attitudes related to asthma among Latin American asthma patients. Asthma patients aged ≥12 years from four Latin American countries (Argentina, Brazil, Mexico, Venezuela) and the Commonwealth of Puerto Rico responded to questions during face-to-face interviews. A sample size of 2,169 patients (approximately 400 patients/location) provided an accurate representation of asthma patients' opinions. Questions probed respondents' views on topics such as levels of asthma control, frequency and duration of exacerbations, and current and recent use of asthma medications. A total of 2,169 adults or parents of children with asthma participated in the LA AIM survey. At least 20% of respondents experienced symptoms every day or night or most days or nights. Although 60% reported their disease as well or completely controlled, only 8% met guideline criteria for well-controlled asthma. 47% of respondents reported episodes when their asthma symptoms were more frequent or severe than normal, and 44% reported seeking acute care for asthma in the past year. Asthma patients in Latin America overestimated their degree of asthma control. The LA AIM survey demonstrated the discrepancy between patient perception of asthma control and guideline-mandated criteria. Additional education is required to teach patients that, by more closely following asthma management strategies outlined by current guidelines more patients can achieve adequate asthma control.

  17. Super and parallel computers and their impact on civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamat, M.P.

    1986-01-01

    This book presents the papers given at a conference on the use of supercomputers in civil engineering. Topics considered at the conference included solving nonlinear equations on a hypercube, a custom architectured parallel processing system, distributed data processing, algorithms, computer architecture, parallel processing, vector processing, computerized simulation, and cost benefit analysis.

  18. Assignment Of Finite Elements To Parallel Processors

    NASA Technical Reports Server (NTRS)

    Salama, Moktar A.; Flower, Jon W.; Otto, Steve W.

    1990-01-01

    Elements assigned approximately optimally to subdomains. Mapping algorithm based on simulated-annealing concept used to minimize approximate time required to perform finite-element computation on hypercube computer or other network of parallel data processors. Mapping algorithm needed when shape of domain complicated or otherwise not obvious what allocation of elements to subdomains minimizes cost of computation.

  19. 3D Navier-Stokes Flow Analysis for a Large-Array Multiprocessor

    DTIC Science & Technology

    1989-04-17

    computer, Alliant’s FX /8, Intel’s Hypercube, and Encore’s Multimax. Unfortunately, the current algorithms have been developed pri- marily for SISD machines...Reversing and Thrust-Vectoring Nozzle Flows," Ph.D. Dissertation in the Dept. of Aero. and Astro ., Univ. of Wash., Washington, 1986. [11] Anderson

  20. Distributed Memory Compiler Methods for Irregular Problems - Data Copy Reuse and Runtime Partitioning

    DTIC Science & Technology

    1991-09-01

    addition, support for Saltz was provided by NSF from NSF Grant ASC-8819374. i 1, introduction Over the past fewyers, ,we have devoped -methods needed to... network . In Third Conf. on Hypercube Concurrent Computers and Applications, pages 241-27278, 1988. [17] G. Fox, S. Hiranandani, K. Kennedy, C. Koelbel

  1. Taste symmetry breaking with hypercubic-smeared staggered fermions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bae, Taegil; Adams, David H.; Kim, Hyung-Jin

    2008-05-01

    We study the impact of hypercubic (HYP) smearing on the size of taste-breaking for staggered fermions, comparing to unimproved and to asqtad-improved staggered fermions. As in previous studies, we find a substantial reduction in taste-breaking compared to unimproved staggered fermions (by a factor of 4-7 on lattices with spacing a{approx_equal}0.1 fm). In addition, we observe that discretization effects of next-to-leading order in the chiral expansion (O(a{sup 2}p{sup 2})) are markedly reduced by HYP smearing. Compared to asqtad valence fermions, we find that taste-breaking in the pion spectrum is reduced by a factor of 2.5-3, down to a level comparable tomore » the expected size of generic O(a{sup 2}) effects. Our results suggest that, once one reaches a lattice spacing of a{approx_equal}0.09 fm, taste-breaking will be small enough after HYP smearing that one can use a modified power counting in which O(a{sup 2})<

  2. A parallel expert system for the control of a robotic air vehicle

    NASA Technical Reports Server (NTRS)

    Shakley, Donald; Lamont, Gary B.

    1988-01-01

    Expert systems can be used to govern the intelligent control of vehicles, for example the Robotic Air Vehicle (RAV). Due to the nature of the RAV system the associated expert system needs to perform in a demanding real-time environment. The use of a parallel processing capability to support the associated expert system's computational requirement is critical in this application. Thus, algorithms for parallel real-time expert systems must be designed, analyzed, and synthesized. The design process incorporates a consideration of the rule-set/face-set size along with representation issues. These issues are looked at in reference to information movement and various inference mechanisms. Also examined is the process involved with transporting the RAV expert system functions from the TI Explorer, where they are implemented in the Automated Reasoning Tool (ART), to the iPSC Hypercube, where the system is synthesized using Concurrent Common LISP (CCLISP). The transformation process for the ART to CCLISP conversion is described. The performance characteristics of the parallel implementation of these expert systems on the iPSC Hypercube are compared to the TI Explorer implementation.

  3. Hypercat - Hypercube of AGN tori

    NASA Astrophysics Data System (ADS)

    Nikutta, Robert; Lopez-Rodriguez, Enrique; Ichikawa, Kohei; Levenson, Nancy A.; Packham, Christopher C.

    2018-06-01

    AGN unification and observations hold that a dusty torus obscures the central accretion engine along some lines of sight. SEDs of dust tori have been modeled for a long time, but resolved emission morphologies have not been studied in much detail, because resolved observations are only possible recently (VLTI,ALMA) and in the near future (TMT,ELT,GMT). Some observations challenge a simple torus model, because in several objects most of MIR emission appears to emanate from polar regions high above the equatorial plane, i.e. not where the dust supposedly resides.We introduce our software framework and hypercube of AGN tori (Hypercat) made with CLUMPY (www.clumpy.org), a large set of images (6 model parameters + wavelength) to facilitate studies of emission and dust morphologies. We make use of Hypercat to study the morphological properties of the emission and dust distributions as function of model parameters. We find that a simple clumpy torus can indeed produce 10-micron emission patterns extended in polar directions, with extension ratios compatible with those found in observations. We are able to constrain the range of parameters that produce such morphologies.

  4. Frequency of early vascular aging and associated risk factors among an adult population in Latin America: the OPTIMO study.

    PubMed

    Botto, Fernando; Obregon, Sebastian; Rubinstein, Fernando; Scuteri, Angelo; Nilsson, Peter M; Kotliar, Carol

    2018-03-01

    The main objective was to estimate the frequency of early vascular aging (EVA) in a sample of subjects from Latin America, with emphasis in young adults. We included 1416 subjects from 12 countries in Latin America who provided information about lifestyle, cardiovascular risk factors (CVRF), and anthropometrics. We measured pulse wave velocity (PWV) as a marker of arterial stiffness, and blood pressure (BP) using an oscillometric device (Mobil-O-Graph). To determine the frequency of EVA, we used multiple linear regression to estimate each subject's PWV expected for his/her age and systolic BP, and compared with observed values to obtain standardized residuals (z-scores). We defined EVA when z-score was ≥1.96. Finally, a multivariable logistic regression analysis was performed to determine baseline characteristics associated with EVA. Mean age was 49.9 ± 15.5 years, male gender was 50.3%. Mean PWV was 7.52 m/s (SD 1.97), mean systolic BP was 125.3 mmHg (SD 16.7) and mean diastolic BP was 78.9 mmHg (SD 12.2). The frequency of EVA was 5.7% in the total population, 9.8% in adults of 40 years or less and 18.7% in those 30 years or less. In these young adults, multiple logistic regression analyses demonstrated that dyslipidemia and hypertension showed an independent association with EVA, and smoking a borderline association (p  =  0.07). In conclusion, the frequency of EVA in a sample from Latin America was around 6%, with higher rates in young adults. These results would support the search of CVRF and EVA during early adulthood.

  5. Latin American immigrants have limited access to health insurance in Japan: a cross sectional study

    PubMed Central

    2012-01-01

    Background Japan provides universal health insurance to all legal residents. Prior research has suggested that immigrants to Japan disproportionately lack health insurance coverage, but no prior study has used rigorous methodology to examine this issue among Latin American immigrants in Japan. The aim of our study, therefore, was to assess the pattern of health insurance coverage and predictors of uninsurance among documented Latin American immigrants in Japan. Methods We used a cross sectional, mixed method approach using a probability proportional to estimated size sampling procedure. Of 1052 eligible Latin American residents mapped through extensive fieldwork in selected clusters, 400 immigrant residents living in Nagahama City, Japan were randomly selected for our study. Data were collected through face-to-face interviews using a structured questionnaire developed from qualitative interviews. Results Our response rate was 70.5% (n = 282). Respondents were mainly from Brazil (69.9%), under 40 years of age (64.5%) and had lived in Japan for 9.45 years (SE 0.44; median, 8.00). We found a high prevalence of uninsurance (19.8%) among our sample compared with the estimated national average of 1.3% in the general population. Among the insured full time workers (n = 209), 55.5% were not covered by the Employee's Health Insurance. Many immigrants cited financial trade-offs as the main reasons for uninsurance. Lacking of knowledge that health insurance is mandatory in Japan, not having a chronic disease, and having one or no children were strong predictors of uninsurance. Conclusions Lack of health insurance for immigrants in Japan is a serious concern for this population as well as for the Japanese health care system. Appropriate measures should be taken to facilitate access to health insurance for this vulnerable population. PMID:22443284

  6. Scalability of Parallel Spatial Direct Numerical Simulations on Intel Hypercube and IBM SP1 and SP2

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Hanebutte, Ulf R.; Zubair, Mohammad

    1995-01-01

    The implementation and performance of a parallel spatial direct numerical simulation (PSDNS) approach on the Intel iPSC/860 hypercube and IBM SP1 and SP2 parallel computers is documented. Spatially evolving disturbances associated with the laminar-to-turbulent transition in boundary-layer flows are computed with the PSDNS code. The feasibility of using the PSDNS to perform transition studies on these computers is examined. The results indicate that PSDNS approach can effectively be parallelized on a distributed-memory parallel machine by remapping the distributed data structure during the course of the calculation. Scalability information is provided to estimate computational costs to match the actual costs relative to changes in the number of grid points. By increasing the number of processors, slower than linear speedups are achieved with optimized (machine-dependent library) routines. This slower than linear speedup results because the computational cost is dominated by FFT routine, which yields less than ideal speedups. By using appropriate compile options and optimized library routines on the SP1, the serial code achieves 52-56 M ops on a single node of the SP1 (45 percent of theoretical peak performance). The actual performance of the PSDNS code on the SP1 is evaluated with a "real world" simulation that consists of 1.7 million grid points. One time step of this simulation is calculated on eight nodes of the SP1 in the same time as required by a Cray Y/MP supercomputer. For the same simulation, 32-nodes of the SP1 and SP2 are required to reach the performance of a Cray C-90. A 32 node SP1 (SP2) configuration is 2.9 (4.6) times faster than a Cray Y/MP for this simulation, while the hypercube is roughly 2 times slower than the Y/MP for this application. KEY WORDS: Spatial direct numerical simulations; incompressible viscous flows; spectral methods; finite differences; parallel computing.

  7. A Shared Heritage: Afro-Latin@s and Black History

    ERIC Educational Resources Information Center

    Busey, Christopher L.; Cruz, Bárbara C.

    2015-01-01

    As the Latin@ population continues to grow in the United States, it is imperative that social studies teachers are aware of the rich history and sociocultural complexities of Latin@ identity. In particular, there is a large population of Latin@s of African descent throughout Latin America, the Caribbean, and North America. However, Afro-Latin@s…

  8. Detection and follow-up of chronic obstructive pulmonary disease (COPD) and risk factors in the Southern Cone of Latin America: the pulmonary risk in South America (PRISA) study.

    PubMed

    Rubinstein, Adolfo L; Irazola, Vilma E; Bazzano, Lydia A; Sobrino, Edgardo; Calandrelli, Matías; Lanas, Fernando; Lee, Alison G; Manfredi, Jose A; Olivera, Héctor; Ponzo, Jacqueline; Seron, Pamela; He, Jiang

    2011-06-01

    The World Health Organization has estimated that by 2030, chronic obstructive pulmonary disease will be the third leading cause of death worldwide. Most knowledge of chronic obstructive pulmonary disease is based on studies performed in Europe or North America and little is known about the prevalence, patient characteristics and change in lung function over time in patients in developing countries, such as those of Latin America. This lack of knowledge is in sharp contrast to the high levels of tobacco consumption and exposure to biomass fuels exhibited in Latin America, both major risk factors for the development of chronic obstructive pulmonary disease. Studies have also demonstrated that most Latin American physicians frequently do not follow international chronic obstructive pulmonary disease diagnostic and treatment guidelines. The PRISA Study will expand the current knowledge regarding chronic obstructive pulmonary disease and risk factors in Argentina, Chile and Uruguay to inform policy makers and health professionals on the best policies and practices to address this condition. PRISA is an observational, prospective cohort study with at least four years of follow-up. In the first year, PRISA has employed a randomized three-staged stratified cluster sampling strategy to identify 6,000 subjects from Marcos Paz and Bariloche, Argentina, Temuco, Chile, and Canelones, Uruguay. Information, such as comorbidities, socioeconomic status and tobacco and biomass exposure, will be collected and spirometry, anthropometric measurements, blood sampling and electrocardiogram will be performed. In year four, subjects will have repeat measurements taken. There is no longitudinal data on chronic obstructive pulmonary disease incidence and risk factors in the southern cone of Latin America, therefore this population-based prospective cohort study will fill knowledge gaps in the prevalence and incidence of chronic obstructive pulmonary disease, patient characteristics and changes in lung function over time as well as quality of life and health care resource utilization. Information gathered during the PRISA Study will inform public health interventions and prevention practices to reduce risk of COPD in the region.

  9. Maternal control, cognitive style, and childhood anxiety: a test of a theoretical model in a multi-ethnic sample.

    PubMed

    Creveling, C Christiane; Varela, R Enrique; Weems, Carl F; Corey, David M

    2010-08-01

    This study tested a theoretical model of the interrelations among controlling parenting, negative cognitive styles, children's anxiety, and race/ethnicity. The model suggests that, in general, cognitive style mediates the relation between maternal control and child anxiety but that the set of associations may differ as a function of ethnicity. African American (n = 235), Latin American (n = 56), and European American (n = 136) children completed measures of their anxiety, cognitive schemas reflecting impaired autonomy/performance and disconnection/rejection domains, and maternal control. Results indicated that a disconnection/rejection negative cognitive style mediated the effect of perceived maternal control on childhood anxiety only for the European American group. Maternal control was associated with the impaired autonomy/performance cognitive style for each of the three ethnic groups and with a disconnection/rejection cognitive style only for the European American and Latin American groups. Maternal control had an indirect effect on anxiety through the disconnection/rejection cognitive style for the Latin American group. The results are discussed in terms of how the model presented extends current theories of anxiety problems to African American and Latin American children by noting that significant cultural variations may exist in how parenting practices and cognitive styles relate to children's anxiety levels.

  10. [Design and validation of a scale to assess Latin American medical students' perception on the labour of the first level of health care].

    PubMed

    Mayta-Tristán, Percy; Mezones-Holguín, Edward; Pereyra-Elías, Reneé; Montenegro-Idrogo, Juan J; Mejia, Christian R; Dulanto-Pizzorni, Andrés; Muñoz, Sergio R

    2013-04-01

    To design and validate a scale to assess Latin American medical students' perception on first level of health care (FLHC). An observational, analytic and multicentre study was carried out in two phases: i) A self-administered questionnaire regarding perceptions on FLHClabor was designed. ii) This questionnaire was applied to to medical students from 18 universities of eight Spanish-speaking Latin American countries. An exploratory factor analysis (EFA) was performed through a principal components analysis with orthogonal varimax rotation. Sample adequacy was evaluated. Factor extraction was based on Kaiser's criteria, Cattell's Scree test and the explained variance (>5%). Internal consistency was measured with Cronbach's alpha. 423 students were included in the analysis; 53.4% were from Peruvian universities. After the EFA, the questionnaire conserved 11 items, which were distributed in three domains, that explaining together 55.47% of the total variance: i) Perceptions concerning the FLHC physician; ii) Perceptions concerning the FiLC labor and iii) Perceptions about the economic consequences of working in FLHC. The scale is composed by three domains and can be used to assess the perceptions of the medical work on first level of health care of Spanish-speaking Latin American medical students.

  11. Research into the Architecture of CAD Based Robot Vision Systems

    DTIC Science & Technology

    1988-02-09

    Vision 󈨚 and "Automatic Generation of Recognition Features for Com- puter Vision," Mudge, Turney and Volz, published in Robotica (1987). All of the...Occluded Parts," (T.N. Mudge, J.L. Turney, and R.A. Volz), Robotica , vol. 5, 1987, pp. 117-127. 5. "Vision Algorithms for Hypercube Machines," (T.N. Mudge

  12. A graph with fractional revival

    NASA Astrophysics Data System (ADS)

    Bernard, Pierre-Antoine; Chan, Ada; Loranger, Érika; Tamon, Christino; Vinet, Luc

    2018-02-01

    An example of a graph that admits balanced fractional revival between antipodes is presented. It is obtained by establishing the correspondence between the quantum walk on a hypercube where the opposite vertices across the diagonals of each face are connected and, the coherent transport of single excitations in the extension of the Krawtchouk spin chain with next-to-nearest neighbour interactions.

  13. Dietary Interventions and Blood Pressure in Latin America - Systematic Review and Meta-Analysis

    PubMed Central

    Mazzaro, Caroline Cantalejo; Klostermann, Flávia Caroline; Erbano, Bruna Olandoski; Schio, Nicolle Amboni; Guarita-Souza, Luiz César; Olandoski, Marcia; Faria-Neto, José Rocha; Baena, Cristina Pellegrino

    2014-01-01

    Background High blood pressure is the major risk factor for cardiovascular disease. Low blood pressure control rates in Latin American populations emphasize the need for gathering evidence on effective therapies. Objective To evaluate the effects of dietary interventions on blood pressure in Latin American populations. Methods Systematic review. Electronic databases (MEDLINE/PubMed, Embase, Cochrane Library, CINAHL, Web of Science, Scopus, SciELO, LILACS and VHL) were searched and manual search for studies published up to April 2013 was performed. Parallel studies about dietary interventions in Latin American adult populations assessing arterial blood pressure (mm Hg) before and after intervention were included. Results Of the 405 studies identified, 10 randomized controlled trials were included and divided into 3 subgroups according to the proposed dietary intervention. There was a non-significant reduction in systolic blood pressure in the subgroups of mineral replacement (-4.82; 95% CI: -11.36 to 1.73) and complex pattern diets (-3.17; 95% CI: -7.62 to 1.28). Regarding diastolic blood pressure, except for the hyperproteic diet subgroup, all subgroups showed a significant reduction in blood pressure: -4.66 mmHg (95% CI: -9.21 to -0.12) and -4.55 mmHg (95% CI: -7.04 to -2.06) for mineral replacement and complex pattern diets, respectively. Conclusion Available evidence on the effects of dietary changes on blood pressure in Latin American populations indicates a homogeneous effect of those interventions, although not significant for systolic blood pressure. Samples were small and the quality of the studies was generally low. Larger studies are required to build robust evidence. PMID:24676220

  14. Dietary interventions and blood pressure in Latin America - systematic review and meta-analysis.

    PubMed

    Mazzaro, Caroline Cantalejo; Klostermann, Flávia Caroline; Erbano, Bruna Olandoski; Schio, Nicolle Amboni; Guarita-Souza, Luiz César; Olandoski, Marcia; Faria-Neto, José Rocha; Baena, Cristina Pellegrino

    2014-04-01

    High blood pressure is the major risk factor for cardiovascular disease. Low blood pressure control rates in Latin American populations emphasize the need for gathering evidence on effective therapies. To evaluate the effects of dietary interventions on blood pressure in Latin American populations. Systematic review. Electronic databases (MEDLINE/PubMed, Embase, Cochrane Library, CINAHL, Web of Science, Scopus, SciELO, LILACS and VHL) were searched and manual search for studies published up to April 2013 was performed. Parallel studies about dietary interventions in Latin American adult populations assessing arterial blood pressure (mm Hg) before and after intervention were included. Of the 405 studies identified, 10 randomized controlled trials were included and divided into 3 subgroups according to the proposed dietary intervention. There was a non-significant reduction in systolic blood pressure in the subgroups of mineral replacement (-4.82; 95% CI: -11.36 to 1.73) and complex pattern diets (-3.17; 95% CI: -7.62 to 1.28). Regarding diastolic blood pressure, except for the hyperproteic diet subgroup, all subgroups showed a significant reduction in blood pressure: -4.66 mmHg (95% CI: -9.21 to -0.12) and -4.55 mmHg (95% CI: -7.04 to -2.06) for mineral replacement and complex pattern diets, respectively. Available evidence on the effects of dietary changes on blood pressure in Latin American populations indicates a homogeneous effect of those interventions, although not significant for systolic blood pressure. Samples were small and the quality of the studies was generally low. Larger studies are required to build robust evidence.

  15. Health-related quality of life of latin-american immigrants and spanish-born attended in spanish primary health care: socio-demographic and psychosocial factors.

    PubMed

    Salinero-Fort, Miguel Ángel; Gómez-Campelo, Paloma; Bragado-Alvárez, Carmen; Abánades-Herranz, Juan Carlos; Jiménez-García, Rodrigo; de Burgos-Lunar, Carmen

    2015-01-01

    This study compares the health-related quality of life of Spanish-born and Latin American-born individuals settled in Spain. Socio-demographic and psychosocial factors associated with health-related quality of life are analyzed. A cross-sectional Primary Health Care multi center-based study of Latin American-born (n = 691) and Spanish-born (n = 903) outpatients from 15 Primary Health Care Centers (Madrid, Spain). The Medical Outcomes Study 36-Item Short Form Health Survey (SF-36) was used to assess health-related quality of life. Socio-demographic, psychosocial, and specific migration data were also collected. Compared to Spanish-born participants, Latin American-born participants reported higher health-related quality of life in the physical functioning and vitality dimensions. Across the entire sample, Latin American-born participants, younger participants, men and those with high social support reported significantly higher levels of physical health. Men with higher social support and a higher income reported significantly higher mental health. When stratified by gender, data show that for men physical health was only positively associated with younger age. For women, in addition to age, social support and marital status were significantly related. Both men and women with higher social support and income had significantly better mental health. Finally, for immigrants, the physical and mental health components of health-related quality of life were not found to be significantly associated with any of the pre-migration factors or conditions of migration. Only the variable "exposure to political violence" was significantly associated with the mental health component (p = 0.014). The key factors to understanding HRQoL among Latin American-born immigrants settled in Spain are age, sex and social support. Therefore, strategies to maintain optimal health outcomes in these immigrant communities should include public policies on social inclusion in the host society and focus on improving social support networks in order to foster and maintain the health and HRQoL of this group.

  16. A pilot test of the Latin active hip hop intervention to increase physical activity among low-income Mexican-American adolescents.

    PubMed

    Romero, Andrea J

    2012-01-01

    The primary purpose of the current study was to develop, implement, and evaluate a hip hop dance intervention, Latin Active, among low-income Mexican-American adolescents. Mexican-descent adolescents tend to have disproportionate rates of low physical activity, overweight status, and obesity. A 5-week intervention design with pretest and post-test self-report measures. Charter middle school (grades 6-9) health/science classes in a low-income neighborhood were the setting for the Latin Active intervention. Overall, 81 participants were recruited; 73 (n  =  41, female; n  =  32, male) provided active parental consent to complete pretest/post-test surveys. Intervention . The Latin Active program included 10 interactive 50-minute lessons that were delivered twice a week during science/health classes. The curriculum was created on the basis of Social Cognitive Theory, Critical Hip Hop Pedagogy, and feedback from key stakeholders. The lessons focused on increasing physical activity as well as neighborhood barriers. The self-report pretest (n  =  73) and post-test (n  =  56) surveys included measures for frequency of vigorous physical activity, self-efficacy, and neighborhood barriers. Analysis . Paired-sample t-test analyses were conducted to assess mean differences from pretest to post-test results for intervention outcomes by gender. The Latin Active program (with 77% retention at post-test) significantly increased vigorous physical activity and dance (p < .05) and increased self-efficacy (p < .05) among girls, and it decreased perception of neighborhood barriers (p < .05) among boys. CONCLUSION . A hip hop physical activity program, Latin Active demonstrated preliminary efficacy to increase girl's vigorous physical activity and boy's perception of neighborhood barriers to physical activity. Future research will need to use a randomized, controlled design and investigate the effect of the program on measures of body mass index.

  17. Sociopolitical development, work salience, and vocational expectations among low socioeconomic status African American, Latin American, and Asian American youth.

    PubMed

    Diemer, Matthew A; Wang, Qiu; Moore, Traymanesha; Gregory, Shannon R; Hatcher, Keisha M; Voight, Adam M

    2010-05-01

    Structural barriers constrain marginalized youths' development of work salience and vocational expectations. Sociopolitical development (SPD), the consciousness of, and motivation to reduce, sociopolitical inequality, may facilitate the negotiation of structural constraints. A structural model of SPD's impact on work salience and vocational expectations was proposed and its generalizability tested among samples of low-socioeconomic-status African American, Latin American, and Asian American youth, with Educational Longitudinal Study data. Measurement and temporal invariance of these constructs was first established before testing the proposed model across the samples. Across the three samples, 10th-grade SPD had significant effects on 10th-grade work salience and vocational expectations; 12th-grade SPD had a significant effect on 12th-grade work salience. Tenth-grade SPD had significant indirect effects on 12th-grade work salience and on 12th-grade vocational expectations for all three samples. These results suggest that SPD facilitates the agentic negotiation of constraints on the development of work salience and vocational expectations. Given the impact of adolescent career development on adult occupational attainment, SPD may also foster social mobility among youth constrained by an inequitable opportunity structure. 2010 APA, all rights reserved

  18. Serological evaluation for Chagas disease in migrants from Latin American countries resident in Rome, Italy.

    PubMed

    Pane, Stefania; Giancola, Maria Letizia; Piselli, Pierluca; Corpolongo, Angela; Repetto, Ernestina; Bellagamba, Rita; Cimaglia, Claudia; Carrara, Stefania; Ghirga, Piero; Oliva, Alessandra; Bevilacqua, Nazario; Al Rousan, Ahmad; Nisii, Carla; Ippolito, Giuseppe; Nicastri, Emanuele

    2018-05-08

    Chagas disease (CD) is a systemic parasitic infection caused by the protozoan Trypanosoma cruzi, whose chronic phase may lead to cardiac and intestinal disorders. Endemic in Latin America where it is transmitted mainly by vectors, large-scale migrations to other countries have turned CD into a global health problem because of its alternative transmission routes through blood transfusion, tissue transplantation, or congenital. Aim of this study was to compare the performance of two commercially available tests for serological diagnosis of CD in a group of Latin American migrants living in a non-endemic setting (Rome, Italy). The study was based on a cross-sectional analysis of seroprevalence in this group. Epidemiological risk factors associated to CD were also evaluated in this study population. The present study was conducted on 368 subjects from the Latin American community resident in Rome. Following WHO guidelines, we employed a diagnostic strategy based on two tests to detect IgG antibodies against T. cruzi in the blood (a lysate antigen-based ELISA and a chemiluminescent microparticle CMIA composed of multiple recombinant antigens), followed by a third test (an immunochromatographic assay) on discordant samples. Our diagnostic approach produced 319/368 (86.7%) concordant negative and 30/368 (8.1%) concordant positive results after the first screening. Discrepancies were obtained for 19/368 (5.2%) samples that were tested using the third assay, obtaining 2 more positive and 17 negative results. The final count of positive samples was 32/368 (8.7% of the tested population). Increasing age, birth in Bolivia, and previous residence in a mud house were independent factors associated with T. cruzi positive serology. Serological diagnosis of CD is still challenging, because of the lack of a reference standard serological assay for diagnosis. Our results reaffirm the importance of performing CD screening in non-endemic countries; employing a fully automated and highly sensitive CMIA assay first could be a cost- and resource-effective strategy for mass screening of low-risk patients. However, our results also suggest that the WHO strategy of using two different serological assays, combined with epidemiological information, remains the best approach for patients coming from endemic countries.

  19. A soil-specific agro-ecological strategy for sustainable production in Argentina farm fields

    NASA Astrophysics Data System (ADS)

    Zamora, Martin; Barbera, Agustin; Castro-Franco, Mauricio; Hansson, Alejandro; Domenech, Marisa

    2017-04-01

    The continuous increment of frequencies and doses of pesticides, glyphosate and fertilizers, the deterioration of the structure, biotic balance and fertility of soils and the ground water pollution are characteristics of the current Argentinian agricultural model. In this context, agro-ecological innovations are needed to develop a real sustainable agriculture, enhancing the food supply. Precision agriculture technologies can strengthen the expansion of agro-ecological farming in experimental farm fields. The aim of this study was to propose a soil-specific agro-ecological strategy for sustainable production at field scale focused on the use of soil sensors and digital soil mapping techniques. This strategy has been developed in 15 hectares transition agro-ecological farm field, located at Barrow Experimental Station (Lat:-38.322844, Lon:-60.25572) Argentina. The strategy included five steps: (i) to measure apparent electrical conductivity (ECa) and elevation within agro-ecological farm field; (ii) to apply a clustering method using MULTISPATI-PCA algorithm to delimitate three soil-specific zones (Z1, Z2 and Z3); (iii) to determine three soil sampling points by zone, using conditioned Latin hypercube method, in addition to elevation and ECa as auxiliary information; (iv) to collect soil samples at 2-10 cm depth in each point and to determine in laboratory: total organic carbon content (TOC), cation-exchange capacity (CEC), pH and phosphorus availability (P-Bray). In addition, soil bulk density (SBD) was measured at 0-20 cm depth. Finally, (v) according to each soil-specific zone, a management strategy was recommended. Important differences in soil properties among zones could suggest that the strategy developed was able to apply an agro ecological soil-specific practice management. pH and P-Bray were significantly (p<0.05) higher in Z1 than in Z2 and Z3. TOC did not show significant difference among zones, but it was higher in Z2. CEC was significantly (p<0.05) lower in Z3 than in the other ones. SBD did not show significant difference among zones; however it had a higher content in Z1. From these results, we propose an agro ecology strategy which involves a continuous nutrient cycling. During the first two years, P-Bray levels will be adjusted among zones, by using different external phosphorous sources. Only in Z3, this strategy will be achieved adding P fertilizer and also rotating plots with high stocking rate. The aim is to increase soil organic matter content and CEC. Furthermore, P content will be supplied through manure because the animal nutrition will include wheat husk, in order to achieve similar P levels among zones. The proposed strategy demonstrated that the agro-ecology soil-specific management allows a sustainable scheme in Argentinian agro-productive systems.

  20. Comparative Analysis of Reconstructed Image Quality in a Simulated Chromotomographic Imager

    DTIC Science & Technology

    2014-03-01

    quality . This example uses five basic images a backlit bar chart with random intensity, 100 nm separation. A total of 54 initial target...compared for a variety of scenes. Reconstructed image quality is highly dependent on the initial target hypercube so a total of 54 initial target...COMPARATIVE ANALYSIS OF RECONSTRUCTED IMAGE QUALITY IN A SIMULATED CHROMOTOMOGRAPHIC IMAGER THESIS

  1. Different Formulations of the Orthogonal Array Problem and Their Symmetries

    DTIC Science & Technology

    2014-06-19

    t). Note that OAP(k, s, t, λ) is a polytope not an unbounded polyhedron because it can be embedded inside the hypercube [0, λ]m. Theorem 5. The OAP(k...3.1) is a facet since otherwise OAP(k, s, t, λ) would be an unbounded polyhedron . Then there exists Ny ∈ Rm satisfying all but the 28 facet defining

  2. Hypercube Solutions for Conjugate Directions

    DTIC Science & Technology

    1991-12-01

    1800s, Charles Babbage had designed his Difference Engine and proceeded to the more advanced Analytical Engine. These machines were 1 never completed...Consider his motivation. The following example was frequently cited by Charles Babbage (1792-1871) to justify the construction of his first computing...360 LIST OF REFERENCES [1] P. Morrison and E. Morrison, editors. Charles Babbage and His Calculating Engines. Dover Publications, Inc., New

  3. HERMIES-I: a mobile robot for navigation and manipulation experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisbin, C.R.; Barhen, J.; de Saussure, G.

    1985-01-01

    The purpose of this paper is to report the current status of investigations ongoing at the Center for Engineering Systems Advanced Research (CESAR) in the areas of navigation and manipulation in unstructured environments. The HERMIES-I mobile robot, a prototype of a series which contains many of the major features needed for remote work in hazardous environments is discussed. Initial experimental work at CESAR has begun in the area of navigation. It briefly reviews some of the ongoing research in autonomous navigation and describes initial research with HERMIES-I and associated graphic simulation. Since the HERMIES robots will generally be composed ofmore » a variety of asynchronously controlled hardware components (such as manipulator arms, digital image sensors, sonars, etc.) it seems appropriate to consider future development of the HERMIES brain as a hypercube ensemble machine with concurrent computation and associated message passing. The basic properties of such a hypercube architecture are presented. Decision-making under uncertainty eventually permeates all of our work. Following a survey of existing analytical approaches, it was decided that a stronger theoretical basis is required. As such, this paper presents the framework for a recently developed hybrid uncertainty theory. 21 refs., 2 figs.« less

  4. Critical and Griffiths-McCoy singularities in quantum Ising spin glasses on d-dimensional hypercubic lattices: A series expansion study.

    PubMed

    Singh, R R P; Young, A P

    2017-08-01

    We study the ±J transverse-field Ising spin-glass model at zero temperature on d-dimensional hypercubic lattices and in the Sherrington-Kirkpatrick (SK) model, by series expansions around the strong-field limit. In the SK model and in high dimensions our calculated critical properties are in excellent agreement with the exact mean-field results, surprisingly even down to dimension d=6, which is below the upper critical dimension of d=8. In contrast, at lower dimensions we find a rich singular behavior consisting of critical and Griffiths-McCoy singularities. The divergence of the equal-time structure factor allows us to locate the critical coupling where the correlation length diverges, implying the onset of a thermodynamic phase transition. We find that the spin-glass susceptibility as well as various power moments of the local susceptibility become singular in the paramagnetic phase before the critical point. Griffiths-McCoy singularities are very strong in two dimensions but decrease rapidly as the dimension increases. We present evidence that high enough powers of the local susceptibility may become singular at the pure-system critical point.

  5. A parallel strategy for implementing real-time expert systems using CLIPS

    NASA Technical Reports Server (NTRS)

    Ilyes, Laszlo A.; Villaseca, F. Eugenio; Delaat, John

    1994-01-01

    As evidenced by current literature, there appears to be a continued interest in the study of real-time expert systems. It is generally recognized that speed of execution is only one consideration when designing an effective real-time expert system. Some other features one must consider are the expert system's ability to perform temporal reasoning, handle interrupts, prioritize data, contend with data uncertainty, and perform context focusing as dictated by the incoming data to the expert system. This paper presents a strategy for implementing a real time expert system on the iPSC/860 hypercube parallel computer using CLIPS. The strategy takes into consideration not only the execution time of the software, but also those features which define a true real-time expert system. The methodology is then demonstrated using a practical implementation of an expert system which performs diagnostics on the Space Shuttle Main Engine (SSME). This particular implementation uses an eight node hypercube to process ten sensor measurements in order to simultaneously diagnose five different failure modes within the SSME. The main program is written in ANSI C and embeds CLIPS to better facilitate and debug the rule based expert system.

  6. Parallel algorithms for quantum chemistry. I. Integral transformations on a hypercube multiprocessor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteside, R.A.; Binkley, J.S.; Colvin, M.E.

    1987-02-15

    For many years it has been recognized that fundamental physical constraints such as the speed of light will limit the ultimate speed of single processor computers to less than about three billion floating point operations per second (3 GFLOPS). This limitation is becoming increasingly restrictive as commercially available machines are now within an order of magnitude of this asymptotic limit. A natural way to avoid this limit is to harness together many processors to work on a single computational problem. In principle, these parallel processing computers have speeds limited only by the number of processors one chooses to acquire. Themore » usefulness of potentially unlimited processing speed to a computationally intensive field such as quantum chemistry is obvious. If these methods are to be applied to significantly larger chemical systems, parallel schemes will have to be employed. For this reason we have developed distributed-memory algorithms for a number of standard quantum chemical methods. We are currently implementing these on a 32 processor Intel hypercube. In this paper we present our algorithm and benchmark results for one of the bottleneck steps in quantum chemical calculations: the four index integral transformation.« less

  7. Critical and Griffiths-McCoy singularities in quantum Ising spin glasses on d -dimensional hypercubic lattices: A series expansion study

    NASA Astrophysics Data System (ADS)

    Singh, R. R. P.; Young, A. P.

    2017-08-01

    We study the ±J transverse-field Ising spin-glass model at zero temperature on d -dimensional hypercubic lattices and in the Sherrington-Kirkpatrick (SK) model, by series expansions around the strong-field limit. In the SK model and in high dimensions our calculated critical properties are in excellent agreement with the exact mean-field results, surprisingly even down to dimension d =6 , which is below the upper critical dimension of d =8 . In contrast, at lower dimensions we find a rich singular behavior consisting of critical and Griffiths-McCoy singularities. The divergence of the equal-time structure factor allows us to locate the critical coupling where the correlation length diverges, implying the onset of a thermodynamic phase transition. We find that the spin-glass susceptibility as well as various power moments of the local susceptibility become singular in the paramagnetic phase before the critical point. Griffiths-McCoy singularities are very strong in two dimensions but decrease rapidly as the dimension increases. We present evidence that high enough powers of the local susceptibility may become singular at the pure-system critical point.

  8. Spacecraft On-Board Information Extraction Computer (SOBIEC)

    NASA Technical Reports Server (NTRS)

    Eisenman, David; Decaro, Robert E.; Jurasek, David W.

    1994-01-01

    The Jet Propulsion Laboratory is the Technical Monitor on an SBIR Program issued for Irvine Sensors Corporation to develop a highly compact, dual use massively parallel processing node known as SOBIEC. SOBIEC couples 3D memory stacking technology provided by nCUBE. The node contains sufficient network Input/Output to implement up to an order-13 binary hypercube. The benefit of this network, is that it scales linearly as more processors are added, and it is a superset of other commonly used interconnect topologies such as: meshes, rings, toroids, and trees. In this manner, a distributed processing network can be easily devised and supported. The SOBIEC node has sufficient memory for most multi-computer applications, and also supports external memory expansion and DMA interfaces. The SOBIEC node is supported by a mature set of software development tools from nCUBE. The nCUBE operating system (OS) provides configuration and operational support for up to 8000 SOBIEC processors in an order-13 binary hypercube or any subset or partition(s) thereof. The OS is UNIX (USL SVR4) compatible, with C, C++, and FORTRAN compilers readily available. A stand-alone development system is also available to support SOBIEC test and integration.

  9. Patterns of admixture and population structure in native populations of Northwest North America.

    PubMed

    Verdu, Paul; Pemberton, Trevor J; Laurent, Romain; Kemp, Brian M; Gonzalez-Oliver, Angelica; Gorodezky, Clara; Hughes, Cris E; Shattuck, Milena R; Petzelt, Barbara; Mitchell, Joycelynn; Harry, Harold; William, Theresa; Worl, Rosita; Cybulski, Jerome S; Rosenberg, Noah A; Malhi, Ripan S

    2014-08-01

    The initial contact of European populations with indigenous populations of the Americas produced diverse admixture processes across North, Central, and South America. Recent studies have examined the genetic structure of indigenous populations of Latin America and the Caribbean and their admixed descendants, reporting on the genomic impact of the history of admixture with colonizing populations of European and African ancestry. However, relatively little genomic research has been conducted on admixture in indigenous North American populations. In this study, we analyze genomic data at 475,109 single-nucleotide polymorphisms sampled in indigenous peoples of the Pacific Northwest in British Columbia and Southeast Alaska, populations with a well-documented history of contact with European and Asian traders, fishermen, and contract laborers. We find that the indigenous populations of the Pacific Northwest have higher gene diversity than Latin American indigenous populations. Among the Pacific Northwest populations, interior groups provide more evidence for East Asian admixture, whereas coastal groups have higher levels of European admixture. In contrast with many Latin American indigenous populations, the variance of admixture is high in each of the Pacific Northwest indigenous populations, as expected for recent and ongoing admixture processes. The results reveal some similarities but notable differences between admixture patterns in the Pacific Northwest and those in Latin America, contributing to a more detailed understanding of the genomic consequences of European colonization events throughout the Americas.

  10. Standardization of the Food Composition Database Used in the Latin American Nutrition and Health Study (ELANS).

    PubMed

    Kovalskys, Irina; Fisberg, Mauro; Gómez, Georgina; Rigotti, Attilio; Cortés, Lilia Yadira; Yépez, Martha Cecilia; Pareja, Rossina G; Herrera-Cuenca, Marianella; Zimberg, Ioná Z; Tucker, Katherine L; Koletzko, Berthold; Pratt, Michael

    2015-09-16

    Between-country comparisons of estimated dietary intake are particularly prone to error when different food composition tables are used. The objective of this study was to describe our procedures and rationale for the selection and adaptation of available food composition to a single database to enable cross-country nutritional intake comparisons. Latin American Study of Nutrition and Health (ELANS) is a multicenter cross-sectional study of representative samples from eight Latin American countries. A standard study protocol was designed to investigate dietary intake of 9000 participants enrolled. Two 24-h recalls using the Multiple Pass Method were applied among the individuals of all countries. Data from 24-h dietary recalls were entered into the Nutrition Data System for Research (NDS-R) program after a harmonization process between countries to include local foods and appropriately adapt the NDS-R database. A food matching standardized procedure involving nutritional equivalency of local food reported by the study participants with foods available in the NDS-R database was strictly conducted by each country. Standardization of food and nutrient assessments has the potential to minimize systematic and random errors in nutrient intake estimations in the ELANS project. This study is expected to result in a unique dataset for Latin America, enabling cross-country comparisons of energy, macro- and micro-nutrient intake within this region.

  11. Patterns of Admixture and Population Structure in Native Populations of Northwest North America

    PubMed Central

    Verdu, Paul; Pemberton, Trevor J.; Laurent, Romain; Kemp, Brian M.; Gonzalez-Oliver, Angelica; Gorodezky, Clara; Hughes, Cris E.; Shattuck, Milena R.; Petzelt, Barbara; Mitchell, Joycelynn; Harry, Harold; William, Theresa; Worl, Rosita; Cybulski, Jerome S.; Rosenberg, Noah A.; Malhi, Ripan S.

    2014-01-01

    The initial contact of European populations with indigenous populations of the Americas produced diverse admixture processes across North, Central, and South America. Recent studies have examined the genetic structure of indigenous populations of Latin America and the Caribbean and their admixed descendants, reporting on the genomic impact of the history of admixture with colonizing populations of European and African ancestry. However, relatively little genomic research has been conducted on admixture in indigenous North American populations. In this study, we analyze genomic data at 475,109 single-nucleotide polymorphisms sampled in indigenous peoples of the Pacific Northwest in British Columbia and Southeast Alaska, populations with a well-documented history of contact with European and Asian traders, fishermen, and contract laborers. We find that the indigenous populations of the Pacific Northwest have higher gene diversity than Latin American indigenous populations. Among the Pacific Northwest populations, interior groups provide more evidence for East Asian admixture, whereas coastal groups have higher levels of European admixture. In contrast with many Latin American indigenous populations, the variance of admixture is high in each of the Pacific Northwest indigenous populations, as expected for recent and ongoing admixture processes. The results reveal some similarities but notable differences between admixture patterns in the Pacific Northwest and those in Latin America, contributing to a more detailed understanding of the genomic consequences of European colonization events throughout the Americas. PMID:25122539

  12. Standardization of the Food Composition Database Used in the Latin American Nutrition and Health Study (ELANS)

    PubMed Central

    Kovalskys, Irina; Fisberg, Mauro; Gómez, Georgina; Rigotti, Attilio; Cortés, Lilia Yadira; Yépez, Martha Cecilia; Pareja, Rossina G.; Herrera-Cuenca, Marianella; Zimberg, Ioná Z.; Tucker, Katherine L.; Koletzko, Berthold; Pratt, Michael

    2015-01-01

    Between-country comparisons of estimated dietary intake are particularly prone to error when different food composition tables are used. The objective of this study was to describe our procedures and rationale for the selection and adaptation of available food composition to a single database to enable cross-country nutritional intake comparisons. Latin American Study of Nutrition and Health (ELANS) is a multicenter cross-sectional study of representative samples from eight Latin American countries. A standard study protocol was designed to investigate dietary intake of 9000 participants enrolled. Two 24-h recalls using the Multiple Pass Method were applied among the individuals of all countries. Data from 24-h dietary recalls were entered into the Nutrition Data System for Research (NDS-R) program after a harmonization process between countries to include local foods and appropriately adapt the NDS-R database. A food matching standardized procedure involving nutritional equivalency of local food reported by the study participants with foods available in the NDS-R database was strictly conducted by each country. Standardization of food and nutrient assessments has the potential to minimize systematic and random errors in nutrient intake estimations in the ELANS project. This study is expected to result in a unique dataset for Latin America, enabling cross-country comparisons of energy, macro- and micro-nutrient intake within this region. PMID:26389952

  13. Association of the duration of residence with obesity-related eating habits and dietary patterns among Latin-American immigrants in Spain.

    PubMed

    Marín-Guerrero, A C; Rodríguez-Artalejo, Fernando; Guallar-Castillón, P; López-García, Esther; Gutiérrez-Fisac, Juan L

    2015-01-28

    The dietary patterns of immigrants usually change with the duration of residence and progressively resemble those of the host country. However, very few studies have investigated individuals migrating to countries with a high-quality diet, such as the Mediterranean diet (MD), and none has yet focused on Latin-American immigrants. The present study examined the association of the duration of residence with obesity-related eating habits and dietary patterns among Latin-American immigrants residing in Spain. A cross-sectional study was conducted in 2008-10 in a representative sample of the adult population residing in Spain. Adherence to the MD was defined as a MD Adherence Screener score ≥ 9. Analyses were conducted by including 419 individuals aged 18-64 years born in Latin-American countries. Compared with immigrants residing in Spain for < 5 years, those residing for ≥ 10 years accounted for a lower percentage of individuals who habitually ate at fast-food restaurants and never trimmed visible fat from meat. Moreover, these immigrants were found to have a lower intake of sugary beverages and a higher intake of Na, saturated fat, fibre, olive oil, vegetables and fish and to more frequently strictly adhere to the MD. A longer duration of residence in Spain was found to be associated with both healthy and unhealthy changes in some eating habits and dietary patterns among Latin-American immigrants. Some of the healthy changes observed contrasted the 'Westernisation' of the diet reported in studies conducted in other Western countries. The results of the present study support the role of the food environment of the host country in the modification of the dietary patterns of immigrants.

  14. What predicts the quality of advanced cancer care in Latin America? A look at five countries: Argentina, Brazil, Cuba, Mexico, and Peru.

    PubMed

    Torres Vigil, Isabel; Aday, Lu Ann; De Lima, Liliana; Cleeland, Charles S

    2007-09-01

    Cancer is now a leading cause of death among adults in most Latin American nations. Yet, until recently, there has been limited research on the quality of, and access to, advanced cancer care in developing regions such as Latin America. This landmark, cross-national study assessed the quality of advanced cancer care in five Latin American countries by surveying a convenience sample of 777 physicians and nurses, and identifying the most salient influences on their quality-of-care assessments based on multiple linear regression analyses. Strategies for disseminating this survey included mass mailings, distribution at professional meetings/conferences, collaboration with Latin American institutions, professional organizations, and the Pan American Health Organization, and online posting. Results indicate that the respondents' assessments of the quality of, access to, and affordability of advanced cancer care varied significantly across nations (P<0.001). The strongest predictor of providers' national-level assessments of the quality of care was their ratings of access to advanced cancer care (Beta=0.647). Other predictors included affordability of care, country (Cuba vs. the other four countries), income-gap quintile, and institutional availability of opioid analgesics. Low prioritization of palliative care in both health care policy formulation and provider education also predicted the quality-of-care ratings. Findings from this study suggest that providers from five different nations hold similar equitable notions of quality care that are dependent on the provision of accessible and affordable care. Measures of social equity, such as the income-gap quintile of nations, and measures of policy barriers, such as the scale developed in this study, should be replicated in future studies to enable policy makers to assess and improve advanced cancer care in their countries.

  15. Impact of bullying victimization on suicide and negative health behaviors among adolescents in Latin America.

    PubMed

    Romo, Matthew L; Kelvin, Elizabeth A

    2016-11-01

    To compare the prevalence of bullying victimization, suicidal ideation, suicidal attempts, and negative health behaviors (current tobacco use, recent heavy alcohol use, truancy, involvement in physical fighting, and unprotected sexual intercourse) in five different Latin American countries and determine the association of bullying victimization with these outcomes, exploring both bullying type and frequency. Study data were from Global School-based Student Health Surveys from Bolivia, Costa Rica, Honduras, Peru, and Uruguay, which covered nationally representative samples of school-going adolescents. The surveys used a two-stage clustered sample design, sampling schools and then classrooms. Logistic regression models were run to determine the statistical significance of associations with bullying. Among the 14 560 school-going adolescents included in this study, the prevalence of any bullying victimization in the past 30 days was 37.8%. Bullying victimization was associated with greater odds of suicidal ideation with planning (adjusted odds ratio (AOR): 3.12; P < 0.0001) and at least one suicide attempt (AOR: 3.07; P < 0.0001). An increasing exposure-response effect of increasing days of bullying victimization on suicide outcomes was also observed. Bullying victimization was associated with higher odds of current tobacco use (AOR: 2.14; P < 0.0001); truancy (AOR: 1.76; P < 0.0001); physical fighting (AOR: 2.40; P < 0.0001); and unprotected sexual intercourse (AOR: 1.77; P < 0.0001). Although the prevalence of bullying victimization varied by country, its association with suicidal ideation and behavior and negative health behaviors remained relatively consistent. Addressing bullying needs to be made a priority in Latin America, and an integrated approach that also includes mental and physical health promotion is needed.

  16. Light and Dark Waves.

    ERIC Educational Resources Information Center

    Parker, Henry H.; Sturdivant, Ann

    1966-01-01

    Several Latin textbooks are described and evaluated in terms of their affectiveness in teaching language mastery. These are: (1) "Using Latin," Scott, Foresman, (2) "Latin for Americans," Macmillan, (3) "Lingua Latina," by Burns, Medicus, and Sherburne, and (4) "A Basic Course in Latin,""An Intermediate Course in Latin--Reading," and "An…

  17. Integrated Pest Management of Coffee Berry Borer: Strategies from Latin America that Could Be Useful for Coffee Farmers in Hawaii.

    PubMed

    Aristizábal, Luis F; Bustillo, Alex E; Arthurs, Steven P

    2016-02-03

    The coffee berry borer (CBB), Hypothenemus hampei Ferrari (Coleoptera: Curculionidae: Scolytinae) is the primary arthropod pest of coffee plantations worldwide. Since its detection in Hawaii (September 2010), coffee growers are facing financial losses due to reduced quality of coffee yields. Several control strategies that include cultural practices, biological control agents (parasitoids), chemical and microbial insecticides (entomopathogenic fungi), and a range of post-harvest sanitation practices have been conducted to manage CBB around the world. In addition, sampling methods including the use of alcohol based traps for monitoring CBB populations have been implemented in some coffee producing countries in Latin America. It is currently unclear which combination of CBB control strategies is optimal under economical, environmental, and sociocultural conditions of Hawaii. This review discusses components of an integrated pest management program for CBB. We focus on practical approaches to provide guidance to coffee farmers in Hawaii. Experiences of integrated pest management (IPM) of CBB learned from Latin America over the past 25 years may be relevant for establishing strategies of control that may fit under Hawaiian coffee farmers' conditions.

  18. Perception of Ethical Misconduct by Neuropsychology Professionals in Latin America.

    PubMed

    Panyavin, Ivan S; Goldberg-Looney, Lisa D; Rivera, Diego; Perrin, Paul B; Arango-Lasprilla, Juan Carlos

    2015-08-01

    To date, extremely limited research has focused on the ethical aspects of clinical neuropsychology practice in Latin America. The current study aimed to identify the frequency of perceived ethical misconduct in a sample of 465 self-identified neuropsychology professionals from Latin America in order to better guide policies for training and begin to establish standards for practitioners in the region. Frequencies of neuropsychologists who knew another professional engaging in ethical misconduct ranged from 1.1% to 60.4% in the areas of research, clinical care, training, and professional relationships. The most frequently reported perceived misconduct was in the domain of professional training and expertise, with nearly two thirds of participants knowing other professionals who do not possess adequate training to be working as neuropsychologists. The least frequently reported perceived misconduct was in the domain of professional relationships. Nearly one third of participants indicated that they had never received formal training in professional ethics. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Rey-Osterrieth Complex Figure - copy and immediate recall: Normative data for the Latin American Spanish speaking adult population.

    PubMed

    Rivera, D; Perrin, P B; Morlett-Paredes, A; Galarza-Del-Angel, J; Martínez, C; Garza, M T; Saracho, C P; Rodríguez, W; Rodríguez-Agudelo, Y; Rábago, B; Aliaga, A; Schebela, S; Luna, M; Longoni, M; Ocampo-Barba, N; Fernández, E; Esenarro, L; García-Egan, P; Arango-Lasprilla, J C

    2015-01-01

    To generate normative data on the Rey-Osterrieth Complex Figure Test (ROCF) across 11 countries in Latin America, with country-specific adjustments for gender, age, and education, where appropriate. The sample consisted of 3,977 healthy adults who were recruited from Argentina, Bolivia, Chile, Cuba, El Salvador, Guatemala, Honduras, Mexico, Paraguay, Peru, and, Puerto Rico. Each subject was administered the ROCF as part of a larger neuropsychological battery. A standardized five-step statistical procedure was used to generate the norms. The final multiple linear regression models explained 7-34% of the variance in ROCF copy scores and 21-41% of the variance in immediate recall scores. Although t-tests showed significant differences between men and women on ROCF copy and immediate recall scores, none of the countries had an effect size larger than 0.3. As a result, gender-adjusted norms were not generated. The present study is the first to create norms for the ROCF in Latin America. As a result, this study will have important implications for the formation and practice of neuropsychology in this region.

  20. Latin American scientific contribution to ecology.

    PubMed

    Wojciechowski, Juliana; Ceschin, Fernanda; Pereto, Suelen C A S; Ribas, Luiz G S; Bezerra, Luis A V; Dittrich, Jaqueline; Siqueira, Tadeu; Padial, André A

    2017-01-01

    Latin America embodies countries of special interest for ecological studies, given that areas with great value for biodiversity are located within their territories. This highlights the importance of an evaluation of ecological research in the Latin America region. We assessed the scientific participation of Latin American researchers in ecological journals, patterns of international collaboration, and defined the main characteristics of the articles. Although Latin American publications have increased in fourteen years, they accounted up to 9% of publications in Ecology. Brazil leaded the scientific production in Latin America, followed by Argentina and Mexico. In general, Latin American articles represented a low percentage of most journals total publication, with particularly low expression in high impact-factor journals. A half of the Latin American publications had international collaboration. Articles with more than five authors and with international collaboration were the most cited. Descriptive studies, mainly based in old theories, are still majority, suggesting that Ecology is in a developing stage in Latin America.

  1. Vision-Based Navigation and Parallel Computing

    DTIC Science & Technology

    1990-08-01

    33 5.8. Behizad Kamgar-Parsi and Behrooz Karngar-Parsi,"On Problem 5- lving with Hopfield Neural Networks", CAR-TR-462, CS-TR...Second. the hypercube connections support logarithmic implementations of fundamental parallel algorithms. such as grid permutations and scan...the pose space. It also uses a set of virtual processors to represent an orthogonal projection grid , and projections of the six dimensional pose space

  2. Hypercat - Hypercube of Clumpy AGN Tori

    NASA Astrophysics Data System (ADS)

    Nikutta, Robert; Lopez-Rodriguez, Enrique; Ichikawa, Kohei; Levenson, Nancy; Packham, Christopher C.

    2017-06-01

    Dusty tori surrounding the central engines of Active Galactic Nuclei (AGN) are required by the Unification Paradigm, and are supported by many observations, e.g. variable nuclear absorber (sometimes Compton-thick) in X-rays, reverberation mapping in optical/UV, hot dust emission and SED shapes in NIR/MIR, molecular and cool-dust tori observed with ALMA in sub-mm.While models of AGN torus SEDs have been developed and utilized for a long time, the study of the resolved emission morphology (brightness maps) has so far been under-appreciated, presumably because resolved observations of the central parsec in AGN are only possible very recently. Currently, only NIR+MIR interferometry is capable of resolving the nuclear dust emission (but not of producing images, until MATISSE comes online). Furthermore, MIR interferometry has delivered also puzzling results, e.g. that in some resolved sources the light emanates preferentially from polar directions above the "torus" system, and not from the equatorial plane, where most of the dust is located.We are preparing the release of a panchromatic, fully interpolable hypercube of brightness maps and projected dust images for a large number of CLUMPY torus models (Nenkova+2008), that will help facilitate studies of resolved AGN emission and dust morphologies. Together with the cube we will release a comprehensive set of open-source tools (Python) that will enable researches to work efficiently with this large hypercube:* easy sub-cube selection + memory-mapping (mitigating the too-big-for-RAM problem)* multi-dim image interpolation (get an image at any wavelength & model parameter combination)* simulation of observations with telescopes (compute/provide + apply a PSF) and interferometers (get visibilities)* analyze images with respect to the power contained at all scales and orientations (via 2D steerable wavelets), addressing the seemingly puzzling results mentioned aboveA series of papers is in preparation, aiming at solving the puzzles, and at making predictions about the resolvability of all nearby AGN tori with any combination of current and future instruments (e.g. VLTI+MATISSE, TMT+MICHI, GMT, ELT, JWST, ALMA).

  3. Latin and the World of Work: Career Education and Foreign Languages.

    ERIC Educational Resources Information Center

    Kennedy, Dora F.; And Others

    This curriculum guide for high school Latin courses emphasizes the usefulness of a knowledge of Latin for career preparation. As a supplement to standard Latin textbooks, a variety of classroom material is offered. The instructional approach revolves around the relation of English and Latin vocabulary acquisition and etymological knowledge on the…

  4. How Do You Say "MOO" in Latin? Assessing Student Learning and Motivation in Beginning Latin.

    ERIC Educational Resources Information Center

    Gruber-Miller, John; Benton, Cindy

    2001-01-01

    Assesses the value of VRoma for Latin language learning. In particular, discusses three exercises that were developed that combine Latin language and Roman culture in order to help students reinforce their Latin skills and gain a more in-depth understanding of ancient Roman society. (Author/VWL)

  5. Latin and Magic Squares

    ERIC Educational Resources Information Center

    Emanouilidis, Emanuel

    2005-01-01

    Latin squares have existed for hundreds of years but it wasn't until rather recently that Latin squares were used in other areas such as statistics, graph theory, coding theory and the generation of random numbers as well as in the design and analysis of experiments. This note describes Latin and diagonal Latin squares, a method of constructing…

  6. Diagnostic labeling of COPD in five Latin American cities.

    PubMed

    Tálamo, Carlos; de Oca, Maria Montes; Halbert, Ron; Perez-Padilla, Rogelio; Jardim, José Roberto B; Muiño, Adriana; Lopez, Maria Victorina; Valdivia, Gonzalo; Pertuzé, Julio; Moreno, Dolores; Menezes, Ana Maria B

    2007-01-01

    COPD is a major worldwide problem with a rising prevalence. Despite its importance, there is a lack of information regarding underdiagnosis and misdiagnosis of COPD in different countries. As part of the Proyecto Latinoamericano de Investigación en Obstrucción Pulmonar study, we examined the relationship between prior diagnostic label and airway obstruction in the metropolitan areas of five Latin American cities (São Paulo, Santiago, Mexico City, Montevideo, and Caracas). A two-stage sampling strategy was used in each of the five areas to obtain probability samples of adults aged >or= 40 years. Participants completed a questionnaire that included questions on prior diagnoses, and prebronchodilator and postbronchodilator spirometry. A study diagnosis of COPD was based on airway obstruction, defined as a postbronchodilator FEV(1)/FVC < 0.70. Valid spirometry and prior diagnosis information was obtained for 5,303 participants; 758 subjects had a study diagnosis of COPD, of which 672 cases (88.7%) had not been previously diagnosed. The prevalence of undiagnosed COPD was 12.7%, ranging from 6.9% in Mexico City to 18.2% in Montevideo. Among 237 subjects with a prior COPD diagnosis, only 86 subjects (36.3%) had postbronchodilator FEV(1)/FVC < 0.7, while 151 subjects (63.7%) had normal spirometric values. In the same group of 237 subjects, only 34% reported ever undergoing spirometry prior to our study. Inaccurate diagnostic labeling of COPD represents an important health problem in Latin America. One possible explanation is the low rate of spirometry for COPD diagnosis.

  7. Midwifery practice and maternity services: A multisite descriptive study in Latin America and the Caribbean.

    PubMed

    Binfa, Lorena; Pantoja, Loreto; Ortiz, Jovita; Cavada, Gabriel; Schindler, Peter; Burgos, Rosa Ypania; Maganha E Melo, Célia Regina; da Silva, Lúcia Cristina Florentino Pereira; Lima, Marlise de Oliveira Pimentel; Hernández, Laura Valli; Schlenker Rm, Rosana; Sánchez, Verdún; Rojas, Mirian Solis; Huamán, Betty Cruz; Chauca, Maria Luisa Torres; Cillo, Alicia; Lofeudo, Susana; Zapiola, Sandra; Weeks, Fiona; Foster, Jennifer

    2016-09-01

    over the past three decades there has been a social movement in Latin American countries (LAC) to support humanised, physiologic birth. Rates of caesarean section overall in Latin America are approximately 35%, increasing up to 85% in some cases. There are many factors related to poor outcomes with regard to maternal and newborn/infant health in LAC countries. Maternal and perinatal outcome data within and between countries is scarce and inaccurate. The aims of this study were to: i) describe selected obstetric and neonatal outcomes of women who received midwifery care, ii) identify the level of maternal well-being after experiencing midwifery care in 6 Latin America countries. this was a cross sectional and descriptive study, conducted in selected maternity units in Argentina, Brazil, Chile, the Dominican Republic, Peru, and Uruguay. Quantitative methods were used to measure midwifery processes of care and maternal perceptions of well-being in labour and childbirth through a validated survey of maternal well-being and an adapted version of the American College of Nurse-Midwives (ACNM) standardized antepartum and intrapartum data set. Maternity units from 6 Latin American countries. the final sample was a convenience sample, and the total participants for all sites in the six countries was 3009 low risk women. for the countries reporting, overall, 82% of these low risk women had spontaneous vaginal deliveries. The rate of caesarean section was 16%; the Dominican Republic had the highest rate of Caesarean sections (30%) and Peru had the lowest rate (4%). The use of oxytocin in labour was widely variable, although overall there was a high proportion of women whose labour was augmented or induced. Ambulation was common, with the lowest proportion (48%) of women ambulating in labour in Chile, Uruguay (50%), Peru (65%), Brazil (85%). The presence of continuous support was highest in Uruguay (93%), Chile (75%) and Argentina (55%), and Peru had the lowest (22%). Episiotomies are still prevalent in all countries, the lowest rate was reported in the Dominican Republic (22%), and the highest rates were 52 and 53% (Chile and Peru, respectively). The Optimal Maternal well-being score had a prevalence of 43.5%, adequate score was 30.8%; 25% of the total sample of women rated their well-being during labour and childbirth as poor. despite evidence-based guidelines and recommendations, birth is not managed accordingly in most cases. Women feel that care is adequate, although some women report mistreatment. More research is needed to understand why such high levels of intervention exist and to test the implementation of evidence-based practices in local settings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Star Trek with Latin. Teacher's Guide. Tentative Edition.

    ERIC Educational Resources Information Center

    Masciantonio, Rudolph; And Others

    The purpose of this guide is to assist Latin and English teachers with some background in Latin to expand the English vocabulary and reading skills of students through the study of Latin roots, prefixes, and suffixes. The introductory material in the guide provides general notes on the teaching of Latin in the Philadelphia School District,…

  9. ON THE CONSTRUCTION OF LATIN SQUARES COUNTERBALANCED FOR IMMEDIATE SEQUENTIAL EFFECTS.

    ERIC Educational Resources Information Center

    HOUSTON, TOM R., JR.

    THIS REPORT IS ONE OF A SERIES DESCRIBING NEW DEVELOPMENTS IN THE AREA OF RESEARCH METHODOLOGY. IT DEALS WITH LATIN SQUARES AS A CONTROL FOR PROGRESSIVE AND ADJACENCY EFFECTS IN EXPERIMENTAL DESIGNS. THE HISTORY OF LATIN SQUARES IS ALSO REVIEWED, AND SEVERAL ALGORITHMS FOR THE CONSTRUCTION OF LATIN AND GRECO-LATIN SQUARES ARE PROPOSED. THE REPORT…

  10. Cystic Echinococcosis in the Province of Álava, North Spain: The Monetary Burden of a Disease No Longer under Surveillance

    PubMed Central

    Carabin, Hélène; Balsera-Rodríguez, Francisco J.; Rebollar-Sáenz, José; Benner, Christine T.; Benito, Aitziber; Fernández-Crespo, Juan C.; Carmena, David

    2014-01-01

    Cystic echinococcosis (CE) is endemic in Spain but has been considered non-endemic in the province of Álava, Northern Spain, since 1997. However, Álava is surrounded by autonomous regions with some of the highest CE prevalence proportions in the nation, casting doubts about the current classification. The purpose of this study is to estimate the frequency of CE in humans and animals and to use this data to determine the societal cost incurred due to CE in the Álava population in 2005. We have identified epidemiological and clinical data from surveillance and hospital records, prevalence data in intermediate (sheep and cattle) host species from abattoir records, and economical data from national and regional official institutions. Direct costs (diagnosis, treatment, medical care in humans and condemnation of offal in livestock species) and indirect costs (productivity losses in humans and reduction in growth, fecundity and milk production in livestock) were modelled using the Latin hypercube method under five different scenarios reflecting different assumptions regarding the prevalence of asymptomatic cases and associated productivity losses in humans. A total of 13 human CE cases were reported in 2005. The median total cost (95% credible interval) of CE in humans and animals in Álava in 2005 was estimated to range between €61,864 (95%CI%: €47,304–€76,590) and €360,466 (95%CI: €76,424–€752,469), with human-associated losses ranging from 57% to 93% of the total losses, depending on the scenario used. Our data provide evidence that CE is still very well present in Álava and incurs important cost to the province every year. We expect this information to prove valuable for public health agencies and policy-makers, as it seems advisable to reinstate appropriate surveillance and monitoring systems and to implement effective control measures that avoid the spread and recrudescence of the disease. PMID:25102173

  11. Assessing the Impact of Climate Change on Extreme Streamflow and Reservoir Operation for Nuuanu Watershed, Oahu, Hawaii

    NASA Astrophysics Data System (ADS)

    Leta, O. T.; El-Kadi, A. I.; Dulaiova, H.

    2016-12-01

    Extreme events, such as flooding and drought, are expected to occur at increased frequencies worldwide due to climate change influencing the water cycle. This is particularly critical for tropical islands where the local freshwater resources are very sensitive to climate. This study examined the impact of climate change on extreme streamflow, reservoir water volume and outflow for the Nuuanu watershed, using the Soil and Water Assessment Tool (SWAT) model. Based on the sensitive parameters screened by the Latin Hypercube-One-factor-At-a-Time (LH-OAT) method, SWAT was calibrated and validated to daily streamflow using the SWAT Calibration and Uncertainty Program (SWAT-CUP) at three streamflow gauging stations. Results showed that SWAT adequately reproduced the observed daily streamflow hydrographs at all stations. This was verified with Nash-Sutcliffe Efficiency that resulted in acceptable values of 0.58 to 0.88, whereby more than 90% of observations were bracketed within 95% model prediction uncertainty interval for both calibration and validation periods, signifying the potential applicability of SWAT for future prediction. The climate change impact on extreme flows, reservoir water volume and outflow was assessed under the Representative Concentration Pathways of 4.5 and 8.5 scenarios. We found wide changes in extreme peak and low flows ranging from -44% to 20% and -50% to -2%, respectively, compared to baseline. Consequently, the amount of water stored in Nuuanu reservoir will be decreased up to 27% while the corresponding outflow rates are expected to decrease up to 37% relative to the baseline. In addition, the stored water and extreme flows are highly sensitive to rainfall change when compared to temperature and solar radiation changes. It is concluded that the decrease in extreme low and peak flows can have serious consequences, such as flooding, drought, with detrimental effects on riparian ecological functioning. This study's results are expected to aid in reservoir operation as well as in identifying appropriate climate change adaptation strategies.

  12. Sensitivity analysis of some critical factors affecting simulated intrusion volumes during a low pressure transient event in a full-scale water distribution system.

    PubMed

    Ebacher, G; Besner, M C; Clément, B; Prévost, M

    2012-09-01

    Intrusion events caused by transient low pressures may result in the contamination of a water distribution system (DS). This work aims at estimating the range of potential intrusion volumes that could result from a real downsurge event caused by a momentary pump shutdown. A model calibrated with transient low pressure recordings was used to simulate total intrusion volumes through leakage orifices and submerged air vacuum valves (AVVs). Four critical factors influencing intrusion volumes were varied: the external head of (untreated) water on leakage orifices, the external head of (untreated) water on submerged air vacuum valves, the leakage rate, and the diameter of AVVs' outlet orifice (represented by a multiplicative factor). Leakage orifices' head and AVVs' orifice head levels were assessed through fieldwork. Two sets of runs were generated as part of two statistically designed experiments. A first set of 81 runs was based on a complete factorial design in which each factor was varied over 3 levels. A second set of 40 runs was based on a latin hypercube design, better suited for experimental runs on a computer model. The simulations were conducted using commercially available transient analysis software. Responses, measured by total intrusion volumes, ranged from 10 to 366 L. A second degree polynomial was used to analyze the total intrusion volumes. Sensitivity analyses of both designs revealed that the relationship between the total intrusion volume and the four contributing factors is not monotonic, with the AVVs' orifice head being the most influential factor. When intrusion through both pathways occurs concurrently, interactions between the intrusion flows through leakage orifices and submerged AVVs influence intrusion volumes. When only intrusion through leakage orifices is considered, the total intrusion volume is more largely influenced by the leakage rate than by the leakage orifices' head. The latter mainly impacts the extent of the area affected by intrusion. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Daily family conflict and emotional distress among adolescents from Latin American, Asian, and European backgrounds.

    PubMed

    Chung, Grace H; Flook, Lisa; Fuligni, Andrew J

    2009-09-01

    The authors employed a daily diary method to assess daily frequencies of interparental and parent-adolescent conflict over a 2-week period and their implications for emotional distress across the high school years in a longitudinal sample of 415 adolescents from Latin American, Asian, and European backgrounds. Although family conflict remained fairly infrequent among all ethnic backgrounds across the high school years, its impact on emotional distress was significant across ethnicity and gender. In addition, parent-adolescent conflict significantly mediated the association between interparental conflict and emotional distress. These associations were observed at both the individual and the daily levels, providing evidence for both the chronic and episodic implications of family conflict for adolescents' emotional adjustment.

  14. Allozyme differentiation and biosystematics of the Californian closed-cone pines (Pinus subsect. Oocarpae)

    Treesearch

    Constance I. Millar; Steven H. Strauss; M. Thompson Conkle; Robert D. Westfall

    1988-01-01

    Allozyme differentiation at 32 loci was studied in the three Californian species of Pinus subsect. Oocarpae: P. attenuata, P. muricata, and P. radiata, and in a small sample of a Latin American species of the subsection, P. oocarpa. The Californian species...

  15. Between the national and the universal: natural history networks in Latin America in the nineteenth and twentieth centuries.

    PubMed

    Duarte, Regina Horta

    2013-12-01

    This essay examines contemporary Latin American historical writing about natural history from the nineteenth through the twentieth centuries. Natural history is a "network science," woven out of connections and communications between diverse people and centers of scholarship, all against a backdrop of complex political and economic changes. Latin American naturalists navigated a tension between promoting national science and participating in "universal" science. These tensions between the national and the universal have also been reflected in historical writing on Latin America. Since the 1980s, narratives that recognize Latin Americans' active role have become more notable within the renewal of the history of Latin American science. However, the nationalist slant of these approaches has kept Latin American historiography on the margins. The networked nature of natural history and Latin America's active role in it afford an opportunity to end the historiographic isolation of Latin America and situate it within world history.

  16. Health-Related Quality of Life of Latin-American Immigrants and Spanish-Born Attended in Spanish Primary Health Care: Socio-Demographic and Psychosocial Factors

    PubMed Central

    Salinero-Fort, Miguel Ángel; Gómez-Campelo, Paloma; Bragado-Alvárez, Carmen; Abánades-Herranz, Juan Carlos; Jiménez-García, Rodrigo; de Burgos-Lunar, Carmen

    2015-01-01

    Background This study compares the health-related quality of life of Spanish-born and Latin American-born individuals settled in Spain. Socio-demographic and psychosocial factors associated with health-related quality of life are analyzed. Methods A cross-sectional Primary Health Care multi center-based study of Latin American-born (n = 691) and Spanish-born (n = 903) outpatients from 15 Primary Health Care Centers (Madrid, Spain). The Medical Outcomes Study 36-Item Short Form Health Survey (SF-36) was used to assess health-related quality of life. Socio-demographic, psychosocial, and specific migration data were also collected. Results Compared to Spanish-born participants, Latin American-born participants reported higher health-related quality of life in the physical functioning and vitality dimensions. Across the entire sample, Latin American-born participants, younger participants, men and those with high social support reported significantly higher levels of physical health. Men with higher social support and a higher income reported significantly higher mental health. When stratified by gender, data show that for men physical health was only positively associated with younger age. For women, in addition to age, social support and marital status were significantly related. Both men and women with higher social support and income had significantly better mental health. Finally, for immigrants, the physical and mental health components of health-related quality of life were not found to be significantly associated with any of the pre-migration factors or conditions of migration. Only the variable “exposure to political violence” was significantly associated with the mental health component (p = 0.014). Conclusions The key factors to understanding HRQoL among Latin American-born immigrants settled in Spain are age, sex and social support. Therefore, strategies to maintain optimal health outcomes in these immigrant communities should include public policies on social inclusion in the host society and focus on improving social support networks in order to foster and maintain the health and HRQoL of this group. PMID:25835714

  17. Characterization of microflora in Latin-style cheeses by next-generation sequencing technology.

    PubMed

    Lusk, Tina S; Ottesen, Andrea R; White, James R; Allard, Marc W; Brown, Eric W; Kase, Julie A

    2012-11-07

    Cheese contamination can occur at numerous stages in the manufacturing process including the use of improperly pasteurized or raw milk. Of concern is the potential contamination by Listeria monocytogenes and other pathogenic bacteria that find the high moisture levels and moderate pH of popular Latin-style cheeses like queso fresco a hospitable environment. In the investigation of a foodborne outbreak, samples typically undergo enrichment in broth for 24 hours followed by selective agar plating to isolate bacterial colonies for confirmatory testing. The broth enrichment step may also enable background microflora to proliferate, which can confound subsequent analysis if not inhibited by effective broth or agar additives. We used 16S rRNA gene sequencing to provide a preliminary survey of bacterial species associated with three brands of Latin-style cheeses after 24-hour broth enrichment. Brand A showed a greater diversity than the other two cheese brands (Brands B and C) at nearly every taxonomic level except phylum. Brand B showed the least diversity and was dominated by a single bacterial taxon, Exiguobacterium, not previously reported in cheese. This genus was also found in Brand C, although Lactococcus was prominent, an expected finding since this bacteria belongs to the group of lactic acid bacteria (LAB) commonly found in fermented foods. The contrasting diversity observed in Latin-style cheese was surprising, demonstrating that despite similarity of cheese type, raw materials and cheese making conditions appear to play a critical role in the microflora composition of the final product. The high bacterial diversity associated with Brand A suggests it may have been prepared with raw materials of high bacterial diversity or influenced by the ecology of the processing environment. Additionally, the presence of Exiguobacterium in high proportions (96%) in Brand B and, to a lesser extent, Brand C (46%), may have been influenced by the enrichment process. This study is the first to define Latin-style cheese microflora using Next-Generation Sequencing. These valuable preliminary data will direct selective tailoring of agar formulations to improve culture-based detection of pathogens in Latin-style cheese.

  18. Characterization of microflora in Latin-style cheeses by next-generation sequencing technology

    PubMed Central

    2012-01-01

    Background Cheese contamination can occur at numerous stages in the manufacturing process including the use of improperly pasteurized or raw milk. Of concern is the potential contamination by Listeria monocytogenes and other pathogenic bacteria that find the high moisture levels and moderate pH of popular Latin-style cheeses like queso fresco a hospitable environment. In the investigation of a foodborne outbreak, samples typically undergo enrichment in broth for 24 hours followed by selective agar plating to isolate bacterial colonies for confirmatory testing. The broth enrichment step may also enable background microflora to proliferate, which can confound subsequent analysis if not inhibited by effective broth or agar additives. We used 16S rRNA gene sequencing to provide a preliminary survey of bacterial species associated with three brands of Latin-style cheeses after 24-hour broth enrichment. Results Brand A showed a greater diversity than the other two cheese brands (Brands B and C) at nearly every taxonomic level except phylum. Brand B showed the least diversity and was dominated by a single bacterial taxon, Exiguobacterium, not previously reported in cheese. This genus was also found in Brand C, although Lactococcus was prominent, an expected finding since this bacteria belongs to the group of lactic acid bacteria (LAB) commonly found in fermented foods. Conclusions The contrasting diversity observed in Latin-style cheese was surprising, demonstrating that despite similarity of cheese type, raw materials and cheese making conditions appear to play a critical role in the microflora composition of the final product. The high bacterial diversity associated with Brand A suggests it may have been prepared with raw materials of high bacterial diversity or influenced by the ecology of the processing environment. Additionally, the presence of Exiguobacterium in high proportions (96%) in Brand B and, to a lesser extent, Brand C (46%), may have been influenced by the enrichment process. This study is the first to define Latin-style cheese microflora using Next-Generation Sequencing. These valuable preliminary data will direct selective tailoring of agar formulations to improve culture-based detection of pathogens in Latin-style cheese. PMID:23134566

  19. Beyond Strain: Personal Strengths and Mental Health of Mexican and Argentinean Dementia Caregivers.

    PubMed

    Sutter, Megan; Perrin, Paul B; Peralta, Silvina Victoria; Stolfi, Miriam E; Morelli, Eliana; Peña Obeso, Leticia Aracely; Arango-Lasprilla, Juan Carlos

    2016-07-01

    Life expectancy is increasing in Latin America resulting in the need for more family caregivers for older adults with dementia. The purpose of the current study was to examine the relationships between personal strengths (optimism, sense of coherence [SOC], and resilience) and the mental health of dementia caregivers from Latin America. Primary family dementia caregivers (n = 127) were identified via convenience sampling at the Instituto de Neurociencias de San Lucas, Argentina, and CETYS University, in Baja California, Mexico and completed measures of these constructs. Personal strengths explained between 32% and 50% of the variance in caregiver mental health. In a series of hierarchical multiple regressions, more manageability (β = -.38, p = .001), general resilience (β = -.24, p = .012), and social competence (β = -.21, p = .034) were uniquely associated with lower depression. Greater comprehensibility (β = -.28, p = .008) was uniquely associated with decreased burden, and manageability was marginally related (β = -.21, p< .10). Greater optimism (β = .37, p< .001) and manageability (β = .27, p = .004) were uniquely associated with increased life satisfaction. The personal strengths of caregivers in Latin America may be particularly important for their mental health because of the culturally imbedded sense of duty toward older family members. Incorporating strengths-based approaches into research on caregiver interventions in regions where caregiving is a highly culturally valued role such as Latin America may have the potential to improve the mental health of dementia caregivers. © The Author(s) 2015.

  20. Forecasting Zika Incidence in the 2016 Latin America Outbreak Combining Traditional Disease Surveillance with Search, Social Media, and News Report Data

    PubMed Central

    McGough, Sarah F.; Brownstein, John S.; Hawkins, Jared B.; Santillana, Mauricio

    2017-01-01

    Background Over 400,000 people across the Americas are thought to have been infected with Zika virus as a consequence of the 2015–2016 Latin American outbreak. Official government-led case count data in Latin America are typically delayed by several weeks, making it difficult to track the disease in a timely manner. Thus, timely disease tracking systems are needed to design and assess interventions to mitigate disease transmission. Methodology/Principal Findings We combined information from Zika-related Google searches, Twitter microblogs, and the HealthMap digital surveillance system with historical Zika suspected case counts to track and predict estimates of suspected weekly Zika cases during the 2015–2016 Latin American outbreak, up to three weeks ahead of the publication of official case data. We evaluated the predictive power of these data and used a dynamic multivariable approach to retrospectively produce predictions of weekly suspected cases for five countries: Colombia, El Salvador, Honduras, Venezuela, and Martinique. Models that combined Google (and Twitter data where available) with autoregressive information showed the best out-of-sample predictive accuracy for 1-week ahead predictions, whereas models that used only Google and Twitter typically performed best for 2- and 3-week ahead predictions. Significance Given the significant delay in the release of official government-reported Zika case counts, we show that these Internet-based data streams can be used as timely and complementary ways to assess the dynamics of the outbreak. PMID:28085877

  1. Energy expenditure during competitive Latin American dancing simulation.

    PubMed

    Massidda, Myosotis; Cugusi, Lucia; Ibba, Maurizio; Tradori, Iosto; Calò, Carla Maria

    2011-12-01

    The aims of this study were to estimate the energy expenditure (EE) and the intensity of physical activity (PA) during a competitive simulation of Latin American dancing and to evaluate the differences in PA and EE values between the sexes, between different dance types, and between the various phases of the competition. Ten Italian dancers (five couples, 5 males and 5 females) competing in Latin American dancing at the international level were examined in this study. The EE (kcal) was measured during the semifinal and final phases of the competition using the SenseWear Pro Armband (SWA). Paired-sample t-tests were used to determine differences in the metabolic equivalent (MET) and EE values between the semifinal and final phases and between each dance. One-way analysis of variance was used to analyze the differences in the MET and EE values between the sexes. The intensity of PA during the dance sequence ranged from moderate (3 to 6 METs) to vigorous (6 to 9 METs). The male dancers had higher EE values than the female dancers during all phases of the simulation. Similar MET values were observed in both sexes. The PA intensity during the finals phase was vigorous for 56% of the time of dance. Of all the dance styles, the rumba had the lowest MET and EE values. Our results demonstrate that competitive Latin American dancing is a heavy exercise and suggest that monitoring variables during normal training can improve training protocols and the dancers' fitness levels.

  2. Boundary conditions, dimensionality, topology and size dependence of the superconducting transition temperature

    NASA Astrophysics Data System (ADS)

    Fink, Herman J.; Haley, Stephen B.; Giuraniuc, Claudiu V.; Kozhevnikov, Vladimir F.; Indekeu, Joseph O.

    2005-11-01

    For various sample geometries (slabs, cylinders, spheres, hypercubes), de Gennes' boundary condition parameter b is used to study its effect upon the transition temperature Tc of a superconductor. For b > 0 the order parameter at the surface is decreased, and as a consequence Tc is reduced, while for b < 0 the order parameter at the surface is increased, thereby enhancing Tc of a specimen in zero magnetic field. Exact solutions, derived by Fink and Haley (Int. J. mod. Phys. B, 17, 2171 (2003)), of the order parameter of a slab of finite thickness as a function of temperature are presented, both for reduced and enhanced transition (nucleation) temperatures. At the nucleation temperature the order parameter approaches zero. This concise review closes with a link established between de Gennes' microscopic boundary condition and the Ginzburg-Landau phenomenological approach, and a discussion of some relevant experiments. For example, applying the boundary condition with b < 0 to tin whiskers elucidates the increase of Tc with strain.

  3. Global sensitivity and uncertainty analysis of an atmospheric chemistry transport model: the FRAME model (version 9.15.0) as a case study

    NASA Astrophysics Data System (ADS)

    Aleksankina, Ksenia; Heal, Mathew R.; Dore, Anthony J.; Van Oijen, Marcel; Reis, Stefan

    2018-04-01

    Atmospheric chemistry transport models (ACTMs) are widely used to underpin policy decisions associated with the impact of potential changes in emissions on future pollutant concentrations and deposition. It is therefore essential to have a quantitative understanding of the uncertainty in model output arising from uncertainties in the input pollutant emissions. ACTMs incorporate complex and non-linear descriptions of chemical and physical processes which means that interactions and non-linearities in input-output relationships may not be revealed through the local one-at-a-time sensitivity analysis typically used. The aim of this work is to demonstrate a global sensitivity and uncertainty analysis approach for an ACTM, using as an example the FRAME model, which is extensively employed in the UK to generate source-receptor matrices for the UK Integrated Assessment Model and to estimate critical load exceedances. An optimised Latin hypercube sampling design was used to construct model runs within ±40 % variation range for the UK emissions of SO2, NOx, and NH3, from which regression coefficients for each input-output combination and each model grid ( > 10 000 across the UK) were calculated. Surface concentrations of SO2, NOx, and NH3 (and of deposition of S and N) were found to be predominantly sensitive to the emissions of the respective pollutant, while sensitivities of secondary species such as HNO3 and particulate SO42-, NO3-, and NH4+ to pollutant emissions were more complex and geographically variable. The uncertainties in model output variables were propagated from the uncertainty ranges reported by the UK National Atmospheric Emissions Inventory for the emissions of SO2, NOx, and NH3 (±4, ±10, and ±20 % respectively). The uncertainties in the surface concentrations of NH3 and NOx and the depositions of NHx and NOy were dominated by the uncertainties in emissions of NH3, and NOx respectively, whilst concentrations of SO2 and deposition of SOy were affected by the uncertainties in both SO2 and NH3 emissions. Likewise, the relative uncertainties in the modelled surface concentrations of each of the secondary pollutant variables (NH4+, NO3-, SO42-, and HNO3) were due to uncertainties in at least two input variables. In all cases the spatial distribution of relative uncertainty was found to be geographically heterogeneous. The global methods used here can be applied to conduct sensitivity and uncertainty analyses of other ACTMs.

  4. Influenza A virus in swine breeding herds: Combination of vaccination and biosecurity practices can reduce likelihood of endemic piglet reservoir.

    PubMed

    White, L A; Torremorell, M; Craft, M E

    2017-03-01

    Recent modelling and empirical work on influenza A virus (IAV) suggests that piglets play an important role as an endemic reservoir. The objective of this study is to test intervention strategies aimed at reducing the incidence of IAV in piglets and ideally, preventing piglets from becoming exposed in the first place. These interventions include biosecurity measures, vaccination, and management options that swine producers may employ individually or jointly to control IAV in their herds. We have developed a stochastic Susceptible-Exposed-Infectious-Recovered-Vaccinated (SEIRV) model that reflects the spatial organization of a standard breeding herd and accounts for the different production classes of pigs therein. Notably, this model allows for loss of immunity for vaccinated and recovered animals, and for vaccinated animals to have different latency and infectious periods from unvaccinated animals as suggested by the literature. The interventions tested include: (1) varied timing of gilt introductions to the breeding herd, (2) gilt separation (no indirect transmission to or from the gilt development unit), (3) gilt vaccination upon arrival to the farm, (4) early weaning, and (5) vaccination strategies of sows with different timing (mass and pre-farrow) and efficacy (homologous vs. heterologous). We conducted a Latin Hypercube Sampling and Partial Rank Correlation Coefficient (LHS-PRCC) analysis combined with a random forest analysis to assess the relative importance of each epidemiological parameter in determining epidemic outcomes. In concert, mass vaccination, early weaning of piglets (removal 0-7days after birth), gilt separation, gilt vaccination, and longer periods between introductions of gilts (6 months) were the most effective at reducing prevalence. Endemic prevalence overall was reduced by 51% relative to the null case; endemic prevalence in piglets was reduced by 74%; and IAV was eliminated completely from the herd in 23% of all simulations. Importantly, elimination of IAV was most likely to occur within the first few days of an epidemic. The latency period, infectious period, duration of immunity, and transmission rate for piglets with maternal immunity had the highest correlation with three separate measures of IAV prevalence; therefore, these are parameters that warrant increased attention for obtaining empirical estimates. Our findings support other studies suggesting that piglets play a key role in maintaining IAV in breeding herds. We recommend biosecurity measures in combination with targeted homologous vaccination or vaccines that provide wider cross-protective immunity to prevent incursions of virus to the farm and subsequent establishment of an infected piglet reservoir. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Calibration and parameterization of a semi-distributed hydrological model to support sub-daily ensemble flood forecasting; a watershed in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida Bressiani, D.; Srinivasan, R.; Mendiondo, E. M.

    2013-12-01

    The use of distributed or semi-distributed models to represent the processes and dynamics of a watershed in the last few years has increased. These models are important tools to predict and forecast the hydrological responses of the watersheds, and they can subside disaster risk management and planning. However they usually have a lot of parameters, of which, due to the spatial and temporal variability of the processes, are not known, specially in developing countries; therefore a robust and sensible calibration is very important. This study conduced a sub-daily calibration and parameterization of the Soil & Water Assessment Tool (SWAT) for a 12,600 km2 watershed in southeast Brazil, and uses ensemble forecasts to evaluate if the model can be used as a tool for flood forecasting. The Piracicaba Watershed, in São Paulo State, is mainly rural, but has about 4 million of population in highly relevant urban areas, and three cities in the list of critical cities of the National Center for Natural Disasters Monitoring and Alerts. For calibration: the watershed was divided in areas with similar hydrological characteristics, for each of these areas one gauge station was chosen for calibration; this procedure was performed to evaluate the effectiveness of calibrating in fewer places, since areas with the same group of groundwater, soil, land use and slope characteristics should have similar parameters; making calibration a less time-consuming task. The sensibility analysis and calibration were performed on the software SWAT-CUP with the optimization algorithm: Sequential Uncertainly Fitting Version 2 (SUFI-2), which uses Latin hypercube sampling scheme in an iterative process. The performance of the models to evaluate the calibration and validation was done with: Nash-Sutcliffe efficiency coefficient (NSE), determination coefficient (r2), root mean square error (RMSE), and percent bias (PBIAS), with monthly average values of NSE around 0.70, r2 of 0.9, normalized RMSE of 0.01, and PBIAS of 10. Past events were analysed to evaluate the possibility of using the SWAT developed model for Piracicaba watershed as a tool for ensemble flood forecasting. For the ensemble evaluation members from the numerical model Eta were used. Eta is an atmospheric model used for research and operational purposes, with 5km resolution, and is updated twice a day (00 e 12 UTC) for a ten day horizon, with precipitation and weather estimates for each hour. The parameterized SWAT model performed overall well for ensemble flood forecasting.

  6. Predicting the impact of insecticide-treated bed nets on malaria transmission: the devil is in the detail.

    PubMed

    Gu, Weidong; Novak, Robert J

    2009-11-16

    Insecticide-treated bed nets (ITNs), including long-lasting insecticidal nets (LLINs), play a primary role in global campaigns to roll back malaria in tropical Africa. Effectiveness of treated nets depends on direct impacts on individual mosquitoes including killing and excite-repellency, which vary considerably among vector species due to variations in host-seeking behaviours. While monitoring and evaluation programmes of ITNs have focuses on morbidity and all-cause mortality in humans, local entomological context receives little attention. Without knowing the dynamics of local vector species and their responses to treated nets, it is difficult to predict clinical outcomes when ITN applications are scaled up across African continent. Sound model frameworks incorporating intricate interactions between mosquitoes and treated nets are needed to develop the predictive capacity for scale-up applications of ITNs. An established agent-based model was extended to incorporate the direct outcomes, e.g. killing and avoidance, of individual mosquitoes exposing to ITNs in a hypothetical village setting with 50 houses and 90 aquatic habitats. Individual mosquitoes were tracked throughout the life cycle across the landscape. Four levels of coverage, i.e. 40, 60, 80 and 100%, were applied at the household level with treated houses having only one bed net. By using Latin hypercube sampling scheme, parameters governing killing, diverting and personal protection of net users were evaluated for their relative roles in containing mosquito populations, entomological inoculation rates (EIRs) and malaria incidence. There were substantial gaps in coverage between households and individual persons, and 100% household coverage resulted in circa 50% coverage of the population. The results show that applications of ITNs could give rise to varying impacts on population-level metrics depending on values of parameters governing interactions of mosquitoes and treated nets at the individual level. The most significant factor in determining effectiveness was killing capability of treated nets. Strong excito-repellent effect of impregnated nets might lead to higher risk exposure to non-bed net users. With variabilities of vector mosquitoes in host-seeking behaviours and the responses to treated nets, it is anticipated that scale-up applications of INTs might produce varying degrees of success dependent on local entomological and epidemiological contexts. This study highlights that increased ITN coverage led to significant reduction in risk exposure and malaria incidence only when treated nets yielded high killing effects. It is necessary to test efficacy of treated nets on local dominant vector mosquitoes, at least in laboratory, for monitoring and evaluation of ITN programmes.

  7. Newcomer Immigrant Adolescents and Ambiguous Discrimination: The Role of Cognitive Appraisal

    ERIC Educational Resources Information Center

    Patel, Sita G.; Tabb, Kevin M.; Strambler, Michael J.; Eltareb, Fazia

    2015-01-01

    Cognitive appraisal has been shown to mediate the relationship between stressors and internalizing symptoms, but not among newcomer immigrant youth facing ambiguous discrimination. Using a mixed-methods design with a sample of newcomer adolescents from African, Arab, Asian, Caribbean, European, and Latin American countries, this study measured the…

  8. "The South American Way": Hollywood Looks at Latins and at Latin America.

    ERIC Educational Resources Information Center

    Aiex, Nola Kortner

    Latin elements or themes made for the North American market have been used in American films, but at the same time these films have been playing in a Latin American market, making it useful to examine how Latin America has been portrayed in these films. The taste for exotic locales and themes is an element that has been present since the…

  9. Computer program to minimize prediction error in models from experiments with 16 hypercube points and 0 to 6 center points

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1982-01-01

    A previous report described a backward deletion procedure of model selection that was optimized for minimum prediction error and which used a multiparameter combination of the F - distribution and an order statistics distribution of Cochran's. A computer program is described that applies the previously optimized procedure to real data. The use of the program is illustrated by examples.

  10. Software Techniques for Non-Von Neumann Architectures

    DTIC Science & Technology

    1990-01-01

    Commtopo programmable Benes net.; hypercubic lattice for QCD Control CENTRALIZED Assign STATIC Memory :SHARED Synch UNIVERSAL Max-cpu 566 Proessor...boards (each = 4 floating point units, 2 multipliers) Cpu-size 32-bit floating point chips Perform 11.4 Gflops Market quantum chromodynamics ( QCD ...functions there should exist a capability to define hierarchies and lattices of complex objects. A complex object can be made up of a set of simple objects

  11. On the Local Equivalence Between the Canonical and the Microcanonical Ensembles for Quantum Spin Systems

    NASA Astrophysics Data System (ADS)

    Tasaki, Hal

    2018-06-01

    We study a quantum spin system on the d-dimensional hypercubic lattice Λ with N=L^d sites with periodic boundary conditions. We take an arbitrary translation invariant short-ranged Hamiltonian. For this system, we consider both the canonical ensemble with inverse temperature β _0 and the microcanonical ensemble with the corresponding energy U_N(β _0) . For an arbitrary self-adjoint operator \\hat{A} whose support is contained in a hypercubic block B inside Λ , we prove that the expectation values of \\hat{A} with respect to these two ensembles are close to each other for large N provided that β _0 is sufficiently small and the number of sites in B is o(N^{1/2}) . This establishes the equivalence of ensembles on the level of local states in a large but finite system. The result is essentially that of Brandao and Cramer (here restricted to the case of the canonical and the microcanonical ensembles), but we prove improved estimates in an elementary manner. We also review and prove standard results on the thermodynamic limits of thermodynamic functions and the equivalence of ensembles in terms of thermodynamic functions. The present paper assumes only elementary knowledge on quantum statistical mechanics and quantum spin systems.

  12. Growing a hypercubical output space in a self-organizing feature map.

    PubMed

    Bauer, H U; Villmann, T

    1997-01-01

    Neural maps project data from an input space onto a neuron position in a (often lower dimensional) output space grid in a neighborhood preserving way, with neighboring neurons in the output space responding to neighboring data points in the input space. A map-learning algorithm can achieve an optimal neighborhood preservation only, if the output space topology roughly matches the effective structure of the data in the input space. We here present a growth algorithm, called the GSOM or growing self-organizing map, which enhances a widespread map self-organization process, Kohonen's self-organizing feature map (SOFM), by an adaptation of the output space grid during learning. The GSOM restricts the output space structure to the shape of a general hypercubical shape, with the overall dimensionality of the grid and its extensions along the different directions being subject of the adaptation. This constraint meets the demands of many larger information processing systems, of which the neural map can be a part. We apply our GSOM-algorithm to three examples, two of which involve real world data. Using recently developed methods for measuring the degree of neighborhood preservation in neural maps, we find the GSOM-algorithm to produce maps which preserve neighborhoods in a nearly optimal fashion.

  13. Application of Deep Learning Architectures for Accurate and Rapid Detection of Internal Mechanical Damage of Blueberry Using Hyperspectral Transmittance Data.

    PubMed

    Wang, Zhaodi; Hu, Menghan; Zhai, Guangtao

    2018-04-07

    Deep learning has become a widely used powerful tool in many research fields, although not much so yet in agriculture technologies. In this work, two deep convolutional neural networks (CNN), viz. Residual Network (ResNet) and its improved version named ResNeXt, are used to detect internal mechanical damage of blueberries using hyperspectral transmittance data. The original structure and size of hypercubes are adapted for the deep CNN training. To ensure that the models are applicable to hypercube, we adjust the number of filters in the convolutional layers. Moreover, a total of 5 traditional machine learning algorithms, viz. Sequential Minimal Optimization (SMO), Linear Regression (LR), Random Forest (RF), Bagging and Multilayer Perceptron (MLP), are performed as the comparison experiments. In terms of model assessment, k-fold cross validation is used to indicate that the model performance does not vary with the different combination of dataset. In real-world application, selling damaged berries will lead to greater interest loss than discarding the sound ones. Thus, precision, recall, and F1-score are also used as the evaluation indicators alongside accuracy to quantify the false positive rate. The first three indicators are seldom used by investigators in the agricultural engineering domain. Furthermore, ROC curves and Precision-Recall curves are plotted to visualize the performance of classifiers. The fine-tuned ResNet/ResNeXt achieve average accuracy and F1-score of 0.8844/0.8784 and 0.8952/0.8905, respectively. Classifiers SMO/ LR/RF/Bagging/MLP obtain average accuracy and F1-score of 0.8082/0.7606/0.7314/0.7113/0.7827 and 0.8268/0.7796/0.7529/0.7339/0.7971, respectively. Two deep learning models achieve better classification performance than the traditional machine learning methods. Classification for each testing sample only takes 5.2 ms and 6.5 ms respectively for ResNet and ResNeXt, indicating that the deep learning framework has great potential for online fruit sorting. The results of this study demonstrate the potential of deep CNN application on analyzing the internal mechanical damage of fruit.

  14. Application of Deep Learning Architectures for Accurate and Rapid Detection of Internal Mechanical Damage of Blueberry Using Hyperspectral Transmittance Data

    PubMed Central

    Hu, Menghan; Zhai, Guangtao

    2018-01-01

    Deep learning has become a widely used powerful tool in many research fields, although not much so yet in agriculture technologies. In this work, two deep convolutional neural networks (CNN), viz. Residual Network (ResNet) and its improved version named ResNeXt, are used to detect internal mechanical damage of blueberries using hyperspectral transmittance data. The original structure and size of hypercubes are adapted for the deep CNN training. To ensure that the models are applicable to hypercube, we adjust the number of filters in the convolutional layers. Moreover, a total of 5 traditional machine learning algorithms, viz. Sequential Minimal Optimization (SMO), Linear Regression (LR), Random Forest (RF), Bagging and Multilayer Perceptron (MLP), are performed as the comparison experiments. In terms of model assessment, k-fold cross validation is used to indicate that the model performance does not vary with the different combination of dataset. In real-world application, selling damaged berries will lead to greater interest loss than discarding the sound ones. Thus, precision, recall, and F1-score are also used as the evaluation indicators alongside accuracy to quantify the false positive rate. The first three indicators are seldom used by investigators in the agricultural engineering domain. Furthermore, ROC curves and Precision-Recall curves are plotted to visualize the performance of classifiers. The fine-tuned ResNet/ResNeXt achieve average accuracy and F1-score of 0.8844/0.8784 and 0.8952/0.8905, respectively. Classifiers SMO/ LR/RF/Bagging/MLP obtain average accuracy and F1-score of 0.8082/0.7606/0.7314/0.7113/0.7827 and 0.8268/0.7796/0.7529/0.7339/0.7971, respectively. Two deep learning models achieve better classification performance than the traditional machine learning methods. Classification for each testing sample only takes 5.2 ms and 6.5 ms respectively for ResNet and ResNeXt, indicating that the deep learning framework has great potential for online fruit sorting. The results of this study demonstrate the potential of deep CNN application on analyzing the internal mechanical damage of fruit. PMID:29642454

  15. A methodology for system-of-systems design in support of the engineering team

    NASA Astrophysics Data System (ADS)

    Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.

    2012-04-01

    Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research project co-financed by Piedmont Region and firms and universities of the Piedmont Aerospace District in the ambit of the P.O.R-F.E.S.R. 2007-2013 program.

  16. [Use of blood lead data to evaluate and prevent childhood lead poisoning in Latin America].

    PubMed

    Romieu, Isabelle

    2003-01-01

    Exposure to lead is a widespread and serious threat to the health of children in Latin America. Health officials should monitor sources of exposure and health outcomes to design, implement, and evaluate prevention and control activities. To evaluate the magnitude of lead as a public health problem, three key elements must be defined: I) the potential sources of exposure, 2) the indicators to evaluate health effects and environmental exposure, and 3) the sampling methods for the population at risk. Several strategies can be used to select the study population depending on the study objectives, the time limitations, and the available resources. If the objective is to evaluate the magnitude and sources of the problem, the following sampling methods can be used: I) population-based random sampling; 2) facility-based random sampling within hospitals, daycare centers, or schools; 3) target sampling of high risk groups; 4) convenience sampling of volunteers; and 5) case reporting (which can lead to the identification of populations at risk and sources of exposures). For all sampling methods, information gathering should include the use of a questionnaire to collect general information on the participants and on potential local sources of exposure, as well as the collection of biological samples. In interpreting data, one should consider the type of sampling used and the non-response rates, as well as factors that might influence blood lead measurements, such as age and seasonal variability. Blood lead measurements should be integrated in an overall strategy to prevent lead toxicity in children. The English version of this paper is available at: http://www.insp.mx/salud/index.html.

  17. French Latin Association Demands Full Restoration of Secondary Classics Curriculum

    ERIC Educational Resources Information Center

    Denegri, Raymond

    1978-01-01

    Discusses the demands of the Association pour la Defense du Latin that Latin be returned to the French secondary school curriculum and that the Classics-Science baccalaureat be restored as one of the French secondary programs. Benefits of studying Latin are enumerated. (KM)

  18. Tobacco industry success in preventing regulation of secondhand smoke in Latin America: the "Latin Project"

    PubMed Central

    Barnoya, J; Glantz, S

    2002-01-01

    Objective: To examine the tobacco industry's strategy to avoid regulations on secondhand smoke exposure in Latin America. Methods: Systematic search of tobacco industry documents available through the internet. All available materials, including confidential reports regarding research, lobbying, and internal memoranda exchanged between the tobacco industry representatives, tobacco industry lawyers, and key players in Latin America. Results: In Latin America, Philip Morris International and British American Tobacco, working through the law firm Covington & Burling, developed a network of well placed physicians and scientists through their "Latin Project" to generate scientific arguments minimising secondhand smoke as a health hazard, produce low estimates of exposure, and to lobby against smoke-free workplaces and public places. The tobacco industry's role was not disclosed. Conclusions: The strategies used by the industry have been successful in hindering development of public health programmes on secondhand smoke. Latin American health professionals need to be aware of this industry involvement and must take steps to counter it to halt the tobacco epidemic in Latin America. PMID:12432156

  19. Gaps In Primary Care And Health System Performance In Six Latin American And Caribbean Countries.

    PubMed

    Macinko, James; Guanais, Frederico C; Mullachery, Pricila; Jimenez, Geronimo

    2016-08-01

    The rapid demographic and epidemiological transitions occurring in Latin America and the Caribbean have led to high levels of noncommunicable diseases in the region. In addition to reduced risk factors for chronic conditions, a strong health system for managing chronic conditions is vital. This study assessed the extent to which populations in six Latin American and Caribbean countries receive high-quality primary care, and it examined the relationship between experiences with care and perceptions of health system performance. We applied a validated survey on access, use, and satisfaction with health care services to nationally representative samples of the populations of Brazil, Colombia, El Salvador, Jamaica, Mexico, and Panama. Respondents reported considerable gaps in the ways in which primary care is organized, financed, and delivered. Nearly half reported using the emergency department for a condition they considered treatable in a primary care setting. Reports of more primary care problems were associated with worse perceptions of health system performance and quality and less receipt of preventive care. Urgent attention to primary care performance is required as the region's population continues to age at an unprecedented rate. Project HOPE—The People-to-People Health Foundation, Inc.

  20. Hepatitis E virus: An ancient hidden enemy in Latin America

    PubMed Central

    Fierro, Nora A; Realpe, Mauricio; Meraz-Medina, Tzintli; Roman, Sonia; Panduro, Arturo

    2016-01-01

    Hepatitis E virus (HEV) infection is a common cause of acute clinical hepatitis worldwide. HEV is an RNA-containing virus and the only member of the genus Hepevirus in the family Hepeviridae. Human HEV is classified into four genotypes widely distributed across the world. The virus is mainly transmitted via the fecal-oral route, and water-borne epidemics have become characteristic of hepatitis E in developing countries, including those in Latin America. The zoonotic potential of HEV is broadly recognized. Thus, there is an urgent need to re-evaluate virus transmission scenarios and to enforce epidemiological surveillance systems. Additionally, it is known that HEV infections, initially defined as self-limiting, can also take chronic courses in immunocompromised patients. Moreover, we recently reported a high seroprevalence of HEV in samples from cirrhotic patients with no other etiological agents present, suggesting the potential role of HEV in the development of chronic liver illness. In this review, HEV genomic variability, transmission, chronic infectious course, zoonotic potential and treatment are discussed. Focus is placed on the impact of HEV infection in Latin America, to support the development of specific control strategies and the handling of this important and typically imperceptible viral infection. PMID:26900289

  1. Integrated Pest Management of Coffee Berry Borer: Strategies from Latin America that Could Be Useful for Coffee Farmers in Hawaii

    PubMed Central

    Aristizábal, Luis F.; Bustillo, Alex E.; Arthurs, Steven P.

    2016-01-01

    The coffee berry borer (CBB), Hypothenemus hampei Ferrari (Coleoptera: Curculionidae: Scolytinae) is the primary arthropod pest of coffee plantations worldwide. Since its detection in Hawaii (September 2010), coffee growers are facing financial losses due to reduced quality of coffee yields. Several control strategies that include cultural practices, biological control agents (parasitoids), chemical and microbial insecticides (entomopathogenic fungi), and a range of post-harvest sanitation practices have been conducted to manage CBB around the world. In addition, sampling methods including the use of alcohol based traps for monitoring CBB populations have been implemented in some coffee producing countries in Latin America. It is currently unclear which combination of CBB control strategies is optimal under economical, environmental, and sociocultural conditions of Hawaii. This review discusses components of an integrated pest management program for CBB. We focus on practical approaches to provide guidance to coffee farmers in Hawaii. Experiences of integrated pest management (IPM) of CBB learned from Latin America over the past 25 years may be relevant for establishing strategies of control that may fit under Hawaiian coffee farmers’ conditions. PMID:26848690

  2. Social determinants of childhood asthma symptoms: an ecological study in urban Latin America.

    PubMed

    Fattore, Gisel L; Santos, Carlos A T; Barreto, Mauricio L

    2014-04-01

    Asthma is an important public health problem in urban Latin America. This study aimed to analyze the role of socioeconomic and environmental factors as potential determinants of asthma symptoms prevalence in children from Latin American (LA) urban centers. We selected 31 LA urban centers with complete data, and an ecological analysis was performed. According to our theoretical framework, the explanatory variables were classified in three levels: distal, intermediate, and proximate. The association between variables in the three levels and prevalence of asthma symptoms was examined by bivariate and multivariate linear regression analysis weighed by sample size. In a second stage, we fitted several linear regression models introducing sequentially the variables according to the predefined hierarchy. In the final hierarchical model Gini Index, crowding, sanitation, variation in infant mortality rates and homicide rates, explained great part of the variance in asthma prevalence between centers (R(2) = 75.0 %). We found a strong association between socioeconomic and environmental variables and prevalence of asthma symptoms in LA urban children, and according to our hierarchical framework and the results found we suggest that social inequalities (measured by the Gini Index) is a central determinant to explain high prevalence of asthma in LA.

  3. Violence reported by the immigrant population is high as compared with the native population in southeast Spain.

    PubMed

    Colorado-Yohar, S; Tormo, M J; Salmerón, D; Dios, S; Ballesta, M; Navarro, C

    2012-11-01

    Immigrants constitute a population vulnerable to the problem of violence. This study sought to ascertain the prevalence of violence reported by the immigrant population in the Murcian Region of Spain and characterize the related factors, taking the country population as reference. A cross-sectional study was carried out based on a representative population sample of Latin American (n = 672; 48% women), Moroccan (n = 361; 25% women), and Spanish origin (n = 1,303; 66% women), aged 16 to 64 years. Using a specific questionnaire, the prevalence of violence in the preceding year was assessed. The results were compared with the Spaniards using the 2006 National Health Survey (NHS). Multivariate logistic regression models were used to study the factors associated with violence having been reported in each group, both separately and in immigrants versus Spaniards. Finally, the cause and place of last aggression were studied. The prevalence of violence was 6.5% in Latin Americans, 12.0% in Moroccans, and 2.7% in Spaniards. Discrimination was the principal violence-related factor in all three groups. Among Latin Americans, low educational level was also associated with violence. Among Moroccans, those who had perceived discrimination showed the greatest differences in prevalence of violence compared with natives. Intimate partner violence (IPV) registered a prevalence of below 2%. As a conclusion, in this study, violence was little reported and higher among immigrants. The principal violence-related factor was discrimination. More studies of this type are called for to characterize the problem in other population-representative samples.

  4. Stress and psychopathology in latin-american immigrants: the role of coping strategies.

    PubMed

    Patiño, Camila; Kirchner, Teresa

    2010-01-01

    Increased migration into Spain requires the development of preventive strategies that help both immigrants and the host society to deal with the associated risk factors and thus avoid the emergence of psychopathology. To determine the level of psychopathology in Latin-American immigrants who reside in Barcelona and its relationship to the coping strategies used to mitigate the effects of the stress linked to migration. The sample comprised 210 Latin-American immigrants over the age of 18. Sampling was based on consecutive cases, and participants were contacted through an NGO. Employment is the stressor that most affects immigrants. Psychopathological symptoms are common among the immigrant population, and there is a relationship between the use of avoidance coping strategies and greater symptomatology. The longer immigrants have been in the host country, the less they make use of approach strategies. The migratory process produces high levels of stress that are linked to psychopathology. Being subjected to a prolonged stressor has a destabilizing effect on both mental and physical health and can lead to a deterioration in social relationships due to more intense feelings of anger and frustration. Coping strategies appear to be more widely used among immigrants than in the indigenous population, and this may indicate the high levels of stress to which the former are subject and the attempts they make to deal with it. The limitations of the study include the source of data collection and the fact that most of the instruments used have not been validated in the participants' countries of origin. Copyright 2009 S. Karger AG, Basel.

  5. Nasal allergies in the Latin American population: results from the Allergies in Latin America survey.

    PubMed

    Neffen, Hugo; Mello, Joao F; Sole, Dirceu; Naspitz, Charles K; Dodero, Alberto Eduardo; Garza, Héctor León; Guerra, Edgard Novelo; Baez-Loyola, Carlos; Boyle, John M; Wingertzahn, Mark A

    2010-01-01

    Allergies in Latin America is the first cross-national survey that describes the symptoms, impact, and treatment of nasal allergies (NAs) in individuals >or=4 years old in Latin America (LA). In total, 22,012 households across the Latin American countries of Argentina, Brazil, Chile, Colombia, Ecuador, Mexico, Peru, and Venezuela were screened for children, adolescents, and adults with a diagnosis of NA and either symptoms or treatment in the past 12 months. A total of 1088 adults and 457 children and adolescents were included and the sample was probability based to ensure valid statistical inference to the population. Approximately 7% of the LA population was diagnosed with NAs with two of three respondents stating that their allergies were seasonal or intermittent in nature. A general practice physician or otolaryngologist diagnosed the majority of individuals surveyed. Nasal congestion was the most common and bothersome symptom of NAs. Sufferers indicated that their symptoms affected productivity and sleep and had a negative impact on quality of life. Two-thirds of patients reported taking some type of medication for their NAs, with a roughly equal percentage of patients reporting taking over-the-counter versus prescription medications. Changing medications was most commonly done in those reporting inadequate efficacy. The most common reasons cited for dissatisfaction with current medications were related to inadequate effectiveness, effectiveness wearing off with chronic use, failure to provide 24-hour relief, and bothersome side effects (e.g., unpleasant taste and retrograde drainage into the esophagus). Findings from this cross-national survey on NAs have confirmed a high prevalence of physician-diagnosed NAs and a considerable negative impact on daily quality of life and work productivity as well as substantial disease management challenges in LA. Through identification of disease impact on the LA population and further defining treatment gaps, clinicians in LA may better understand and treat NAs, thus leading to improvements in overall patient satisfaction and quality of life.

  6. Latin Curriculum Standards. Revised.

    ERIC Educational Resources Information Center

    Delaware State Dept. of Public Instruction, Dover.

    Delaware's state standards for the Latin curriculum in the public schools are presented. An introductory section outlines the goals of the Latin program for reading, cultural awareness, grammar, writing, and oral language and briefly discusses the philosophy of and approaches to Latin instruction in elementary and middle schools. Three subsequent…

  7. Re-evaluation of the sorption behaviour of Bromide and Sulfamethazine under field conditions using leaching data and modelling methods

    NASA Astrophysics Data System (ADS)

    Gassmann, Matthias; Olsson, Oliver; Höper, Heinrich; Hamscher, Gerd; Kümmerer, Klaus

    2016-04-01

    The simulation of reactive transport in the aquatic environment is hampered by the ambiguity of environmental fate process conceptualizations for a specific substance in the literature. Concepts are usually identified by experimental studies and inverse modelling under controlled lab conditions in order to reduce environmental uncertainties such as uncertain boundary conditions and input data. However, since environmental conditions affect substance behaviour, a re-evaluation might be necessary under environmental conditions which might, in turn, be affected by uncertainties. Using a combination of experimental data and simulations of the leaching behaviour of the veterinary antibiotic Sulfamethazine (SMZ; synonym: sulfadimidine) and the hydrological tracer Bromide (Br) in a field lysimeter, we re-evaluated the sorption concepts of both substances under uncertain field conditions. Sampling data of a field lysimeter experiment in which both substances were applied twice a year with manure and sampled at the bottom of two lysimeters during three subsequent years was used for model set-up and evaluation. The total amount of leached SMZ and Br were 22 μg and 129 mg, respectively. A reactive transport model was parameterized to the conditions of the two lysimeters filled with monoliths (depth 2 m, area 1 m²) of a sandy soil showing a low pH value under which Bromide is sorptive. We used different sorption concepts such as constant and organic-carbon dependent sorption coefficients and instantaneous and kinetic sorption equilibrium. Combining the sorption concepts resulted in four scenarios per substance with different equations for sorption equilibrium and sorption kinetics. The GLUE (Generalized Likelihood Uncertainty Estimation) method was applied to each scenario using parameter ranges found in experimental and modelling studies. The parameter spaces for each scenario were sampled using a Latin Hypercube method which was refined around local model efficiency maxima. Results of the cumulative SMZ leaching simulations suggest a best conceptualization combination of instantaneous sorption to organic carbon which is consistent with the literature. The best Nash-Sutcliffe efficiency (Neff) was 0.96 and the 5th and 95th percentile of the uncertainty estimation were 18 and 27 μg. In contrast, both scenarios of kinetic Br sorption had similar results (Neff =0.99, uncertainty bounds 110-176 mg and 112-176 mg) but were clearly better than instantaneous sorption scenarios. Therefore, only the concept of sorption kinetics could be identified for Br modelling whereas both tested sorption equilibrium coefficient concepts performed equally well. The reasons for this specific case of equifinality may be uncertainties of model input data under field conditions or an insensitivity of the sorption equilibrium method due to relatively low adsorption of Br. Our results show that it may be possible to identify or at least falsify specific sorption concepts under uncertain field conditions using a long-term leaching experiment and modelling methods. Cases of environmental fate concept equifinality arouse the possibility of future model structure uncertainty analysis using an ensemble of models with different environmental fate concepts.

  8. Twenty-four cases of imported zika virus infections diagnosed by molecular methods.

    PubMed

    Alejo-Cancho, Izaskun; Torner, Nuria; Oliveira, Inés; Martínez, Ana; Muñoz, José; Jane, Mireia; Gascón, Joaquim; Requena-Méndez, Ana; Vilella, Anna; Marcos, M Ángeles; Pinazo, María Jesús; Gonzalo, Verónica; Rodriguez, Natalia; Martínez, Miguel J

    2016-10-01

    Zika virus is an emerging flavivirus widely spreading through Latin America. Molecular diagnosis of the infection can be performed using serum, urine and saliva samples, although a well-defined diagnostic algorithm is not yet established. We describe a series of 24 cases of imported zika virus infection into Catalonia (northeastern Spain). Based on our findings, testing of paired serum and urine samples is recommended. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Phylogenetic Analysis of Rubella Virus Strains from an Outbreak in Madrid, Spain, from 2004 to 2005 ▿ †

    PubMed Central

    Martínez-Torres, A. O.; Mosquera, M. M.; Sanz, J. C.; Ramos, B.; Echevarría, J. E.

    2009-01-01

    An outbreak of rubella affected 460 individuals in 2004 and 2005 in the community of Madrid, Spain. Most of the patients were nonvaccinated Latin American immigrants or Spanish males. This study presents the first data on rubella virus genotypes in Spain. Forty selected clinical samples (2 urine, 5 serum, 3 blood, 2 saliva, and 28 pharyngeal exudate samples) from 40 cases were collected. The 739-nucleotide sequence recommended by the World Health Organization obtained from viral RNA in these samples was analyzed by using the MEGA v4.0 software. Seventeen isolates were obtained from 40 clinical samples from the outbreak, including two isolated from congenital rubella syndrome cases. Only viral RNA of genotype 1j was detected in both isolates and clinical specimens. Two variations in amino acids, G253C and T394S, which are involved in neutralization epitopes arose during the outbreak, but apparently there was no positive selection of either of them. The origin of the outbreak remains unknown because of poor virologic surveillance in Latin America and the African countries neighboring Spain. On the other hand, this is the first report of this genotype in Europe. The few published sequences of genotype 1j indicate that it comes from Japan and the Philippines, but there are no epidemiological data supporting this as the origin of the Madrid outbreak. PMID:19020066

  10. The Bologna Process from a Latin American Perspective

    ERIC Educational Resources Information Center

    Brunner, Jose Joaquin

    2009-01-01

    Although Latin America's geography, history, and languages might seem a suitable foundation for a Bologna-type process, the development of a common Latin American higher education and research area meets predictable difficulties.The reasons are to be found in the continent's historic and modern institutional patterns. Latin American governments…

  11. Latin America and the United States: What Do United States History Textbooks Tell Us?

    ERIC Educational Resources Information Center

    Fleming, Dan B.

    1982-01-01

    Evaluates how U.S.-Latin American relations are presented in high school U.S. history textbooks. An examination of 10 textbooks published between 1977-81 revealed inadequate coverage of Latin American cultural diversity and United States foreign policy from the Latin American perspective. (AM)

  12. Atlanta Public Schools Latin Guide.

    ERIC Educational Resources Information Center

    Atlanta Public Schools, GA.

    This teacher's guide outlines the basic objectives and the content of the Atlanta Public Schools Latin program and suggests resources and methods to achieve the stated goals. The philosophy and general objectives of the program are presented. Course outlines include: (1) Beginning Latin, (2) Intermediate Latin, (3) Vergil's "Aeneid," (4) Ovid:…

  13. HRD in Latin America. Symposium 5. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This document contains three papers on human resource development (HRD) in Latin America. "Looking at the Literature on Workplace Democracy in Latin America: Factors in Favor and Against It" (Max U. Montesino) discusses selected issues related to workplace democracy in Latin America and identifies salient issues for further research,…

  14. Apuntes sobre Latinismos en Espanol (Notes on Latinisms in Spanish)

    ERIC Educational Resources Information Center

    Perez B., L. A.

    1977-01-01

    Several Latinisms appear in Latin American Spanish, which would logically be farther from its Latin roots than Spanish in Spain. The existence of these elements and their importance as linguistic facts is analyzed here. Four words are treated: "Cliente,""cuadrar,""cuarto" and "rabula." (Text is in Spanish.) (CHK)

  15. Textbooks in Greek and Latin: 1975 List

    ERIC Educational Resources Information Center

    McCarty, Thomas G.

    1975-01-01

    List of textbooks in Greek and Latin for 1975. Subject, title, publisher and price are noted. Greek and Latin works are listed separately under the eight categories of texts, beginner's books, grammars, books about the language, readers and anthologies, composition, dictionaries, and New Testament Greek and Later Latin. (RM)

  16. Internationalizing Business Education in Latin America: Issues and Challenges

    ERIC Educational Resources Information Center

    Elahee, Mohammad; Norbis, Mario

    2009-01-01

    This article examines the extent of internationalization of business education in Latin America and identifies the key challenges facing the Latin American business schools. Based on a survey of the business schools that are members of CLADEA (Consejo Latinoamericano de Escuelas de Administracion--Latin American Council of Management Schools), and…

  17. Support and Conflict in Ethnically Diverse Young Adults' Relationships with Parents and Friends

    ERIC Educational Resources Information Center

    Moilanen, Kristin L.; Raffaelli, Marcela

    2010-01-01

    We examined support and conflict with parents and close friends in a sample of ethnically diverse young adults (European-, Asian-, Cuban-, Latin-, and Mexican Americans). College students (N = 495) completed six subscales from the Network of Relationships Inventory (NRI; Furman & Buhrmester, 1985). Friends were rated higher than parents on…

  18. Parallel Processing and Scientific Applications

    DTIC Science & Technology

    1992-11-30

    Lattice QCD Calculations on the Connection Machine), SIAM News 24, 1 (May 1991) 5. C. F. Baillie and D. A. Johnston, Crumpling Dynamically Triangulated...hypercubic lattice ; in the second, the surface is randomly triangulated once at the beginning of the simulation; and in the third the random...Sharpe, QCD with Dynamical Wilson Fermions 1I, Phys. Rev. D44, 3272 (1991), 8. R. Gupta and C. F. Baillie, Critical Behavior of the 2D XY Model, Phys

  19. Mapping Flows onto Networks to Optimize Organizational Processes

    DTIC Science & Technology

    2005-01-01

    And G . Porter, “Assessments of Simulated Performance of Alternative Architectures for Command and Control: The Role of Coordination”, Proceedings of...the 1999 Command & Control Research & Technology Symposium, NWC, Newport, RI, June 1999, pp. 123-143. [Iverson95] M. Iverson, F. Ozguner, G . Follen...Technology Symposium, NPS, Monterrey, CA, June, 2002. [Wu88] Min-You Wu, D. Gajski . “A Programming Aid for Hypercube Architectures.” The Journal of Supercomputing, 2(1988), pp. 349-372.

  20. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  1. Socioeconomic Status Is Not Related with Facial Fluctuating Asymmetry: Evidence from Latin-American Populations

    PubMed Central

    Quinto-Sánchez, Mirsha; Cintas, Celia; Silva de Cerqueira, Caio Cesar; Ramallo, Virginia; Acuña-Alonzo, Victor; Adhikari, Kaustubh; Castillo, Lucía; Gomez-Valdés, Jorge; Everardo, Paola; De Avila, Francisco; Hünemeier, Tábita; Jaramillo, Claudia; Arias, Williams; Fuentes, Macarena; Gallo, Carla; Poletti, Giovani; Schuler-Faccini, Lavinia; Bortolini, Maria Cátira; Canizales-Quinteros, Samuel; Rothhammer, Francisco; Bedoya, Gabriel; Rosique, Javier; Ruiz-Linares, Andrés; González-José, Rolando

    2017-01-01

    The expression of facial asymmetries has been recurrently related with poverty and/or disadvantaged socioeconomic status. Departing from the developmental instability theory, previous approaches attempted to test the statistical relationship between the stress experienced by individuals grown in poor conditions and an increase in facial and corporal asymmetry. Here we aim to further evaluate such hypothesis on a large sample of admixed Latin Americans individuals by exploring if low socioeconomic status individuals tend to exhibit greater facial fluctuating asymmetry values. To do so, we implement Procrustes analysis of variance and Hierarchical Linear Modelling (HLM) to estimate potential associations between facial fluctuating asymmetry values and socioeconomic status. We report significant relationships between facial fluctuating asymmetry values and age, sex, and genetic ancestry, while socioeconomic status failed to exhibit any strong statistical relationship with facial asymmetry. These results are persistent after the effect of heterozygosity (a proxy for genetic ancestry) is controlled in the model. Our results indicate that, at least on the studied sample, there is no relationship between socioeconomic stress (as intended as low socioeconomic status) and facial asymmetries. PMID:28060876

  2. Socioeconomic Status Is Not Related with Facial Fluctuating Asymmetry: Evidence from Latin-American Populations.

    PubMed

    Quinto-Sánchez, Mirsha; Cintas, Celia; Silva de Cerqueira, Caio Cesar; Ramallo, Virginia; Acuña-Alonzo, Victor; Adhikari, Kaustubh; Castillo, Lucía; Gomez-Valdés, Jorge; Everardo, Paola; De Avila, Francisco; Hünemeier, Tábita; Jaramillo, Claudia; Arias, Williams; Fuentes, Macarena; Gallo, Carla; Poletti, Giovani; Schuler-Faccini, Lavinia; Bortolini, Maria Cátira; Canizales-Quinteros, Samuel; Rothhammer, Francisco; Bedoya, Gabriel; Rosique, Javier; Ruiz-Linares, Andrés; González-José, Rolando

    2017-01-01

    The expression of facial asymmetries has been recurrently related with poverty and/or disadvantaged socioeconomic status. Departing from the developmental instability theory, previous approaches attempted to test the statistical relationship between the stress experienced by individuals grown in poor conditions and an increase in facial and corporal asymmetry. Here we aim to further evaluate such hypothesis on a large sample of admixed Latin Americans individuals by exploring if low socioeconomic status individuals tend to exhibit greater facial fluctuating asymmetry values. To do so, we implement Procrustes analysis of variance and Hierarchical Linear Modelling (HLM) to estimate potential associations between facial fluctuating asymmetry values and socioeconomic status. We report significant relationships between facial fluctuating asymmetry values and age, sex, and genetic ancestry, while socioeconomic status failed to exhibit any strong statistical relationship with facial asymmetry. These results are persistent after the effect of heterozygosity (a proxy for genetic ancestry) is controlled in the model. Our results indicate that, at least on the studied sample, there is no relationship between socioeconomic stress (as intended as low socioeconomic status) and facial asymmetries.

  3. Look for the Latin Word: A Gamebook on English Derivatives and Cognates to Accompany "How the Romans Lived and Spoke (Romani Viventes et Dicentes): A Humanistic Approach to Latin for Children in the Fifth Grade".

    ERIC Educational Resources Information Center

    Masciantonio, Rudolph; And Others

    This gamebook is intended to assist the elementary school Latin teacher in introducing the reading and writing of English derivatives and cognates after these have been mastered audiolingually in the Latin course "How the Romans Lived and Spoke (Romani Viventes et Dicentes): A Humanistic Approach to Latin for Children in the Fifth Grade." (For the…

  4. [Scientific journals of medical students in Latin-America].

    PubMed

    Cabrera-Samith, Ignacio; Oróstegui-Pinilla, Diana; Angulo-Bazán, Yolanda; Mayta-Tristán, Percy; Rodríguez-Morales, Alfonso J

    2010-11-01

    This article deals with the history and evolution of student's scientific journals in Latin-America, their beginnings, how many still exist and which is their future projection. Relevant events show the growth of student's scientific journals in Latin-America and how are they working together to improve their quality. This article is addressed not only for Latin American readers but also to worldwide readers. Latin American medical students are consistently working together to publish scientific research, whose quality is constantly improving.

  5. Sickle cell in Latin America and the United States [corrected].

    PubMed

    Huttle, Alexandra; Maestre, Gladys E; Lantigua, Rafael; Green, Nancy S

    2015-07-01

    Latin Americans are an underappreciated population affected by sickle cell disease (SCD). Sickle trait and SCD exist throughout Latin America and U.S. Latino communities. We describe the epidemiology and genetic heterogeneity of SCD among Latin Americans, and fetal hemoglobin expression. National population-based newborn screening for SCD is limited to Brazil, Costa Rica, and the U.S. Available and extrapolated data suggest that over 6,000 annual births and 100,000-150,000 Latin Americans are affected by SCD. This comprehensive review highlights the substantial numbers and population distribution of SCD and sickle trait in Latin America, and where national newborn screening programs for SCD exist. © 2015 Wiley Periodicals, Inc.

  6. Hospital malnutrition in Latin America: A systematic review.

    PubMed

    Correia, Maria Isabel T D; Perman, Mario Ignacio; Waitzberg, Dan Linetzky

    2017-08-01

    Disease-related malnutrition is a major public health issue in both industrialised and emerging countries. The reported prevalence in hospitalised adults ranges from 20% to 50%. Initial reports from emerging countries suggested a higher prevalence compared with other regions, with limited data on outcomes and costs. We performed a systematic literature search for articles on disease-related malnutrition in Latin American countries published between January 1995 and September 2014. Studies reporting data on the prevalence, clinical outcomes, or economic costs of malnutrition in an adult (≥18 years) inpatient population with a sample size of ≥30 subjects were eligible for inclusion. Methodological quality of the studies was assessed by two independent reviewers using published criteria. We identified 1467 citations; of these, 66 studies including 29 ,474 patients in 12 Latin American countries met the criteria for inclusion. There was considerable variability in methodology and in the reported prevalence of disease-related malnutrition; however, prevalence was consistently in the range of 40%-60% at the time of admission, with several studies reporting an increase in prevalence with increasing duration of hospitalisation. Disease-related malnutrition was associated with an increase in infectious and non-infectious clinical complications, length of hospital stay, and costs. Disease-related malnutrition is a highly prevalent condition that imposes a substantial health and economic burden on the countries of Latin America. Further research is necessary to characterise screening/assessment practices and identify evidence-based solutions to this persistent and costly public health issue. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  7. Multiple syndemic psychosocial factors are associated with reduced engagement in HIV care among a multinational, online sample of HIV-infected MSM in Latin America.

    PubMed

    Biello, Katie B; Oldenburg, Catherine E; Safren, Steven A; Rosenberger, Joshua G; Novak, David S; Mayer, Kenneth H; Mimiaga, Matthew J

    2016-01-01

    Latin America has some of the highest levels of antiretroviral therapy (ART) coverage of any developing region in the world. Early initiation and optimal adherence to ART are necessary for improved health outcomes and reduction in onward transmission. Previous work has demonstrated the role of psychosocial problems as barriers to uptake and adherence to ART, and recently, a syndemic framework has been applied to the role of multiple psychosocial syndemic factors and adherence to ART, in the USA. However, to our knowledge, these associations have not been investigated outside of the USA, nor in a multi-country context. To address these gaps, we assessed the association between multiple co-occurring psychosocial factors and engagement in HIV-related medical care and adherence to ART among a large, multinational sample of sexually-active HIV-infected men who have sex with men in Latin America. Among the 2020 respondents, 80.7% reported currently receiving HIV-related medical care, 72.3% reported currently receiving ART; among those, 62.5% reported 100% adherence. Compared with experiencing no psychosocial health problems, experiencing five or more psychosocial health problems is associated with 42% lower odds of currently receiving HIV-related medical care (adjusted odds ratio, aOR = 0.58, 95% CI 0.36, 0.95) and of currently receiving ART (aOR = 0.58, 95% CI 0.38, 0.91). The number of psychosocial health problems experienced was associated with self-reported ART adherence in a dose-response relationship; compared to those with none of the factors, individuals with one syndemic factor had 23% lower odds (aOR = 0.77, 95% CI 0.60, 0.97) and individuals with five or more syndemic factors had 72% lower odds (aOR = 0.28, 95% CI 0.14, 0.55) of reporting being 100% adherent to ART. Addressing co-occurring psychosocial problems as potential barriers to uptake and adherence of ART in Latin America may improve the effectiveness of secondary prevention interventions.

  8. Spatially distributed environmental fate modelling of terbuthylazine in a mesoscale agricultural catchment using passive sampler data

    NASA Astrophysics Data System (ADS)

    Gassmann, Matthias; Farlin, Julien; Gallé, Tom

    2017-04-01

    Agricultural application of herbicides often leads to significant herbicide losses to receiving rivers. The impact of agricultural practices on water pollution can be assessed by process-based reactive transport modelling using catchment scale models. Prior to investigations of management practices, these models have to be calibrated using sampling data. However, most previous studies only used concentrations at the catchment outlet for model calibration and validation. Thus, even if the applied model is spatially distributed, predicted spatial differences of pesticide loss cannot be directly compared to observations. In this study, we applied the spatially distributed reactive transport model Zin-AgriTra in the mesoscale (78 km2) catchment of the Wark River in Luxembourg in order to simulate concentrations of terbuthylazine in river water. In contrast to former studies, we used six sampling points, equipped with passive samplers, for pesticide model validation. Three samplers were located in the main channel of the river and three in smaller tributaries. At each sampling point, event mean concentration of six events from May to July 2011 were calculated by subtraction of baseflow-mass from total collected mass assuming time-proportional uptake by passive samplers. Continuous discharge measurements and high-resolution autosampling during events allowed for accurate load calculations at the outlet. Detailed information about maize cultivation in the catchment and nation-wide terbuthylazine application statistics (341 g/ha in the 3rd week of May) were used for a definition of the pesticide input function of the model. The hydrological model was manually calibrated to fit baseflow and spring/summer events. Substance fluxes were calibrated using a Latin Hypercube of physico-chemical substance characteristics as provided by the literature: surface soil half-lives of 10-35 d, Freundlich KOC of 150-330 ml/g, Freundlich n of 0.9 - 1 and adsorption/desorption kinetics of 20 - 80 1/d. Daily discharge simulations resulted in high Kling-Gupta efficiencies (KGE) for the calibration and the validation period (KGE > 0.70). Overall, terbuthylazine concentrations could be successfully reproduced with maximum KGE > 0.90 for all concentrations in the catchment and loads at the outlet. The generally lower concentrations in the tributaries that were measured by the passive samplers and the declining concentrations towards the outlet in the main channel could be reproduced by the model. The model simulated overland flow to be the major source of terbuthylazine in the main channel and soil water fluxes to be the most important pathways in the tributaries. Simulation results suggest that less than 0.01 % of applied terbuthylazine mass was exported to the river in the Wark catchment and less than 5 % of the exported mass was originating from the sampled tributaries. In addition to calibration of substance characteristics, passive sampler data was helpful in model setup of application field connectivity. Since the spatial resolution of the model was 50m, input maps sometimes showed a field to be directly connected to a river, whereas it was in reality separated from it by a 30m wide field or forest strip. Such misconfigurations leading to high concentrations in tributaries could easily be identified by comparing model results to passive sampler data. In conclusion, assigning different transport pathways of terbuthylazine to the rivers by model simulations was helped by using the additional spatial information on pesticide concentrations gained from passive samplers.

  9. Composition and sources of carbonaceous aerosols in Northern Europe during winter

    NASA Astrophysics Data System (ADS)

    Glasius, M.; Hansen, A. M. K.; Claeys, M.; Henzing, J. S.; Jedynska, A. D.; Kasper-Giebl, A.; Kistler, M.; Kristensen, K.; Martinsson, J.; Maenhaut, W.; Nøjgaard, J. K.; Spindler, G.; Stenström, K. E.; Swietlicki, E.; Szidat, S.; Simpson, D.; Yttri, K. E.

    2018-01-01

    Sources of elemental carbon (EC) and organic carbon (OC) in atmospheric aerosols (carbonaceous aerosols) were investigated by collection of weekly aerosol filter samples at six background sites in Northern Europe (Birkenes, Norway; Vavihill, Sweden; Risoe, Denmark; Cabauw and Rotterdam in The Netherlands; Melpitz, Germany) during winter 2013. Analysis of 14C and a set of molecular tracers were used to constrain the sources of EC and OC. During the four-week campaign, most sites (in particular those in Germany and The Netherlands) were affected by an episode during the first two weeks with high concentrations of aerosol, as continental air masses were transported westward. The analysis results showed a clear, increasing north to south gradient for most molecular tracers. Total carbon (TC = OC + EC) at Birkenes showed an average concentration of 0.5 ± 0.3 μg C m-3, whereas the average concentration at Melpitz was 6.0 ± 4.3 μg C m-3. One weekly mean TC concentration as high as 11 μg C m-3 was observed at Melpitz. Average levoglucosan concentrations varied by an order of magnitude from 25 ± 13 ng m-3 (Birkenes) to 249 ± 13 ng m-3 (Melpitz), while concentrations of tracers of fungal spores (arabitol and mannitol) and vegetative debris (cellulose) were very low, showing a minor influence of primary biological aerosol particles during the North European winter. The fraction of modern carbon generally varied from 0.57 (Melpitz) to 0.91 (Birkenes), showing an opposite trend compared to the molecular tracers and TC. Total concentrations of 10 biogenic and anthropogenic carboxylic acids, mainly of secondary origin, were 4-53 ng m-3, with the lowest concentrations observed at Birkenes and the highest at Melpitz. However, the highest relative concentrations of carboxylic acids (normalized to TC) were observed at the most northern sites. Levels of organosulphates and nitrooxy organosulphates varied more than two orders of magnitude, from 2 to 414 ng m-3, between individual sites and samples. The three sites Melpitz, Rotterdam and Cabauw, located closest to source regions in continental Europe, showed very high levels of organosulphates and nitrooxy organosulphates (up to 414 ng m-3) during the first two weeks of the study, while low levels (<7 ng m-3) were found at all sites except Melpitz during the last week. The large variation in organosulphate levels probably reflects differences in the presence of acidic sulphate aerosols, known from laboratory studies to accelerate the formation of these compounds. On average, the ratio of organic sulphate to inorganic sulphate was 1.5 ± 1.0% (range 0.1-3.4%). Latin-hypercube source apportionment techniques identified biomass burning as the major source of OC for all samples at all sites (typically >40% of TC), while use and combustion of fossil fuels was the second most important source. Furthermore, EC from biomass burning accounted for 7-16% of TC, whereas EC from fossil sources contributed to <2-23% of TC, of which the highest percentages were observed for low-concentration aerosol samples. Unresolved non-fossil sources (such as cooking and biogenic secondary organic aerosols) did not account for more than 5-12% of TC. The results confirm that wood combustion is a major source to OC and EC in Northern Europe during winter.

  10. Latin American Universities and the Bologna Process: From Commercialisation to the "Tuning" Competencies Project

    ERIC Educational Resources Information Center

    Aboites, Hugo

    2010-01-01

    Through the "Tuning-Latin America" competencies project, Latin American universities have been incorporated into the Bologna Process. In 2003 the European Commission approved an initiative of this project for Latin America and began to promote it among ministries, university presidents' organisations and other institutions in Latin…

  11. Considerations for Integrating Technology in Developing Communities in Latin America

    ERIC Educational Resources Information Center

    Ponte, Daniela Núñez; Cullen, Theresa A.

    2013-01-01

    This article discusses issues related to introducing new information and communication technologies (ICT) into Latin American countries. Latin American countries are gaining world focus with political changes such as the death of Hugo Chavez in Venezuela and the election of the first Latin American Pope. This region will host the World Cup,…

  12. Latin American Immigrant Women and Intergenerational Sex Education

    ERIC Educational Resources Information Center

    Alcalde, Maria Cristina; Quelopana, Ana Maria

    2013-01-01

    People of Latin American descent make up the largest and fastest-growing minority group in the USA. Rates of pregnancy, childbirth, and sexually transmitted infections among people of Latin American descent are higher than among other ethnic groups. This paper builds on research that suggests that among families of Latin American descent, mothers…

  13. Latin and Cross Latin Squares

    ERIC Educational Resources Information Center

    Emanouilidis, Emanuel

    2008-01-01

    Latin squares were first introduced and studied by the famous mathematician Leonhard Euler in the 1700s. Through the years, Latin squares have been used in areas such as statistics, graph theory, coding theory, the generation of random numbers as well as in the design and analysis of experiments. Recently, with the international popularity of…

  14. Colloquamur Latine cum Pueris Puellesque: Latin in the Middle School.

    ERIC Educational Resources Information Center

    Graber, Charles F.; Norton, Harriet S.

    Guidelines for the development of a Latin curriculum for the middle school, with specific suggestions as to content and methodology, are presented in this manual. The material, oriented toward new approaches in the teaching of the Latin and Graeco-Roman cultures, strives to develop proficiency in the skills of listening, speaking, reading, and…

  15. Machine learning for predicting soil classes in three semi-arid landscapes

    USGS Publications Warehouse

    Brungard, Colby W.; Boettinger, Janis L.; Duniway, Michael C.; Wills, Skye A.; Edwards, Thomas C.

    2015-01-01

    Mapping the spatial distribution of soil taxonomic classes is important for informing soil use and management decisions. Digital soil mapping (DSM) can quantitatively predict the spatial distribution of soil taxonomic classes. Key components of DSM are the method and the set of environmental covariates used to predict soil classes. Machine learning is a general term for a broad set of statistical modeling techniques. Many different machine learning models have been applied in the literature and there are different approaches for selecting covariates for DSM. However, there is little guidance as to which, if any, machine learning model and covariate set might be optimal for predicting soil classes across different landscapes. Our objective was to compare multiple machine learning models and covariate sets for predicting soil taxonomic classes at three geographically distinct areas in the semi-arid western United States of America (southern New Mexico, southwestern Utah, and northeastern Wyoming). All three areas were the focus of digital soil mapping studies. Sampling sites at each study area were selected using conditioned Latin hypercube sampling (cLHS). We compared models that had been used in other DSM studies, including clustering algorithms, discriminant analysis, multinomial logistic regression, neural networks, tree based methods, and support vector machine classifiers. Tested machine learning models were divided into three groups based on model complexity: simple, moderate, and complex. We also compared environmental covariates derived from digital elevation models and Landsat imagery that were divided into three different sets: 1) covariates selected a priori by soil scientists familiar with each area and used as input into cLHS, 2) the covariates in set 1 plus 113 additional covariates, and 3) covariates selected using recursive feature elimination. Overall, complex models were consistently more accurate than simple or moderately complex models. Random forests (RF) using covariates selected via recursive feature elimination was consistently the most accurate, or was among the most accurate, classifiers between study areas and between covariate sets within each study area. We recommend that for soil taxonomic class prediction, complex models and covariates selected by recursive feature elimination be used. Overall classification accuracy in each study area was largely dependent upon the number of soil taxonomic classes and the frequency distribution of pedon observations between taxonomic classes. Individual subgroup class accuracy was generally dependent upon the number of soil pedon observations in each taxonomic class. The number of soil classes is related to the inherent variability of a given area. The imbalance of soil pedon observations between classes is likely related to cLHS. Imbalanced frequency distributions of soil pedon observations between classes must be addressed to improve model accuracy. Solutions include increasing the number of soil pedon observations in classes with few observations or decreasing the number of classes. Spatial predictions using the most accurate models generally agree with expected soil–landscape relationships. Spatial prediction uncertainty was lowest in areas of relatively low relief for each study area.

  16. The use of Latin terminology in medical case reports: quantitative, structural, and thematic analysis.

    PubMed

    Lysanets, Yuliia V; Bieliaieva, Olena M

    2018-02-23

    This paper focuses on the prevalence of Latin terms and terminological collocations in the issues of Journal of Medical Case Reports (February 2007-August 2017) and discusses the role of Latin terminology in the contemporary process of writing medical case reports. The objective of the research is to study the frequency of using Latin terminology in English-language medical case reports, thus providing relevant guidelines for medical professionals who deal with this genre and drawing their attention to the peculiarities of using Latin in case reports. The selected medical case reports are considered, using methods of quantitative examination and structural, narrative, and contextual analyses. We developed structural and thematic typologies of Latin terms and expressions, and we conducted a quantitative analysis that enabled us to observe the tendencies in using these lexical units in medical case reports. The research revealed that the use of Latin fully complies with the communicative strategies of medical case reports as a genre. Owing to the fact that Latin medical lexis is internationally adopted and understood worldwide, it promotes the conciseness of medical case reports, as well as contributes to their narrative style and educational intentions. The adequate use of Latin terms in medical case reports is an essential prerequisite of effective sharing of one's clinical findings with fellow researchers from all over the world. Therefore, it is highly important to draw students' attention to Latin terms and expressions that are used in medical case reports most frequently. Hence, the analysis of structural, thematic, and contextual features of Latin terms in case reports should be an integral part of curricula at medical universities.

  17. Atmospheric Concentrations of New Persistent Organic Pollutants and Emerging Chemicals of Concern in the Group of Latin America and Caribbean (GRULAC) Region.

    PubMed

    Rauert, Cassandra; Harner, Tom; Schuster, Jasmin K; Eng, Anita; Fillmann, Gilberto; Castillo, Luisa Eugenia; Fentanes, Oscar; Villa Ibarra, Martín; Miglioranza, Karina S B; Moreno Rivadeneira, Isabel; Pozo, Karla; Aristizábal Zuluaga, Beatriz Helena

    2018-06-15

    A special initiative was run by the Global Atmospheric Passive Sampling (GAPS) Network to provide atmospheric data on a range of emerging chemicals of concern and candidate and new persistent organic pollutants in the Group of Latin America and Caribbean (GRULAC) region. Regional-scale data for a range of flame retardants (FRs) including polybrominated diphenyl ethers (PBDEs), organophosphate esters (OPEs), and a range of alternative FRs (novel FRs) are reported over 2 years of sampling with low detection frequencies of the novel FRs. Atmospheric concentrations of the OPEs were an order of magnitude higher than all other FRs, with similar profiles at all sites. Regional-scale background concentrations of the poly- and perfluoroalkyl substances (PFAS), including the neutral PFAS (n-PFAS) and perfluoroalkyl acids (PFAAs), and the volatile methyl siloxanes (VMS) are also reported. Ethyl perfluorooctane sulfonamide (EtFOSA) was detected at highly elevated concentrations in Brazil and Colombia, in line with the use of the pesticide sulfluramid in this region. Similar concentrations of the perfluoroalkyl sulfonates (PFAS) were detected throughout the GRULAC region regardless of location type, and the VMS concentrations in air increased with the population density of sampling locations. This is the first report of atmospheric concentrations of the PFAAs and VMS from this region.

  18. From magic to science: a journey throughout Latin American medical mycology.

    PubMed

    San-Blas, G

    2000-01-01

    The start of Latin America's love story with fungi may be placed in pre-Hispanic times when the use of fungi in both ritual ceremonies and daily life were common to the native civilizations. But the medical mycology discipline in Latin America started at the end of the 19th Century. At that time, scholars such as A. Posadas, R. Seeber, A. Lutz and P. Almeida, discovered agents of fungal diseases, the study of which has influenced the regional research ever since. Heirs to them are the researchers that today thrive in regional Universities and Research Institutes. Two current initiatives improve cooperation among Latin American medical mycologists. First, the periodical organization of International Paracoccidioidomycosis Meetings (seven so far, from 1979 to 1999); second, the creation of the Latin American Association for Mycology in 1991 (three Congresses, from 1993 to 1999). Latin American publications have increased in international specialized journals such as that from our Society (ISHAM) (from 8% in 1967 to 19% in 1999), and the Iberoamerican Journal of Mycology (Revista Iberoamericana de Micologia; > 40% from 1997 to 1999). In addition, Latin American participation at ISHAM International Congresses has risen from 6.9% in 1975 to 21.3% in 1997, and 43.2% at the 14th ISHAM Congress, held for the first time in a Latin American country, Argentina. A significant contribution of women to the scientific establishment of Latin American medical mycology (e.g., 45% of Latin American papers vs. 18% of other regions published in Journal of Medical and Veterinary Mycology in 1987, had women as authors or coauthors) suggests a better academic consideration of Latin American women against their counterparts in the developed world. Taken together, all these figures reflect the enthusiasm of our Latin American colleagues in the field, despite the difficulties that afflict our region, and affect our work.

  19. Education, Adjustment, and Democracy in Latin America. Development Discussion Paper No. 363.

    ERIC Educational Resources Information Center

    Reimers, Fernando M.

    This document examines changes in Latin American economies and educational systems during the 1980s and the responses of the Latin American democracies. Following a description of changes in Latin American public expenditures in the 1980s and subsequent adjustments in education expenditures, the dynamics of the adjustment in Costa Rica and…

  20. "Dance for Your Health": Exploring Social Latin Dancing for Community Health Promotion

    ERIC Educational Resources Information Center

    Iuliano, Joseph E.; Lutrick, Karen; Maez, Paula; Nacim, Erika; Reinschmidt, Kerstin

    2017-01-01

    The goal of "Dance for Your Health" was to explore the relationship between social Latin dance and health as described by members of the Tucson social Latin dance community. Social Latin dance was selected because of the variety of dances, cultural relevance and popularity in Tucson, and the low-key, relaxed atmosphere. Dance has been…

  1. Latin America: A Filmic Approach. Latin American Studies Program, Film Series No. 1.

    ERIC Educational Resources Information Center

    Campbell, Leon G.; And Others

    This document describes a university course designed to provide an historical understanding of Latin America through feature films. The booklet contains an introductory essay on the teaching of a film course on Latin America, a general discussion of strengths and weaknesses of student analyses of films, and nine analyses written by students during…

  2. She Doesn't Even Act Mexican: Smartness Trespassing in the New South

    ERIC Educational Resources Information Center

    Carrillo, Juan F.; Rodriguez, Esmeralda

    2016-01-01

    Historically, Latin@s have experienced deficit thinking as it relates to their perceived academic abilities. In North Carolina's emerging Latin@ population, the first waves of K-12 students are writing a new chapter in the state's history. Yet, there is a dearth of research that focuses on the identities of academically successful Latin@ students…

  3. LATIN FOR SECONDARY SCHOOLS (A GUIDE TO MINIMUM ESSENTIALS).

    ERIC Educational Resources Information Center

    LEAMON, M. PHILLIP; AND OTHERS

    A SET OF MINIMUM ESSENTIALS FOR EACH LEVEL OF A 4-YEAR SEQUENCE OF LATIN IN SECONDARY SCHOOLS IS PRESENTED IN THIS CURRICULUM GUIDE. FOLLOWING STATEMENTS OF THE OBJECTIVES OF LATIN STUDY--READING THE LATIN OF THE GREAT ROMAN AUTHORS, ATTAINING A LINGUISTIC PROFICIENCY, AND ACQUIRING A WIDER HISTORICAL AND CULTURAL AWARENESS--THE GUIDE OUTLINES FOR…

  4. Word Power through Latin; A Curriculum Resource.

    ERIC Educational Resources Information Center

    Masciantonio, Rudolph

    This curriculum guide is intended to assist Latin teachers in the School District of Philadelphia in achieving one of the goals of Latin instruction: the development of word power in English through a structured study of Latin roots and affixes. The guide may be used in two different ways. First, it may form the basis for a separate…

  5. Machismo and Virginidad: Sex Roles in Latin America. Discussion Paper 79-10.

    ERIC Educational Resources Information Center

    Quinones, Julio

    The purpose of this paper is to present a view of Latin American males and females that describes the situation in Latin America more accurately than the current stereotypical view accepted in the United States. The author discusses the roots of the North American misconception, citing differences between Latin American and North American cultures…

  6. Effective promotion of breastfeeding among Latin American women newly immigrated to the United States.

    PubMed

    Denman-Vitale, S; Murillo, E K

    1999-07-01

    Across the United States, advance practice nurses (APNs) are increasingly encountering recently immigrated Latin American populations. This article provides an overview of the situation of Latin Americans in the United States and discusses aspects of Latin American culture such as, respeto (respect), confianza (confidence), the importance of family, and the value of a personal connection. Strategies that will assist practitioners to incorporate culturally holistic principles in the promotion of breastfeeding among Latin American women who are new arrivals in the United States are described. If practitioners are to respond to the increasing numbers of Latin American women who need health care services, and also provide thorough, holistic health care then health care activities must be integrated with cultural competence.

  7. Emotion Socialization and Ethnicity: An Examination of Practices and Outcomes in African American, Asian American, and Latin American Families

    PubMed Central

    Morelen, Diana; Thomassin, Kristel

    2013-01-01

    The current review paper summarizes the literature on parental emotion socialization in ethnically diverse families in the United States. Models of emotion socialization have been primarily developed using samples of European American parents and children. As such, current categorizations of “adaptive” and “maladaptive” emotion socialization practices may not be applicable to individuals from different ethnic backgrounds. The review examines current models of emotion socialization, with particular attention paid to the demographic breakdown of the studies used to develop these models. Additionally, the review highlights studies examining emotion socialization practices in African American, Asian American, and Latin American families. The review is synthesized with summarizing themes of similarities and differences across ethnic groups, and implications for culturally sensitive research and practice are discussed. PMID:23766738

  8. A unified model of the standard genetic code.

    PubMed

    José, Marco V; Zamudio, Gabriel S; Morgado, Eberto R

    2017-03-01

    The Rodin-Ohno (RO) and the Delarue models divide the table of the genetic code into two classes of aminoacyl-tRNA synthetases (aaRSs I and II) with recognition from the minor or major groove sides of the tRNA acceptor stem, respectively. These models are asymmetric but they are biologically meaningful. On the other hand, the standard genetic code (SGC) can be derived from the primeval RNY code (R stands for purines, Y for pyrimidines and N any of them). In this work, the RO-model is derived by means of group actions, namely, symmetries represented by automorphisms, assuming that the SGC originated from a primeval RNY code. It turns out that the RO-model is symmetric in a six-dimensional (6D) hypercube. Conversely, using the same automorphisms, we show that the RO-model can lead to the SGC. In addition, the asymmetric Delarue model becomes symmetric by means of quotient group operations. We formulate isometric functions that convert the class aaRS I into the class aaRS II and vice versa. We show that the four polar requirement categories display a symmetrical arrangement in our 6D hypercube. Altogether these results cannot be attained, neither in two nor in three dimensions. We discuss the present unified 6D algebraic model, which is compatible with both the SGC (based upon the primeval RNY code) and the RO-model.

  9. Optimizing Radiometric Processing and Feature Extraction of Drone Based Hyperspectral Frame Format Imagery for Estimation of Yield Quantity and Quality of a Grass Sward

    NASA Astrophysics Data System (ADS)

    Näsi, R.; Viljanen, N.; Oliveira, R.; Kaivosoja, J.; Niemeläinen, O.; Hakala, T.; Markelin, L.; Nezami, S.; Suomalainen, J.; Honkavaara, E.

    2018-04-01

    Light-weight 2D format hyperspectral imagers operable from unmanned aerial vehicles (UAV) have become common in various remote sensing tasks in recent years. Using these technologies, the area of interest is covered by multiple overlapping hypercubes, in other words multiview hyperspectral photogrammetric imagery, and each object point appears in many, even tens of individual hypercubes. The common practice is to calculate hyperspectral orthomosaics utilizing only the most nadir areas of the images. However, the redundancy of the data gives potential for much more versatile and thorough feature extraction. We investigated various options of extracting spectral features in the grass sward quantity evaluation task. In addition to the various sets of spectral features, we used photogrammetry-based ultra-high density point clouds to extract features describing the canopy 3D structure. Machine learning technique based on the Random Forest algorithm was used to estimate the fresh biomass. Results showed high accuracies for all investigated features sets. The estimation results using multiview data provided approximately 10 % better results than the most nadir orthophotos. The utilization of the photogrammetric 3D features improved estimation accuracy by approximately 40 % compared to approaches where only spectral features were applied. The best estimation RMSE of 239 kg/ha (6.0 %) was obtained with multiview anisotropy corrected data set and the 3D features.

  10. A multi-satellite orbit determination problem in a parallel processing environment

    NASA Technical Reports Server (NTRS)

    Deakyne, M. S.; Anderle, R. J.

    1988-01-01

    The Engineering Orbit Analysis Unit at GE Valley Forge used an Intel Hypercube Parallel Processor to investigate the performance and gain experience of parallel processors with a multi-satellite orbit determination problem. A general study was selected in which major blocks of computation for the multi-satellite orbit computations were used as units to be assigned to the various processors on the Hypercube. Problems encountered or successes achieved in addressing the orbit determination problem would be more likely to be transferable to other parallel processors. The prime objective was to study the algorithm to allow processing of observations later in time than those employed in the state update. Expertise in ephemeris determination was exploited in addressing these problems and the facility used to bring a realism to the study which would highlight the problems which may not otherwise be anticipated. Secondary objectives were to gain experience of a non-trivial problem in a parallel processor environment, to explore the necessary interplay of serial and parallel sections of the algorithm in terms of timing studies, to explore the granularity (coarse vs. fine grain) to discover the granularity limit above which there would be a risk of starvation where the majority of nodes would be idle or under the limit where the overhead associated with splitting the problem may require more work and communication time than is useful.

  11. Public Policies and Interventions for Diabetes in Latin America: a Scoping Review.

    PubMed

    Kaselitz, Elizabeth; Rana, Gurpreet K; Heisler, Michele

    2017-08-01

    Successful interventions are needed to diagnose and manage type 2 diabetes (T2DM) in Latin America, a region that is experiencing a significant rise in rates of T2DM. Complementing an earlier review exploring diabetes prevention efforts in Latin America, this scoping review examines the literature on (1) policies and governmental programs intended to improve diabetes diagnosis and treatment in Latin America and (2) interventions to improve diabetes management in Latin America. It concludes with a brief discussion of promising directions for future research. Governmental policies and programs for the diagnosis and treatment of diabetes in different Latin American countries have been implemented, but their efficacy to date has not been rigorously evaluated. There are some promising intervention approaches in Latin America to manage diabetes that have been evaluated. Some of these utilize multidisciplinary teams, a relatively resource-intensive approach difficult to replicate in low-resource settings. Other evaluated interventions in Latin America have successfully leveraged mobile health tools, trained peer volunteers, and community health workers (CHWs) to improve diabetes management and outcomes. There are some promising approaches and large-scale governmental efforts underway to curb the growing burden of type 2 diabetes in Latin America. While some of these interventions have been rigorously evaluated, further research is warranted to determine their effectiveness, cost, and scalability in this region.

  12. Foreign Direct Investment and Trade Openness: The Case of Developing Economies

    ERIC Educational Resources Information Center

    Liargovas, Panagiotis G.; Skandalis, Konstantinos S.

    2012-01-01

    This paper examines the importance of trade openness for attracting Foreign Direct Investment (FDI) inflows, using a sample of 36 developing economies for the period 1990-2008. It provides a direct test of causality between FDI inflows, trade openness and other key variables in developing regions of the world: Latin America, Asia, Africa, CIS…

  13. Risk and Resilience in Rural Communities: The Experiences of Immigrant Latina Mothers

    ERIC Educational Resources Information Center

    Raffaelli, Marcela; Tran, Steve P.; Wiley, Angela R.; Galarza-Heras, Maria; Lazarevic, Vanja

    2012-01-01

    Immigrants from Latin America are increasingly settling in rural U.S. communities that welcome them as workers but are often unprepared to address their needs and promote their well-being. Building on recent descriptive studies, we examined factors associated with individual and family well-being in a sample of 112 immigrant Latina mothers (mean…

  14. La Biblioteca Latino Americana: User Survey (San Jose Public Library). Studies in Librarianship No. 2.

    ERIC Educational Resources Information Center

    Johnstone, James C.; And Others

    To assist a neighborhood committee in applying for federal funding of a bilingual/bicultural library with a distinct Latin American emphasis, a student research group from San Jose State University designed and administered a bilingual questionnaire to a stratified sample of 400 households in the Gardner District of San Jose, California. The…

  15. "Should We Carry Our Passports with Us?": Resisting Violence against Latin@s in Light of Political Rhetoric

    ERIC Educational Resources Information Center

    Bondy, Jennifer M.

    2017-01-01

    In this article, Jennifer Bondy argues that educators should teach students about the emotions that drive racialized violence. In order to do this, she focuses on political rhetoric that locates Latin@s as illegal aliens, criminals, and questionably American. She begins with her professional experiences as a social studies teacher who has worked…

  16. From Latin Americans to Latinos: Latin American Immigration in US: The Unwanted Children

    ERIC Educational Resources Information Center

    Moraña, Ana

    2007-01-01

    It is my understanding that Latin American immigrants in the United States, during the contested process of becoming Latinos (US citizens or the offspring of Latin Americans born in US) are for the most part socially portrayed as unwanted, messy children who need to be educated before they can become American citizens. Whether they can be called…

  17. Latin American Culture Studies: Information and Materials for Teaching About Latin America. Revised Edition.

    ERIC Educational Resources Information Center

    Glab, Edward, Jr., Ed.

    This resource manual provides ideas, lesson plans, course outlines, arts and crafts projects, games, and other materials for teaching K-12 students about Latin America. A major objective is to help students understand and appreciate the diverse Latin American culture. There are six chapters in this volume. Chapter one discusses key ideas that can…

  18. History of primary vasculitis in Latin America.

    PubMed

    Iglesias Gammara, Antonio; Coral, Paola; Quintana, Gerardo; Toro, Carlos E; Flores, Luis Felipe; Matteson, Eric L; Restrepo, José Félix

    2010-03-01

    A literature review utilizing Fepafem, Bireme, LiLacs, Scielo Colombia, Scielo Internacional, former MedLine, Pubmed, and BVS Colombia as well as manual searches in the libraries of major Latin American universities was performed to study vasculitis in Latin America. Since 1945, a total of 752 articles have been published by Latin American authors. However, only a minority are devoted to primary vasculitides, and even fewer have been published in indexed journals. Approximately 126 are in OLD, Medline, Pubmed, Bireme, and Scielo. Most publications are from Mexico, followed by Brazil and Colombia. Systematic studies of the epidemiology of primary idiopathic vasculitis are available for a few countries, i.e. Brazil, Mexico, Colombia, Chile, and Peru. Takayasu arteritis and ANCA-associated vasculitis are the best studied forms of vasculitis in Latin America. Interest and expertise in vasculitis is growing in Latin America, as reflected in the increased number of published articles from this region of the world in the last decade. Racial and environmental factors are possibly responsible for the differential expression of various types of primary vasculitis observed in Latin America. With time, the unique features, epidemiology, and better treatment strategies for idiopathic vasculitides in Latin America will emerge.

  19. The history of Latin teeth names.

    PubMed

    Šimon, František

    2015-01-01

    This paper aims to give an account of the Latin naming of the different types of teeth by reviewing relevant historical and contemporary literature. The paper presents etymologies of Latin or Greek teeth names, their development, variants and synonyms, and sometimes the names of their authors. The Greek names did not have the status of official terms, but the Latin terms for particular types of teeth gradually established themselves. Names for the incisors, canines and molars are Latin calques for the Greek ones (tomeis, kynodontes, mylai), dens serotinus is an indirect calque of the Greek name (odús) opsigonos, and the term pre-molar is created in the way which is now common in modern anatomical terminology, using the prefix prae- = pre and the adjective molaris. The Latin terms dentes canini and dentes molares occur in the Classical Latin literature, the term (dentes) incisivi is found first time in medieval literature, and the terms dentes premolares and dens serotinus are modern-age ones.

  20. Founder effects and stochastic dispersal at the continental scale of the fungal pathogen of bananas Mycosphaerella fijiensis.

    PubMed

    Rivas, Gonzalo-Galileo; Zapater, Marie-Françoise; Abadie, Catherine; Carlier, Jean

    2004-02-01

    The worldwide destructive epidemic of the fungus Mycosphaerella fijiensis on banana started recently, spreading from South-East Asia. The founder effects detected in the global population structure of M. fijiensis reflected rare migration events among continents through movements of infected plant material. The main objective of this work was to infer gene flow and dispersal processes of M. fijiensis at the continental scale from population structure analysis in recently invaded regions. Samples of isolates were collected from banana plantations in 13 countries in Latin America and the Caribbean and in Africa. The isolates were analysed using polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) and microsatellite molecular markers. The results indicate that a high level of genetic diversity was maintained at the plantation and the plant scales. The loci were at gametic equilibrium in most of the samples analysed, supporting the hypothesis of the existence of random-mating populations of M. fijiensis, even at the plant scale. A low level of gene diversity was observed in some populations from the Africa and Latin America-Caribbean regions. Nearly half the populations analysed showed a significant deviation from mutation-drift equilibrium with gene diversity excess. Finally, a high level of genetic differentiation was detected between populations from Africa (FST = 0.19) and from the Latin America-Caribbean region (FST = 0.30). These results show that founder effects accompanied the recent invasion of M. fijiensis in both regions, suggesting stochastic spread of the disease at the continental scale. This spread might be caused by either the limited dispersal of ascospores or by movements of infected plant material.

Top