Sample records for conditional latin hypercube

  1. Optimization of an Advanced Multi-Junction Solar-Cell Design for Space Environments (AM0) Using Nearly Orthogonal Latin Hypercubes

    DTIC Science & Technology

    2017-06-01

    AN ADVANCED MULTI-JUNCTION SOLAR -CELL DESIGN FOR SPACE ENVIRONMENTS (AM0) USING NEARLY ORTHOGONAL LATIN HYPERCUBES by Silvio Pueschel June...ADVANCED MULTI-JUNCTION SOLAR -CELL DESIGN FOR SPACE ENVIRONMENTS (AM0) USING NEARLY ORTHOGONAL LATIN HYPERCUBES 5. FUNDING NUMBERS 6. AUTHOR(S) Silvio...multi-junction solar cells with Silvaco Atlas simulation software. It introduces the nearly orthogonal Latin hypercube (NOLH) design of experiments (DoE

  2. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    PubMed

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  3. A Novel Latin Hypercube Algorithm via Translational Propagation

    PubMed Central

    Pan, Guang; Ye, Pengcheng

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the experimental designs used. Optimal Latin hypercube designs are frequently used and have been shown to have good space-filling and projective properties. However, the high cost in constructing them limits their use. In this paper, a methodology for creating novel Latin hypercube designs via translational propagation and successive local enumeration algorithm (TPSLE) is developed without using formal optimization. TPSLE algorithm is based on the inspiration that a near optimal Latin Hypercube design can be constructed by a simple initial block with a few points generated by algorithm SLE as a building block. In fact, TPSLE algorithm offers a balanced trade-off between the efficiency and sampling performance. The proposed algorithm is compared to two existing algorithms and is found to be much more efficient in terms of the computation time and has acceptable space-filling and projective properties. PMID:25276844

  4. Remote sensing data with the conditional latin hypercube sampling and geostatistical approach to delineate landscape changes induced by large chronological physical disturbances.

    PubMed

    Lin, Yu-Pin; Chu, Hone-Jay; Wang, Cheng-Long; Yu, Hsiao-Hsuan; Wang, Yung-Chieh

    2009-01-01

    This study applies variogram analyses of normalized difference vegetation index (NDVI) images derived from SPOT HRV images obtained before and after the ChiChi earthquake in the Chenyulan watershed, Taiwan, as well as images after four large typhoons, to delineate the spatial patterns, spatial structures and spatial variability of landscapes caused by these large disturbances. The conditional Latin hypercube sampling approach was applied to select samples from multiple NDVI images. Kriging and sequential Gaussian simulation with sufficient samples were then used to generate maps of NDVI images. The variography of NDVI image results demonstrate that spatial patterns of disturbed landscapes were successfully delineated by variogram analysis in study areas. The high-magnitude Chi-Chi earthquake created spatial landscape variations in the study area. After the earthquake, the cumulative impacts of typhoons on landscape patterns depended on the magnitudes and paths of typhoons, but were not always evident in the spatiotemporal variability of landscapes in the study area. The statistics and spatial structures of multiple NDVI images were captured by 3,000 samples from 62,500 grids in the NDVI images. Kriging and sequential Gaussian simulation with the 3,000 samples effectively reproduced spatial patterns of NDVI images. However, the proposed approach, which integrates the conditional Latin hypercube sampling approach, variogram, kriging and sequential Gaussian simulation in remotely sensed images, efficiently monitors, samples and maps the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial variability and heterogeneity.

  5. Progressive Sampling Technique for Efficient and Robust Uncertainty and Sensitivity Analysis of Environmental Systems Models: Stability and Convergence

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, R.; Hosseini, N.; Razavi, S.

    2016-12-01

    Modern earth and environmental models are usually characterized by a large parameter space and high computational cost. These two features prevent effective implementation of sampling-based analysis such as sensitivity and uncertainty analysis, which require running these computationally expensive models several times to adequately explore the parameter/problem space. Therefore, developing efficient sampling techniques that scale with the size of the problem, computational budget, and users' needs is essential. In this presentation, we propose an efficient sequential sampling strategy, called Progressive Latin Hypercube Sampling (PLHS), which provides an increasingly improved coverage of the parameter space, while satisfying pre-defined requirements. The original Latin hypercube sampling (LHS) approach generates the entire sample set in one stage; on the contrary, PLHS generates a series of smaller sub-sets (also called `slices') while: (1) each sub-set is Latin hypercube and achieves maximum stratification in any one dimensional projection; (2) the progressive addition of sub-sets remains Latin hypercube; and thus (3) the entire sample set is Latin hypercube. Therefore, it has the capability to preserve the intended sampling properties throughout the sampling procedure. PLHS is deemed advantageous over the existing methods, particularly because it nearly avoids over- or under-sampling. Through different case studies, we show that PHLS has multiple advantages over the one-stage sampling approaches, including improved convergence and stability of the analysis results with fewer model runs. In addition, PLHS can help to minimize the total simulation time by only running the simulations necessary to achieve the desired level of quality (e.g., accuracy, and convergence rate).

  6. A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Wyss, Gregory Dane

    2004-07-01

    This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a librarymore » that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.« less

  7. Identification of stochastic interactions in nonlinear models of structural mechanics

    NASA Astrophysics Data System (ADS)

    Kala, Zdeněk

    2017-07-01

    In the paper, the polynomial approximation is presented by which the Sobol sensitivity analysis can be evaluated with all sensitivity indices. The nonlinear FEM model is approximated. The input area is mapped using simulations runs of Latin Hypercube Sampling method. The domain of the approximation polynomial is chosen so that it were possible to apply large number of simulation runs of Latin Hypercube Sampling method. The method presented also makes possible to evaluate higher-order sensitivity indices, which could not be identified in case of nonlinear FEM.

  8. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  9. Probabilistic Modeling of Intracranial Pressure Effects on Optic Nerve Biomechanics

    NASA Technical Reports Server (NTRS)

    Ethier, C. R.; Feola, Andrew J.; Raykin, Julia; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.

    2016-01-01

    Altered intracranial pressure (ICP) is involved/implicated in several ocular conditions: papilledema, glaucoma and Visual Impairment and Intracranial Pressure (VIIP) syndrome. The biomechanical effects of altered ICP on optic nerve head (ONH) tissues in these conditions are uncertain but likely important. We have quantified ICP-induced deformations of ONH tissues, using finite element (FE) and probabilistic modeling (Latin Hypercube Simulations (LHS)) to consider a range of tissue properties and relevant pressures.

  10. Extension of latin hypercube samples with correlated variables.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hora, Stephen Curtis; Helton, Jon Craig; Sallaberry, Cedric J. PhD.

    2006-11-01

    A procedure for extending the size of a Latin hypercube sample (LHS) with rank correlated variables is described and illustrated. The extension procedure starts with an LHS of size m and associated rank correlation matrix C and constructs a new LHS of size 2m that contains the elements of the original LHS and has a rank correlation matrix that is close to the original rank correlation matrix C. The procedure is intended for use in conjunction with uncertainty and sensitivity analysis of computationally demanding models in which it is important to make efficient use of a necessarily limited number ofmore » model evaluations.« less

  11. Latin hypercube approach to estimate uncertainty in ground water vulnerability

    USGS Publications Warehouse

    Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.

    2007-01-01

    A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.

  12. Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping.

    NASA Astrophysics Data System (ADS)

    Hamalainen, Sampsa; Geng, Xiaoyuan; He, Juanxia

    2017-04-01

    Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping. Sampsa Hamalainen, Xiaoyuan Geng, and Juanxia, He. AAFC - Agriculture and Agr-Food Canada, Ottawa, Canada. The Latin Hypercube Sampling (LHS) approach to assist with Digital Soil Mapping has been developed for some time now, however the purpose of this work was to complement LHS with use of multiple spatial resolutions of covariate datasets and variability in the range of sampling points produced. This allowed for specific sets of LHS points to be produced to fulfil the needs of various partners from multiple projects working in the Ontario and Prince Edward Island provinces of Canada. Secondary soil and environmental attributes are critical inputs that are required in the development of sampling points by LHS. These include a required Digital Elevation Model (DEM) and subsequent covariate datasets produced as a result of a Digital Terrain Analysis performed on the DEM. These additional covariates often include but are not limited to Topographic Wetness Index (TWI), Length-Slope (LS) Factor, and Slope which are continuous data. The range of specific points created in LHS included 50 - 200 depending on the size of the watershed and more importantly the number of soil types found within. The spatial resolution of covariates included within the work ranged from 5 - 30 m. The iterations within the LHS sampling were run at an optimal level so the LHS model provided a good spatial representation of the environmental attributes within the watershed. Also, additional covariates were included in the Latin Hypercube Sampling approach which is categorical in nature such as external Surficial Geology data. Some initial results of the work include using a 1000 iteration variable within the LHS model. 1000 iterations was consistently a reasonable value used to produce sampling points that provided a good spatial representation of the environmental attributes. When working within the same spatial resolution for covariates, however only modifying the desired number of sampling points produced, the change of point location portrayed a strong geospatial relationship when using continuous data. Access to agricultural fields and adjacent land uses is often "pinned" as the greatest deterrent to performing soil sampling for both soil survey and soil attribute validation work. The lack of access can be a result of poor road access and/or difficult geographical conditions to navigate for field work individuals. This seems a simple yet continuous issue to overcome for the scientific community and in particular, soils professionals. The ability to assist with the ease of access to sampling points will be in the future a contribution to the Latin Hypercube Sampling (LHS) approach. By removing all locations in the initial instance from the DEM, the LHS model can be restricted to locations only with access from the adjacent road or trail. To further the approach, a road network geospatial dataset can be included within spatial Geographic Information Systems (GIS) applications to access already produced points using a shortest-distance network method.

  13. Analysis of hepatitis C viral dynamics using Latin hypercube sampling

    NASA Astrophysics Data System (ADS)

    Pachpute, Gaurav; Chakrabarty, Siddhartha P.

    2012-12-01

    We consider a mathematical model comprising four coupled ordinary differential equations (ODEs) to study hepatitis C viral dynamics. The model includes the efficacies of a combination therapy of interferon and ribavirin. There are two main objectives of this paper. The first one is to approximate the percentage of cases in which there is a viral clearance in absence of treatment as well as percentage of response to treatment for various efficacy levels. The other is to better understand and identify the parameters that play a key role in the decline of viral load and can be estimated in a clinical setting. A condition for the stability of the uninfected and the infected steady states is presented. A large number of sample points for the model parameters (which are physiologically feasible) are generated using Latin hypercube sampling. An analysis of the simulated values identifies that, approximately 29.85% cases result in clearance of the virus during the early phase of the infection. Results from the χ2 and the Spearman's tests done on the samples, indicate a distinctly different distribution for certain parameters for the cases exhibiting viral clearance under the combination therapy.

  14. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  15. Hybrid real-code ant colony optimisation for constrained mechanical design

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Bureerat, Sujin

    2016-01-01

    This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.

  16. Convoy Protection under Multi-Threat Scenario

    DTIC Science & Technology

    2017-06-01

    14. SUBJECT TERMS antisubmarine warfare, convoy protection, screening, design of experiments, agent-based simulation 15. NUMBER OF...46 5. Scenarios 33–36 (Red Submarine Tactic-2) ...............................46 IV. DESIGN OF EXPERIMENT...47 C. NEARLY ORTHOGONAL LATIN HYPERCUBE DESIGN ............51 V. DATA ANALYSIS

  17. FORMAL UNCERTAINTY ANALYSIS OF A LAGRANGIAN PHOTOCHEMICAL AIR POLLUTION MODEL. (R824792)

    EPA Science Inventory

    This study applied Monte Carlo analysis with Latin
    hypercube sampling to evaluate the effects of uncertainty
    in air parcel trajectory paths, emissions, rate constants,
    deposition affinities, mixing heights, and atmospheric stability
    on predictions from a vertically...

  18. Improvements in sub-grid, microphysics averages using quadrature based approaches

    NASA Astrophysics Data System (ADS)

    Chowdhary, K.; Debusschere, B.; Larson, V. E.

    2013-12-01

    Sub-grid variability in microphysical processes plays a critical role in atmospheric climate models. In order to account for this sub-grid variability, Larson and Schanen (2013) propose placing a probability density function on the sub-grid cloud microphysics quantities, e.g. autoconversion rate, essentially interpreting the cloud microphysics quantities as a random variable in each grid box. Random sampling techniques, e.g. Monte Carlo and Latin Hypercube, can be used to calculate statistics, e.g. averages, on the microphysics quantities, which then feed back into the model dynamics on the coarse scale. We propose an alternate approach using numerical quadrature methods based on deterministic sampling points to compute the statistical moments of microphysics quantities in each grid box. We have performed a preliminary test on the Kessler autoconversion formula, and, upon comparison with Latin Hypercube sampling, our approach shows an increased level of accuracy with a reduction in sample size by almost two orders of magnitude. Application to other microphysics processes is the subject of ongoing research.

  19. Comparison of Drainmod Based Watershed Scale Models

    Treesearch

    Glenn P. Fernandez; George M. Chescheir; R. Wayne Skaggs; Devendra M. Amatya

    2004-01-01

    Watershed scale hydrology and water quality models (DRAINMOD-DUFLOW, DRAINMOD-W, DRAINMOD-GIS and WATGIS) that describe the nitrogen loadings at the outlet of poorly drained watersheds were examined with respect to their accuracy and uncertainty in model predictions. Latin Hypercube Sampling (LHS) was applied to determine the impact of uncertainty in estimating field...

  20. Extending Orthogonal and Nearly Orthogonal Latin Hypercube Designs for Computer Simulation and Experimentation

    DTIC Science & Technology

    2006-12-01

    The code was initially developed to be run within the netBeans IDE 5.04 running J2SE 5.0. During the course of the development, Eclipse SDK 3.2...covers the results from the research. Chapter V concludes and recommends future research. 4 netBeans

  1. Comparison of empirical strategies to maximize GENEHUNTER lod scores.

    PubMed

    Chen, C H; Finch, S J; Mendell, N R; Gordon, D

    1999-01-01

    We compare four strategies for finding the settings of genetic parameters that maximize the lod scores reported in GENEHUNTER 1.2. The four strategies are iterated complete factorial designs, iterated orthogonal Latin hypercubes, evolutionary operation, and numerical optimization. The genetic parameters that are set are the phenocopy rate, penetrance, and disease allele frequency; both recessive and dominant models are considered. We selected the optimization of a recessive model on the Collaborative Study on the Genetics of Alcoholism (COGA) data of chromosome 1 for complete analysis. Convergence to a setting producing a local maximum required the evaluation of over 100 settings (for a time budget of 800 minutes on a Pentium II 300 MHz PC). Two notable local maxima were detected, suggesting the need for a more extensive search before claiming that a global maximum had been found. The orthogonal Latin hypercube design was the best strategy for finding areas that produced high lod scores with small numbers of evaluations. Numerical optimization starting from a region producing high lod scores was the strategy that found the highest maximum observed.

  2. Influence of scanning parameters on the estimation accuracy of control points of B-spline surfaces

    NASA Astrophysics Data System (ADS)

    Aichinger, Julia; Schwieger, Volker

    2018-04-01

    This contribution deals with the influence of scanning parameters like scanning distance, incidence angle, surface quality and sampling width on the average estimated standard deviations of the position of control points from B-spline surfaces which are used to model surfaces from terrestrial laser scanning data. The influence of the scanning parameters is analyzed by the Monte Carlo based variance analysis. The samples were generated for non-correlated and correlated data, leading to the samples generated by Latin hypercube and replicated Latin hypercube sampling algorithms. Finally, the investigations show that the most influential scanning parameter is the distance from the laser scanner to the object. The angle of incidence shows a significant effect for distances of 50 m and longer, while the surface quality contributes only negligible effects. The sampling width has no influence. Optimal scanning parameters can be found in the smallest possible object distance at an angle of incidence close to 0° in the highest surface quality. The consideration of correlations improves the estimation accuracy and underlines the importance of complete stochastic models for TLS measurements.

  3. What Friends Are For: Collaborative Intelligence Analysis and Search

    DTIC Science & Technology

    2014-06-01

    14. SUBJECT TERMS Intelligence Community, information retrieval, recommender systems , search engines, social networks, user profiling, Lucene...improvements over existing search systems . The improvements are shown to be robust to high levels of human error and low similarity between users ...precision NOLH nearly orthogonal Latin hypercubes P@ precision at documents RS recommender systems TREC Text REtrieval Conference USM user

  4. Mixing-model Sensitivity to Initial Conditions in Hydrodynamic Predictions

    NASA Astrophysics Data System (ADS)

    Bigelow, Josiah; Silva, Humberto; Truman, C. Randall; Vorobieff, Peter

    2017-11-01

    Amagat and Dalton mixing-models were studied to compare their thermodynamic prediction of shock states. Numerical simulations with the Sandia National Laboratories shock hydrodynamic code CTH modeled University of New Mexico (UNM) shock tube laboratory experiments shocking a 1:1 molar mixture of helium (He) and sulfur hexafluoride (SF6) . Five input parameters were varied for sensitivity analysis: driver section pressure, driver section density, test section pressure, test section density, and mixture ratio (mole fraction). We show via incremental Latin hypercube sampling (LHS) analysis that significant differences exist between Amagat and Dalton mixing-model predictions. The differences observed in predicted shock speeds, temperatures, and pressures grow more pronounced with higher shock speeds. Supported by NNSA Grant DE-0002913.

  5. Flexible Space-Filling Designs for Complex System Simulations

    DTIC Science & Technology

    2013-06-01

    interior of the experimental region and cannot fit higher-order models. We present a genetic algorithm that constructs space-filling designs with...Computer Experiments, Design of Experiments, Genetic Algorithm , Latin Hypercube, Response Surface Methodology, Nearly Orthogonal 15. NUMBER OF PAGES 147...experimental region and cannot fit higher-order models. We present a genetic algorithm that constructs space-filling designs with minimal correlations

  6. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less

  7. Uncertainty quantification of resonant ultrasound spectroscopy for material property and single crystal orientation estimation on a complex part

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Mayes, Alexander; Jauriqui, Leanne; Biedermann, Eric; Heffernan, Julieanne; Livings, Richard; Goodlet, Brent; Mazdiyasni, Siamack

    2018-04-01

    A case study is presented evaluating uncertainty in Resonance Ultrasound Spectroscopy (RUS) inversion for a single crystal (SX) Ni-based superalloy Mar-M247 cylindrical dog-bone specimens. A number of surrogate models were developed with FEM model solutions, using different sampling schemes (regular grid, Monte Carlo sampling, Latin Hyper-cube sampling) and model approaches, N-dimensional cubic spline interpolation and Kriging. Repeated studies were used to quantify the well-posedness of the inversion problem, and the uncertainty was assessed in material property and crystallographic orientation estimates given typical geometric dimension variability in aerospace components. Surrogate model quality was found to be an important factor in inversion results when the model more closely represents the test data. One important discovery was when the model matches well with test data, a Kriging surrogate model using un-sorted Latin Hypercube sampled data performed as well as the best results from an N-dimensional interpolation model using sorted data. However, both surrogate model quality and mode sorting were found to be less critical when inverting properties from either experimental data or simulated test cases with uncontrolled geometric variation.

  8. Approximation of the exponential integral (well function) using sampling methods

    NASA Astrophysics Data System (ADS)

    Baalousha, Husam Musa

    2015-04-01

    Exponential integral (also known as well function) is often used in hydrogeology to solve Theis and Hantush equations. Many methods have been developed to approximate the exponential integral. Most of these methods are based on numerical approximations and are valid for a certain range of the argument value. This paper presents a new approach to approximate the exponential integral. The new approach is based on sampling methods. Three different sampling methods; Latin Hypercube Sampling (LHS), Orthogonal Array (OA), and Orthogonal Array-based Latin Hypercube (OA-LH) have been used to approximate the function. Different argument values, covering a wide range, have been used. The results of sampling methods were compared with results obtained by Mathematica software, which was used as a benchmark. All three sampling methods converge to the result obtained by Mathematica, at different rates. It was found that the orthogonal array (OA) method has the fastest convergence rate compared with LHS and OA-LH. The root mean square error RMSE of OA was in the order of 1E-08. This method can be used with any argument value, and can be used to solve other integrals in hydrogeology such as the leaky aquifer integral.

  9. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Treesearch

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  10. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  11. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this relative improvement decreases with increasing number of sample points and input parameter dimensions. Since the computational time and efforts for generating the sample designs in the two approaches are identical, the use of midpoint LHS as the initial design in OLHS is thus recommended.

  12. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Undisturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-19

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment for the Waste Isolation Pilot Plant are presented for two-phase flow the vicinity of the repository under undisturbed conditions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformation are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure is potentially the most important due to its influence on spallings and direct brine releases, with the uncertainty in its value being dominated by the extent to whichmore » the microbial degradation of cellulose takes place, the rate at which the corrosion of steel takes place, and the amount of brine that drains from the surrounding disturbed rock zone into the repository.« less

  13. Sensitivity analysis of static resistance of slender beam under bending

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valeš, Jan

    2016-06-08

    The paper deals with statical and sensitivity analyses of resistance of simply supported I-beams under bending. The resistance was solved by geometrically nonlinear finite element method in the programme Ansys. The beams are modelled with initial geometrical imperfections following the first eigenmode of buckling. Imperfections were, together with geometrical characteristics of cross section, and material characteristics of steel, considered as random quantities. The method Latin Hypercube Sampling was applied to evaluate statistical and sensitivity resistance analyses.

  14. Characterizing the optimal flux space of genome-scale metabolic reconstructions through modified latin-hypercube sampling.

    PubMed

    Chaudhary, Neha; Tøndel, Kristin; Bhatnagar, Rakesh; dos Santos, Vítor A P Martins; Puchałka, Jacek

    2016-03-01

    Genome-Scale Metabolic Reconstructions (GSMRs), along with optimization-based methods, predominantly Flux Balance Analysis (FBA) and its derivatives, are widely applied for assessing and predicting the behavior of metabolic networks upon perturbation, thereby enabling identification of potential novel drug targets and biotechnologically relevant pathways. The abundance of alternate flux profiles has led to the evolution of methods to explore the complete solution space aiming to increase the accuracy of predictions. Herein we present a novel, generic algorithm to characterize the entire flux space of GSMR upon application of FBA, leading to the optimal value of the objective (the optimal flux space). Our method employs Modified Latin-Hypercube Sampling (LHS) to effectively border the optimal space, followed by Principal Component Analysis (PCA) to identify and explain the major sources of variability within it. The approach was validated with the elementary mode analysis of a smaller network of Saccharomyces cerevisiae and applied to the GSMR of Pseudomonas aeruginosa PAO1 (iMO1086). It is shown to surpass the commonly used Monte Carlo Sampling (MCS) in providing a more uniform coverage for a much larger network in less number of samples. Results show that although many fluxes are identified as variable upon fixing the objective value, majority of the variability can be reduced to several main patterns arising from a few alternative pathways. In iMO1086, initial variability of 211 reactions could almost entirely be explained by 7 alternative pathway groups. These findings imply that the possibilities to reroute greater portions of flux may be limited within metabolic networks of bacteria. Furthermore, the optimal flux space is subject to change with environmental conditions. Our method may be a useful device to validate the predictions made by FBA-based tools, by describing the optimal flux space associated with these predictions, thus to improve them.

  15. Stochastic Multi-Commodity Facility Location Based on a New Scenario Generation Technique

    NASA Astrophysics Data System (ADS)

    Mahootchi, M.; Fattahi, M.; Khakbazan, E.

    2011-11-01

    This paper extends two models for stochastic multi-commodity facility location problem. The problem is formulated as two-stage stochastic programming. As a main point of this study, a new algorithm is applied to efficiently generate scenarios for uncertain correlated customers' demands. This algorithm uses Latin Hypercube Sampling (LHS) and a scenario reduction approach. The relation between customer satisfaction level and cost are considered in model I. The risk measure using Conditional Value-at-Risk (CVaR) is embedded into the optimization model II. Here, the structure of the network contains three facility layers including plants, distribution centers, and retailers. The first stage decisions are the number, locations, and the capacity of distribution centers. In the second stage, the decisions are the amount of productions, the volume of transportation between plants and customers.

  16. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Disturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-22

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) are presented for two-phase flow in the vicinity of the repository under disturbed conditions resulting from drilling intrusions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure and brine flow from the repository to the Culebra Dolomite are potentially the most important in PA for the WIPP. Subsequentmore » to a drilling intrusion repository pressure was dominated by borehole permeability and generally below the level (i.e., 8 MPa) that could potentially produce spallings and direct brine releases. Brine flow from the repository to the Culebra Dolomite tended to be small or nonexistent with its occurrence and size also dominated by borehole permeability.« less

  17. Estimation of the discharges of the multiple water level stations by multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Matsumoto, Kazuhiro; Miyamoto, Mamoru; Yamakage, Yuzuru; Tsuda, Morimasa; Yanami, Hitoshi; Anai, Hirokazu; Iwami, Yoichi

    2016-04-01

    This presentation shows two aspects of the parameter identification to estimate the discharges of the multiple water level stations by multi-objective optimization. One is how to adjust the parameters to estimate the discharges accurately. The other is which optimization algorithms are suitable for the parameter identification. Regarding the previous studies, there is a study that minimizes the weighted error of the discharges of the multiple water level stations by single-objective optimization. On the other hand, there are some studies that minimize the multiple error assessment functions of the discharge of a single water level station by multi-objective optimization. This presentation features to simultaneously minimize the errors of the discharges of the multiple water level stations by multi-objective optimization. Abe River basin in Japan is targeted. The basin area is 567.0km2. There are thirteen rainfall stations and three water level stations. Nine flood events are investigated. They occurred from 2005 to 2012 and the maximum discharges exceed 1,000m3/s. The discharges are calculated with PWRI distributed hydrological model. The basin is partitioned into the meshes of 500m x 500m. Two-layer tanks are placed on each mesh. Fourteen parameters are adjusted to estimate the discharges accurately. Twelve of them are the hydrological parameters and two of them are the parameters of the initial water levels of the tanks. Three objective functions are the mean squared errors between the observed and calculated discharges at the water level stations. Latin Hypercube sampling is one of the uniformly sampling algorithms. The discharges are calculated with respect to the parameter values sampled by a simplified version of Latin Hypercube sampling. The observed discharge is surrounded by the calculated discharges. It suggests that it might be possible to estimate the discharge accurately by adjusting the parameters. In a sense, it is true that the discharge of a water level station can be accurately estimated by setting the parameter values optimized to the responding water level station. However, there are some cases that the calculated discharge by setting the parameter values optimized to one water level station does not meet the observed discharge at another water level station. It is important to estimate the discharges of all the water level stations in some degree of accuracy. It turns out to be possible to select the parameter values from the pareto optimal solutions by the condition that all the normalized errors by the minimum error of the responding water level station are under 3. The optimization performance of five implementations of the algorithms and a simplified version of Latin Hypercube sampling are compared. Five implementations are NSGA2 and PAES of an optimization software inspyred and MCO_NSGA2R, MOPSOCD and NSGA2R_NSGA2R of a statistical software R. NSGA2, PAES and MOPSOCD are the optimization algorithms of a genetic algorithm, an evolution strategy and a particle swarm optimization respectively. The number of the evaluations of the objective functions is 10,000. Two implementations of NSGA2 of R outperform the others. They are promising to be suitable for the parameter identification of PWRI distributed hydrological model.

  18. Constraining Unsaturated Hydraulic Parameters Using the Latin Hypercube Sampling Method and Coupled Hydrogeophysical Approach

    NASA Astrophysics Data System (ADS)

    Farzamian, Mohammad; Monteiro Santos, Fernando A.; Khalil, Mohamed A.

    2017-12-01

    The coupled hydrogeophysical approach has proved to be a valuable tool for improving the use of geoelectrical data for hydrological model parameterization. In the coupled approach, hydrological parameters are directly inferred from geoelectrical measurements in a forward manner to eliminate the uncertainty connected to the independent inversion of electrical resistivity data. Several numerical studies have been conducted to demonstrate the advantages of a coupled approach; however, only a few attempts have been made to apply the coupled approach to actual field data. In this study, we developed a 1D coupled hydrogeophysical code to estimate the van Genuchten-Mualem model parameters, K s, n, θ r and α, from time-lapse vertical electrical sounding data collected during a constant inflow infiltration experiment. van Genuchten-Mualem parameters were sampled using the Latin hypercube sampling method to provide a full coverage of the range of each parameter from their distributions. By applying the coupled approach, vertical electrical sounding data were coupled to hydrological models inferred from van Genuchten-Mualem parameter samples to investigate the feasibility of constraining the hydrological model. The key approaches taken in the study are to (1) integrate electrical resistivity and hydrological data and avoiding data inversion, (2) estimate the total water mass recovery of electrical resistivity data and consider it in van Genuchten-Mualem parameters evaluation and (3) correct the influence of subsurface temperature fluctuations during the infiltration experiment on electrical resistivity data. The results of the study revealed that the coupled hydrogeophysical approach can improve the value of geophysical measurements in hydrological model parameterization. However, the approach cannot overcome the technical limitations of the geoelectrical method associated with resolution and of water mass recovery.

  19. Stochastic Investigation of Natural Frequency for Functionally Graded Plates

    NASA Astrophysics Data System (ADS)

    Karsh, P. K.; Mukhopadhyay, T.; Dey, S.

    2018-03-01

    This paper presents the stochastic natural frequency analysis of functionally graded plates by applying artificial neural network (ANN) approach. Latin hypercube sampling is utilised to train the ANN model. The proposed algorithm for stochastic natural frequency analysis of FGM plates is validated and verified with original finite element method and Monte Carlo simulation (MCS). The combined stochastic variation of input parameters such as, elastic modulus, shear modulus, Poisson ratio, and mass density are considered. Power law is applied to distribute the material properties across the thickness. The present ANN model reduces the sample size and computationally found efficient as compared to conventional Monte Carlo simulation.

  20. Erosion Modeling in Central China - Soil Data Acquisition by Conditioned Latin Hypercube Sampling and Incorporation of Legacy Data

    NASA Astrophysics Data System (ADS)

    Stumpf, Felix; Schönbrodt-Stitt, Sarah; Schmidt, Karsten; Behrens, Thorsten; Scholten, Thomas

    2013-04-01

    The Three Gorges Dam at the Yangtze River in Central China outlines a prominent example of human-induced environmental impacts. Throughout one year the water table at the main river fluctuates about 30m due to impoundment and drainage activities. The dynamic water table implicates a range of georisks such as soil erosion, mass movements, sediment transport and diffuse matter inputs into the reservoir. Within the framework of the joint Sino-German project YANGTZE GEO, the subproject "Soil Erosion" deals with soil erosion risks and sediment transport pathways into the reservoir. The study site is a small catchment (4.8 km²) in Badong, approximately 100 km upstream the dam. It is characterized by scattered plots of agricultural landuse and resettlements in a largely wooded, steep sloping and mountainous area. Our research is focused on data acquisition and processing to develop a process-oriented erosion model. Hereby, area-covering knowledge of specific soil properties in the catchment is an intrinsic input parameter. This will be acquired by means of digital soil mapping (DSM). Thereby, soil properties are estimated by covariates. The functions are calibrated by soil property samples. The DSM approach is based on an appropriate sample design, which reflects the heterogeneity of the catchment, regarding the covariates with influence on the relevant soil properties. In this approach the covariates, processed by a digital terrain analysis, are outlined by the slope, altitude, profile curvature, plane curvature, and the aspect. For the development of the sample design, we chose the Conditioned Latin Hypercube Sampling (cLHS) procedure (Minasny and McBratney, 2006). It provides an efficient method of sampling variables from their multivariate distribution. Thereby, a sample size n from multiple variables is drawn such that for each variable the sample is marginally maximally stratified. The method ensures the maximal stratification by two features: First, number of strata equals the sample size n and secondly, the probability of falling in each of the strata is n-¹ (McKay et al., 1979). We extended the classical cLHS with extremes (Schmidt et al., 2012) approach by incorporating legacy data of previous field campaigns. Instead of identifying precise sample locations by CLHS, we demarcate the multivariate attribute space of the samples based on the histogram borders of each stratum. This widens the spatial scope of the actual CLHS sample locations and allows the incorporation of legacy data lying within that scope. Furthermore, this approach provides an extended potential regarding the accessibility of sample sites in the field.

  1. Maximum projection designs for computer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan; Gul, Evren; Ba, Shan

    Space-filling properties are important in designing computer experiments. The traditional maximin and minimax distance designs only consider space-filling in the full dimensional space. This can result in poor projections onto lower dimensional spaces, which is undesirable when only a few factors are active. Restricting maximin distance design to the class of Latin hypercubes can improve one-dimensional projections, but cannot guarantee good space-filling properties in larger subspaces. We propose designs that maximize space-filling properties on projections to all subsets of factors. We call our designs maximum projection designs. As a result, our design criterion can be computed at a cost nomore » more than a design criterion that ignores projection properties.« less

  2. Maximum projection designs for computer experiments

    DOE PAGES

    Joseph, V. Roshan; Gul, Evren; Ba, Shan

    2015-03-18

    Space-filling properties are important in designing computer experiments. The traditional maximin and minimax distance designs only consider space-filling in the full dimensional space. This can result in poor projections onto lower dimensional spaces, which is undesirable when only a few factors are active. Restricting maximin distance design to the class of Latin hypercubes can improve one-dimensional projections, but cannot guarantee good space-filling properties in larger subspaces. We propose designs that maximize space-filling properties on projections to all subsets of factors. We call our designs maximum projection designs. As a result, our design criterion can be computed at a cost nomore » more than a design criterion that ignores projection properties.« less

  3. Assessing performance and validating finite element simulations using probabilistic knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolin, Ronald M.; Rodriguez, E. A.

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrencemore » results are used to validate finite element predictions.« less

  4. In Silico Oncology: Quantification of the In Vivo Antitumor Efficacy of Cisplatin-Based Doublet Therapy in Non-Small Cell Lung Cancer (NSCLC) through a Multiscale Mechanistic Model

    PubMed Central

    Kolokotroni, Eleni; Dionysiou, Dimitra; Veith, Christian; Kim, Yoo-Jin; Franz, Astrid; Grgic, Aleksandar; Bohle, Rainer M.; Stamatakos, Georgios

    2016-01-01

    The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs’ cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of such estimates with the molecular profile of patients could serve as a basis for reliable personalized predictions. PMID:27657742

  5. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    NASA Astrophysics Data System (ADS)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  6. Simulation of Fault Tolerance in a Hypercube Arrangement of Discrete Processors.

    DTIC Science & Technology

    1987-12-01

    Geometric Properties .................... 22 Binary Properties ....................... 26 Intel Hypercube Hardware Arrangement ... 28 IV. Cube-Connected... Properties of the CCC..............35 CCC Redundancy............................... 38 iii 6L V. Re-Configurable Cube-Connected Cycles ....... 40 Global...o........ 74 iv List of Figures Page Figure 1: Hypercubes of Different Dimensions ......... 21 Figure 2: Hypercube Properties

  7. Design Optimization of a Centrifugal Fan with Splitter Blades

    NASA Astrophysics Data System (ADS)

    Heo, Man-Woong; Kim, Jin-Hyuk; Kim, Kwang-Yong

    2015-05-01

    Multi-objective optimization of a centrifugal fan with additionally installed splitter blades was performed to simultaneously maximize the efficiency and pressure rise using three-dimensional Reynolds-averaged Navier-Stokes equations and hybrid multi-objective evolutionary algorithm. Two design variables defining the location of splitter, and the height ratio between inlet and outlet of impeller were selected for the optimization. In addition, the aerodynamic characteristics of the centrifugal fan were investigated with the variation of design variables in the design space. Latin hypercube sampling was used to select the training points, and response surface approximation models were constructed as surrogate models of the objective functions. With the optimization, both the efficiency and pressure rise of the centrifugal fan with splitter blades were improved considerably compared to the reference model.

  8. Validation of heat transfer, thermal decomposition, and container pressurization of polyurethane foam using mean value and Latin hypercube sampling approaches

    DOE PAGES

    Scott, Sarah N.; Dodd, Amanda B.; Larsen, Marvin E.; ...

    2014-12-09

    In this study, polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. It can be advantageous to surround objects of interest, such as electronics, with foams in a hermetically sealed container in order to protect them from hostile environments or from accidents such as fire. In fire environments, gas pressure from thermal decomposition of foams can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of polymeric methylene diisocyanate (PMDI)-polyether-polyol based polyurethane foam is presented and compared to experimental results to assess the validity of a 3-D finite element model of themore » heat transfer and degradation processes. In this series of experiments, 320 kg/m 3 PMDI foam in a 0.2 L sealed steel container is heated to 1,073 K at a rate of 150 K/min. The experiment ends when the can breaches due to the buildup of pressure. The temperature at key location is monitored as well as the internal pressure of the can. Both experimental uncertainty and computational uncertainty are examined and compared. The mean value method (MV) and Latin hypercube sampling (LHS) approach are used to propagate the uncertainty through the model. The results of the both the MV method and the LHS approach show that while the model generally can predict the temperature at given locations in the system, it is less successful at predicting the pressure response. Also, these two approaches for propagating uncertainty agree with each other, the importance of each input parameter on the simulation results is also investigated, showing that for the temperature response the conductivity of the steel container and the effective conductivity of the foam, are the most important parameters. For the pressure response, the activation energy, effective conductivity, and specific heat are most important. The comparison to experiments and the identification of the drivers of uncertainty allow for targeted development of the computational model and for definition of the experiments necessary to improve accuracy.« less

  9. When to Make Mountains out of Molehills: The Pros and Cons of Simple and Complex Model Calibration Procedures

    NASA Astrophysics Data System (ADS)

    Smith, K. A.; Barker, L. J.; Harrigan, S.; Prudhomme, C.; Hannaford, J.; Tanguy, M.; Parry, S.

    2017-12-01

    Earth and environmental models are relied upon to investigate system responses that cannot otherwise be examined. In simulating physical processes, models have adjustable parameters which may, or may not, have a physical meaning. Determining the values to assign to these model parameters is an enduring challenge for earth and environmental modellers. Selecting different error metrics by which the models results are compared to observations will lead to different sets of calibrated model parameters, and thus different model results. Furthermore, models may exhibit `equifinal' behaviour, where multiple combinations of model parameters lead to equally acceptable model performance against observations. These decisions in model calibration introduce uncertainty that must be considered when model results are used to inform environmental decision-making. This presentation focusses on the uncertainties that derive from the calibration of a four parameter lumped catchment hydrological model (GR4J). The GR models contain an inbuilt automatic calibration algorithm that can satisfactorily calibrate against four error metrics in only a few seconds. However, a single, deterministic model result does not provide information on parameter uncertainty. Furthermore, a modeller interested in extreme events, such as droughts, may wish to calibrate against more low flows specific error metrics. In a comprehensive assessment, the GR4J model has been run with 500,000 Latin Hypercube Sampled parameter sets across 303 catchments in the United Kingdom. These parameter sets have been assessed against six error metrics, including two drought specific metrics. This presentation compares the two approaches, and demonstrates that the inbuilt automatic calibration can outperform the Latin Hypercube experiment approach in single metric assessed performance. However, it is also shown that there are many merits of the more comprehensive assessment, which allows for probabilistic model results, multi-objective optimisation, and better tailoring to calibrate the model for specific applications such as drought event characterisation. Modellers and decision-makers may be constrained in their choice of calibration method, so it is important that they recognise the strengths and limitations of their chosen approach.

  10. Validation of heat transfer, thermal decomposition, and container pressurization of polyurethane foam using mean value and Latin hypercube sampling approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Sarah N.; Dodd, Amanda B.; Larsen, Marvin E.

    In this study, polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. It can be advantageous to surround objects of interest, such as electronics, with foams in a hermetically sealed container in order to protect them from hostile environments or from accidents such as fire. In fire environments, gas pressure from thermal decomposition of foams can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of polymeric methylene diisocyanate (PMDI)-polyether-polyol based polyurethane foam is presented and compared to experimental results to assess the validity of a 3-D finite element model of themore » heat transfer and degradation processes. In this series of experiments, 320 kg/m 3 PMDI foam in a 0.2 L sealed steel container is heated to 1,073 K at a rate of 150 K/min. The experiment ends when the can breaches due to the buildup of pressure. The temperature at key location is monitored as well as the internal pressure of the can. Both experimental uncertainty and computational uncertainty are examined and compared. The mean value method (MV) and Latin hypercube sampling (LHS) approach are used to propagate the uncertainty through the model. The results of the both the MV method and the LHS approach show that while the model generally can predict the temperature at given locations in the system, it is less successful at predicting the pressure response. Also, these two approaches for propagating uncertainty agree with each other, the importance of each input parameter on the simulation results is also investigated, showing that for the temperature response the conductivity of the steel container and the effective conductivity of the foam, are the most important parameters. For the pressure response, the activation energy, effective conductivity, and specific heat are most important. The comparison to experiments and the identification of the drivers of uncertainty allow for targeted development of the computational model and for definition of the experiments necessary to improve accuracy.« less

  11. Concurrent Image Processing Executive (CIPE). Volume 2: Programmer's guide

    NASA Technical Reports Server (NTRS)

    Williams, Winifred I.

    1990-01-01

    This manual is intended as a guide for application programmers using the Concurrent Image Processing Executive (CIPE). CIPE is intended to become the support system software for a prototype high performance science analysis workstation. In its current configuration CIPE utilizes a JPL/Caltech Mark 3fp Hypercube with a Sun-4 host. CIPE's design is capable of incorporating other concurrent architectures as well. CIPE provides a programming environment to applications' programmers to shield them from various user interfaces, file transactions, and architectural complexities. A programmer may choose to write applications to use only the Sun-4 or to use the Sun-4 with the hypercube. A hypercube program will use the hypercube's data processors and optionally the Weitek floating point accelerators. The CIPE programming environment provides a simple set of subroutines to activate user interface functions, specify data distributions, activate hypercube resident applications, and to communicate parameters to and from the hypercube.

  12. A discrete random walk on the hypercube

    NASA Astrophysics Data System (ADS)

    Zhang, Jingyuan; Xiang, Yonghong; Sun, Weigang

    2018-03-01

    In this paper, we study the scaling for mean first-passage time (MFPT) of random walks on the hypercube and obtain a closed-form formula for the MFPT over all node pairs. We also determine the exponent of scaling efficiency characterizing the random walks and compare it with those of the existing networks. Finally we study the random walks on the hypercube with a located trap and provide a solution of the Kirchhoff index of the hypercube.

  13. Validation of Heat Transfer Thermal Decomposition and Container Pressurization of Polyurethane Foam.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Sarah Nicole; Dodd, Amanda B.; Larsen, Marvin E.

    Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. In fire environments, gas pressure from thermal decomposition of polymers can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of PMDI-based polyurethane foam is presented to assess the validity of the computational model. Both experimental measurement uncertainty and model prediction uncertainty are examined and compared. Both the mean value method and Latin hypercube sampling approach are used to propagate the uncertainty through the model. In addition to comparing computational and experimental results, the importance of each input parameter on the simulation resultmore » is also investigated. These results show that further development in the physics model of the foam and appropriate associated material testing are necessary to improve model accuracy.« less

  14. Parallel and fault-tolerant algorithms for hypercube multiprocessors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aykanat, C.

    1988-01-01

    Several techniques for increasing the performance of parallel algorithms on distributed-memory message-passing multi-processor systems are investigated. These techniques are effectively implemented for the parallelization of the Scaled Conjugate Gradient (SCG) algorithm on a hypercube connected message-passing multi-processor. Significant performance improvement is achieved by using these techniques. The SCG algorithm is used for the solution phase of an FE modeling system. Almost linear speed-up is achieved, and it is shown that hypercube topology is scalable for an FE class of problem. The SCG algorithm is also shown to be suitable for vectorization, and near supercomputer performance is achieved on a vectormore » hypercube multiprocessor by exploiting both parallelization and vectorization. Fault-tolerance issues for the parallel SCG algorithm and for the hypercube topology are also addressed.« less

  15. Emulation and Sensitivity Analysis of the Community Multiscale Air Quality Model for a UK Ozone Pollution Episode.

    PubMed

    Beddows, Andrew V; Kitwiroon, Nutthida; Williams, Martin L; Beevers, Sean D

    2017-06-06

    Gaussian process emulation techniques have been used with the Community Multiscale Air Quality model, simulating the effects of input uncertainties on ozone and NO 2 output, to allow robust global sensitivity analysis (SA). A screening process ranked the effect of perturbations in 223 inputs, isolating the 30 most influential from emissions, boundary conditions (BCs), and reaction rates. Community Multiscale Air Quality (CMAQ) simulations of a July 2006 ozone pollution episode in the UK were made with input values for these variables plus ozone dry deposition velocity chosen according to a 576 point Latin hypercube design. Emulators trained on the output of these runs were used in variance-based SA of the model output to input uncertainties. Performing these analyses for every hour of a 21 day period spanning the episode and several days on either side allowed the results to be presented as a time series of sensitivity coefficients, showing how the influence of different input uncertainties changed during the episode. This is one of the most complex models to which these methods have been applied, and here, they reveal detailed spatiotemporal patterns of model sensitivities, with NO and isoprene emissions, NO 2 photolysis, ozone BCs, and deposition velocity being among the most influential input uncertainties.

  16. Fault tree models for fault tolerant hypercube multiprocessors

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Tuazon, Jezus O.

    1991-01-01

    Three candidate fault tolerant hypercube architectures are modeled, their reliability analyses are compared, and the resulting implications of these methods of incorporating fault tolerance into hypercube multiprocessors are discussed. In the course of performing the reliability analyses, the use of HARP and fault trees in modeling sequence dependent system behaviors is demonstrated.

  17. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    DOE PAGES

    Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra; ...

    2017-11-20

    The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less

  18. Development of building energy asset rating using stock modelling in the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Makhmalbaf, Atefe

    2016-01-29

    The US Building Energy Asset Score helps building stakeholders quickly gain insight into the efficiency of building systems (envelope, electrical and mechanical systems). A robust, easy-to-understand 10-point scoring system was developed to facilitate an unbiased comparison of similar building types across the country. The Asset Score does not rely on a database or specific building baselines to establish a rating. Rather, distributions of energy use intensity (EUI) for various building use types were constructed using Latin hypercube sampling and converted to a series of stepped linear scales to score buildings. A score is calculated based on the modelled source EUImore » after adjusting for climate. A web-based scoring tool, which incorporates an analytical engine and a simulation engine, was developed to standardize energy modelling and reduce implementation cost. This paper discusses the methodology used to perform several hundred thousand building simulation runs and develop the scoring scales.« less

  19. Multi-objective aerodynamic shape optimization of small livestock trailers

    NASA Astrophysics Data System (ADS)

    Gilkeson, C. A.; Toropov, V. V.; Thompson, H. M.; Wilson, M. C. T.; Foxley, N. A.; Gaskell, P. H.

    2013-11-01

    This article presents a formal optimization study of the design of small livestock trailers, within which the majority of animals are transported to market in the UK. The benefits of employing a headboard fairing to reduce aerodynamic drag without compromising the ventilation of the animals' microclimate are investigated using a multi-stage process involving computational fluid dynamics (CFD), optimal Latin hypercube (OLH) design of experiments (DoE) and moving least squares (MLS) metamodels. Fairings are parameterized in terms of three design variables and CFD solutions are obtained at 50 permutations of design variables. Both global and local search methods are employed to locate the global minimum from metamodels of the objective functions and a Pareto front is generated. The importance of carefully selecting an objective function is demonstrated and optimal fairing designs, offering drag reductions in excess of 5% without compromising animal ventilation, are presented.

  20. Generalized Likelihood Uncertainty Estimation (GLUE) methodology for optimization of extraction in natural products.

    PubMed

    Maulidiani; Rudiyanto; Abas, Faridah; Ismail, Intan Safinar; Lajis, Nordin H

    2018-06-01

    Optimization process is an important aspect in the natural product extractions. Herein, an alternative approach is proposed for the optimization in extraction, namely, the Generalized Likelihood Uncertainty Estimation (GLUE). The approach combines the Latin hypercube sampling, the feasible range of independent variables, the Monte Carlo simulation, and the threshold criteria of response variables. The GLUE method is tested in three different techniques including the ultrasound, the microwave, and the supercritical CO 2 assisted extractions utilizing the data from previously published reports. The study found that this method can: provide more information on the combined effects of the independent variables on the response variables in the dotty plots; deal with unlimited number of independent and response variables; consider combined multiple threshold criteria, which is subjective depending on the target of the investigation for response variables; and provide a range of values with their distribution for the optimization. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Optimisation of lateral car dynamics taking into account parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Busch, Jochen; Bestle, Dieter

    2014-02-01

    Simulation studies on an active all-wheel-steering car show that disturbance of vehicle parameters have high influence on lateral car dynamics. This motivates the need of robust design against such parameter uncertainties. A specific parametrisation is established combining deterministic, velocity-dependent steering control parameters with partly uncertain, velocity-independent vehicle parameters for simultaneous use in a numerical optimisation process. Model-based objectives are formulated and summarised in a multi-objective optimisation problem where especially the lateral steady-state behaviour is improved by an adaption strategy based on measurable uncertainties. The normally distributed uncertainties are generated by optimal Latin hypercube sampling and a response surface based strategy helps to cut down time consuming model evaluations which offers the possibility to use a genetic optimisation algorithm. Optimisation results are discussed in different criterion spaces and the achieved improvements confirm the validity of the proposed procedure.

  2. An uncertainty analysis of the flood-stage upstream from a bridge.

    PubMed

    Sowiński, M

    2006-01-01

    The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.

  3. Efficient hybrid evolutionary algorithm for optimization of a strip coiling process

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Park, Won-Woong; Kim, Dong-Kyu; Im, Yong-Taek; Bureerat, Sujin; Kwon, Hyuck-Cheol; Chun, Myung-Sik

    2015-04-01

    This article proposes an efficient metaheuristic based on hybridization of teaching-learning-based optimization and differential evolution for optimization to improve the flatness of a strip during a strip coiling process. Differential evolution operators were integrated into the teaching-learning-based optimization with a Latin hypercube sampling technique for generation of an initial population. The objective function was introduced to reduce axial inhomogeneity of the stress distribution and the maximum compressive stress calculated by Love's elastic solution within the thin strip, which may cause an irregular surface profile of the strip during the strip coiling process. The hybrid optimizer and several well-established evolutionary algorithms (EAs) were used to solve the optimization problem. The comparative studies show that the proposed hybrid algorithm outperformed other EAs in terms of convergence rate and consistency. It was found that the proposed hybrid approach was powerful for process optimization, especially with a large-scale design problem.

  4. Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

    NASA Astrophysics Data System (ADS)

    García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier

    2016-04-01

    Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.

  5. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra

    The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less

  6. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  7. Optimization of High-Dimensional Functions through Hypercube Evaluation

    PubMed Central

    Abiyev, Rahib H.; Tunay, Mustafa

    2015-01-01

    A novel learning algorithm for solving global numerical optimization problems is proposed. The proposed learning algorithm is intense stochastic search method which is based on evaluation and optimization of a hypercube and is called the hypercube optimization (HO) algorithm. The HO algorithm comprises the initialization and evaluation process, displacement-shrink process, and searching space process. The initialization and evaluation process initializes initial solution and evaluates the solutions in given hypercube. The displacement-shrink process determines displacement and evaluates objective functions using new points, and the search area process determines next hypercube using certain rules and evaluates the new solutions. The algorithms for these processes have been designed and presented in the paper. The designed HO algorithm is tested on specific benchmark functions. The simulations of HO algorithm have been performed for optimization of functions of 1000-, 5000-, or even 10000 dimensions. The comparative simulation results with other approaches demonstrate that the proposed algorithm is a potential candidate for optimization of both low and high dimensional functions. PMID:26339237

  8. A parallel simulated annealing algorithm for standard cell placement on a hypercube computer

    NASA Technical Reports Server (NTRS)

    Jones, Mark Howard

    1987-01-01

    A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.

  9. Benchmarking hypercube hardware and software

    NASA Technical Reports Server (NTRS)

    Grunwald, Dirk C.; Reed, Daniel A.

    1986-01-01

    It was long a truism in computer systems design that balanced systems achieve the best performance. Message passing parallel processors are no different. To quantify the balance of a hypercube design, an experimental methodology was developed and the associated suite of benchmarks was applied to several existing hypercubes. The benchmark suite includes tests of both processor speed in the absence of internode communication and message transmission speed as a function of communication patterns.

  10. A Monte Carlo risk assessment model for acrylamide formation in French fries.

    PubMed

    Cummins, Enda; Butler, Francis; Gormley, Ronan; Brunton, Nigel

    2009-10-01

    The objective of this study is to estimate the likely human exposure to the group 2a carcinogen, acrylamide, from French fries by Irish consumers by developing a quantitative risk assessment model using Monte Carlo simulation techniques. Various stages in the French-fry-making process were modeled from initial potato harvest, storage, and processing procedures. The model was developed in Microsoft Excel with the @Risk add-on package. The model was run for 10,000 iterations using Latin hypercube sampling. The simulated mean acrylamide level in French fries was calculated to be 317 microg/kg. It was found that females are exposed to smaller levels of acrylamide than males (mean exposure of 0.20 microg/kg bw/day and 0.27 microg/kg bw/day, respectively). Although the carcinogenic potency of acrylamide is not well known, the simulated probability of exceeding the average chronic human dietary intake of 1 microg/kg bw/day (as suggested by WHO) was 0.054 and 0.029 for males and females, respectively. A sensitivity analysis highlighted the importance of the selection of appropriate cultivars with known low reducing sugar levels for French fry production. Strict control of cooking conditions (correlation coefficient of 0.42 and 0.35 for frying time and temperature, respectively) and blanching procedures (correlation coefficient -0.25) were also found to be important in ensuring minimal acrylamide formation.

  11. Probability-based methodology for buckling investigation of sandwich composite shells with and without cut-outs

    NASA Astrophysics Data System (ADS)

    Alfano, M.; Bisagni, C.

    2017-01-01

    The objective of the running EU project DESICOS (New Robust DESign Guideline for Imperfection Sensitive COmposite Launcher Structures) is to formulate an improved shell design methodology in order to meet the demand of aerospace industry for lighter structures. Within the project, this article discusses the development of a probability-based methodology developed at Politecnico di Milano. It is based on the combination of the Stress-Strength Interference Method and the Latin Hypercube Method with the aim to predict the bucking response of three sandwich composite cylindrical shells, assuming a loading condition of pure compression. The three shells are made of the same material, but have different stacking sequence and geometric dimensions. One of them presents three circular cut-outs. Different types of input imperfections, treated as random variables, are taken into account independently and in combination: variability in longitudinal Young's modulus, ply misalignment, geometric imperfections, and boundary imperfections. The methodology enables a first assessment of the structural reliability of the shells through the calculation of a probabilistic buckling factor for a specified level of probability. The factor depends highly on the reliability level, on the number of adopted samples, and on the assumptions made in modeling the input imperfections. The main advantage of the developed procedure is the versatility, as it can be applied to the buckling analysis of laminated composite shells and sandwich composite shells including different types of imperfections.

  12. The Mark III Hypercube-Ensemble Computers

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Tuazon, Jesus O.; Lieberman, Don; Pniel, Moshe

    1988-01-01

    Mark III Hypercube concept applied in development of series of increasingly powerful computers. Processor of each node of Mark III Hypercube ensemble is specialized computer containing three subprocessors and shared main memory. Solves problem quickly by simultaneously processing part of problem at each such node and passing combined results to host computer. Disciplines benefitting from speed and memory capacity include astrophysics, geophysics, chemistry, weather, high-energy physics, applied mechanics, image processing, oil exploration, aircraft design, and microcircuit design.

  13. Threshold conditions for integrated pest management models with pesticides that have residual effects.

    PubMed

    Tang, Sanyi; Liang, Juhua; Tan, Yuanshun; Cheke, Robert A

    2013-01-01

    Impulsive differential equations (hybrid dynamical systems) can provide a natural description of pulse-like actions such as when a pesticide kills a pest instantly. However, pesticides may have long-term residual effects, with some remaining active against pests for several weeks, months or years. Therefore, a more realistic method for modelling chemical control in such cases is to use continuous or piecewise-continuous periodic functions which affect growth rates. How to evaluate the effects of the duration of the pesticide residual effectiveness on successful pest control is key to the implementation of integrated pest management (IPM) in practice. To address these questions in detail, we have modelled IPM including residual effects of pesticides in terms of fixed pulse-type actions. The stability threshold conditions for pest eradication are given. Moreover, effects of the killing efficiency rate and the decay rate of the pesticide on the pest and on its natural enemies, the duration of residual effectiveness, the number of pesticide applications and the number of natural enemy releases on the threshold conditions are investigated with regard to the extent of depression or resurgence resulting from pulses of pesticide applications and predator releases. Latin Hypercube Sampling/Partial Rank Correlation uncertainty and sensitivity analysis techniques are employed to investigate the key control parameters which are most significantly related to threshold values. The findings combined with Volterra's principle confirm that when the pesticide has a strong effect on the natural enemies, repeated use of the same pesticide can result in target pest resurgence. The results also indicate that there exists an optimal number of pesticide applications which can suppress the pest most effectively, and this may help in the design of an optimal control strategy.

  14. A parallel row-based algorithm with error control for standard-cell replacement on a hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Sargent, Jeff Scott

    1988-01-01

    A new row-based parallel algorithm for standard-cell placement targeted for execution on a hypercube multiprocessor is presented. Key features of this implementation include a dynamic simulated-annealing schedule, row-partitioning of the VLSI chip image, and two novel new approaches to controlling error in parallel cell-placement algorithms; Heuristic Cell-Coloring and Adaptive (Parallel Move) Sequence Control. Heuristic Cell-Coloring identifies sets of noninteracting cells that can be moved repeatedly, and in parallel, with no buildup of error in the placement cost. Adaptive Sequence Control allows multiple parallel cell moves to take place between global cell-position updates. This feedback mechanism is based on an error bound derived analytically from the traditional annealing move-acceptance profile. Placement results are presented for real industry circuits and the performance is summarized of an implementation on the Intel iPSC/2 Hypercube. The runtime of this algorithm is 5 to 16 times faster than a previous program developed for the Hypercube, while producing equivalent quality placement. An integrated place and route program for the Intel iPSC/2 Hypercube is currently being developed.

  15. Hyperswitch Network For Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Chow, Edward; Madan, Herbert; Peterson, John

    1989-01-01

    Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S. M.; Kim, K. Y.

    Printed circuit heat exchanger (PCHE) is recently considered as a recuperator for the high temperature gas cooled reactor. In this work, the zigzag-channels of a PCHE have been optimized by using three-dimensional Reynolds-Averaged Navier-Stokes (RANS) analysis and response surface approximation (RSA) modeling technique to enhance thermal-hydraulic performance. Shear stress transport turbulence model is used as a turbulence closure. The objective function is defined as a linear combination of the functions related to heat transfer and friction loss of the PCHE, respectively. Three geometric design variables viz., the ratio of the radius of the fillet to hydraulic diameter of the channels,more » the ratio of wavelength to hydraulic diameter of the channels, and the ratio of wave height to hydraulic diameter of the channels, are used for the optimization. Design points are selected through Latin-hypercube sampling. The optimal design is determined through the RSA model which uses RANS derived calculations at the design points. The results show that the optimum shape enhances considerably the thermal-hydraulic performance than a reference shape. (authors)« less

  17. Multiobjective robust design of the double wishbone suspension system based on particle swarm optimization.

    PubMed

    Cheng, Xianfu; Lin, Yuqun

    2014-01-01

    The performance of the suspension system is one of the most important factors in the vehicle design. For the double wishbone suspension system, the conventional deterministic optimization does not consider any deviations of design parameters, so design sensitivity analysis and robust optimization design are proposed. In this study, the design parameters of the robust optimization are the positions of the key points, and the random factors are the uncertainties in manufacturing. A simplified model of the double wishbone suspension is established by software ADAMS. The sensitivity analysis is utilized to determine main design variables. Then, the simulation experiment is arranged and the Latin hypercube design is adopted to find the initial points. The Kriging model is employed for fitting the mean and variance of the quality characteristics according to the simulation results. Further, a particle swarm optimization method based on simple PSO is applied and the tradeoff between the mean and deviation of performance is made to solve the robust optimization problem of the double wishbone suspension system.

  18. Uncertain dynamic analysis for rigid-flexible mechanisms with random geometry and material properties

    NASA Astrophysics Data System (ADS)

    Wu, Jinglai; Luo, Zhen; Zhang, Nong; Zhang, Yunqing; Walker, Paul D.

    2017-02-01

    This paper proposes an uncertain modelling and computational method to analyze dynamic responses of rigid-flexible multibody systems (or mechanisms) with random geometry and material properties. Firstly, the deterministic model for the rigid-flexible multibody system is built with the absolute node coordinate formula (ANCF), in which the flexible parts are modeled by using ANCF elements, while the rigid parts are described by ANCF reference nodes (ANCF-RNs). Secondly, uncertainty for the geometry of rigid parts is expressed as uniform random variables, while the uncertainty for the material properties of flexible parts is modeled as a continuous random field, which is further discretized to Gaussian random variables using a series expansion method. Finally, a non-intrusive numerical method is developed to solve the dynamic equations of systems involving both types of random variables, which systematically integrates the deterministic generalized-α solver with Latin Hypercube sampling (LHS) and Polynomial Chaos (PC) expansion. The benchmark slider-crank mechanism is used as a numerical example to demonstrate the characteristics of the proposed method.

  19. Design optimization of hydraulic turbine draft tube based on CFD and DOE method

    NASA Astrophysics Data System (ADS)

    Nam, Mun chol; Dechun, Ba; Xiangji, Yue; Mingri, Jin

    2018-03-01

    In order to improve performance of the hydraulic turbine draft tube in its design process, the optimization for draft tube is performed based on multi-disciplinary collaborative design optimization platform by combining the computation fluid dynamic (CFD) and the design of experiment (DOE) in this paper. The geometrical design variables are considered as the median section in the draft tube and the cross section in its exit diffuser and objective function is to maximize the pressure recovery factor (Cp). Sample matrixes required for the shape optimization of the draft tube are generated by optimal Latin hypercube (OLH) method of the DOE technique and their performances are evaluated through computational fluid dynamic (CFD) numerical simulation. Subsequently the main effect analysis and the sensitivity analysis of the geometrical parameters of the draft tube are accomplished. Then, the design optimization of the geometrical design variables is determined using the response surface method. The optimization result of the draft tube shows a marked performance improvement over the original.

  20. Developing an in silico model of the modulation of base excision repair using methoxyamine for more targeted cancer therapeutics.

    PubMed

    Gurkan-Cavusoglu, Evren; Avadhani, Sriya; Liu, Lili; Kinsella, Timothy J; Loparo, Kenneth A

    2013-04-01

    Base excision repair (BER) is a major DNA repair pathway involved in the processing of exogenous non-bulky base damages from certain classes of cancer chemotherapy drugs as well as ionising radiation (IR). Methoxyamine (MX) is a small molecule chemical inhibitor of BER that is shown to enhance chemotherapy and/or IR cytotoxicity in human cancers. In this study, the authors have analysed the inhibitory effect of MX on the BER pathway kinetics using a computational model of the repair pathway. The inhibitory effect of MX depends on the BER efficiency. The authors have generated variable efficiency groups using different sets of protein concentrations generated by Latin hypercube sampling, and they have clustered simulation results into high, medium and low efficiency repair groups. From analysis of the inhibitory effect of MX on each of the three groups, it is found that the inhibition is most effective for high efficiency BER, and least effective for low efficiency repair.

  1. Construction of nested maximin designs based on successive local enumeration and modified novel global harmony search algorithm

    NASA Astrophysics Data System (ADS)

    Yi, Jin; Li, Xinyu; Xiao, Mi; Xu, Junnan; Zhang, Lin

    2017-01-01

    Engineering design often involves different types of simulation, which results in expensive computational costs. Variable fidelity approximation-based design optimization approaches can realize effective simulation and efficiency optimization of the design space using approximation models with different levels of fidelity and have been widely used in different fields. As the foundations of variable fidelity approximation models, the selection of sample points of variable-fidelity approximation, called nested designs, is essential. In this article a novel nested maximin Latin hypercube design is constructed based on successive local enumeration and a modified novel global harmony search algorithm. In the proposed nested designs, successive local enumeration is employed to select sample points for a low-fidelity model, whereas the modified novel global harmony search algorithm is employed to select sample points for a high-fidelity model. A comparative study with multiple criteria and an engineering application are employed to verify the efficiency of the proposed nested designs approach.

  2. Least squares polynomial chaos expansion: A review of sampling strategies

    NASA Astrophysics Data System (ADS)

    Hadigol, Mohammad; Doostan, Alireza

    2018-04-01

    As non-institutive polynomial chaos expansion (PCE) techniques have gained growing popularity among researchers, we here provide a comprehensive review of major sampling strategies for the least squares based PCE. Traditional sampling methods, such as Monte Carlo, Latin hypercube, quasi-Monte Carlo, optimal design of experiments (ODE), Gaussian quadratures, as well as more recent techniques, such as coherence-optimal and randomized quadratures are discussed. We also propose a hybrid sampling method, dubbed alphabetic-coherence-optimal, that employs the so-called alphabetic optimality criteria used in the context of ODE in conjunction with coherence-optimal samples. A comparison between the empirical performance of the selected sampling methods applied to three numerical examples, including high-order PCE's, high-dimensional problems, and low oversampling ratios, is presented to provide a road map for practitioners seeking the most suitable sampling technique for a problem at hand. We observed that the alphabetic-coherence-optimal technique outperforms other sampling methods, specially when high-order ODE are employed and/or the oversampling ratio is low.

  3. Pu239 Cross-Section Variations Based on Experimental Uncertainties and Covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sigeti, David Edward; Williams, Brian J.; Parsons, D. Kent

    2016-10-18

    Algorithms and software have been developed for producing variations in plutonium-239 neutron cross sections based on experimental uncertainties and covariances. The varied cross-section sets may be produced as random samples from the multi-variate normal distribution defined by an experimental mean vector and covariance matrix, or they may be produced as Latin-Hypercube/Orthogonal-Array samples (based on the same means and covariances) for use in parametrized studies. The variations obey two classes of constraints that are obligatory for cross-section sets and which put related constraints on the mean vector and covariance matrix that detemine the sampling. Because the experimental means and covariances domore » not obey some of these constraints to sufficient precision, imposing the constraints requires modifying the experimental mean vector and covariance matrix. Modification is done with an algorithm based on linear algebra that minimizes changes to the means and covariances while insuring that the operations that impose the different constraints do not conflict with each other.« less

  4. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  5. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  6. Optimal cube-connected cube multiprocessors

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He; Wu, Jie

    1993-01-01

    Many CFD (computational fluid dynamics) and other scientific applications can be partitioned into subproblems. However, in general the partitioned subproblems are very large. They demand high performance computing power themselves, and the solutions of the subproblems have to be combined at each time step. The cube-connect cube (CCCube) architecture is studied. The CCCube architecture is an extended hypercube structure with each node represented as a cube. It requires fewer physical links between nodes than the hypercube, and provides the same communication support as the hypercube does on many applications. The reduced physical links can be used to enhance the bandwidth of the remaining links and, therefore, enhance the overall performance. The concept and the method to obtain optimal CCCubes, which are the CCCubes with a minimum number of links under a given total number of nodes, are proposed. The superiority of optimal CCCubes over standard hypercubes was also shown in terms of the link usage in the embedding of a binomial tree. A useful computation structure based on a semi-binomial tree for divide-and-conquer type of parallel algorithms was identified. It was shown that this structure can be implemented in optimal CCCubes without performance degradation compared with regular hypercubes. The result presented should provide a useful approach to design of scientific parallel computers.

  7. Assessing the Impact of Model Parameter Uncertainty in Simulating Grass Biomass Using a Hybrid Carbon Allocation Strategy

    NASA Astrophysics Data System (ADS)

    Reyes, J. J.; Adam, J. C.; Tague, C.

    2016-12-01

    Grasslands play an important role in agricultural production as forage for livestock; they also provide a diverse set of ecosystem services including soil carbon (C) storage. The partitioning of C between above and belowground plant compartments (i.e. allocation) is influenced by both plant characteristics and environmental conditions. The objectives of this study are to 1) develop and evaluate a hybrid C allocation strategy suitable for grasslands, and 2) apply this strategy to examine the importance of various parameters related to biogeochemical cycling, photosynthesis, allocation, and soil water drainage on above and belowground biomass. We include allocation as an important process in quantifying the model parameter uncertainty, which identifies the most influential parameters and what processes may require further refinement. For this, we use the Regional Hydro-ecologic Simulation System, a mechanistic model that simulates coupled water and biogeochemical processes. A Latin hypercube sampling scheme was used to develop parameter sets for calibration and evaluation of allocation strategies, as well as parameter uncertainty analysis. We developed the hybrid allocation strategy to integrate both growth-based and resource-limited allocation mechanisms. When evaluating the new strategy simultaneously for above and belowground biomass, it produced a larger number of less biased parameter sets: 16% more compared to resource-limited and 9% more compared to growth-based. This also demonstrates its flexible application across diverse plant types and environmental conditions. We found that higher parameter importance corresponded to sub- or supra-optimal resource availability (i.e. water, nutrients) and temperature ranges (i.e. too hot or cold). For example, photosynthesis-related parameters were more important at sites warmer than the theoretical optimal growth temperature. Therefore, larger values of parameter importance indicate greater relative sensitivity in adequately representing the relevant process to capture limiting resources or manage atypical environmental conditions. These results may inform future experimental work by focusing efforts on quantifying specific parameters under various environmental conditions or across diverse plant functional types.

  8. Retrospective and current risks of mercury to panthers in the Florida Everglades.

    PubMed

    Barron, Mace G; Duvall, Stephanie E; Barron, Kyle J

    2004-04-01

    Florida panthers are an endangered species inhabiting south Florida. Hg has been suggested as a causative factor for low populations and some reported panther deaths, but a quantitative assessment of risks has never been performed. This study quantitatively evaluated retrospective (pre-1992) and current (2002) risks of chronic dietary Hg exposures to panthers in the Florida Everglades. A probabilistic assessment of Hg risks was performed using a dietary exposure model and Latin Hypercube sampling that incorporated the variability and uncertainty in ingestion rate, diet, body weight, and mercury exposure of panthers. Hazard quotients (HQs) for retrospective risks ranged from less than 0.1-20, with a 46% probability of exceeding chronic dietary thresholds for methylmercury. Retrospective risks of developing clinical symptoms, including ataxia and convulsions, had an HQ range of <0.1-5.4 with a 17% probability of exceeding an HQ of 1. Current risks were substantially lower (4% probability of exceedences; HQ range <0.1-3.5) because of an estimated 70-90% decline in Hg exposure to panthers over the last decade. Under worst case conditions of panthers consuming only raccoons from the most contaminated area of the Everglades, current risks of developing clinical symptoms that may lead to death was 4.6%. Current risks of mercury poisoning of panthers with a diversified diet was 0.1% (HQ range of <0.1-1.4). The results of this assessment indicate that past Hg exposures likely adversely affected panthers in the Everglades, but current risks of Hg are low.

  9. Critical Concentration Ratio for Solar Thermoelectric Generators

    NASA Astrophysics Data System (ADS)

    ur Rehman, Naveed; Siddiqui, Mubashir Ali

    2016-10-01

    A correlation for determining the critical concentration ratio (CCR) of solar concentrated thermoelectric generators (SCTEGs) has been established, and the significance of the contributing parameters is discussed in detail. For any SCTEG, higher concentration ratio leads to higher temperatures at the hot side of modules. However, the maximum value of this temperature for safe operation is limited by the material properties of the modules and should be considered as an important design constraint. Taking into account this limitation, the CCR can be defined as the maximum concentration ratio usable for a particular SCTEG. The established correlation is based on factors associated with the material and geometric properties of modules, thermal characteristics of the receiver, installation site attributes, and thermal and electrical operating conditions. To reduce the number of terms in the correlation, these factors are combined to form dimensionless groups by applying the Buckingham Pi theorem. A correlation model containing these groups is proposed and fit to a dataset obtained by simulating a thermodynamic (physical) model over sampled values acquired by applying the Latin hypercube sampling (LHS) technique over a realistic distribution of factors. The coefficient of determination and relative error are found to be 97% and ±20%, respectively. The correlation is validated by comparing the predicted results with literature values. In addition, the significance and effects of the Pi groups on the CCR are evaluated and thoroughly discussed. This study will lead to a wide range of opportunities regarding design and optimization of SCTEGs.

  10. Global sensitivity analysis of water age and temperature for informing salmonid disease management

    NASA Astrophysics Data System (ADS)

    Javaheri, Amir; Babbar-Sebens, Meghna; Alexander, Julie; Bartholomew, Jerri; Hallett, Sascha

    2018-06-01

    Many rivers in the Pacific Northwest region of North America are anthropogenically manipulated via dam operations, leading to system-wide impacts on hydrodynamic conditions and aquatic communities. Understanding how dam operations alter abiotic and biotic variables is important for designing management actions. For example, in the Klamath River, dam outflows could be manipulated to alter water age and temperature to reduce risk of parasite infections in salmon by diluting or altering viability of parasite spores. However, sensitivity of water age and temperature to the riverine conditions such as bathymetry can affect outcomes from dam operations. To examine this issue in detail, we conducted a global sensitivity analysis of water age and temperature to a comprehensive set of hydraulics and meteorological parameters in the Klamath River, California, where management of salmonid disease is a high priority. We applied an analysis technique, which combined Latin-hypercube and one-at-a-time sampling methods, and included simulation runs with the hydrodynamic numerical model of the Lower Klamath. We found that flow rate and bottom roughness were the two most important parameters that influence water age. Water temperature was more sensitive to inflow temperature, air temperature, solar radiation, wind speed, flow rate, and wet bulb temperature respectively. Our results are relevant for managers because they provide a framework for predicting how water within 'high infection risk' sections of the river will respond to dam water (low infection risk) input. Moreover, these data will be useful for prioritizing the use of water age (dilution) versus temperature (spore viability) under certain contexts when considering flow manipulation as a method to reduce risk of infection and disease in Klamath River salmon.

  11. Environmental drivers of spatial patterns of topsoil nitrogen and phosphorus under monsoon conditions in a complex terrain of South Korea

    PubMed Central

    Choi, Kwanghun; Spohn, Marie; Park, Soo Jin; Huwe, Bernd; Ließ, Mareike

    2017-01-01

    Nitrogen (N) and phosphorus (P) in topsoils are critical for plant nutrition. Relatively little is known about the spatial patterns of N and P in the organic layer of mountainous landscapes. Therefore, the spatial distributions of N and P in both the organic layer and the A horizon were analyzed using a light detection and ranging (LiDAR) digital elevation model and vegetation metrics. The objective of the study was to analyze the effect of vegetation and topography on the spatial patterns of N and P in a small watershed covered by forest in South Korea. Soil samples were collected using the conditioned latin hypercube method. LiDAR vegetation metrics, the normalized difference vegetation index (NDVI), and terrain parameters were derived as predictors. Spatial explicit predictions of N/P ratios were obtained using a random forest with uncertainty analysis. We tested different strategies of model validation (repeated 2-fold to 20-fold and leave-one-out cross validation). Repeated 10-fold cross validation was selected for model validation due to the comparatively high accuracy and low variance of prediction. Surface curvature was the best predictor of P contents in the organic layer and in the A horizon, while LiDAR vegetation metrics and NDVI were important predictors of N in the organic layer. N/P ratios increased with surface curvature and were higher on the convex upper slope than on the concave lower slope. This was due to P enrichment of the soil on the lower slope and a more even spatial distribution of N. Our digital soil maps showed that the topsoils on the upper slopes contained relatively little P. These findings are critical for understanding N and P dynamics in mountainous ecosystems. PMID:28837590

  12. A GIS-assisted regional screening tool to evaluate the leaching potential of volatile and non-volatile pesticides

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Ray, Chittaranjan

    2015-03-01

    A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.

  13. An uncertainty analysis of the hydrogen source term for a station blackout accident in Sequoyah using MELCOR 1.8.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Bixler, Nathan E.; Wagner, Kenneth Charles

    2014-03-01

    A methodology for using the MELCOR code with the Latin Hypercube Sampling method was developed to estimate uncertainty in various predicted quantities such as hydrogen generation or release of fission products under severe accident conditions. In this case, the emphasis was on estimating the range of hydrogen sources in station blackout conditions in the Sequoyah Ice Condenser plant, taking into account uncertainties in the modeled physics known to affect hydrogen generation. The method uses user-specified likelihood distributions for uncertain model parameters, which may include uncertainties of a stochastic nature, to produce a collection of code calculations, or realizations, characterizing themore » range of possible outcomes. Forty MELCOR code realizations of Sequoyah were conducted that included 10 uncertain parameters, producing a range of in-vessel hydrogen quantities. The range of total hydrogen produced was approximately 583kg 131kg. Sensitivity analyses revealed expected trends with respected to the parameters of greatest importance, however, considerable scatter in results when plotted against any of the uncertain parameters was observed, with no parameter manifesting dominant effects on hydrogen generation. It is concluded that, with respect to the physics parameters investigated, in order to further reduce predicted hydrogen uncertainty, it would be necessary to reduce all physics parameter uncertainties similarly, bearing in mind that some parameters are inherently uncertain within a range. It is suspected that some residual uncertainty associated with modeling complex, coupled and synergistic phenomena, is an inherent aspect of complex systems and cannot be reduced to point value estimates. The probabilistic analyses such as the one demonstrated in this work are important to properly characterize response of complex systems such as severe accident progression in nuclear power plants.« less

  14. High-speed mid-infrared hyperspectral imaging using quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Kelley, David B.; Goyal, Anish K.; Zhu, Ninghui; Wood, Derek A.; Myers, Travis R.; Kotidis, Petros; Murphy, Cara; Georgan, Chelsea; Raz, Gil; Maulini, Richard; Müller, Antoine

    2017-05-01

    We report on a standoff chemical detection system using widely tunable external-cavity quantum cascade lasers (ECQCLs) to illuminate target surfaces in the mid infrared (λ = 7.4 - 10.5 μm). Hyperspectral images (hypercubes) are acquired by synchronously operating the EC-QCLs with a LN2-cooled HgCdTe camera. The use of rapidly tunable lasers and a high-frame-rate camera enables the capture of hypercubes with 128 x 128 pixels and >100 wavelengths in <0.1 s. Furthermore, raster scanning of the laser illumination allowed imaging of a 100-cm2 area at 5-m standoff. Raw hypercubes are post-processed to generate a hypercube that represents the surface reflectance relative to that of a diffuse reflectance standard. Results will be shown for liquids (e.g., silicone oil) and solid particles (e.g., caffeine, acetaminophen) on a variety of surfaces (e.g., aluminum, plastic, glass). Signature spectra are obtained for particulate loadings of RDX on glass of <1 μg/cm2.

  15. Performance of a parallel code for the Euler equations on hypercube computers

    NASA Technical Reports Server (NTRS)

    Barszcz, Eric; Chan, Tony F.; Jesperson, Dennis C.; Tuminaro, Raymond S.

    1990-01-01

    The performance of hypercubes were evaluated on a computational fluid dynamics problem and the parallel environment issues were considered that must be addressed, such as algorithm changes, implementation choices, programming effort, and programming environment. The evaluation focuses on a widely used fluid dynamics code, FLO52, which solves the two dimensional steady Euler equations describing flow around the airfoil. The code development experience is described, including interacting with the operating system, utilizing the message-passing communication system, and code modifications necessary to increase parallel efficiency. Results from two hypercube parallel computers (a 16-node iPSC/2, and a 512-node NCUBE/ten) are discussed and compared. In addition, a mathematical model of the execution time was developed as a function of several machine and algorithm parameters. This model accurately predicts the actual run times obtained and is used to explore the performance of the code in interesting but yet physically realizable regions of the parameter space. Based on this model, predictions about future hypercubes are made.

  16. A method for simultaneously counterbalancing condition order and assignment of stimulus materials to conditions.

    PubMed

    Zeelenberg, René; Pecher, Diane

    2015-03-01

    Counterbalanced designs are frequently used in the behavioral sciences. Studies often counterbalance either the order in which conditions are presented in the experiment or the assignment of stimulus materials to conditions. Occasionally, researchers need to simultaneously counterbalance both condition order and stimulus assignment to conditions. Lewis (1989; Behavior Research Methods, Instruments, & Computers 25:414-415, 1993) presented a method for constructing Latin squares that fulfill these requirements. The resulting Latin squares counterbalance immediate sequential effects, but not remote sequential effects. Here, we present a new method for generating Latin squares that simultaneously counterbalance both immediate and remote sequential effects and assignment of stimuli to conditions. An Appendix is provided to facilitate implementation of these Latin square designs.

  17. Experimental demonstration of the optical multi-mesh hypercube: scaleable interconnection network for multiprocessors and multicomputers.

    PubMed

    Louri, A; Furlonge, S; Neocleous, C

    1996-12-10

    A prototype of a novel topology for scaleable optical interconnection networks called the optical multi-mesh hypercube (OMMH) is experimentally demonstrated to as high as a 150-Mbit/s data rate (2(7) - 1 nonreturn-to-zero pseudo-random data pattern) at a bit error rate of 10(-13)/link by the use of commercially available devices. OMMH is a scaleable network [Appl. Opt. 33, 7558 (1994); J. Lightwave Technol. 12, 704 (1994)] architecture that combines the positive features of the hypercube (small diameter, connectivity, symmetry, simple routing, and fault tolerance) and the mesh (constant node degree and size scaleability). The optical implementation method is divided into two levels: high-density local connections for the hypercube modules, and high-bit-rate, low-density, long connections for the mesh links connecting the hypercube modules. Free-space imaging systems utilizing vertical-cavity surface-emitting laser (VCSEL) arrays, lenslet arrays, space-invariant holographic techniques, and photodiode arrays are demonstrated for the local connections. Optobus fiber interconnects from Motorola are used for the long-distance connections. The OMMH was optimized to operate at the data rate of Motorola's Optobus (10-bit-wide, VCSEL-based bidirectional data interconnects at 150 Mbits/s). Difficulties encountered included the varying fan-out efficiencies of the different orders of the hologram, misalignment sensitivity of the free-space links, low power (1 mW) of the individual VCSEL's, and noise.

  18. National working conditions surveys in Latin America: comparison of methodological characteristics

    PubMed Central

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G.

    2015-01-01

    Background: High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. Objective: To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. Methods: The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Results: Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Conclusions: Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region. PMID:26079314

  19. National working conditions surveys in Latin America: comparison of methodological characteristics.

    PubMed

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G

    2015-01-01

    High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region.

  20. Parameter identification and optimization of slide guide joint of CNC machine tools

    NASA Astrophysics Data System (ADS)

    Zhou, S.; Sun, B. B.

    2017-11-01

    The joint surface has an important influence on the performance of CNC machine tools. In order to identify the dynamic parameters of slide guide joint, the parametric finite element model of the joint is established and optimum design method is used based on the finite element simulation and modal test. Then the mode that has the most influence on the dynamics of slip joint is found through harmonic response analysis. Take the frequency of this mode as objective, the sensitivity analysis of the stiffness of each joint surface is carried out using Latin Hypercube Sampling and Monte Carlo Simulation. The result shows that the vertical stiffness of slip joint surface constituted by the bed and the slide plate has the most obvious influence on the structure. Therefore, this stiffness is taken as the optimization variable and the optimal value is obtained through studying the relationship between structural dynamic performance and stiffness. Take the stiffness values before and after optimization into the FEM of machine tool, and it is found that the dynamic performance of the machine tool is improved.

  1. Agent-Based Modeling of China's Rural-Urban Migration and Social Network Structure.

    PubMed

    Fu, Zhaohao; Hao, Lingxin

    2018-01-15

    We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k -core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.

  2. Agent-based modeling of China's rural-urban migration and social network structure

    NASA Astrophysics Data System (ADS)

    Fu, Zhaohao; Hao, Lingxin

    2018-01-01

    We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k-core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.

  3. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). Here, the resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  4. Optimized Reduction of Unsteady Radial Forces in a Singlechannel Pump for Wastewater Treatment

    NASA Astrophysics Data System (ADS)

    Kim, Jin-Hyuk; Cho, Bo-Min; Choi, Young-Seok; Lee, Kyoung-Yong; Peck, Jong-Hyeon; Kim, Seon-Chang

    2016-11-01

    A single-channel pump for wastewater treatment was optimized to reduce unsteady radial force sources caused by impeller-volute interactions. The steady and unsteady Reynolds- averaged Navier-Stokes equations using the shear-stress transport turbulence model were discretized by finite volume approximations and solved on tetrahedral grids to analyze the flow in the single-channel pump. The sweep area of radial force during one revolution and the distance of the sweep-area center of mass from the origin were selected as the objective functions; the two design variables were related to the internal flow cross-sectional area of the volute. These objective functions were integrated into one objective function by applying the weighting factor for optimization. Latin hypercube sampling was employed to generate twelve design points within the design space. A response-surface approximation model was constructed as a surrogate model for the objectives, based on the objective function values at the generated design points. The optimized results showed considerable reduction in the unsteady radial force sources in the optimum design, relative to those of the reference design.

  5. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    NASA Astrophysics Data System (ADS)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  6. A Novel Approach for Determining Source–Receptor Relationships in Model Simulations: A Case Study of Black Carbon Transport in Northern Hemisphere Winter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Po-Lun; Gattiker, J. R.; Liu, Xiaohong

    2013-06-27

    A Gaussian process (GP) emulator is applied to quantify the contribution of local and remote emissions of black carbon (BC) on the BC concentrations in different regions using a Latin Hypercube sampling strategy for emission perturbations in the offline version of the Community Atmosphere Model Version 5.1 (CAM5) simulations. The source-receptor relationships are computed based on simulations constrained by a standard free-running CAM5 simulation and the ERA-Interim reanalysis product. The analysis demonstrates that the emulator is capable of retrieving the source-receptor relationships based on a small number of CAM5 simulations. Most regions are found susceptible to their local emissions. Themore » emulator also finds that the source-receptor relationships retrieved from the model-driven and the reanalysis-driven simulations are very similar, suggesting that the simulated circulation in CAM5 resembles the assimilated meteorology in ERA-Interim. The robustness of the results provides confidence for applying the emulator to detect dose-response signals in the climate system.« less

  7. The Influence of PV Module Materials and Design on Solder Joint Thermal Fatigue Durability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosco, Nick; Silverman, Timothy J.; Kurtz, Sarah

    Finite element model (FEM) simulations have been performed to elucidate the effect of flat plate photovoltaic (PV) module materials and design on PbSn eutectic solder joint thermal fatigue durability. The statistical method of Latin Hypercube sampling was employed to investigate the sensitivity of simulated damage to each input variable. Variables of laminate material properties and their thicknesses were investigated. Using analysis of variance, we determined that the rate of solder fatigue was most sensitive to solder layer thickness, with copper ribbon and silicon thickness being the next two most sensitive variables. By simulating both accelerated thermal cycles (ATCs) and PVmore » cell temperature histories through two characteristic days of service, we determined that the acceleration factor between the ATC and outdoor service was independent of the variables sampled in this study. This result implies that an ATC test will represent a similar time of outdoor exposure for a wide range of module designs. This is an encouraging result for the standard ATC that must be universally applied across all modules.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Jim Bouchard

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. Themore » fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.« less

  9. Requirements analysis for a hardware, discrete-event, simulation engine accelerator

    NASA Astrophysics Data System (ADS)

    Taylor, Paul J., Jr.

    1991-12-01

    An analysis of a general Discrete Event Simulation (DES), executing on the distributed architecture of an eight mode Intel PSC/2 hypercube, was performed. The most time consuming portions of the general DES algorithm were determined to be the functions associated with message passing of required simulation data between processing nodes of the hypercube architecture. A behavioral description, using the IEEE standard VHSIC Hardware Description and Design Language (VHDL), for a general DES hardware accelerator is presented. The behavioral description specifies the operational requirements for a DES coprocessor to augment the hypercube's execution of DES simulations. The DES coprocessor design implements the functions necessary to perform distributed discrete event simulations using a conservative time synchronization protocol.

  10. Concurrent Image Processing Executive (CIPE)

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1988-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.

  11. Multiphase complete exchange on Paragon, SP2 and CS-2

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1995-01-01

    The overhead of interprocessor communication is a major factor in limiting the performance of parallel computer systems. The complete exchange is the severest communication pattern in that it requires each processor to send a distinct message to every other processor. This pattern is at the heart of many important parallel applications. On hypercubes, multiphase complete exchange has been developed and shown to provide optimal performance over varying message sizes. Most commercial multicomputer systems do not have a hypercube interconnect. However, they use special purpose hardware and dedicated communication processors to achieve very high performance communication and can be made to emulate the hypercube quite well. Multiphase complete exchange has been implemented on three contemporary parallel architectures: the Intel Paragon, IBM SP2 and Meiko CS-2. The essential features of these machines are described and their basic interprocessor communication overheads are discussed. The performance of multiphase complete exchange is evaluated on each machine. It is shown that the theoretical ideas developed for hypercubes are also applicable in practice to these machines and that multiphase complete exchange can lead to major savings in execution time over traditional solutions.

  12. Parallel spatial direct numerical simulations on the Intel iPSC/860 hypercube

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Zubair, Mohammad

    1993-01-01

    The implementation and performance of a parallel spatial direct numerical simulation (PSDNS) approach on the Intel iPSC/860 hypercube is documented. The direct numerical simulation approach is used to compute spatially evolving disturbances associated with the laminar-to-turbulent transition in boundary-layer flows. The feasibility of using the PSDNS on the hypercube to perform transition studies is examined. The results indicate that the direct numerical simulation approach can effectively be parallelized on a distributed-memory parallel machine. By increasing the number of processors nearly ideal linear speedups are achieved with nonoptimized routines; slower than linear speedups are achieved with optimized (machine dependent library) routines. This slower than linear speedup results because the Fast Fourier Transform (FFT) routine dominates the computational cost and because the routine indicates less than ideal speedups. However with the machine-dependent routines the total computational cost decreases by a factor of 4 to 5 compared with standard FORTRAN routines. The computational cost increases linearly with spanwise wall-normal and streamwise grid refinements. The hypercube with 32 processors was estimated to require approximately twice the amount of Cray supercomputer single processor time to complete a comparable simulation; however it is estimated that a subgrid-scale model which reduces the required number of grid points and becomes a large-eddy simulation (PSLES) would reduce the computational cost and memory requirements by a factor of 10 over the PSDNS. This PSLES implementation would enable transition simulations on the hypercube at a reasonable computational cost.

  13. Parallel computation for biological sequence comparison: comparing a portable model to the native model for the Intel Hypercube.

    PubMed

    Nadkarni, P M; Miller, P L

    1991-01-01

    A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations.

  14. Indigenous People and Poverty in Latin America: An Empirical Analysis. World Bank Regional and Sectoral Studies.

    ERIC Educational Resources Information Center

    Psacharopoulos, George, Ed.; Patrinos, Harry Anthony, Ed.

    The indigenous peoples of Latin America live in conditions of extreme poverty. This book uses empirical data from national survey sources to determine the extent of poverty among Latin American indigenous populations; to compare indigenous and nonindigenous populations with regard to socioeconomic status, living conditions, educational attainment,…

  15. Sensitivity Analysis of the Sheet Metal Stamping Processes Based on Inverse Finite Element Modeling and Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Yu, Maolin; Du, R.

    2005-08-01

    Sheet metal stamping is one of the most commonly used manufacturing processes, and hence, much research has been carried for economic gain. Searching through the literatures, however, it is found that there are still a lots of problems unsolved. For example, it is well known that for a same press, same workpiece material, and same set of die, the product quality may vary owing to a number of factors, such as the inhomogeneous of the workpice material, the loading error, the lubrication, and etc. Presently, few seem able to predict the quality variation, not to mention what contribute to the quality variation. As a result, trial-and-error is still needed in the shop floor, causing additional cost and time delay. This paper introduces a new approach to predict the product quality variation and identify the sensitive design / process parameters. The new approach is based on a combination of inverse Finite Element Modeling (FEM) and Monte Carlo Simulation (more specifically, the Latin Hypercube Sampling (LHS) approach). With an acceptable accuracy, the inverse FEM (also called one-step FEM) requires much less computation load than that of the usual incremental FEM and hence, can be used to predict the quality variations under various conditions. LHS is a statistical method, through which the sensitivity analysis can be carried out. The result of the sensitivity analysis has clear physical meaning and can be used to optimize the die design and / or the process design. Two simulation examples are presented including drawing a rectangular box and drawing a two-step rectangular box.

  16. The formation of continuous opinion dynamics based on a gambling mechanism and its sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Alexandre Wang, Qiuping; Li, Wei; Cai, Xu

    2017-09-01

    The formation of continuous opinion dynamics is investigated based on a virtual gambling mechanism where agents fight for a limited resource. We propose a model with agents holding opinions between -1 and 1. Agents are segregated into two cliques according to the sign of their opinions. Local communication happens only when the opinion distance between corresponding agents is no larger than a pre-defined confidence threshold. Theoretical analysis regarding special cases provides a deep understanding of the roles of both the resource allocation parameter and confidence threshold in the formation of opinion dynamics. For a sparse network, the evolution of opinion dynamics is negligible in the region of low confidence threshold when the mindless agents are absent. Numerical results also imply that, in the presence of economic agents, high confidence threshold is required for apparent clustering of agents in opinion. Moreover, a consensus state is generated only when the following three conditions are satisfied simultaneously: mindless agents are absent, the resource is concentrated in one clique, and confidence threshold tends to a critical value(=1.25+2/ka ; k_a>8/3 , the average number of friends of individual agents). For fixed a confidence threshold and resource allocation parameter, the most chaotic steady state of the dynamics happens when the fraction of mindless agents is about 0.7. It is also demonstrated that economic agents are more likely to win at gambling, compared to mindless ones. Finally, the importance of three involved parameters in establishing the uncertainty of model response is quantified in terms of Latin hypercube sampling-based sensitivity analysis.

  17. Parameter Uncertainty on AGCM-simulated Tropical Cyclones

    NASA Astrophysics Data System (ADS)

    He, F.

    2015-12-01

    This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.

  18. Hypercube technology

    NASA Technical Reports Server (NTRS)

    Parker, Jay W.; Cwik, Tom; Ferraro, Robert D.; Liewer, Paulett C.; Patterson, Jean E.

    1991-01-01

    The JPL designed MARKIII hypercube supercomputer has been in application service since June 1988 and has had successful application to a broad problem set including electromagnetic scattering, discrete event simulation, plasma transport, matrix algorithms, neural network simulation, image processing, and graphics. Currently, problems that are not homogeneous are being attempted, and, through this involvement with real world applications, the software is evolving to handle the heterogeneous class problems efficiently.

  19. Parallel computation for biological sequence comparison: comparing a portable model to the native model for the Intel Hypercube.

    PubMed Central

    Nadkarni, P. M.; Miller, P. L.

    1991-01-01

    A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations. PMID:1807632

  20. Communication overhead on the Intel iPSC-860 hypercube

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1990-01-01

    Experiments were conducted on the Intel iPSC-860 hypercube in order to evaluate the overhead of interprocessor communication. It is demonstrated that: (1) contrary to popular belief, the distance between two communicating processors has a significant impact on communication time, (2) edge contention can increase communication time by a factor of more than 7, and (3) node contention has no measurable impact.

  1. RISC-type microprocessors may revolutionize aerospace simulation

    NASA Astrophysics Data System (ADS)

    Jackson, Albert S.

    The author explores the application of RISC (reduced instruction set computer) processors in massively parallel computer (MPC) designs for aerospace simulation. The MPC approach is shown to be well adapted to the needs of aerospace simulation. It is shown that any of the three common types of interconnection schemes used with MPCs are effective for general-purpose simulation, although the bus-or switch-oriented machines are somewhat easier to use. For partial differential equation models, the hypercube approach at first glance appears more efficient because the nearest-neighbor connections required for three-dimensional models are hardwired in a hypercube machine. However, the data broadcast ability of a bus system, combined with the fact that data can be transmitted over a bus as soon as it has been updated, makes the bus approach very competitive with the hypercube approach even for these types of models.

  2. Generalized hypercube structures and hyperswitch communication network

    NASA Technical Reports Server (NTRS)

    Young, Steven D.

    1992-01-01

    This paper discusses an ongoing study that uses a recent development in communication control technology to implement hybrid hypercube structures. These architectures are similar to binary hypercubes, but they also provide added connectivity between the processors. This added connectivity increases communication reliability while decreasing the latency of interprocessor message passing. Because these factors directly determine the speed that can be obtained by multiprocessor systems, these architectures are attractive for applications such as remote exploration and experimentation, where high performance and ultrareliability are required. This paper describes and enumerates these architectures and discusses how they can be implemented with a modified version of the hyperswitch communication network (HCN). The HCN is analyzed because it has three attractive features that enable these architectures to be effective: speed, fault tolerance, and the ability to pass multiple messages simultaneously through the same hyperswitch controller.

  3. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  4. Non-intrusive reduced order modeling of nonlinear problems using neural networks

    NASA Astrophysics Data System (ADS)

    Hesthaven, J. S.; Ubbiali, S.

    2018-06-01

    We develop a non-intrusive reduced basis (RB) method for parametrized steady-state partial differential equations (PDEs). The method extracts a reduced basis from a collection of high-fidelity solutions via a proper orthogonal decomposition (POD) and employs artificial neural networks (ANNs), particularly multi-layer perceptrons (MLPs), to accurately approximate the coefficients of the reduced model. The search for the optimal number of neurons and the minimum amount of training samples to avoid overfitting is carried out in the offline phase through an automatic routine, relying upon a joint use of the Latin hypercube sampling (LHS) and the Levenberg-Marquardt (LM) training algorithm. This guarantees a complete offline-online decoupling, leading to an efficient RB method - referred to as POD-NN - suitable also for general nonlinear problems with a non-affine parametric dependence. Numerical studies are presented for the nonlinear Poisson equation and for driven cavity viscous flows, modeled through the steady incompressible Navier-Stokes equations. Both physical and geometrical parametrizations are considered. Several results confirm the accuracy of the POD-NN method and show the substantial speed-up enabled at the online stage as compared to a traditional RB strategy.

  5. Detailed design of a lattice composite fuselage structure by a mixed optimization method

    NASA Astrophysics Data System (ADS)

    Liu, D.; Lohse-Busch, H.; Toropov, V.; Hühne, C.; Armani, U.

    2016-10-01

    In this article, a procedure for designing a lattice fuselage barrel is developed. It comprises three stages: first, topology optimization of an aircraft fuselage barrel is performed with respect to weight and structural performance to obtain the conceptual design. The interpretation of the optimal result is given to demonstrate the development of this new lattice airframe concept for the fuselage barrel. Subsequently, parametric optimization of the lattice aircraft fuselage barrel is carried out using genetic algorithms on metamodels generated with genetic programming from a 101-point optimal Latin hypercube design of experiments. The optimal design is achieved in terms of weight savings subject to stability, global stiffness and strain requirements, and then verified by the fine mesh finite element simulation of the lattice fuselage barrel. Finally, a practical design of the composite skin complying with the aircraft industry lay-up rules is presented. It is concluded that the mixed optimization method, combining topology optimization with the global metamodel-based approach, allows the problem to be solved with sufficient accuracy and provides the designers with a wealth of information on the structural behaviour of the novel anisogrid composite fuselage design.

  6. Uncertainty and Sensitivity Analyses of a Pebble Bed HTGR Loss of Cooling Event

    DOE PAGES

    Strydom, Gerhard

    2013-01-01

    The Very High Temperature Reactor Methods Development group at the Idaho National Laboratory identified the need for a defensible and systematic uncertainty and sensitivity approach in 2009. This paper summarizes the results of an uncertainty and sensitivity quantification investigation performed with the SUSA code, utilizing the International Atomic Energy Agency CRP 5 Pebble Bed Modular Reactor benchmark and the INL code suite PEBBED-THERMIX. Eight model input parameters were selected for inclusion in this study, and after the input parameters variations and probability density functions were specified, a total of 800 steady state and depressurized loss of forced cooling (DLOFC) transientmore » PEBBED-THERMIX calculations were performed. The six data sets were statistically analyzed to determine the 5% and 95% DLOFC peak fuel temperature tolerance intervals with 95% confidence levels. It was found that the uncertainties in the decay heat and graphite thermal conductivities were the most significant contributors to the propagated DLOFC peak fuel temperature uncertainty. No significant differences were observed between the results of Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) data sets, and use of uniform or normal input parameter distributions also did not lead to any significant differences between these data sets.« less

  7. Economic consequences of aviation system disruptions: A reduced-form computable general equilibrium analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin

    The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less

  8. Photovoltaic frequency–watt curve design for frequency regulation and fast contingency reserves

    DOE PAGES

    Johnson, Jay; Neely, Jason C.; Delhotal, Jarod J.; ...

    2016-09-02

    When renewable energy resources are installed in electricity grids, they typically increase generation variability and displace thermal generator control action and inertia. Grid operators combat these emerging challenges with advanced distributed energy resource (DER) functions to support frequency and provide voltage regulation and protection mechanisms. This paper focuses on providing frequency reserves using autonomous IEC TR 61850-90-7 pointwise frequency-watt (FW) functions that adjust DER active power as a function of measured grid frequency. The importance of incorporating FW functions into a fleet of photovoltaic (PV) systems is demonstrated in simulation. Effects of FW curve design, including curtailment, deadband, and droop,more » were analyzed against performance metrics using Latin hypercube sampling for 20%, 70%, and 120% PV penetration scenarios on the Hawaiian island of Lanai. Finally, to understand the financial implications of FW functions to utilities, a performance function was defined based on monetary costs attributable to curtailed PV production, load shedding, and generator wear. An optimization wrapper was then created to find the best FW function curve for each penetration level. Lastly, it was found that in all cases, the utility would save money by implementing appropriate FW functions.« less

  9. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  10. Sensitivity analysis of infectious disease models: methods, advances and their application

    PubMed Central

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  11. Dynamics of hepatitis C under optimal therapy and sampling based analysis

    NASA Astrophysics Data System (ADS)

    Pachpute, Gaurav; Chakrabarty, Siddhartha P.

    2013-08-01

    We examine two models for hepatitis C viral (HCV) dynamics, one for monotherapy with interferon (IFN) and the other for combination therapy with IFN and ribavirin. Optimal therapy for both the models is determined using the steepest gradient method, by defining an objective functional which minimizes infected hepatocyte levels, virion population and side-effects of the drug(s). The optimal therapies for both the models show an initial period of high efficacy, followed by a gradual decline. The period of high efficacy coincides with a significant decrease in the viral load, whereas the efficacy drops after hepatocyte levels are restored. We use the Latin hypercube sampling technique to randomly generate a large number of patient scenarios and study the dynamics of each set under the optimal therapy already determined. Results show an increase in the percentage of responders (indicated by drop in viral load below detection levels) in case of combination therapy (72%) as compared to monotherapy (57%). Statistical tests performed to study correlations between sample parameters and time required for the viral load to fall below detection level, show a strong monotonic correlation with the death rate of infected hepatocytes, identifying it to be an important factor in deciding individual drug regimens.

  12. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  13. Concurrent electromagnetic scattering analysis

    NASA Technical Reports Server (NTRS)

    Patterson, Jean E.; Cwik, Tom; Ferraro, Robert D.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Parker, Jay

    1989-01-01

    The computational power of the hypercube parallel computing architecture is applied to the solution of large-scale electromagnetic scattering and radiation problems. Three analysis codes have been implemented. A Hypercube Electromagnetic Interactive Analysis Workstation was developed to aid in the design and analysis of metallic structures such as antennas and to facilitate the use of these analysis codes. The workstation provides a general user environment for specification of the structure to be analyzed and graphical representations of the results.

  14. Asynchronous Communication Scheme For Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Madan, Herb S.

    1988-01-01

    Scheme devised for asynchronous-message communication system for Mark III hypercube concurrent-processor network. Network consists of up to 1,024 processing elements connected electrically as though were at corners of 10-dimensional cube. Each node contains two Motorola 68020 processors along with Motorola 68881 floating-point processor utilizing up to 4 megabytes of shared dynamic random-access memory. Scheme intended to support applications requiring passage of both polled or solicited and unsolicited messages.

  15. Image-Processing Software For A Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

    1992-01-01

    Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

  16. Virtual time and time warp on the JPL hypercube. [operating system implementation for distributed simulation

    NASA Technical Reports Server (NTRS)

    Jefferson, David; Beckman, Brian

    1986-01-01

    This paper describes the concept of virtual time and its implementation in the Time Warp Operating System at the Jet Propulsion Laboratory. Virtual time is a distributed synchronization paradigm that is appropriate for distributed simulation, database concurrency control, real time systems, and coordination of replicated processes. The Time Warp Operating System is targeted toward the distributed simulation application and runs on a 32-node JPL Mark II Hypercube.

  17. Electrooptical adaptive switching network for the hypercube computer

    NASA Technical Reports Server (NTRS)

    Chow, E.; Peterson, J.

    1988-01-01

    An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.

  18. Novel snapshot hyperspectral imager for fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Chandler, Lynn; Chandler, Andrea; Periasamy, Ammasi

    2018-02-01

    Hyperspectral imaging has emerged as a new technique for the identification and classification of biological tissue1. Benefitting recent developments in sensor technology, the new class of hyperspectral imagers can capture entire hypercubes with single shot operation and it shows great potential for real-time imaging in biomedical sciences. This paper explores the use of a SnapShot imager in fluorescence imaging via microscope for the very first time. Utilizing the latest imaging sensor, the Snapshot imager is both compact and attachable via C-mount to any commercially available light microscope. Using this setup, fluorescence hypercubes of several cells were generated, containing both spatial and spectral information. The fluorescence images were acquired with one shot operation for all the emission range from visible to near infrared (VIS-IR). The paper will present the hypercubes obtained images from example tissues (475-630nm). This study demonstrates the potential of application in cell biology or biomedical applications for real time monitoring.

  19. Inferring patterns in mitochondrial DNA sequences through hypercube independent spanning trees.

    PubMed

    Silva, Eduardo Sant Ana da; Pedrini, Helio

    2016-03-01

    Given a graph G, a set of spanning trees rooted at a vertex r of G is said vertex/edge independent if, for each vertex v of G, v≠r, the paths of r to v in any pair of trees are vertex/edge disjoint. Independent spanning trees (ISTs) provide a number of advantages in data broadcasting due to their fault tolerant properties. For this reason, some studies have addressed the issue by providing mechanisms for constructing independent spanning trees efficiently. In this work, we investigate how to construct independent spanning trees on hypercubes, which are generated based upon spanning binomial trees, and how to use them to predict mitochondrial DNA sequence parts through paths on the hypercube. The prediction works both for inferring mitochondrial DNA sequences comprised of six bases as well as infer anomalies that probably should not belong to the mitochondrial DNA standard. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Sets of Mutually Orthogonal Sudoku Latin Squares

    ERIC Educational Resources Information Center

    Vis, Timothy; Petersen, Ryan M.

    2009-01-01

    A Latin square of order "n" is an "n" x "n" array using n symbols, such that each symbol appears exactly once in each row and column. A set of Latin squares is c ordered pairs of symbols appearing in the cells of the array are distinct. The popular puzzle Sudoku involves Latin squares with n = 9, along with the added condition that each of the 9…

  1. Modeling uncertainty and correlation in soil properties using Restricted Pairing and implications for ensemble-based hillslope-scale soil moisture and temperature estimation

    NASA Astrophysics Data System (ADS)

    Flores, A. N.; Entekhabi, D.; Bras, R. L.

    2007-12-01

    Soil hydraulic and thermal properties (SHTPs) affect both the rate of moisture redistribution in the soil column and the volumetric soil water capacity. Adequately constraining these properties through field and lab analysis to parameterize spatially-distributed hydrology models is often prohibitively expensive. Because SHTPs vary significantly at small spatial scales individual soil samples are also only reliably indicative of local conditions, and these properties remain a significant source of uncertainty in soil moisture and temperature estimation. In ensemble-based soil moisture data assimilation, uncertainty in the model-produced prior estimate due to associated uncertainty in SHTPs must be taken into account to avoid under-dispersive ensembles. To treat SHTP uncertainty for purposes of supplying inputs to a distributed watershed model we use the restricted pairing (RP) algorithm, an extension of Latin Hypercube (LH) sampling. The RP algorithm generates an arbitrary number of SHTP combinations by sampling the appropriate marginal distributions of the individual soil properties using the LH approach, while imposing a target rank correlation among the properties. A previously-published meta- database of 1309 soils representing 12 textural classes is used to fit appropriate marginal distributions to the properties and compute the target rank correlation structure, conditioned on soil texture. Given categorical soil textures, our implementation of the RP algorithm generates an arbitrarily-sized ensemble of realizations of the SHTPs required as input to the TIN-based Realtime Integrated Basin Simulator with vegetation dynamics (tRIBS+VEGGIE) distributed parameter ecohydrology model. Soil moisture ensembles simulated with RP- generated SHTPs exhibit less variance than ensembles simulated with SHTPs generated by a scheme that neglects correlation among properties. Neglecting correlation among SHTPs can lead to physically unrealistic combinations of parameters that exhibit implausible hydrologic behavior when input to the tRIBS+VEGGIE model.

  2. Capturing spatial heterogeneity of soil organic carbon under changing climate

    NASA Astrophysics Data System (ADS)

    Mishra, U.; Fan, Z.; Jastrow, J. D.; Matamala, R.; Vitharana, U.

    2015-12-01

    The spatial heterogeneity of the land surface affects water, energy, and greenhouse gas exchanges with the atmosphere. Designing observation networks that capture land surface spatial heterogeneity is a critical scientific challenge. Here, we present a geospatial approach to capture the existing spatial heterogeneity of soil organic carbon (SOC) stocks across Alaska, USA. We used the standard deviation of 556 georeferenced SOC profiles previously compiled in Mishra and Riley (2015, Biogeosciences, 12:3993-4004) to calculate the number of observations that would be needed to reliably estimate Alaskan SOC stocks. This analysis indicated that 906 randomly distributed observation sites would be needed to quantify the mean value of SOC stocks across Alaska at a confidence interval of ± 5 kg m-2. We then used soil-forming factors (climate, topography, land cover types, surficial geology) to identify the locations of appropriately distributed observation sites by using the conditioned Latin hypercube sampling approach. Spatial correlation and variogram analyses demonstrated that the spatial structures of soil-forming factors were adequately represented by these 906 sites. Using the spatial correlation length of existing SOC observations, we identified 484 new observation sites would be needed to provide the best estimate of the present status of SOC stocks in Alaska. We then used average decadal projections (2020-2099) of precipitation, temperature, and length of growing season for three representative concentration pathway (RCP 4.5, 6.0, and 8.5) scenarios of the Intergovernmental Panel on Climate Change to investigate whether the location of identified observation sites will shift/change under future climate. Our results showed 12-41 additional observation sites (depending on emission scenarios) will be required to capture the impact of projected climatic conditions by 2100 on the spatial heterogeneity of Alaskan SOC stocks. Our results represent an ideal distribution of observation sites across Alaska that captures the land surface spatial heterogeneity and can be used in efforts to quantify SOC stocks, monitor greenhouse gas emissions, and benchmark Earth System Model results.

  3. Multiphase complete exchange on a circuit switched hypercube

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1991-01-01

    On a distributed memory parallel computer, the complete exchange (all-to-all personalized) communication pattern requires each of n processors to send a different block of data to each of the remaining n - 1 processors. This pattern is at the heart of many important algorithms, most notably the matrix transpose. For a circuit switched hypercube of dimension d(n = 2(sup d)), two algorithms for achieving complete exchange are known. These are (1) the Standard Exchange approach that employs d transmissions of size 2(sup d-1) blocks each and is useful for small block sizes, and (2) the Optimal Circuit Switched algorithm that employs 2(sup d) - 1 transmissions of 1 block each and is best for large block sizes. A unified multiphase algorithm is described that includes these two algorithms as special cases. The complete exchange on a hypercube of dimension d and block size m is achieved by carrying out k partial exchange on subcubes of dimension d(sub i) Sigma(sup k)(sub i=1) d(sub i) = d and effective block size m(sub i) = m2(sup d-di). When k = d and all d(sub i) = 1, this corresponds to algorithm (1) above. For the case of k = 1 and d(sub i) = d, this becomes the circuit switched algorithm (2). Changing the subcube dimensions d, varies the effective block size and permits a compromise between the data permutation and block transmission overhead of (1) and the startup overhead of (2). For a hypercube of dimension d, the number of possible combinations of subcubes is p(d), the number of partitions of the integer d. This is an exponential but very slowly growing function and it is feasible over these partitions to discover the best combination for a given message size. The approach was analyzed for, and implemented on, the Intel iPSC-860 circuit switched hypercube. Measurements show good agreement with predictions and demonstrate that the multiphase approach can substantially improve performance for block sizes in the 0 to 160 byte range. This range, which corresponds to 0 to 40 floating point numbers per processor, is commonly encountered in practical numeric applications. The multiphase technique is applicable to all circuit-switched hypercubes that use the common e-cube routing strategy.

  4. Revolution or evolution? An analysis of E-health innovation and impact using a hypercube model.

    PubMed

    Wu, Jen-Her; Huang, An-Sheng; Hisa, Tzyh-Lih; Tsai, Hsien-Tang

    2006-01-01

    This study utilises a hypercube innovation model to analyse the changes in both healthcare informatics and medical related delivery models based on the innovations from Tele-healthcare, electronic healthcare (E-healthcare), to mobile healthcare (M-healthcare). Further, the critical impacts of these E-health innovations on the stakeholders: healthcare customers, hospitals, healthcare complementary providers and healthcare regulators are identified. Thereafter, the critical capabilities for adopting each innovation are discussed.

  5. Modeling bronchial circulation with application to soluble gas exchange: description and sensitivity analysis.

    PubMed

    Bui, T D; Dabdub, D; George, S C

    1998-06-01

    The steady-state exchange of inert gases across an in situ canine trachea has recently been shown to be limited equally by diffusion and perfusion over a wide range (0.01-350) of blood solubilities (betablood; ml . ml-1 . atm-1). Hence, we hypothesize that the exchange of ethanol (betablood = 1,756 at 37 degrees C) in the airways depends on the blood flow rate from the bronchial circulation. To test this hypothesis, the dynamics of the bronchial circulation were incorporated into an existing model that describes the simultaneous exchange of heat, water, and a soluble gas in the airways. A detailed sensitivity analysis of key model parameters was performed by using the method of Latin hypercube sampling. The model accurately predicted a previously reported experimental exhalation profile of ethanol (R2 = 0.991) as well as the end-exhalation airstream temperature (34.6 degrees C). The model predicts that 27, 29, and 44% of exhaled ethanol in a single exhalation are derived from the tissues of the mucosa and submucosa, the bronchial circulation, and the tissue exterior to the submucosa (which would include the pulmonary circulation), respectively. Although the concentration of ethanol in the bronchial capillary decreased during inspiration, the three key model outputs (end-exhaled ethanol concentration, the slope of phase III, and end-exhaled temperature) were all statistically insensitive (P > 0.05) to the parameters describing the bronchial circulation. In contrast, the model outputs were all sensitive (P < 0.05) to the thickness of tissue separating the core body conditions from the bronchial smooth muscle. We conclude that both the bronchial circulation and the pulmonary circulation impact soluble gas exchange when the entire conducting airway tree is considered.

  6. Development of a decision support tool for seasonal water supply management incorporating system uncertainties and operational constraints

    NASA Astrophysics Data System (ADS)

    Wang, H.; Asefa, T.

    2017-12-01

    A real-time decision support tool (DST) for water supply system would consider system uncertainties, e.g., uncertain streamflow and demand, as well as operational constraints and infrastructure outage (e.g., pump station shutdown, an offline reservoir due to maintenance). Such DST is often used by water managers for resource allocation and delivery for customers. Although most seasonal DST used by water managers recognize those system uncertainties and operational constraints, most use only historical information or assume deterministic outlook of water supply systems. This study presents a seasonal DST that incorporates rainfall/streamflow uncertainties, seasonal demand outlook and system operational constraints. Large scale climate-information is captured through a rainfall simulator driven by a Bayesian non-homogeneous Markov Chain Monte Carlo model that allows non-stationary transition probabilities contingent on Nino 3.4 index. An ad-hoc seasonal demand forecasting model considers weather conditions explicitly and socio-economic factors implicitly. Latin Hypercube sampling is employed to effectively sample probability density functions of flow and demand. Seasonal system operation is modelled as a mixed-integer optimization problem that aims at minimizing operational costs. It embeds the flexibility of modifying operational rules at different components, e.g., surface water treatment plants, desalination facilities, and groundwater pumping stations. The proposed framework is illustrated at a wholesale water supplier in Southeastern United States, Tampa Bay Water. The use of the tool is demonstrated in proving operational guidance in a typical drawdown and refill cycle of a regional reservoir. The DST provided: 1) probabilistic outlook of reservoir storage and chance of a successful refill by the end of rainy season; 2) operational expectations for large infrastructures (e.g., high service pumps and booster stations) throughout the season. Other potential use of such DST is also discussed.

  7. Landscape patterns and soil organic carbon stocks in agricultural bocage landscapes

    NASA Astrophysics Data System (ADS)

    Viaud, Valérie; Lacoste, Marine; Michot, Didier; Walter, Christian

    2014-05-01

    Soil organic carbon (SOC) has a crucial impact on global carbon storage at world scale. SOC spatial variability is controlled by the landscape patterns resulting from the continuous interactions between the physical environment and the society. Natural and anthropogenic processes occurring and interplaying at the landscape scale, such as soil redistribution in the lateral and vertical dimensions by tillage and water erosion processes or spatial differentiation of land-use and land-management practices, strongly affect SOC dynamics. Inventories of SOC stocks, reflecting their spatial distribution, are thus key elements to develop relevant management strategies to improving carbon sequestration and mitigating climate change and soil degradation. This study aims to quantify SOC stocks and their spatial distribution in a 1,000-ha agricultural bocage landscape with dairy production as dominant farming system (Zone Atelier Armorique, LTER Europe, NW France). The site is characterized by high heterogeneity on short distance due to a high diversity of soils with varying waterlogging, soil parent material, topography, land-use and hedgerow density. SOC content and stocks were measured up to 105-cm depth in 200 sampling locations selected using conditioned Latin hypercube sampling. Additive sampling was designed to specifically explore SOC distribution near to hedges: 112 points were sampled at fixed distance on 14 transects perpendicular from hedges. We illustrate the heterogeneity of spatial and vertical distribution of SOC stocks at landscape scale, and quantify SOC stocks in the various landscape components. Using multivariate statistics, we discuss the variability and co-variability of existing spatial organization of cropping systems, environmental factors, and SOM stocks, over landscape. Ultimately, our results may contribute to improving regional or national digital soil mapping approaches, by considering the distribution of SOC stocks within each modeling unit and by accounting for the impact of sensitive ecosystems.

  8. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less

  9. Estimating the Non-Monetary Burden of Neurocysticercosis in Mexico

    PubMed Central

    Bhattarai, Rachana; Budke, Christine M.; Carabin, Hélène; Proaño, Jefferson V.; Flores-Rivera, Jose; Corona, Teresa; Ivanek, Renata; Snowden, Karen F.; Flisser, Ana

    2012-01-01

    Background Neurocysticercosis (NCC) is a major public health problem in many developing countries where health education, sanitation, and meat inspection infrastructure are insufficient. The condition occurs when humans ingest eggs of the pork tapeworm Taenia solium, which then develop into larvae in the central nervous system. Although NCC is endemic in many areas of the world and is associated with considerable socio-economic losses, the burden of NCC remains largely unknown. This study provides the first estimate of disability adjusted life years (DALYs) associated with NCC in Mexico. Methods DALYs lost for symptomatic cases of NCC in Mexico were estimated by incorporating morbidity and mortality due to NCC-associated epilepsy, and morbidity due to NCC-associated severe chronic headaches. Latin hypercube sampling methods were employed to sample the distributions of uncertain parameters and to estimate 95% credible regions (95% CRs). Findings In Mexico, 144,433 and 98,520 individuals are estimated to suffer from NCC-associated epilepsy and NCC-associated severe chronic headaches, respectively. A total of 25,341 (95% CR: 12,569–46,640) DALYs were estimated to be lost due to these clinical manifestations, with 0.25 (95% CR: 0.12–0.46) DALY lost per 1,000 person-years of which 90% was due to NCC-associated epilepsy. Conclusion This is the first estimate of DALYs associated with NCC in Mexico. However, this value is likely to be underestimated since only the clinical manifestations of epilepsy and severe chronic headaches were included. In addition, due to limited country specific data, some parameters used in the analysis were based on systematic reviews of the literature or primary research from other geographic locations. Even with these limitations, our estimates suggest that healthy years of life are being lost due to NCC in Mexico. PMID:22363827

  10. Changing patterns of migration in Latin America: how can research develop intelligence for public health?

    PubMed

    Cabieses, Baltica; Tunstall, Helena; Pickett, Kate E; Gideon, Jasmine

    2013-07-01

    Migration patterns in Latin America have changed significantly in recent decades, particularly since the onset of global recession in 2007. These recent economic changes have highlighted and exacerbated the weakness of evidence from Latin America regarding migration-a crucial determinant of health. Migration patterns are constantly evolving in Latin America, but research on migration has not developed at the same speed. This article focuses on the need for better understanding of the living conditions and health of migrant populations in Latin America within the context of the recent global recession. The authors explain how new data on migrant well-being could be obtained through improved evidence from censuses and ongoing research surveys to 1) better inform policy-makers about the needs of migrant populations in Latin America and 2) help determine better ways of reaching undocumented immigrants. Longitudinal studies on immigrants in Latin America are essential for generating a better representation of migrant living conditions and health needs during the initial stages of immigration and over time. To help meet this need, the authors support the promotion of sustainable sources of data and evidence on the complex relationship between migration and health.

  11. Implementation and Performance Analysis of Parallel Assignment Algorithms on a Hypercube Computer.

    DTIC Science & Technology

    1987-12-01

    coupled pro- cessors because of the degree of interaction between processors imposed by the global memory [HwB84]. Another sub-class of MIMD... interaction between the individual processors [MuA87]. Many of the commercial MIMD computers available today are loosely coupled [HwB84]. 2.1.3 The Hypercube...Alpha-beta is a method usually employed in the solution of two-person zero-sum games like chess and checkers [Qui87]. The ha sic approach of the alpha

  12. Living conditions and health. A population-based study of labour migrants and Latin American refugees in Sweden and those who were repatriated.

    PubMed

    Sundquist, J

    1995-06-01

    To examine whether there are differences in living conditions and self-rated health between South European labour migrants and Latin American refugees and those who were repatriated to Latin America. Analysis of data from a survey (face-to-face interviews) in 1991 of 338 Latin American refugees and 60 repatriated refugees. A random sample of 161 South European and 396 Finnish labour migrants from the Swedish Annual Level-of-Living Surveys 1980-1981 and 1988-89 was analysed. A random sample of 1,159 age-, sex- and education-matched Swedes served as controls. Lund, a medium-sized town in southern Sweden, Santiago and Montevideo, capitals of Chile and Uruguay, respectively, and Sweden. Labour migrants and refugees in particular lived in rented flats while Swedes lived in privately-owned one-family homes. All immigrants and in particular repatriated Latin Americans had low material standard and meagre economic resources compared with Swedes. Being a Latin American refugee, a South European or Finnish labour migrant were independent risk indicators of self-rated poor health in logistic regression (multivariate analyses). Not feeling secure in everyday life and poor leisure opportunities were independent risk factors for poor health with an estimated odds ratio of 3.13(2.09-4.45) and 1.57(1.22-2.00), respectively. This study shows a clear ethnic segregation in housing and other living conditions between Swedes and immigrants, where Latin American refugees and repatriated Latin Americans were most vulnerable. All immigrants had increased self-rated poor health compared with Swedes. Being an immigrant was a risk factor for poor health of equal importance to more traditional risk factors such as lifestyle factors.

  13. A World of Hurt: Latin America

    ERIC Educational Resources Information Center

    Ramaswami, Rama

    2009-01-01

    Massive socioeconomic problems have left Latin American education in a dire condition, and decades behind the rest of the globe in integrating technology into teaching and learning. But a few spots in the region offer signs of hope. In this article, the author describes several efforts at tech-based educational reform in Latin America.

  14. Latin American Youth in a Time of Change and Crisis.

    ERIC Educational Resources Information Center

    United Nations Economic and Social Council, New York, NY.

    This 4-part study reaffirms the concepts of a previous study entitled "Situation and Prospects of Youth in Latin America" and approved in 1983, and on the basis of new knowledge explores more deeply national situations and their diversity. It offers new conceptual and theoretical contributions on the condition of youth in Latin America…

  15. Reducing Latin America’s Bumper Crop: Babies.

    DTIC Science & Technology

    birth control measures. Data was gathered using a literature search which relied heavily on periodicals and materials written as a result of on site research in Latin America by the American Universities Field Staff. The high birth rate in Latin America is caused primarily either by the prohibitions of the Catholic Church against artifical contraceptive methods nor by cultural attitudes towards large families. High birth rates are caused primarily by poverty, illiteracy, and underdevelopment. Where socioechnomic conditions have improved in Latin America birth rates have

  16. Treatment of Mentally Ill Offenders in Nine Developing Latin American Countries.

    PubMed

    Almanzar, Santiago; Katz, Craig L; Harry, Bruce

    2015-09-01

    The prevalence of psychiatric conditions among prisoners in Latin America is greatly underestimated, and because of the lack of awareness about mental illness among service providers in Latin American prisons, oftentimes these conditions go unrecognized or are not treated properly. In the worst-case scenarios, human rights violations occur. Despite the high levels of need, many prisoners have not received adequate or timely treatment. The sparse existing literature documents prison conditions throughout Latin American countries, ranging from poor to extremely harsh, overcrowded, and life threatening. Most prison systems do not meet international prison standards. The information on forensic mental health services and the treatment of offenders with mental illness have been less extensively studied and compared with forensic practices in developed American nations. This study analyzes the existing literature on forensic psychiatry, focusing on nine socioeconomically developing nations in Latin America, to improve understanding of treatment approaches for offenders with mental illness and identify emerging themes. A review was conducted and data were included in regression analyses to investigate information relative to the treatment of offenders with mental illness and its interaction with the mental health system. © 2015 American Academy of Psychiatry and the Law.

  17. Uncertainty and sensitivity analysis of the basic reproduction number of diphtheria: a case study of a Rohingya refugee camp in Bangladesh, November–December 2017

    PubMed Central

    Matsuyama, Ryota; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya

    2018-01-01

    Background A Rohingya refugee camp in Cox’s Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R0. Methods A renewal process model was devised to estimate the R0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. Results R0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R0 would become smaller with greater variance of the generation time. Discussion Estimated R0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified. PMID:29629244

  18. Optimization on the impeller of a low-specific-speed centrifugal pump for hydraulic performance improvement

    NASA Astrophysics Data System (ADS)

    Pei, Ji; Wang, Wenjie; Yuan, Shouqi; Zhang, Jinfeng

    2016-09-01

    In order to widen the high-efficiency operating range of a low-specific-speed centrifugal pump, an optimization process for considering efficiencies under 1.0 Q d and 1.4 Q d is proposed. Three parameters, namely, the blade outlet width b 2, blade outlet angle β 2, and blade wrap angle φ, are selected as design variables. Impellers are generated using the optimal Latin hypercube sampling method. The pump efficiencies are calculated using the software CFX 14.5 at two operating points selected as objectives. Surrogate models are also constructed to analyze the relationship between the objectives and the design variables. Finally, the particle swarm optimization algorithm is applied to calculate the surrogate model to determine the best combination of the impeller parameters. The results show that the performance curve predicted by numerical simulation has a good agreement with the experimental results. Compared with the efficiencies of the original impeller, the hydraulic efficiencies of the optimized impeller are increased by 4.18% and 0.62% under 1.0 Q d and 1.4Qd, respectively. The comparison of inner flow between the original pump and optimized one illustrates the improvement of performance. The optimization process can provide a useful reference on performance improvement of other pumps, even on reduction of pressure fluctuations.

  19. Geometry control of long-span continuous girder concrete bridge during construction through finite element model updating

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Yan, Quan-sheng; Li, Jian; Hu, Min-yi

    2016-04-01

    In bridge construction, geometry control is critical to ensure that the final constructed bridge has the consistent shape as design. A common method is by predicting the deflections of the bridge during each construction phase through the associated finite element models. Therefore, the cambers of the bridge during different construction phases can be determined beforehand. These finite element models are mostly based on the design drawings and nominal material properties. However, the accuracy of these bridge models can be large due to significant uncertainties of the actual properties of the materials used in construction. Therefore, the predicted cambers may not be accurate to ensure agreement of bridge geometry with design, especially for long-span bridges. In this paper, an improved geometry control method is described, which incorporates finite element (FE) model updating during the construction process based on measured bridge deflections. A method based on the Kriging model and Latin hypercube sampling is proposed to perform the FE model updating due to its simplicity and efficiency. The proposed method has been applied to a long-span continuous girder concrete bridge during its construction. Results show that the method is effective in reducing construction error and ensuring the accuracy of the geometry of the final constructed bridge.

  20. Uncertainty and sensitivity analysis of the basic reproduction number of diphtheria: a case study of a Rohingya refugee camp in Bangladesh, November-December 2017.

    PubMed

    Matsuyama, Ryota; Akhmetzhanov, Andrei R; Endo, Akira; Lee, Hyojung; Yamaguchi, Takayuki; Tsuzuki, Shinya; Nishiura, Hiroshi

    2018-01-01

    A Rohingya refugee camp in Cox's Bazar, Bangladesh experienced a large-scale diphtheria epidemic in 2017. The background information of previously immune fraction among refugees cannot be explicitly estimated, and thus we conducted an uncertainty analysis of the basic reproduction number, R 0 . A renewal process model was devised to estimate the R 0 and ascertainment rate of cases, and loss of susceptible individuals was modeled as one minus the sum of initially immune fraction and the fraction naturally infected during the epidemic. To account for the uncertainty of initially immune fraction, we employed a Latin Hypercube sampling (LHS) method. R 0 ranged from 4.7 to 14.8 with the median estimate at 7.2. R 0 was positively correlated with ascertainment rates. Sensitivity analysis indicated that R 0 would become smaller with greater variance of the generation time. Estimated R 0 was broadly consistent with published estimate from endemic data, indicating that the vaccination coverage of 86% has to be satisfied to prevent the epidemic by means of mass vaccination. LHS was particularly useful in the setting of a refugee camp in which the background health status is poorly quantified.

  1. A smart Monte Carlo procedure for production costing and uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, C.; Stremel, J.

    1996-11-01

    Electric utilities using chronological production costing models to decide whether to buy or sell power over the next week or next few weeks need to determine potential profits or losses under a number of uncertainties. A large amount of money can be at stake--often $100,000 a day or more--and one party of the sale must always take on the risk. In the case of fixed price ($/MWh) contracts, the seller accepts the risk. In the case of cost plus contracts, the buyer must accept the risk. So, modeling uncertainty and understanding the risk accurately can improve the competitive edge ofmore » the user. This paper investigates an efficient procedure for representing risks and costs from capacity outages. Typically, production costing models use an algorithm based on some form of random number generator to select resources as available or on outage. These algorithms allow experiments to be repeated and gains and losses to be observed in a short time. The authors perform several experiments to examine the capability of three unit outage selection methods and measures their results. Specifically, a brute force Monte Carlo procedure, a Monte Carlo procedure with Latin Hypercube sampling, and a Smart Monte Carlo procedure with cost stratification and directed sampling are examined.« less

  2. Estimated Under-Five Deaths Associated with Poor-Quality Antimalarials in Sub-Saharan Africa

    PubMed Central

    Renschler, John P.; Walters, Kelsey M.; Newton, Paul N.; Laxminarayan, Ramanan

    2015-01-01

    Many antimalarials sold in sub-Saharan Africa are poor-quality (falsified, substandard, or degraded), and the burden of disease caused by this problem is inadequately quantified. In this article, we estimate the number of under-five deaths caused by ineffective treatment of malaria associated with consumption of poor-quality antimalarials in 39 sub-Saharan countries. Using Latin hypercube sampling our estimates were calculated as the product of the number of private sector antimalarials consumed by malaria-positive children in 2013; the proportion of private sector antimalarials consumed that were of poor-quality; and the case fatality rate (CFR) of under-five malaria-positive children who did not receive appropriate treatment. An estimated 122,350 (interquartile range [IQR]: 91,577–154,736) under-five malaria deaths were associated with consumption of poor-quality antimalarials, representing 3.75% (IQR: 2.81–4.75%) of all under-five deaths in our sample of 39 countries. There is considerable uncertainty surrounding our results because of gaps in data on case fatality rates and prevalence of poor-quality antimalarials. Our analysis highlights the need for further investigation into the distribution of poor-quality antimalarials and the need for stronger surveillance and regulatory efforts to prevent the sale of poor-quality antimalarials. PMID:25897068

  3. Data Analysis Approaches for the Risk-Informed Safety Margins Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Alfonsi, Andrea; Maljovec, Daniel P.

    2016-09-01

    In the past decades, several numerical simulation codes have been employed to simulate accident dynamics (e.g., RELAP5-3D, RELAP-7, MELCOR, MAAP). In order to evaluate the impact of uncertainties into accident dynamics, several stochastic methodologies have been coupled with these codes. These stochastic methods range from classical Monte-Carlo and Latin Hypercube sampling to stochastic polynomial methods. Similar approaches have been introduced into the risk and safety community where stochastic methods (such as RAVEN, ADAPT, MCDET, ADS) have been coupled with safety analysis codes in order to evaluate the safety impact of timing and sequencing of events. These approaches are usually calledmore » Dynamic PRA or simulation-based PRA methods. These uncertainties and safety methods usually generate a large number of simulation runs (database storage may be on the order of gigabytes or higher). The scope of this paper is to present a broad overview of methods and algorithms that can be used to analyze and extract information from large data sets containing time dependent data. In this context, “extracting information” means constructing input-output correlations, finding commonalities, and identifying outliers. Some of the algorithms presented here have been developed or are under development within the RAVEN statistical framework.« less

  4. Identification of robust adaptation gene regulatory network parameters using an improved particle swarm optimization algorithm.

    PubMed

    Huang, X N; Ren, H P

    2016-05-13

    Robust adaptation is a critical ability of gene regulatory network (GRN) to survive in a fluctuating environment, which represents the system responding to an input stimulus rapidly and then returning to its pre-stimulus steady state timely. In this paper, the GRN is modeled using the Michaelis-Menten rate equations, which are highly nonlinear differential equations containing 12 undetermined parameters. The robust adaption is quantitatively described by two conflicting indices. To identify the parameter sets in order to confer the GRNs with robust adaptation is a multi-variable, multi-objective, and multi-peak optimization problem, which is difficult to acquire satisfactory solutions especially high-quality solutions. A new best-neighbor particle swarm optimization algorithm is proposed to implement this task. The proposed algorithm employs a Latin hypercube sampling method to generate the initial population. The particle crossover operation and elitist preservation strategy are also used in the proposed algorithm. The simulation results revealed that the proposed algorithm could identify multiple solutions in one time running. Moreover, it demonstrated a superior performance as compared to the previous methods in the sense of detecting more high-quality solutions within an acceptable time. The proposed methodology, owing to its universality and simplicity, is useful for providing the guidance to design GRN with superior robust adaptation.

  5. Efficient stochastic approaches for sensitivity studies of an Eulerian large-scale air pollution model

    NASA Astrophysics Data System (ADS)

    Dimov, I.; Georgieva, R.; Todorov, V.; Ostromsky, Tz.

    2017-10-01

    Reliability of large-scale mathematical models is an important issue when such models are used to support decision makers. Sensitivity analysis of model outputs to variation or natural uncertainties of model inputs is crucial for improving the reliability of mathematical models. A comprehensive experimental study of Monte Carlo algorithms based on Sobol sequences for multidimensional numerical integration has been done. A comparison with Latin hypercube sampling and a particular quasi-Monte Carlo lattice rule based on generalized Fibonacci numbers has been presented. The algorithms have been successfully applied to compute global Sobol sensitivity measures corresponding to the influence of several input parameters (six chemical reactions rates and four different groups of pollutants) on the concentrations of important air pollutants. The concentration values have been generated by the Unified Danish Eulerian Model. The sensitivity study has been done for the areas of several European cities with different geographical locations. The numerical tests show that the stochastic algorithms under consideration are efficient for multidimensional integration and especially for computing small by value sensitivity indices. It is a crucial element since even small indices may be important to be estimated in order to achieve a more accurate distribution of inputs influence and a more reliable interpretation of the mathematical model results.

  6. Assessment of the transportation route of oversize and excessive loads in relation to the load-bearing capacity of existing bridges

    NASA Astrophysics Data System (ADS)

    Doležel, Jiří; Novák, Drahomír; Petrů, Jan

    2017-09-01

    Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.

  7. Uncertainty quantification analysis of the dynamics of an electrostatically actuated microelectromechanical switch model

    NASA Astrophysics Data System (ADS)

    Snow, Michael G.; Bajaj, Anil K.

    2015-08-01

    This work presents an uncertainty quantification (UQ) analysis of a comprehensive model for an electrostatically actuated microelectromechanical system (MEMS) switch. The goal is to elucidate the effects of parameter variations on certain key performance characteristics of the switch. A sufficiently detailed model of the electrostatically actuated switch in the basic configuration of a clamped-clamped beam is developed. This multi-physics model accounts for various physical effects, including the electrostatic fringing field, finite length of electrodes, squeeze film damping, and contact between the beam and the dielectric layer. The performance characteristics of immediate interest are the static and dynamic pull-in voltages for the switch. Numerical approaches for evaluating these characteristics are developed and described. Using Latin Hypercube Sampling and other sampling methods, the model is evaluated to find these performance characteristics when variability in the model's geometric and physical parameters is specified. Response surfaces of these results are constructed via a Multivariate Adaptive Regression Splines (MARS) technique. Using a Direct Simulation Monte Carlo (DSMC) technique on these response surfaces gives smooth probability density functions (PDFs) of the outputs characteristics when input probability characteristics are specified. The relative variation in the two pull-in voltages due to each of the input parameters is used to determine the critical parameters.

  8. Challenges for Scientists in Latin America.

    PubMed

    Kalergis, Alexis M; Lacerda, Marcus; Rabinovich, Gabriel A; Rosenstein, Yvonne

    2016-09-01

    Despite political turmoil and economical crisis, research in Latin America has considerably advanced over recent decades. The present 'Point of View' outlines our perspectives on the working conditions, successes, difficulties, limitations, and challenges of biomedical scientific communities in four Latin American countries: Argentina (G.A.R.), Brazil (M.L.), Chile (A.K.), and Mexico (Y.R.). Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. [A framework for evaluating primary health care in Latin America].

    PubMed

    Haggerty, Jeannie L; Yavich, Natalia; Báscolo, Ernesto Pablo

    2009-11-01

    To determine the relevancy of applying the Canadian primary health care (PHC) assessment strategy to Latin America and to propose any modifications that might be needed for reaching a consensus in Latin America. The Delphi method was used to reach a consensus among 29 experts engaged in PHC development or evaluation in Latin America. Four virtual sessions and a face-to-face meeting were held to discuss the PHC evaluation logic model and the seven goals and six conditioning factors that make up the Canadian strategy, as well as any questions regarding the evaluation and indicators. The relevance of each concept was ranked according to the perspective of the Latin American countries. The experts considered the Canadian strategy's objectives and conditioning factors to be highly relevant to assessing PHC in Latin America, though they acknowledged that additional modification would increase relevance. The chief suggestions were to create a PHC vision and mission, to include additional objectives and conditioning factors, and to rework the original set. The objectives that concerned coordination and integrated comprehensive care did not achieve a high degree of consensus because of ambiguities in the original text and multiple interpretations of statements regarding certain aspects of the evaluation. Considerable progress was made on the road to building a PHC evaluation framework for the Region of the Americas. Indicators and information-gathering tools, which can be appropriately and practically applied in diverse contexts, need to be developed.

  10. Energies and radial distributions of Bs mesons - the effect of hypercubic blocking

    NASA Astrophysics Data System (ADS)

    Koponen, Jonna

    2006-12-01

    This is a follow-up to our earlier work for the energies and the charge (vector) and matter (scalar) distributions for S-wave states in a heavy-light meson, where the heavy quark is static and the light quark has a mass about that of the strange quark. We study the radial distributions of higher angular momentum states, namely P- and D-wave states, using a "fuzzy" static quark. A new improvement is the use of hypercubic blocking in the time direction, which effectively constrains the heavy quark to move within a 2a hypercube (a is the lattice spacing). The calculation is carried out with dynamical fermions on a 163 × 32 lattice with a ≈ 0.10 fm generated using the non-perturbatively improved clover action. The configurations were gener- ated by the UKQCD Collaboration using lattice action parameters β = 5.2, c SW = 2.0171 and κ = 0.1350. In nature the closest equivalent of this heavy-light system is the Bs meson. Attempts are now being made to understand these results in terms of the Dirac equation.

  11. Physically motivated correlation formalism in hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Roy, Ankita; Rafert, J. Bruce

    2004-05-01

    Most remote sensing data-sets contain a limiting number of independent spatial and spectral measurements, beyond which no effective increase in information is achieved. This paper presents a Physically Motivated Correlation Formalism (PMCF) ,which places both Spatial and Spectral data on an equivalent mathematical footing in the context of a specific Kernel, such that, optimal combinations of independent data can be selected from the entire Hypercube via the method of "Correlation Moments". We present an experimental and computational analysis of Hyperspectral data sets using the Michigan Tech VFTHSI [Visible Fourier Transform Hyperspectral Imager] based on a Sagnac Interferometer, adjusted to obtain high SNR levels. The captured Signal Interferograms of different targets - aerial snaps of Houghton and lab-based data (white light , He-Ne laser , discharge tube sources) with the provision of customized scan of targets with the same exposures are processed using inverse imaging transformations and filtering techniques to obtain the Spectral profiles and generate Hypercubes to compute Spectral/Spatial/Cross Moments. PMCF answers the question of how optimally the entire hypercube should be sampled and finds how many spatial-spectral pixels are required for a particular target recognition.

  12. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.

    PubMed

    Barhen, J; Toomarian, N; Protopopescu, V

    1987-12-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  13. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot

    NASA Technical Reports Server (NTRS)

    Barhen, Jacob; Toomarian, N.; Protopopescu, V.

    1987-01-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  14. Maximum Torque and Momentum Envelopes for Reaction Wheel Arrays

    NASA Technical Reports Server (NTRS)

    Reynolds, R. G.; Markley, F. Landis

    2001-01-01

    Spacecraft reaction wheel maneuvers are limited by the maximum torque and/or angular momentum which the wheels can provide. For an n-wheel configuration, the torque or momentum envelope can be obtained by projecting the n-dimensional hypercube, representing the domain boundary of individual wheel torques or momenta, into three dimensional space via the 3xn matrix of wheel axes. In this paper, the properties of the projected hypercube are discussed, and algorithms are proposed for determining this maximal torque or momentum envelope for general wheel configurations. Practical implementation strategies for specific wheel configurations are also considered.

  15. [Basic questionnaire and methodological criteria for Surveys on Working Conditions, Employment, and Health in Latin America and the Caribbean].

    PubMed

    Benavides, Fernando G; Merino-Salazar, Pamela; Cornelio, Cecilia; Assunção, Ada Avila; Agudelo-Suárez, Andrés A; Amable, Marcelo; Artazcoz, Lucía; Astete, Jonh; Barraza, Douglas; Berhó, Fabián; Milián, Lino Carmenate; Delclòs, George; Funcasta, Lorena; Gerke, Johanna; Gimeno, David; Itatí-Iñiguez, María José; Lima, Eduardo de Paula; Martínez-Iñigo, David; Medeiros, Adriane Mesquita de; Orta, Lida; Pinilla, Javier; Rodrigo, Fernando; Rojas, Marianela; Sabastizagal, Iselle; Vallebuona, Clelia; Vermeylen, Greet; Villalobos, Gloria H; Vives, Alejandra

    2016-10-10

    This article aimed to present a basic questionnaire and minimum methodological criteria for consideration in future Surveys on Working Conditions, Employment, and Health in Latin America and the Caribbean. A virtual and face-to-face consensus process was conducted with participation by a group of international experts who used the surveys available up until 2013 as the point of departure for defining the proposal. The final questionnaire included 77 questions grouped in six dimensions: socio-demographic characteristics of workers and companies; employment conditions; working conditions; health status; resources and preventive activities; and family characteristics. The minimum methodological criteria feature the interviewee's home as the place for the interview and aspects related to the quality of the fieldwork. These results can help improve the comparability of future surveys in Latin America and the Caribbean, which would in turn help improve information on workers' heath in the region.

  16. On the Local Equivalence Between the Canonical and the Microcanonical Ensembles for Quantum Spin Systems

    NASA Astrophysics Data System (ADS)

    Tasaki, Hal

    2018-06-01

    We study a quantum spin system on the d-dimensional hypercubic lattice Λ with N=L^d sites with periodic boundary conditions. We take an arbitrary translation invariant short-ranged Hamiltonian. For this system, we consider both the canonical ensemble with inverse temperature β _0 and the microcanonical ensemble with the corresponding energy U_N(β _0) . For an arbitrary self-adjoint operator \\hat{A} whose support is contained in a hypercubic block B inside Λ , we prove that the expectation values of \\hat{A} with respect to these two ensembles are close to each other for large N provided that β _0 is sufficiently small and the number of sites in B is o(N^{1/2}) . This establishes the equivalence of ensembles on the level of local states in a large but finite system. The result is essentially that of Brandao and Cramer (here restricted to the case of the canonical and the microcanonical ensembles), but we prove improved estimates in an elementary manner. We also review and prove standard results on the thermodynamic limits of thermodynamic functions and the equivalence of ensembles in terms of thermodynamic functions. The present paper assumes only elementary knowledge on quantum statistical mechanics and quantum spin systems.

  17. Performance of a plasma fluid code on the Intel parallel computers

    NASA Technical Reports Server (NTRS)

    Lynch, V. E.; Carreras, B. A.; Drake, J. B.; Leboeuf, J. N.; Liewer, P.

    1992-01-01

    One approach to improving the real-time efficiency of plasma turbulence calculations is to use a parallel algorithm. A parallel algorithm for plasma turbulence calculations was tested on the Intel iPSC/860 hypercube and the Touchtone Delta machine. Using the 128 processors of the Intel iPSC/860 hypercube, a factor of 5 improvement over a single-processor CRAY-2 is obtained. For the Touchtone Delta machine, the corresponding improvement factor is 16. For plasma edge turbulence calculations, an extrapolation of the present results to the Intel (sigma) machine gives an improvement factor close to 64 over the single-processor CRAY-2.

  18. A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Rao, Hariprasad Nannapaneni

    1989-01-01

    The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.

  19. Model Based Predictive Control of Multivariable Hammerstein Processes with Fuzzy Logic Hypercube Interpolated Models

    PubMed Central

    Coelho, Antonio Augusto Rodrigues

    2016-01-01

    This paper introduces the Fuzzy Logic Hypercube Interpolator (FLHI) and demonstrates applications in control of multiple-input single-output (MISO) and multiple-input multiple-output (MIMO) processes with Hammerstein nonlinearities. FLHI consists of a Takagi-Sugeno fuzzy inference system where membership functions act as kernel functions of an interpolator. Conjunction of membership functions in an unitary hypercube space enables multivariable interpolation of N-dimensions. Membership functions act as interpolation kernels, such that choice of membership functions determines interpolation characteristics, allowing FLHI to behave as a nearest-neighbor, linear, cubic, spline or Lanczos interpolator, to name a few. The proposed interpolator is presented as a solution to the modeling problem of static nonlinearities since it is capable of modeling both a function and its inverse function. Three study cases from literature are presented, a single-input single-output (SISO) system, a MISO and a MIMO system. Good results are obtained regarding performance metrics such as set-point tracking, control variation and robustness. Results demonstrate applicability of the proposed method in modeling Hammerstein nonlinearities and their inverse functions for implementation of an output compensator with Model Based Predictive Control (MBPC), in particular Dynamic Matrix Control (DMC). PMID:27657723

  20. A Bayesian ensemble data assimilation to constrain model parameters and land-use carbon emissions

    NASA Astrophysics Data System (ADS)

    Lienert, Sebastian; Joos, Fortunat

    2018-05-01

    A dynamic global vegetation model (DGVM) is applied in a probabilistic framework and benchmarking system to constrain uncertain model parameters by observations and to quantify carbon emissions from land-use and land-cover change (LULCC). Processes featured in DGVMs include parameters which are prone to substantial uncertainty. To cope with these uncertainties Latin hypercube sampling (LHS) is used to create a 1000-member perturbed parameter ensemble, which is then evaluated with a diverse set of global and spatiotemporally resolved observational constraints. We discuss the performance of the constrained ensemble and use it to formulate a new best-guess version of the model (LPX-Bern v1.4). The observationally constrained ensemble is used to investigate historical emissions due to LULCC (ELUC) and their sensitivity to model parametrization. We find a global ELUC estimate of 158 (108, 211) PgC (median and 90 % confidence interval) between 1800 and 2016. We compare ELUC to other estimates both globally and regionally. Spatial patterns are investigated and estimates of ELUC of the 10 countries with the largest contribution to the flux over the historical period are reported. We consider model versions with and without additional land-use processes (shifting cultivation and wood harvest) and find that the difference in global ELUC is on the same order of magnitude as parameter-induced uncertainty and in some cases could potentially even be offset with appropriate parameter choice.

  1. Feasibility of Stochastic Voltage/VAr Optimization Considering Renewable Energy Resources for Smart Grid

    NASA Astrophysics Data System (ADS)

    Momoh, James A.; Salkuti, Surender Reddy

    2016-06-01

    This paper proposes a stochastic optimization technique for solving the Voltage/VAr control problem including the load demand and Renewable Energy Resources (RERs) variation. The RERs often take along some inputs like stochastic behavior. One of the important challenges i. e., Voltage/VAr control is a prime source for handling power system complexity and reliability, hence it is the fundamental requirement for all the utility companies. There is a need for the robust and efficient Voltage/VAr optimization technique to meet the peak demand and reduction of system losses. The voltages beyond the limit may damage costly sub-station devices and equipments at consumer end as well. Especially, the RERs introduces more disturbances and some of the RERs are not even capable enough to meet the VAr demand. Therefore, there is a strong need for the Voltage/VAr control in RERs environment. This paper aims at the development of optimal scheme for Voltage/VAr control involving RERs. In this paper, Latin Hypercube Sampling (LHS) method is used to cover full range of variables by maximally satisfying the marginal distribution. Here, backward scenario reduction technique is used to reduce the number of scenarios effectively and maximally retain the fitting accuracy of samples. The developed optimization scheme is tested on IEEE 24 bus Reliability Test System (RTS) considering the load demand and RERs variation.

  2. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  3. iTOUGH2 V6.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, Stefan A.

    2010-11-01

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional , multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. It performs sensitivity analysis, parameter estimation, and uncertainty propagation, analysis in geosciences and reservoir engineering and other application areas. It supports a number of different combination of fluids and components [equation-of-state (EOS) modules]. In addition, the optimization routines implemented in iTOUGH2 can also be used or sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files. This link is achieved by means of the PEST application programmingmore » interface. iTOUGH2 solves the inverse problem by minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative fee, gradient-based and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlos simulation for uncertainty propagation analysis. A detailed residual and error analysis is provided. This upgrade includes new EOS modules (specifically EOS7c, ECO2N and TMVOC), hysteretic relative permeability and capillary pressure functions and the PEST API. More details can be found at http://esd.lbl.gov/iTOUGH2 and the publications cited there. Hardware Req.: Multi-platform; Related/auxiliary software PVM (if running in parallel).« less

  4. Machine learning from computer simulations with applications in rail vehicle dynamics

    NASA Astrophysics Data System (ADS)

    Taheri, Mehdi; Ahmadian, Mehdi

    2016-05-01

    The application of stochastic modelling for learning the behaviour of a multibody dynamics (MBD) models is investigated. Post-processing data from a simulation run are used to train the stochastic model that estimates the relationship between model inputs (suspension relative displacement and velocity) and the output (sum of suspension forces). The stochastic model can be used to reduce the computational burden of the MBD model by replacing a computationally expensive subsystem in the model (suspension subsystem). With minor changes, the stochastic modelling technique is able to learn the behaviour of a physical system and integrate its behaviour within MBD models. The technique is highly advantageous for MBD models where real-time simulations are necessary, or with models that have a large number of repeated substructures, e.g. modelling a train with a large number of railcars. The fact that the training data are acquired prior to the development of the stochastic model discards the conventional sampling plan strategies like Latin Hypercube sampling plans where simulations are performed using the inputs dictated by the sampling plan. Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, a sampling plan suitable for the process is developed where the most space-filling subset of the acquired data with ? number of sample points that best describes the dynamic behaviour of the system under study is selected as the training data.

  5. A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts

    NASA Astrophysics Data System (ADS)

    Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel

    2016-04-01

    Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.

  6. iTOUGH2 v7.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FINSTERLE, STEFAN; JUNG, YOOJIN; KOWALSKY, MICHAEL

    2016-09-15

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional, multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. iTOUGH2 performs sensitivity analyses, data-worth analyses, parameter estimation, and uncertainty propagation analyses in geosciences and reservoir engineering and other application areas. iTOUGH2 supports a number of different combinations of fluids and components (equation-of-state (EOS) modules). In addition, the optimization routines implemented in iTOUGH2 can also be used for sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files using the PEST protocol. iTOUGH2 solves the inverse problem bymore » minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative-free, gradient-based, and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlo simulations for uncertainty propagation analyses. A detailed residual and error analysis is provided. This upgrade includes (a) global sensitivity analysis methods, (b) dynamic memory allocation (c) additional input features and output analyses, (d) increased forward simulation capabilities, (e) parallel execution on multicore PCs and Linux clusters, and (f) bug fixes. More details can be found at http://esd.lbl.gov/iTOUGH2.« less

  7. Deriving global parameter estimates for the Noah land surface model using FLUXNET and machine learning

    NASA Astrophysics Data System (ADS)

    Chaney, Nathaniel W.; Herman, Jonathan D.; Ek, Michael B.; Wood, Eric F.

    2016-11-01

    With their origins in numerical weather prediction and climate modeling, land surface models aim to accurately partition the surface energy balance. An overlooked challenge in these schemes is the role of model parameter uncertainty, particularly at unmonitored sites. This study provides global parameter estimates for the Noah land surface model using 85 eddy covariance sites in the global FLUXNET network. The at-site parameters are first calibrated using a Latin Hypercube-based ensemble of the most sensitive parameters, determined by the Sobol method, to be the minimum stomatal resistance (rs,min), the Zilitinkevich empirical constant (Czil), and the bare soil evaporation exponent (fxexp). Calibration leads to an increase in the mean Kling-Gupta Efficiency performance metric from 0.54 to 0.71. These calibrated parameter sets are then related to local environmental characteristics using the Extra-Trees machine learning algorithm. The fitted Extra-Trees model is used to map the optimal parameter sets over the globe at a 5 km spatial resolution. The leave-one-out cross validation of the mapped parameters using the Noah land surface model suggests that there is the potential to skillfully relate calibrated model parameter sets to local environmental characteristics. The results demonstrate the potential to use FLUXNET to tune the parameterizations of surface fluxes in land surface models and to provide improved parameter estimates over the globe.

  8. [On research concerning abortion in Latin America and studies on women].

    PubMed

    Barroso, C

    1989-01-01

    "Research on abortion is important for the Latin American women's movements. Rates of illegal abortion seem quite high. Cuba is the only country where abortion is legal. Policies on abortion are closely related to attitudes towards sexuality and women. Contraception has, in addition to health and economic costs, social and psychological costs, therefore unwanted pregnancies are the normal results of behavior that follows a certain rationality. Consequences of abortion depend on a woman's integration in her social network. The Latin American scene has two main differences from industrialized countries: mass poverty and the influence of the Catholic Church. Conditions of poverty affect less the motivation for abortion and more the conditions of its use." (SUMMARY IN ENG) excerpt

  9. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy - taking a reasonable computational effort - when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Andrew; Lawrence, Earl

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less

  11. Drivers And Uncertainties Of Increasing Global Water Scarcity

    NASA Astrophysics Data System (ADS)

    Scherer, L.; Pfister, S.

    2015-12-01

    Water scarcity threatens ecosystems and human health and hampers economic development. It generally depends on the ratio of water consumption to availability. We calculated global, spatially explicit water stress indices (WSIs) which describe the vulnerability to additional water consumption on a scale from 0 (low) to 1 (high) and compare them for the decades 1981-1990 and 2001-2010. Input data are obtained from a multi-model ensemble at a resolution of 0.5 degrees. The variability among the models was used to run 1000 Monte Carlo simulations (latin hypercube sampling) and to subsequently estimate uncertainties of the WSIs. Globally, a trend of increasing water scarcity can be observed, however, uncertainties are large. The probability that this trend is actually occurring is as low as 53%. The increase in WSIs is rather driven by higher water use than lower water availability. Water availability is only 40% likely to decrease whereas water consumption is 67% likely to increase. Independent from the trend, we are already living under water scarce conditions, which is reflected in a consumption-weighted average of monthly WSIs of 0.51 in the recent decade. Its coefficient of variation points with 0.8 to the high uncertainties entailed, which might still hide poor model performance where all models consistently over- or underestimate water availability or use. Especially in arid areas, models generally overestimate availability. Although we do not traverse the planetary boundary of freshwater use as global water availability is sufficient, local water scarcity might be high. Therefore the regionalized assessment of WSIs under uncertainty helps to focus on specific regions to optimise water consumption. These global results can also help to raise awareness of water scarcity, and to suggest relevant measures such as more water efficient technologies to international companies, which have to deal with complex and distributed supply chains (e.g. in food production).

  12. Modeling Soil Organic Carbon Variation Along Climatic and Topographic Trajectories in the Central Andes

    NASA Astrophysics Data System (ADS)

    Gavilan, C.; Grunwald, S.; Quiroz, R.; Zhu, L.

    2015-12-01

    The Andes represent the largest and highest mountain range in the tropics. Geological and climatic differentiation favored landscape and soil diversity, resulting in ecosystems adapted to very different climatic patterns. Although several studies support the fact that the Andes are a vast sink of soil organic carbon (SOC) only few have quantified this variable in situ. Estimating the spatial distribution of SOC stocks in data-poor and/or poorly accessible areas, like the Andean region, is challenging due to the lack of recent soil data at high spatial resolution and the wide range of coexistent ecosystems. Thus, the sampling strategy is vital in order to ensure the whole range of environmental covariates (EC) controlling SOC dynamics is represented. This approach allows grasping the variability of the area, which leads to more efficient statistical estimates and improves the modeling process. The objectives of this study were to i) characterize and model the spatial distribution of SOC stocks in the Central Andean region using soil-landscape modeling techniques, and to ii) validate and evaluate the model for predicting SOC content in the area. For that purpose, three representative study areas were identified and a suite of variables including elevation, mean annual temperature, annual precipitation and Normalized Difference Vegetation Index (NDVI), among others, was selected as EC. A stratified random sampling (namely conditioned Latin Hypercube) was implemented and a total of 400 sampling locations were identified. At all sites, four composite topsoil samples (0-30 cm) were collected within a 2 m radius. SOC content was measured using dry combustion and SOC stocks were estimated using bulk density measurements. Regression Kriging was used to map the spatial variation of SOC stocks. The accuracy, fit and bias of SOC models was assessed using a rigorous validation assessment. This study produced the first comprehensive, geospatial SOC stock assessment in this undersampled region that serves as a baseline reference to assess potential impacts of climate and land use change.

  13. Soil organic carbon content assessment in a heterogeneous landscape: comparison of digital soil mapping and visible and near Infrared spectroscopy approaches

    NASA Astrophysics Data System (ADS)

    Michot, Didier; Fouad, Youssef; Pascal, Pichelin; Viaud, Valérie; Soltani, Inès; Walter, Christian

    2017-04-01

    This study aims are: i) to assess SOC content distribution according to the global soil map (GSM) project recommendations in a heterogeneous landscape ; ii) to compare the prediction performance of digital soil mapping (DSM) and visible-near infrared (Vis-NIR) spectroscopy approaches. The study area of 140 ha, located at Plancoët, surrounds the unique mineral spring water of Brittany (Western France). It's a hillock characterized by a heterogeneous landscape mosaic with different types of forest, permanent pastures and wetlands along a small coastal river. We acquired two independent datasets: j) 50 points selected using a conditioned Latin hypercube sampling (cLHS); jj) 254 points corresponding to the GSM grid. Soil samples were collected in three layers (0-5, 20-25 and 40-50cm) for both sampling strategies. SOC content was only measured in cLHS soil samples, while Vis-NIR spectra were measured on all the collected samples. For the DSM approach, a machine-learning algorithm (Cubist) was applied on the cLHS calibration data to build rule-based models linking soil carbon content in the different layers with environmental covariates, derived from digital elevation model, geological variables, land use data and existing large scale soil maps. For the spectroscopy approach, we used two calibration datasets: k) the local cLHS ; kk) a subset selected from the regional spectral database of Brittany after a PCA with a hierarchical clustering analysis and spiked by local cLHS spectra. The PLS regression algorithm with "leave-one-out" cross validation was performed for both calibration datasets. SOC contents for the 3 layers of the GSM grid were predicted using the different approaches and were compared with each other. Their prediction performance was evaluated by the following parameters: R2, RMSE and RPD. Both approaches led to satisfactory predictions for SOC content with an advantage for the spectral approach, particularly as regards the pertinence of the variation range.

  14. A hypercube compact neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rostykus, P.L.; Somani, A.K.

    1988-09-01

    A major problem facing implementation of neural networks is the connection problem. One popular tradeoff is to remove connections. Random disconnection severely degrades the capabilities. The hypercube based Compact Neural Network (CNN) has structured architecture combined with a rearrangement of the memory vectors gives a larger input space and better degradation than a cost equivalent network with more connections. The CNNs are based on a Hopfield network. The changes from the Hopfield net include states of -1 and +1 and when a node was evaluated to 0, it was not biased either positive or negative, instead it resumed its previousmore » state. L = PEs, N = memories and t/sub ij/s is the weights between i and j.« less

  15. Mapping implicit spectral methods to distributed memory architectures

    NASA Technical Reports Server (NTRS)

    Overman, Andrea L.; Vanrosendale, John

    1991-01-01

    Spectral methods were proven invaluable in numerical simulation of PDEs (Partial Differential Equations), but the frequent global communication required raises a fundamental barrier to their use on highly parallel architectures. To explore this issue, a 3-D implicit spectral method was implemented on an Intel hypercube. Utilization of about 50 percent was achieved on a 32 node iPSC/860 hypercube, for a 64 x 64 x 64 Fourier-spectral grid; finer grids yield higher utilizations. Chebyshev-spectral grids are more problematic, since plane-relaxation based multigrid is required. However, by using a semicoarsening multigrid algorithm, and by relaxing all multigrid levels concurrently, relatively high utilizations were also achieved in this harder case.

  16. Maximum Torque and Momentum Envelopes for Reaction Wheel Arrays

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Reynolds, Reid G.; Liu, Frank X.; Lebsock, Kenneth L.

    2009-01-01

    Spacecraft reaction wheel maneuvers are limited by the maximum torque and/or angular momentum that the wheels can provide. For an n-wheel configuration, the torque or momentum envelope can be obtained by projecting the n-dimensional hypercube, representing the domain boundary of individual wheel torques or momenta, into three dimensional space via the 3xn matrix of wheel axes. In this paper, the properties of the projected hypercube are discussed, and algorithms are proposed for determining this maximal torque or momentum envelope for general wheel configurations. Practical strategies for distributing a prescribed torque or momentum among the n wheels are presented, with special emphasis on configurations of four, five, and six wheels.

  17. Ordered fast Fourier transforms on a massively parallel hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Tong, Charles; Swarztrauber, Paul N.

    1991-01-01

    The present evaluation of alternative, massively parallel hypercube processor-applicable designs for ordered radix-2 decimation-in-frequency FFT algorithms gives attention to the reduction of computation time-dominating communication. A combination of the order and computational phases of the FFT is accordingly employed, in conjunction with sequence-to-processor maps which reduce communication. Two orderings, 'standard' and 'cyclic', in which the order of the transform is the same as that of the input sequence, can be implemented with ease on the Connection Machine (where orderings are determined by geometries and priorities. A parallel method for trigonometric coefficient computation is presented which does not employ trigonometric functions or interprocessor communication.

  18. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  19. PC-CUBE: A Personal Computer Based Hypercube

    NASA Technical Reports Server (NTRS)

    Ho, Alex; Fox, Geoffrey; Walker, David; Snyder, Scott; Chang, Douglas; Chen, Stanley; Breaden, Matt; Cole, Terry

    1988-01-01

    PC-CUBE is an ensemble of IBM PCs or close compatibles connected in the hypercube topology with ordinary computer cables. Communication occurs at the rate of 115.2 K-band via the RS-232 serial links. Available for PC-CUBE is the Crystalline Operating System III (CrOS III), Mercury Operating System, CUBIX and PLOTIX which are parallel I/O and graphics libraries. A CrOS performance monitor was developed to facilitate the measurement of communication and computation time of a program and their effects on performance. Also available are CXLISP, a parallel version of the XLISP interpreter; GRAFIX, some graphics routines for the EGA and CGA; and a general execution profiler for determining execution time spent by program subroutines. PC-CUBE provides a programming environment similar to all hypercube systems running CrOS III, Mercury and CUBIX. In addition, every node (personal computer) has its own graphics display monitor and storage devices. These allow data to be displayed or stored at every processor, which has much instructional value and enables easier debugging of applications. Some application programs which are taken from the book Solving Problems on Concurrent Processors (Fox 88) were implemented with graphics enhancement on PC-CUBE. The applications range from solving the Mandelbrot set, Laplace equation, wave equation, long range force interaction, to WaTor, an ecological simulation.

  20. Ordered fast fourier transforms on a massively parallel hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Tong, Charles; Swarztrauber, Paul N.

    1989-01-01

    Design alternatives for ordered Fast Fourier Transformation (FFT) algorithms were examined on massively parallel hypercube multiprocessors such as the Connection Machine. Particular emphasis is placed on reducing communication which is known to dominate the overall computing time. To this end, the order and computational phases of the FFT were combined, and the sequence to processor maps that reduce communication were used. The class of ordered transforms is expanded to include any FFT in which the order of the transform is the same as that of the input sequence. Two such orderings are examined, namely, standard-order and A-order which can be implemented with equal ease on the Connection Machine where orderings are determined by geometries and priorities. If the sequence has N = 2 exp r elements and the hypercube has P = 2 exp d processors, then a standard-order FFT can be implemented with d + r/2 + 1 parallel transmissions. An A-order sequence can be transformed with 2d - r/2 parallel transmissions which is r - d + 1 fewer than the standard order. A parallel method for computing the trigonometric coefficients is presented that does not use trigonometric functions or interprocessor communication. A performance of 0.9 GFLOPS was obtained for an A-order transform on the Connection Machine.

  1. Self-Reflecting Skew Polygons and Polytopes in the 4-Dimensional Hypercube.

    DTIC Science & Technology

    1983-01-01

    in a general position. Definition 1. We say that Lk is in general position provided that (1.7) the n x k matrix i11 has no vanishing minor of order k...monochromes (4.8) are parallel, a condition expressed by requiring that (4.9) the 4 x 2 matrix IA I has no vanishing minor of order 2 !i • I !.li -11- 6...8217 . every (ulu 2 ) we have (ul,u 2) e .M(, for some L and by Lemna 1 and (5.1) we conclude that for some 1 we have 1 3 either 0 < x , or else < x 1 • This

  2. Stochastic species abundance models involving special copulas

    NASA Astrophysics Data System (ADS)

    Huillet, Thierry E.

    2018-01-01

    Copulas offer a very general tool to describe the dependence structure of random variables supported by the hypercube. Inspired by problems of species abundances in Biology, we study three distinct toy models where copulas play a key role. In a first one, a Marshall-Olkin copula arises in a species extinction model with catastrophe. In a second one, a quasi-copula problem arises in a flagged species abundance model. In a third model, we study completely random species abundance models in the hypercube as those, not of product type, with uniform margins and singular. These can be understood from a singular copula supported by an inflated simplex. An exchangeable singular Dirichlet copula is also introduced, together with its induced completely random species abundance vector.

  3. Performance analysis of parallel branch and bound search with the hypercube architecture

    NASA Technical Reports Server (NTRS)

    Mraz, Richard T.

    1987-01-01

    With the availability of commercial parallel computers, researchers are examining new classes of problems which might benefit from parallel computing. This paper presents results of an investigation of the class of search intensive problems. The specific problem discussed is the Least-Cost Branch and Bound search method of deadline job scheduling. The object-oriented design methodology was used to map the problem into a parallel solution. While the initial design was good for a prototype, the best performance resulted from fine-tuning the algorithm for a specific computer. The experiments analyze the computation time, the speed up over a VAX 11/785, and the load balance of the problem when using loosely coupled multiprocessor system based on the hypercube architecture.

  4. A site oriented supercomputer for theoretical physics: The Fermilab Advanced Computer Program Multi Array Processor System (ACMAPS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nash, T.; Atac, R.; Cook, A.

    1989-03-06

    The ACPMAPS multipocessor is a highly cost effective, local memory parallel computer with a hypercube or compound hypercube architecture. Communication requires the attention of only the two communicating nodes. The design is aimed at floating point intensive, grid like problems, particularly those with extreme computing requirements. The processing nodes of the system are single board array processors, each with a peak power of 20 Mflops, supported by 8 Mbytes of data and 2 Mbytes of instruction memory. The system currently being assembled has a peak power of 5 Gflops. The nodes are based on the Weitek XL Chip set. Themore » system delivers performance at approximately $300/Mflop. 8 refs., 4 figs.« less

  5. The Social Condition of Higher Education: Globalisation and (beyond) Regionalisation in Latin America

    ERIC Educational Resources Information Center

    Gomes, Alfredo M.; Robertson, Susan L.; Dale, Roger

    2012-01-01

    This article aims to discuss the relationship between higher education (HE), globalisation and regionalism projects focusing on HE in Latin America and Brazil. It is claimed that HE has predominantly taken the diverse, yet concerted and co-ordinated routes of globalisation and regionalisation and, by doing so, been profoundly transformed. The…

  6. A snapshot of gene therapy in Latin America.

    PubMed

    Linden, Rafael; Matte, Ursula

    2014-03-01

    Gene therapy attempts the insertion and expression of exogenous genetic material in cells for therapeutic purposes. Conceived in the 1960s, gene therapy reached its first clinical trial at the end of the 1980s and by December 2013 around 600 genuine open clinical trials of gene therapy were registered at NIH Clinical Trials Database. Here, we summarize the current efforts towards the development of gene therapy in Latin America. Our survey shows that the number of scientists involved in the development of gene therapy and DNA vaccines in Latin America is still very low. Higher levels of investment in this technology are necessary to boost the advancement of innovation and intellectual property in this field in a way that would ease both the social and financial burden of various medical conditions in Latin America.

  7. Polyhedra and Higher Dimensions.

    ERIC Educational Resources Information Center

    Scott, Paul

    1988-01-01

    Describes the definition and characteristics of a regular polyhedron, tessellation, and pseudopolyhedra with diagrams. Discusses the nature of simplex, hypercube, and cross-polytope in the fourth dimension and beyond. (YP)

  8. Access and utilisation of social and health services as a social determinant of health: the case of undocumented Latin American immigrant women working in Lleida (Catalonia, Spain).

    PubMed

    Gea-Sánchez, Montserrat; Gastaldo, Denise; Molina-Luque, Fidel; Otero-García, Laura

    2017-03-01

    Although Spain has social and healthcare systems based on universal coverage, little is known about how undocumented immigrant women access and utilise them. This is particularly true in the case of Latin Americans who are overrepresented in the informal labour market, taking on traditionally female roles of caregivers and cleaners in private homes. This study describes access and utilisation of social and healthcare services by undocumented Latin American women working and living in rural and urban areas, and the barriers these women may face. An exploratory qualitative study was designed with 12 in-depth interviews with Latin American women living and working in three different settings: an urban city, a rural city and rural villages in the Pyrenees. Interviews were recorded, transcribed and analysed, yielding four key themes: health is a tool for work which worsens due to precarious working conditions; lack of legal status traps Latin American women in precarious jobs; lack of access to and use of social services; and limited access to and use of healthcare services. While residing and working in different areas of the province impacted the utilisation of services, working conditions was the main barrier experienced by the participants. In conclusion, decent working conditions are the key to ensuring undocumented immigrant women's right to social and healthcare. To create a pathway to immigrant women's health promotion, the 'trap of illegality' should be challenged and the impact of being considered 'illegal' should be considered as a social determinant of health, even where the right to access services is legal. © 2016 John Wiley & Sons Ltd.

  9. Health and social security reforms in Latin America: the convergence of the World Health Organization, the World Bank, and transnational corporations.

    PubMed

    Armada, F; Muntaner, C; Navarro, V

    2001-01-01

    International financial institutions have played an increasing role in the formation of social policy in Latin American countries over the last two decades, particularly in health and pension programs. World Bank loans and their attached policy conditions have promoted several social security reforms within a neoliberal framework that privileges the role of the market in the provision of health and pensions. Moreover, by endorsing the privatization of health services in Latin America, the World Health Organization has converged with these policies. The privatization of social security has benefited international corporations that become partners with local business elites. Thus the World Health Organization, international financial institutions, and transnational corporations have converged in the neoliberal reforms of social security in Latin America. Overall, the process represents a mechanism of resource transfer from labor to capital and sheds light on one of the ways in which neoliberalism may affect the health of Latin American populations.

  10. Boundary conditions, dimensionality, topology and size dependence of the superconducting transition temperature

    NASA Astrophysics Data System (ADS)

    Fink, Herman J.; Haley, Stephen B.; Giuraniuc, Claudiu V.; Kozhevnikov, Vladimir F.; Indekeu, Joseph O.

    2005-11-01

    For various sample geometries (slabs, cylinders, spheres, hypercubes), de Gennes' boundary condition parameter b is used to study its effect upon the transition temperature Tc of a superconductor. For b > 0 the order parameter at the surface is decreased, and as a consequence Tc is reduced, while for b < 0 the order parameter at the surface is increased, thereby enhancing Tc of a specimen in zero magnetic field. Exact solutions, derived by Fink and Haley (Int. J. mod. Phys. B, 17, 2171 (2003)), of the order parameter of a slab of finite thickness as a function of temperature are presented, both for reduced and enhanced transition (nucleation) temperatures. At the nucleation temperature the order parameter approaches zero. This concise review closes with a link established between de Gennes' microscopic boundary condition and the Ginzburg-Landau phenomenological approach, and a discussion of some relevant experiments. For example, applying the boundary condition with b < 0 to tin whiskers elucidates the increase of Tc with strain.

  11. Tensor form factor for the D → π(K) transitions with Twisted Mass fermions.

    NASA Astrophysics Data System (ADS)

    Lubicz, Vittorio; Riggio, Lorenzo; Salerno, Giorgio; Simula, Silvano; Tarantino, Cecilia

    2018-03-01

    We present a preliminary lattice calculation of the D → π and D → K tensor form factors fT (q2) as a function of the squared 4-momentum transfer q2. ETMC recently computed the vector and scalar form factors f+(q2) and f0(q2) describing D → π(K)lv semileptonic decays analyzing the vector current and the scalar density. The study of the weak tensor current, which is directly related to the tensor form factor, completes the set of hadronic matrix element regulating the transition between these two pseudoscalar mesons within and beyond the Standard Model where a non-zero tensor coupling is possible. Our analysis is based on the gauge configurations produced by the European Twisted Mass Collaboration with Nf = 2 + 1 + 1 flavors of dynamical quarks. We simulated at three different values of the lattice spacing and with pion masses as small as 210 MeV and with the valence heavy quark in the mass range from ≃ 0.7 mc to ≃ 1.2mc. The matrix element of the tensor current are determined for a plethora of kinematical conditions in which parent and child mesons are either moving or at rest. As for the vector and scalar form factors, Lorentz symmetry breaking due to hypercubic effects is clearly observed in the data. We will present preliminary results on the removal of such hypercubic lattice effects.

  12. Using Response Surface Methods to Correlate the Modal Test of an Inflatable Test Article

    NASA Technical Reports Server (NTRS)

    Gupta, Anju

    2013-01-01

    This paper presents a practical application of response surface methods (RSM) to correlate a finite element model of a structural modal test. The test article is a quasi-cylindrical inflatable structure which primarily consists of a fabric weave, with an internal bladder and metallic bulkheads on either end. To mitigate model size, the fabric weave was simplified by representing it with shell elements. The task at hand is to represent the material behavior of the weave. The success of the model correlation is measured by comparing the four major modal frequencies of the analysis model to the four major modal frequencies of the test article. Given that only individual strap material properties were provided and material properties of the overall weave were not available, defining the material properties of the finite element model became very complex. First it was necessary to determine which material properties (modulus of elasticity in the hoop and longitudinal directions, shear modulus, Poisson's ratio, etc.) affected the modal frequencies. Then a Latin Hypercube of the parameter space was created to form an efficiently distributed finite case set. Each case was then analyzed with the results input into RSM. In the resulting response surface it was possible to see how each material parameter affected the modal frequencies of the analysis model. If the modal frequencies of the analysis model and its corresponding parameters match the test with acceptable accuracy, it can be said that the model correlation is successful.

  13. The simulation of a two-dimensional (2D) transport problem in a rectangular region with Lattice Boltzmann method with two-relaxation-time

    NASA Astrophysics Data System (ADS)

    Sugiyanto, S.; Hardyanto, W.; Marwoto, P.

    2018-03-01

    Transport phenomena are found in many problems in many engineering and industrial sectors. We analyzed a Lattice Boltzmann method with Two-Relaxation Time (LTRT) collision operators for simulation of pollutant moving through the medium as a two-dimensional (2D) transport problem in a rectangular region model. This model consists of a 2D rectangular region with 54 length (x), 27 width (y), and it has isotropic homogeneous medium. Initially, the concentration is zero and is distributed evenly throughout the region of interest. A concentration of 1 is maintained at 9 < y < 18, whereas the concentration of zero is maintained at 0 < y < 9 and 18 < y < 27. A specific discharge (Darcy velocity) of 1.006 is assumed. A diffusion coefficient of 0.8333 is distributed uniformly with a uniform porosity of 0.35. A computer program is written in MATLAB to compute the concentration of pollutant at any specified place and time. The program shows that LTRT solution with quadratic equilibrium distribution functions (EDFs) and relaxation time τa=1.0 are in good agreement result with other numerical solutions methods such as 3DLEWASTE (Hybrid Three-dimensional Lagrangian-Eulerian Finite Element Model of Waste Transport Through Saturated-Unsaturated Media) obtained by Yeh and 3DFEMWATER-LHS (Three-dimensional Finite Element Model of Water Flow Through Saturated-Unsaturated Media with Latin Hypercube Sampling) obtained by Hardyanto.

  14. Robust human body model injury prediction in simulated side impact crashes.

    PubMed

    Golman, Adam J; Danelson, Kerry A; Stitzel, Joel D

    2016-01-01

    This study developed a parametric methodology to robustly predict occupant injuries sustained in real-world crashes using a finite element (FE) human body model (HBM). One hundred and twenty near-side impact motor vehicle crashes were simulated over a range of parameters using a Toyota RAV4 (bullet vehicle), Ford Taurus (struck vehicle) FE models and a validated human body model (HBM) Total HUman Model for Safety (THUMS). Three bullet vehicle crash parameters (speed, location and angle) and two occupant parameters (seat position and age) were varied using a Latin hypercube design of Experiments. Four injury metrics (head injury criterion, half deflection, thoracic trauma index and pelvic force) were used to calculate injury risk. Rib fracture prediction and lung strain metrics were also analysed. As hypothesized, bullet speed had the greatest effect on each injury measure. Injury risk was reduced when bullet location was further from the B-pillar or when the bullet angle was more oblique. Age had strong correlation to rib fractures frequency and lung strain severity. The injuries from a real-world crash were predicted using two different methods by (1) subsampling the injury predictors from the 12 best crush profile matching simulations and (2) using regression models. Both injury prediction methods successfully predicted the case occupant's low risk for pelvic injury, high risk for thoracic injury, rib fractures and high lung strains with tight confidence intervals. This parametric methodology was successfully used to explore crash parameter interactions and to robustly predict real-world injuries.

  15. The practice of intensive care in Latin America: a survey of academic intensivists.

    PubMed

    Castro, Ricardo; Nin, Nicolas; Ríos, Fernando; Alegría, Leyla; Estenssoro, Elisa; Murias, Gastón; Friedman, Gilberto; Jibaja, Manuel; Ospina-Tascon, Gustavo; Hurtado, Javier; Marín, María Del Carmen; Machado, Flavia R; Cavalcanti, Alexandre Biasi; Dubin, Arnaldo; Azevedo, Luciano; Cecconi, Maurizio; Bakker, Jan; Hernandez, Glenn

    2018-02-21

    Intensive care medicine is a relatively young discipline that has rapidly grown into a full-fledged medical subspecialty. Intensivists are responsible for managing an ever-increasing number of patients with complex, life-threatening diseases. Several factors may influence their performance, including age, training, experience, workload, and socioeconomic context. The aim of this study was to examine individual- and work-related aspects of the Latin American intensivist workforce, mainly with academic appointments, which might influence the quality of care provided. In consequence, we conducted a cross-sectional study of intensivists at public and private academic and nonacademic Latin American intensive care units (ICUs) through a web-based electronic survey submitted by email. Questions about personal aspects, work-related topics, and general clinical workflow were incorporated. Our study comprised 735 survey respondents (53% return rate) with the following country-specific breakdown: Brazil (29%); Argentina (19%); Chile (17%); Uruguay (12%); Ecuador (9%); Mexico (7%); Colombia (5%); and Bolivia, Peru, Guatemala, and Paraguay combined (2%). Latin American intensivists were predominantly male (68%) young adults (median age, 40 [IQR, 35-48] years) with a median clinical ICU experience of 10 (IQR, 5-20) years. The median weekly workload was 60 (IQR, 47-70) h. ICU formal training was between 2 and 4 years. Only 63% of academic ICUs performed multidisciplinary rounds. Most intensivists (85%) reported adequate conditions to manage patients with septic shock in their units. Unsatisfactory conditions were attributed to insufficient technology (11%), laboratory support (5%), imaging resources (5%), and drug shortages (5%). Seventy percent of intensivists participated in research, and 54% read scientific studies regularly, whereas 32% read no more than one scientific study per month. Research grants and pharmaceutical sponsorship are unusual funding sources in Latin America. Although Latin American intensivists are mostly unsatisfied with their income (81%), only a minority (27%) considered changing to another specialty before retirement. Latin American intensivists constitute a predominantly young adult workforce, mostly formally trained, have a high workload, and most are interested in research. They are under important limitations owing to resource constraints and overt dissatisfaction. Latin America may be representative of other world areas with similar challenges for intensivists. Specific initiatives aimed at addressing these situations need to be devised to improve the quality of critical care delivery in Latin America.

  16. Latin American social medicine across borders: South-South cooperation and the making of health solidarity.

    PubMed

    Birn, Anne-Emanuelle; Muntaner, Carles

    2018-02-22

    Latin American social medicine efforts are typically understood as national endeavours, involving health workers, policymakers, academics, social movements, unions, and left-wing political parties, among other domestic actors. But Latin America's social medicine trajectory has also encompassed considerable between-country solidarity, building on early twentieth century interchanges among a range of players who shared approaches for improving living and working conditions and instituting protective social policies. Since the 1960s, Cuba's country-to-country solidarity has stood out, comprising medic exchanges, training, and other forms of support for the health and social struggles of oppressed peoples throughout Latin America and around the world, recently via Misión Barrio Adentro in Venezuela. These efforts strive for social justice-oriented health cooperation based on horizontal power relations, shared political values, a commitment to social and economic redistribution, bona fide equity, and an understanding of the societal determination of health that includes, but goes well beyond, public health and medical care. With Latin America's left-wing surge now receding, this article traces the provenance, dynamics, impact, challenges, and legacy of health solidarity across Latin American borders and its prospects for continuity.

  17. [Infant and child mortality in Latin America].

    PubMed

    Behm, H; Primante, D A

    1978-04-01

    High mortality rates persist in Latin America, and data collection is made very difficult because of the lack of reliable statistics. A study was initiated in 1976 to measure the probability of mortality from birth to 2 years of age in 12 Latin American countries. The Brass method was used and applied to population censuses. Probability of mortality is extremely heterogeneous and regularly very high, varying between a maximum of 202/1000 in Bolivia, to a minimum of 112/1000 in Uruguay. In comparison, the same probability is 21/1000 in the U.S., and 11/1000 in sweden. Mortality in rural areas is much higher than in urban ones, and varies according to the degree of education of the mother, children being born to mothers who had 10 years of formal education having the lowest risk of death. Children born to the indigenous population, largely illiterate and living in the poorest of conditions, have the highest probability of death, a probability reaching 67% of all deaths under 2 years. National health services in Latin America, although vastly improved and improving, still do not meet the needs of the population, especially rural, and structural and historical conditions hamper a wider application of existing medical knowledge.

  18. A snapshot of gene therapy in Latin America

    PubMed Central

    Linden, Rafael; Matte, Ursula

    2014-01-01

    Gene therapy attempts the insertion and expression of exogenous genetic material in cells for therapeutic purposes. Conceived in the 1960s, gene therapy reached its first clinical trial at the end of the 1980s and by December 2013 around 600 genuine open clinical trials of gene therapy were registered at NIH Clinical Trials Database. Here, we summarize the current efforts towards the development of gene therapy in Latin America. Our survey shows that the number of scientists involved in the development of gene therapy and DNA vaccines in Latin America is still very low. Higher levels of investment in this technology are necessary to boost the advancement of innovation and intellectual property in this field in a way that would ease both the social and financial burden of various medical conditions in Latin America. PMID:24764763

  19. Immigration experience of Latin American working women in Alicante, Spain: an ethnographic study 1

    PubMed Central

    González-Juárez, Liliana; Noreña-Peña, Ana Lucía

    2014-01-01

    OBJECTIVE: to describe the experience of Latin American working women regarding immigration, taking into account the expectations and conditions in which this process takes place. METHOD: ethnographic qualitative study. Data collection was performed by means of semi-structured interviews with 24 Latin American immigrant women in Spain. The information collected was triangulated through two focal groups. RESULTS: the expectations of migrant women focus on improving family living conditions. Social support is essential for their settling and to perform daily life activities. They declare they have adapted to the settlement country, although they live with stress. They perceive they have greater sexual freedom and power with their partners but keep greater responsibility in childcare, combining that with the role of working woman. CONCLUSIONS: migrant women play a key role in the survival of households, they build and create new meanings about being a woman, their understanding of life, their social and couple relationships. Such importance is shaped by their expectations and the conditions in which the migration process takes place, as well as their work integration. PMID:25493683

  20. A non-invasive implementation of a mixed domain decomposition method for frictional contact problems

    NASA Astrophysics Data System (ADS)

    Oumaziz, Paul; Gosselet, Pierre; Boucard, Pierre-Alain; Guinard, Stéphane

    2017-11-01

    A non-invasive implementation of the Latin domain decomposition method for frictional contact problems is described. The formulation implies to deal with mixed (Robin) conditions on the faces of the subdomains, which is not a classical feature of commercial software. Therefore we propose a new implementation of the linear stage of the Latin method with a non-local search direction built as the stiffness of a layer of elements on the interfaces. This choice enables us to implement the method within the open source software Code_Aster, and to derive 2D and 3D examples with similar performance as the standard Latin method.

  1. Group velocity of discrete-time quantum walks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kempf, A.; Department of Applied Mathematics, University of Waterloo, Waterloo, Ontario, Canada N2L 3G1; Portugal, R.

    2009-05-15

    We show that certain types of quantum walks can be modeled as waves that propagate in a medium with phase and group velocities that are explicitly calculable. Since the group and phase velocities indicate how fast wave packets can propagate causally, we propose the use of these wave velocities in our definition for the hitting time of quantum walks. Our definition of hitting time has the advantage that it requires neither the specification of a walker's initial condition nor of an arrival probability threshold. We give full details for the case of quantum walks on the Cayley graphs of Abelianmore » groups. This includes the special cases of quantum walks on the line and on hypercubes.« less

  2. Experience of disused source management in Latin America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pimenta Mourao, R.

    2008-07-01

    The Centro de Desenvolvimento da Tecnologia Nuclear (Center for the Development of Nuclear Technology) - CDTN - has been actively engaged in cooperation programs for disused source management throughout the Latin American and the Caribbean region since 1996. The CDTN source conditioning team participated in the preparation of the technical procedures established for the different tasks involved in the radium sources conditioning operations, like preparation of the packaging for conditioning; sources conditioning; capsule welding; leak test in radium-containing capsule; and radiation protection planning for the conditioning of disused radium sources. The team also carried out twelve radium sources conditioning operationmore » in the region, besides in-house operations, which resulted in a total conditioned activity of approximately 525 GBq, or 14,200 mg of radium. Additionally, one operation was carried out in Nicaragua to safely condition three Cobalt teletherapy heads stored under very precarious conditions in the premises of an old hospital. More recently, the team started its participation in an IAEA- and US State Department-sponsored program for the repatriation of disused or excess transuranic sources presently stored at users' premises or under regulatory control in different countries in the region. In September 2007 the team attended a theoretical and practical training in transuranic sources management, including the participation in the conditioning of different neutron sources in certified packages. It is expected that the trained team will carry out similar operations in other Latin American countries. Finally, the team is expected be involved in the near future in the repatriation of US-origin teletherapy heads and industrial gauges. (authors)« less

  3. Education as Part of the Migratory Project of Latin American Migrant Women Traveling to the United States in Undocumented Conditions

    ERIC Educational Resources Information Center

    Cardenas-Rodríguez, Rocio; Terrón-Caro, Teresa; Vázquez Delgado, Blanca Delia; Cueva-Luna, Teresa Elizabeth

    2015-01-01

    Education is an indispensable element for the development of society. In Latin America, the point of origin of most of the undocumented immigrants to the United States, equal opportunity in access to education and educational achievement is still pending. The study presented here focuses on the analysis of the expectations of female migrants via…

  4. Outline of Education Systems and School Conditions in Latin America. Bulletin, 1923, No. 44

    ERIC Educational Resources Information Center

    Luckey, George W. A.

    1923-01-01

    This bulletin is divided into two parts: (1) South America; and (2) Mexico, Cuba, and Central America. The countries included under the term "Latin America" are so extensive and important, and the effects of the World War, direct and indirect, on all systems of education have been so disturbing, that one is at a loss to know how best to…

  5. Genetic code, hamming distance and stochastic matrices.

    PubMed

    He, Matthew X; Petoukhov, Sergei V; Ricci, Paolo E

    2004-09-01

    In this paper we use the Gray code representation of the genetic code C=00, U=10, G=11 and A=01 (C pairs with G, A pairs with U) to generate a sequence of genetic code-based matrices. In connection with these code-based matrices, we use the Hamming distance to generate a sequence of numerical matrices. We then further investigate the properties of the numerical matrices and show that they are doubly stochastic and symmetric. We determine the frequency distributions of the Hamming distances, building blocks of the matrices, decomposition and iterations of matrices. We present an explicit decomposition formula for the genetic code-based matrix in terms of permutation matrices, which provides a hypercube representation of the genetic code. It is also observed that there is a Hamiltonian cycle in a genetic code-based hypercube.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, J.; Derrida, B.

    The problem of directed polymers on disordered hierarchical and hypercubic lattices is considered. For the hierarchical lattices the problem can be reduced to the study of the stable laws for combining random variables in a nonlinear way. The authors present the results of numerical simulations of two hierarchical lattices, finding evidence of a phase transition in one case. For a limiting case they extend the perturbation theory developed by Derrida and Griffiths to nonzero temperature and to higher order and use this approach to calculate thermal and geometrical properties (overlaps) of the model. In this limit they obtain an interpolationmore » formula, allowing one to obtain the noninteger moments of the partition function from the integer moments. They obtain bounds for the transition temperature for hierarchical and hypercubic lattices, and some similarities between the problem on the two different types of lattice are discussed.« less

  7. High-Accuracy Finite Element Method: Benchmark Calculations

    NASA Astrophysics Data System (ADS)

    Gusev, Alexander; Vinitsky, Sergue; Chuluunbaatar, Ochbadrakh; Chuluunbaatar, Galmandakh; Gerdt, Vladimir; Derbov, Vladimir; Góźdź, Andrzej; Krassovitskiy, Pavel

    2018-02-01

    We describe a new high-accuracy finite element scheme with simplex elements for solving the elliptic boundary-value problems and show its efficiency on benchmark solutions of the Helmholtz equation for the triangle membrane and hypercube.

  8. Characterizing commercial oil palm expansion in Latin America: land use change and trade

    NASA Astrophysics Data System (ADS)

    Furumo, Paul Richard; Aide, T. Mitchell

    2017-02-01

    Commodity crop expansion has increased with the globalization of production systems and consumer demand, linking distant socio-ecological systems. Oil palm plantations are expanding in the tropics to satisfy growing oilseed and biofuel markets, and much of this expansion has caused extensive deforestation, especially in Asia. In Latin America, palm oil output has doubled since 2001, and the majority of expansion seems to be occurring on non-forested lands. We used MODIS satellite imagery (250 m resolution) to map current oil palm plantations in Latin America and determined prior land use and land cover (LULC) using high-resolution images in Google Earth. In addition, we compiled trade data to determine where Latin American palm oil flows, in order to better understand the underlying drivers of expansion in the region. Based on a sample of 342 032 ha of oil palm plantations across Latin America, we found that 79% replaced previously intervened lands (e.g. pastures, croplands, bananas), primarily cattle pastures (56%). The remaining 21% came from areas that were classified as woody vegetation (e.g. forests), most notably in the Amazon and the Petén region in northern Guatemala. Latin America is a net exporter of palm oil but the majority of palm oil exports (70%) stayed within the region, with Mexico importing about half. Growth of the oil palm sector may be driven by global factors, but environmental and economic outcomes vary between regions (i.e. Asia and Latin America), within regions (i.e. Colombia and Peru), and within single countries (i.e. Guatemala), suggesting that local conditions are influential. The present trend of oil palm expanding onto previously cleared lands, guided by roundtable certifications programs, provides an opportunity for more sustainable development of the oil palm sector in Latin America.

  9. Gaps In Primary Care And Health System Performance In Six Latin American And Caribbean Countries.

    PubMed

    Macinko, James; Guanais, Frederico C; Mullachery, Pricila; Jimenez, Geronimo

    2016-08-01

    The rapid demographic and epidemiological transitions occurring in Latin America and the Caribbean have led to high levels of noncommunicable diseases in the region. In addition to reduced risk factors for chronic conditions, a strong health system for managing chronic conditions is vital. This study assessed the extent to which populations in six Latin American and Caribbean countries receive high-quality primary care, and it examined the relationship between experiences with care and perceptions of health system performance. We applied a validated survey on access, use, and satisfaction with health care services to nationally representative samples of the populations of Brazil, Colombia, El Salvador, Jamaica, Mexico, and Panama. Respondents reported considerable gaps in the ways in which primary care is organized, financed, and delivered. Nearly half reported using the emergency department for a condition they considered treatable in a primary care setting. Reports of more primary care problems were associated with worse perceptions of health system performance and quality and less receipt of preventive care. Urgent attention to primary care performance is required as the region's population continues to age at an unprecedented rate. Project HOPE—The People-to-People Health Foundation, Inc.

  10. Uncertainty Analysis of Simulated Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Chen, M.; Sun, Y.; Fu, P.; Carrigan, C. R.; Lu, Z.

    2012-12-01

    Artificial hydraulic fracturing is being used widely to stimulate production of oil, natural gas, and geothermal reservoirs with low natural permeability. Optimization of field design and operation is limited by the incomplete characterization of the reservoir, as well as the complexity of hydrological and geomechanical processes that control the fracturing. Thus, there are a variety of uncertainties associated with the pre-existing fracture distribution, rock mechanics, and hydraulic-fracture engineering that require evaluation of their impact on the optimized design. In this study, a multiple-stage scheme was employed to evaluate the uncertainty. We first define the ranges and distributions of 11 input parameters that characterize the natural fracture topology, in situ stress, geomechanical behavior of the rock matrix and joint interfaces, and pumping operation, to cover a wide spectrum of potential conditions expected for a natural reservoir. These parameters were then sampled 1,000 times in an 11-dimensional parameter space constrained by the specified ranges using the Latin-hypercube method. These 1,000 parameter sets were fed into the fracture simulators, and the outputs were used to construct three designed objective functions, i.e. fracture density, opened fracture length and area density. Using PSUADE, three response surfaces (11-dimensional) of the objective functions were developed and global sensitivity was analyzed to identify the most sensitive parameters for the objective functions representing fracture connectivity, which are critical for sweep efficiency of the recovery process. The second-stage high resolution response surfaces were constructed with dimension reduced to the number of the most sensitive parameters. An additional response surface with respect to the objective function of the fractal dimension for fracture distributions was constructed in this stage. Based on these response surfaces, comprehensive uncertainty analyses were conducted among input parameters and objective functions. In addition, reduced-order emulation models resulting from this analysis can be used for optimal control of hydraulic fracturing. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  11. Regional prediction of soil organic carbon content over temperate croplands using visible near-infrared airborne hyperspectral imagery and synchronous field spectra

    NASA Astrophysics Data System (ADS)

    Vaudour, E.; Gilliot, J. M.; Bel, L.; Lefevre, J.; Chehdi, K.

    2016-07-01

    This study aimed at identifying the potential of Vis-NIR airborne hyperspectral AISA-Eagle data for predicting the topsoil organic carbon (SOC) content of bare cultivated soils over a large peri-urban area (221 km2) with both contrasted soils and SOC contents, located in the western region of Paris, France. Soil types comprised haplic luvisols, calcaric cambisols and colluvic cambisols. Airborne AISA-Eagle data (400-1000 nm, 126 bands) with 1 m-resolution were acquired on 17 April 2013 over 13 tracks. Tracks were atmospherically corrected then mosaicked at a 2 m-resolution using a set of 24 synchronous field spectra of bare soils, black and white targets and impervious surfaces. The land use identification system layer (RPG) of 2012 was used to mask non-agricultural areas, then calculation and thresholding of NDVI from an atmospherically corrected SPOT image acquired the same day enabled to map agricultural fields with bare soil. A total of 101 sites sampled either in 2013 or in the 3 previous years and in 2015 were identified as bare by means of this map. Predictions were made from the mosaic AISA spectra which were related to topsoil SOC contents by means of partial least squares regression (PLSR). Regression robustness was evaluated through a series of 1000 bootstrap data sets of calibration-validation samples, considering 74 sites outside cloud shadows only, and different sampling strategies for selecting calibration samples. Validation root-mean-square errors (RMSE) were comprised between 3.73 and 4.49 g Kg-1 and were ∼4 g Kg-1 in median. The most performing models in terms of coefficient of determination (R2) and Residual Prediction Deviation (RPD) values were the calibration models derived either from Kennard-Stone or conditioned Latin Hypercube sampling on smoothed spectra. The most generalizable model leading to lowest RMSE value of 3.73 g Kg-1 at the regional scale and 1.44 g Kg-1 at the within-field scale and low bias was the cross-validated leave-one-out PLSR model constructed with the 28 near-synchronous samples and raw spectra.

  12. Uncertainty Analysis of Decomposing Polyurethane Foam

    NASA Technical Reports Server (NTRS)

    Hobbs, Michael L.; Romero, Vicente J.

    2000-01-01

    Sensitivity/uncertainty analyses are necessary to determine where to allocate resources for improved predictions in support of our nation's nuclear safety mission. Yet, sensitivity/uncertainty analyses are not commonly performed on complex combustion models because the calculations are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, a variety of sensitivity/uncertainty analyses were used to determine the uncertainty associated with thermal decomposition of polyurethane foam exposed to high radiative flux boundary conditions. The polyurethane used in this study is a rigid closed-cell foam used as an encapsulant. Related polyurethane binders such as Estane are used in many energetic materials of interest to the JANNAF community. The complex, finite element foam decomposition model used in this study has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state decomposition front velocity calculated as the derivative of the decomposition front location versus time. An analytical mean value sensitivity/uncertainty (MV) analysis was used to determine the standard deviation by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation was essentially determined from a second derivative that was extremely sensitive to numerical noise. To minimize the numerical noise, 50-micrometer element dimensions and approximately 1-msec time steps were required to obtain stable uncertainty results. As an alternative method to determine the uncertainty and sensitivity in the decomposition front velocity, surrogate response surfaces were generated for use with a constrained Latin Hypercube Sampling (LHS) technique. Two surrogate response surfaces were investigated: 1) a linear surrogate response surface (LIN) and 2) a quadratic response surface (QUAD). The LHS techniques do not require derivatives of the response variable and are subsequently relatively insensitive to numerical noise. To compare the LIN and QUAD methods to the MV method, a direct LHS analysis (DLHS) was performed using the full grid and timestep resolved finite element model. The surrogate response models (LIN and QUAD) are shown to give acceptable values of the mean and standard deviation when compared to the fully converged DLHS model.

  13. Control of type 2 diabetes mellitus among general practitioners in private practice in nine countries of Latin America.

    PubMed

    Lopez Stewart, Gloria; Tambascia, Marcos; Rosas Guzmán, Juan; Etchegoyen, Federico; Ortega Carrión, Jorge; Artemenko, Sofia

    2007-07-01

    To better understand how diabetes care and control are being administered by general practitioners/nonspecialists in private practice in nine countries of Latin America, and to identify the most significant patient- and physician-related barriers to care. A multicenter, cross-sectional, epidemiological survey was conducted in nine countries in Latin America: Argentina, Brazil, Chile, Costa Rica, Ecuador, Guatemala, Mexico, Peru, and Venezuela. General practitioners in private practice were asked to provide care and control data for patients 18 to 75 years of age with type 2 diabetes mellitus (T2DM), including demographics, medical and medication history, laboratory exams, and information on the challenges of patient management. Of the 3 592 patient questionnaires returned by 377 physicians, 60% of the patients had a family history of diabetes, 58% followed a poor diet, 71% were sedentary, and 79% were obese or overweight. Poor glycemic control (fasting blood glucose >or= 110 mg/dL) was observed in 78% of patients. The number of patients with HbA1c < 7.0% was 43.2%. Glycemic control decreased significantly with increased duration of T2DM. Comorbid conditions associated with T2DM were observed in 86% of patients; insulin use and comorbid conditions, especially those associated with microvascular complications, increased significantly disease duration. Ensuring compliance with recommended diet and exercise plans was the most-cited patient management challenge. Blood glucose levels are undercontrolled in T2DM patients in the private health care system in Latin America, particularly among those who have had the disease the longest (>15 years). Considering the differences between private and public health care in Latin America, especially regarding the quality of care and access to medication, further studies are called for in the public setting. Overall, a more efficient and intensive program of T2DM control is required, including effective patient education programs, adjusted to the realities of Latin America.

  14. Performance Evaluation of Parallel Branch and Bound Search with the Intel iPSC (Intel Personal SuperComputer) Hypercube Computer.

    DTIC Science & Technology

    1986-12-01

    17 III. Analysis of Parallel Design ................................................ 18 Parallel Abstract Data ...Types ........................................... 18 Abstract Data Type .................................................. 19 Parallel ADT...22 Data -Structure Design ........................................... 23 Object-Oriented Design

  15. A 2D electrostatic PIC code for the Mark III Hypercube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferraro, R.D.; Liewer, P.C.; Decyk, V.K.

    We have implemented a 2D electrostastic plasma particle in cell (PIC) simulation code on the Caltech/JPL Mark IIIfp Hypercube. The code simulates plasma effects by evolving in time the trajectories of thousands to millions of charged particles subject to their self-consistent fields. Each particle`s position and velocity is advanced in time using a leap frog method for integrating Newton`s equations of motion in electric and magnetic fields. The electric field due to these moving charged particles is calculated on a spatial grid at each time by solving Poisson`s equation in Fourier space. These two tasks represent the largest part ofmore » the computation. To obtain efficient operation on a distributed memory parallel computer, we are using the General Concurrent PIC (GCPIC) algorithm previously developed for a 1D parallel PIC code.« less

  16. Discrete sensitivity derivatives of the Navier-Stokes equations with a parallel Krylov solver

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Taylor, Arthur C., III

    1994-01-01

    This paper solves an 'incremental' form of the sensitivity equations derived by differentiating the discretized thin-layer Navier Stokes equations with respect to certain design variables of interest. The equations are solved with a parallel, preconditioned Generalized Minimal RESidual (GMRES) solver on a distributed-memory architecture. The 'serial' sensitivity analysis code is parallelized by using the Single Program Multiple Data (SPMD) programming model, domain decomposition techniques, and message-passing tools. Sensitivity derivatives are computed for low and high Reynolds number flows over a NACA 1406 airfoil on a 32-processor Intel Hypercube, and found to be identical to those computed on a single-processor Cray Y-MP. It is estimated that the parallel sensitivity analysis code has to be run on 40-50 processors of the Intel Hypercube in order to match the single-processor processing time of a Cray Y-MP.

  17. SIAM Conference on Parallel Processing for Scientific Computing, 4th, Chicago, IL, Dec. 11-13, 1989, Proceedings

    NASA Technical Reports Server (NTRS)

    Dongarra, Jack (Editor); Messina, Paul (Editor); Sorensen, Danny C. (Editor); Voigt, Robert G. (Editor)

    1990-01-01

    Attention is given to such topics as an evaluation of block algorithm variants in LAPACK and presents a large-grain parallel sparse system solver, a multiprocessor method for the solution of the generalized Eigenvalue problem on an interval, and a parallel QR algorithm for iterative subspace methods on the CM2. A discussion of numerical methods includes the topics of asynchronous numerical solutions of PDEs on parallel computers, parallel homotopy curve tracking on a hypercube, and solving Navier-Stokes equations on the Cedar Multi-Cluster system. A section on differential equations includes a discussion of a six-color procedure for the parallel solution of elliptic systems using the finite quadtree structure, data parallel algorithms for the finite element method, and domain decomposition methods in aerodynamics. Topics dealing with massively parallel computing include hypercube vs. 2-dimensional meshes and massively parallel computation of conservation laws. Performance and tools are also discussed.

  18. Developing a Screening Algorithm for Type II Diabetes Mellitus in the Resource-Limited Setting of Rural Tanzania.

    PubMed

    West, Caroline; Ploth, David; Fonner, Virginia; Mbwambo, Jessie; Fredrick, Francis; Sweat, Michael

    2016-04-01

    Noncommunicable diseases are on pace to outnumber infectious disease as the leading cause of death in sub-Saharan Africa, yet many questions remain unanswered with concern toward effective methods of screening for type II diabetes mellitus (DM) in this resource-limited setting. We aim to design a screening algorithm for type II DM that optimizes sensitivity and specificity of identifying individuals with undiagnosed DM, as well as affordability to health systems and individuals. Baseline demographic and clinical data, including hemoglobin A1c (HbA1c), were collected from 713 participants using probability sampling of the general population. We used these data, along with model parameters obtained from the literature, to mathematically model 8 purposed DM screening algorithms, while optimizing the sensitivity and specificity using Monte Carlo and Latin Hypercube simulation. An algorithm that combines risk assessment and measurement of fasting blood glucose was found to be superior for the most resource-limited settings (sensitivity 68%, sensitivity 99% and cost per patient having DM identified as $2.94). Incorporating HbA1c testing improves the sensitivity to 75.62%, but raises the cost per DM case identified to $6.04. The preferred algorithms are heavily biased to diagnose those with more severe cases of DM. Using basic risk assessment tools and fasting blood sugar testing in lieu of HbA1c testing in resource-limited settings could allow for significantly more feasible DM screening programs with reasonable sensitivity and specificity. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  19. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    NASA Astrophysics Data System (ADS)

    Bennett, Katrina E.; Urrego Blanco, Jorge R.; Jonko, Alexandra; Bohn, Theodore J.; Atchley, Adam L.; Urban, Nathan M.; Middleton, Richard S.

    2018-01-01

    The Colorado River Basin is a fundamentally important river for society, ecology, and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent, and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model. We combine global sensitivity analysis with a space-filling Latin Hypercube Sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach. We find that snow-dominated regions are much more sensitive to uncertainties in VIC parameters. Although baseflow and runoff changes respond to parameters used in previous sensitivity studies, we discover new key parameter sensitivities. For instance, changes in runoff and evapotranspiration are sensitive to albedo, while changes in snow water equivalent are sensitive to canopy fraction and Leaf Area Index (LAI) in the VIC model. It is critical for improved modeling to narrow uncertainty in these parameters through improved observations and field studies. This is important because LAI and albedo are anticipated to change under future climate and narrowing uncertainty is paramount to advance our application of models such as VIC for water resource management.

  20. An Elitist Multiobjective Tabu Search for Optimal Design of Groundwater Remediation Systems.

    PubMed

    Yang, Yun; Wu, Jianfeng; Wang, Jinguo; Zhou, Zhifang

    2017-11-01

    This study presents a new multiobjective evolutionary algorithm (MOEA), the elitist multiobjective tabu search (EMOTS), and incorporates it with MODFLOW/MT3DMS to develop a groundwater simulation-optimization (SO) framework based on modular design for optimal design of groundwater remediation systems using pump-and-treat (PAT) technique. The most notable improvement of EMOTS over the original multiple objective tabu search (MOTS) lies in the elitist strategy, selection strategy, and neighborhood move rule. The elitist strategy is to maintain all nondominated solutions within later search process for better converging to the true Pareto front. The elitism-based selection operator is modified to choose two most remote solutions from current candidate list as seed solutions to increase the diversity of searching space. Moreover, neighborhood solutions are uniformly generated using the Latin hypercube sampling (LHS) in the bounded neighborhood space around each seed solution. To demonstrate the performance of the EMOTS, we consider a synthetic groundwater remediation example. Problem formulations consist of two objective functions with continuous decision variables of pumping rates while meeting water quality requirements. Especially, sensitivity analysis is evaluated through the synthetic case for determination of optimal combination of the heuristic parameters. Furthermore, the EMOTS is successfully applied to evaluate remediation options at the field site of the Massachusetts Military Reservation (MMR) in Cape Cod, Massachusetts. With both the hypothetical and the large-scale field remediation sites, the EMOTS-based SO framework is demonstrated to outperform the original MOTS in achieving the performance metrics of optimality and diversity of nondominated frontiers with desirable stability and robustness. © 2017, National Ground Water Association.

  1. Statistical uncertainty analysis applied to the DRAGONv4 code lattice calculations and based on JENDL-4 covariance data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez-Solis, A.; Demaziere, C.; Ekberg, C.

    2012-07-01

    In this paper, multi-group microscopic cross-section uncertainty is propagated through the DRAGON (Version 4) lattice code, in order to perform uncertainty analysis on k{infinity} and 2-group homogenized macroscopic cross-sections predictions. A statistical methodology is employed for such purposes, where cross-sections of certain isotopes of various elements belonging to the 172 groups DRAGLIB library format, are considered as normal random variables. This library is based on JENDL-4 data, because JENDL-4 contains the largest amount of isotopic covariance matrixes among the different major nuclear data libraries. The aim is to propagate multi-group nuclide uncertainty by running the DRAGONv4 code 500 times, andmore » to assess the output uncertainty of a test case corresponding to a 17 x 17 PWR fuel assembly segment without poison. The chosen sampling strategy for the current study is Latin Hypercube Sampling (LHS). The quasi-random LHS allows a much better coverage of the input uncertainties than simple random sampling (SRS) because it densely stratifies across the range of each input probability distribution. Output uncertainty assessment is based on the tolerance limits concept, where the sample formed by the code calculations infers to cover 95% of the output population with at least a 95% of confidence. This analysis is the first attempt to propagate parameter uncertainties of modern multi-group libraries, which are used to feed advanced lattice codes that perform state of the art resonant self-shielding calculations such as DRAGONv4. (authors)« less

  2. Acute effect of Vagus nerve stimulation parameters on cardiac chronotropic, inotropic, and dromotropic responses

    NASA Astrophysics Data System (ADS)

    Ojeda, David; Le Rolle, Virginie; Romero-Ugalde, Hector M.; Gallet, Clément; Bonnet, Jean-Luc; Henry, Christine; Bel, Alain; Mabo, Philippe; Carrault, Guy; Hernández, Alfredo I.

    2017-11-01

    Vagus nerve stimulation (VNS) is an established therapy for drug-resistant epilepsy and depression, and is considered as a potential therapy for other pathologies, including Heart Failure (HF) or inflammatory diseases. In the case of HF, several experimental studies on animals have shown an improvement in the cardiac function and a reverse remodeling of the cardiac cavity when VNS is applied. However, recent clinical trials have not been able to reproduce the same response in humans. One of the hypothesis to explain this lack of response is related to the way in which stimulation parameters are defined. The combined effect of VNS parameters is still poorly-known, especially in the case of VNS synchronously delivered with cardiac activity. In this paper, we propose a methodology to analyze the acute cardiovascular effects of VNS parameters individually, as well as their interactive effects. A Latin hypercube sampling method was applied to design a uniform experimental plan. Data gathered from this experimental plan was used to produce a Gaussian process regression (GPR) model in order to estimate unobserved VNS sequences. Finally, a Morris screening sensitivity analysis method was applied to each obtained GPR model. Results highlight dominant effects of pulse current, pulse width and number of pulses over frequency and delay and, more importantly, the degree of interactions between these parameters on the most important acute cardiovascular responses. In particular, high interacting effects between current and pulse width were found. Similar sensitivity profiles were observed for chronotropic, dromotropic and inotropic effects. These findings are of primary importance for the future development of closed-loop, personalized neuromodulator technologies.

  3. Analysis of the influence of handset phone position on RF exposure of brain tissue.

    PubMed

    Ghanmi, Amal; Varsier, Nadège; Hadjem, Abdelhamid; Conil, Emmanuelle; Picon, Odile; Wiart, Joe

    2014-12-01

    Exposure to mobile phone radio frequency (RF) electromagnetic fields depends on many different parameters. For epidemiological studies investigating the risk of brain cancer linked to RF exposure from mobile phones, it is of great interest to characterize brain tissue exposure and to know which parameters this exposure is sensitive to. One such parameter is the position of the phone during communication. In this article, we analyze the influence of the phone position on the brain exposure by comparing the specific absorption rate (SAR) induced in the head by two different mobile phone models operating in Global System for Mobile Communications (GSM) frequency bands. To achieve this objective, 80 different phone positions were chosen using an experiment based on the Latin hypercube sampling (LHS) to select a representative set of positions. The averaged SAR over 10 g (SAR10 g) in the head, the averaged SAR over 1 g (SAR1 g ) in the brain, and the averaged SAR in different anatomical brain structures were estimated at 900 and 1800 MHz for the 80 positions. The results illustrate that SAR distributions inside the brain area are sensitive to the position of the mobile phone relative to the head. The results also show that for 5-10% of the studied positions the SAR10 g in the head and the SAR1 g in the brain can be 20% higher than the SAR estimated for the standard cheek position and that the Specific Anthropomorphic Mannequin (SAM) model is conservative for 95% of all the studied positions. © 2014 Wiley Periodicals, Inc.

  4. Optimization of Heat Exchangers with Dimpled Surfaces to Improve the Performance in Thermoelectric Generators Using a Kriging Model

    NASA Astrophysics Data System (ADS)

    Li, Shuai; Wang, Yiping; Wang, Tao; Yang, Xue; Deng, Yadong; Su, Chuqi

    2017-05-01

    Thermoelectric generators (TEGs) have become a topic of interest for vehicle exhaust energy recovery. Electrical power generation is deeply influenced by temperature differences, temperature uniformity and topological structures of TEGs. When the dimpled surfaces are adopted in heat exchangers, the heat transfer rates can be augmented with a minimal pressure drop. However, the temperature distribution shows a large gradient along the flow direction which has adverse effects on the power generation. In the current study, the heat exchanger performance was studied in a computational fluid dynamics (CFD) model. The dimple depth, dimple print diameter, and channel height were chosen as design variables. The objective function was defined as a combination of average temperature, temperature uniformity and pressure loss. The optimal Latin hypercube method was used to determine the experiment points as a method of design of the experiment in order to analyze the sensitivity of the design variables. A Kriging surrogate model was built and verified according to the database resulting from the CFD simulation. A multi-island genetic algorithm was used to optimize the structure in the heat exchanger based on the surrogate model. The results showed that the average temperature of the heat exchanger was most sensitive to the dimple depth. The pressure loss and temperature uniformity were most sensitive to the parameter of channel rear height, h 2. With an optimal design of channel structure, the temperature uniformity can be greatly improved compared with the initial exchanger, and the additional pressure loss also increased.

  5. Optimization design of energy deposition on single expansion ramp nozzle

    NASA Astrophysics Data System (ADS)

    Ju, Shengjun; Yan, Chao; Wang, Xiaoyong; Qin, Yupei; Ye, Zhifei

    2017-11-01

    Optimization design has been widely used in the aerodynamic design process of scramjets. The single expansion ramp nozzle is an important component for scramjets to produces most of thrust force. A new concept of increasing the aerodynamics of the scramjet nozzle with energy deposition is presented. The essence of the method is to create a heated region in the inner flow field of the scramjet nozzle. In the current study, the two-dimensional coupled implicit compressible Reynolds Averaged Navier-Stokes and Menter's shear stress transport turbulence model have been applied to numerically simulate the flow fields of the single expansion ramp nozzle with and without energy deposition. The numerical results show that the proposal of energy deposition can be an effective method to increase force characteristics of the scramjet nozzle, the thrust coefficient CT increase by 6.94% and lift coefficient CN decrease by 26.89%. Further, the non-dominated sorting genetic algorithm coupled with the Radial Basis Function neural network surrogate model has been employed to determine optimum location and density of the energy deposition. The thrust coefficient CT and lift coefficient CN are selected as objective functions, and the sampling points are obtained numerically by using a Latin hypercube design method. The optimized thrust coefficient CT further increase by 1.94%, meanwhile, the optimized lift coefficient CN further decrease by 15.02% respectively. At the same time, the optimized performances are in good and reasonable agreement with the numerical predictions. The findings suggest that scramjet nozzle design and performance can benefit from the application of energy deposition.

  6. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  7. Parametric Sensitivity Analysis of Precipitation at Global and Local Scales in the Community Atmosphere Model CAM5

    DOE PAGES

    Qian, Yun; Yan, Huiping; Hou, Zhangshuan; ...

    2015-04-10

    We investigate the sensitivity of precipitation characteristics (mean, extreme and diurnal cycle) to a set of uncertain parameters that influence the qualitative and quantitative behavior of the cloud and aerosol processes in the Community Atmosphere Model (CAM5). We adopt both the Latin hypercube and quasi-Monte Carlo sampling approaches to effectively explore the high-dimensional parameter space and then conduct two large sets of simulations. One set consists of 1100 simulations (cloud ensemble) perturbing 22 parameters related to cloud physics and convection, and the other set consists of 256 simulations (aerosol ensemble) focusing on 16 parameters related to aerosols and cloud microphysics.more » Results show that for the 22 parameters perturbed in the cloud ensemble, the six having the greatest influences on the global mean precipitation are identified, three of which (related to the deep convection scheme) are the primary contributors to the total variance of the phase and amplitude of the precipitation diurnal cycle over land. The extreme precipitation characteristics are sensitive to a fewer number of parameters. The precipitation does not always respond monotonically to parameter change. The influence of individual parameters does not depend on the sampling approaches or concomitant parameters selected. Generally the GLM is able to explain more of the parametric sensitivity of global precipitation than local or regional features. The total explained variance for precipitation is primarily due to contributions from the individual parameters (75-90% in total). The total variance shows a significant seasonal variability in the mid-latitude continental regions, but very small in tropical continental regions.« less

  8. 3-D ballistic transport of ellipsoidal volcanic projectiles considering horizontal wind field and variable shape-dependent drag coefficients

    NASA Astrophysics Data System (ADS)

    Bertin, Daniel

    2017-02-01

    An innovative 3-D numerical model for the dynamics of volcanic ballistic projectiles is presented here. The model focuses on ellipsoidal particles and improves previous approaches by considering horizontal wind field, virtual mass forces, and drag forces subjected to variable shape-dependent drag coefficients. Modeling suggests that the projectile's launch velocity and ejection angle are first-order parameters influencing ballistic trajectories. The projectile's density and minor radius are second-order factors, whereas both intermediate and major radii of the projectile are of third order. Comparing output parameters, assuming different input data, highlights the importance of considering a horizontal wind field and variable shape-dependent drag coefficients in ballistic modeling, which suggests that they should be included in every ballistic model. On the other hand, virtual mass forces should be discarded since they almost do not contribute to ballistic trajectories. Simulation results were used to constrain some crucial input parameters (launch velocity, ejection angle, wind speed, and wind azimuth) of the block that formed the biggest and most distal ballistic impact crater during the 1984-1993 eruptive cycle of Lascar volcano, Northern Chile. Subsequently, up to 106 simulations were performed, whereas nine ejection parameters were defined by a Latin-hypercube sampling approach. Simulation results were summarized as a quantitative probabilistic hazard map for ballistic projectiles. Transects were also done in order to depict aerial hazard zones based on the same probabilistic procedure. Both maps combined can be used as a hazard prevention tool for ground and aerial transits nearby unresting volcanoes.

  9. A Novel Approach for Determining Source-Receptor Relationships of Aerosols in Model Simulations

    NASA Astrophysics Data System (ADS)

    Ma, P.; Gattiker, J.; Liu, X.; Rasch, P. J.

    2013-12-01

    The climate modeling community usually performs sensitivity studies in the 'one-factor-at-a-time' fashion. However, owing to the a-priori unknown complexity and nonlinearity of the climate system and simulation response, it is computationally expensive to systematically identify the cause-and-effect of multiple factors in climate models. In this study, we use a Gaussian Process emulator, based on a small number of Community Atmosphere Model Version 5.1 (CAM5) simulations (constrained by meteorological reanalyses) using a Latin Hypercube experimental design, to demonstrate that it is possible to characterize model behavior accurately and very efficiently without any modifications to the model itself. We use the emulator to characterize the source-receptor relationships of black carbon (BC), focusing specifically on describing the constituent burden and surface deposition rates from emissions in various regions. Our results show that the emulator is capable of quantifying the contribution of aerosol burden and surface deposition from different source regions, finding that most of current Arctic BC comes from remote sources. We also demonstrate that the sensitivity of the BC burdens to emission perturbations differs for various source regions. For example, the emission growth in Africa where dry convections are strong results in a moderate increase of BC burden over the globe while the same emission growth in the Arctic leads to a significant increase of local BC burdens and surface deposition rates. These results provide insights into the dynamical, physical, and chemical processes of the climate model, and the conclusions may have policy implications for making cost-effective global and regional pollution management strategies.

  10. Vertical overlap of probability density functions of cloud and precipitation hydrometeors: CLOUD AND PRECIPITATION PDF OVERLAP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovchinnikov, Mikhail; Lim, Kyo-Sun Sunny; Larson, Vincent E.

    Coarse-resolution climate models increasingly rely on probability density functions (PDFs) to represent subgrid-scale variability of prognostic variables. While PDFs characterize the horizontal variability, a separate treatment is needed to account for the vertical structure of clouds and precipitation. When sub-columns are drawn from these PDFs for microphysics or radiation parameterizations, appropriate vertical correlations must be enforced via PDF overlap specifications. This study evaluates the representation of PDF overlap in the Subgrid Importance Latin Hypercube Sampler (SILHS) employed in the assumed PDF turbulence and cloud scheme called the Cloud Layers Unified By Binormals (CLUBB). PDF overlap in CLUBB-SILHS simulations of continentalmore » and tropical oceanic deep convection is compared with overlap of PDF of various microphysics variables in cloud-resolving model (CRM) simulations of the same cases that explicitly predict the 3D structure of cloud and precipitation fields. CRM results show that PDF overlap varies significantly between different hydrometeor types, as well as between PDFs of mass and number mixing ratios for each species, - a distinction that the current SILHS implementation does not make. In CRM simulations that explicitly resolve cloud and precipitation structures, faster falling species, such as rain and graupel, exhibit significantly higher coherence in their vertical distributions than slow falling cloud liquid and ice. These results suggest that to improve the overlap treatment in the sub-column generator, the PDF correlations need to depend on hydrometeor properties, such as fall speeds, in addition to the currently implemented dependency on the turbulent convective length scale.« less

  11. Evaluation of incremental reactivity and its uncertainty in Southern California.

    PubMed

    Martien, Philip T; Harley, Robert A; Milford, Jana B; Russell, Armistead G

    2003-04-15

    The incremental reactivity (IR) and relative incremental reactivity (RIR) of carbon monoxide and 30 individual volatile organic compounds (VOC) were estimated for the South Coast Air Basin using two photochemical air quality models: a 3-D, grid-based model and a vertically resolved trajectory model. Both models include an extended version of the SAPRC99 chemical mechanism. For the 3-D modeling, the decoupled direct method (DDM-3D) was used to assess reactivities. The trajectory model was applied to estimate uncertainties in reactivities due to uncertainties in chemical rate parameters, deposition parameters, and emission rates using Monte Carlo analysis with Latin hypercube sampling. For most VOC, RIRs were found to be consistent in rankings with those produced by Carter using a box model. However, 3-D simulations show that coastal regions, upwind of most of the emissions, have comparatively low IR but higher RIR than predicted by box models for C4-C5 alkenes and carbonyls that initiate the production of HOx radicals. Biogenic VOC emissions were found to have a lower RIR than predicted by box model estimates, because emissions of these VOC were mostly downwind of the areas of primary ozone production. Uncertainties in RIR of individual VOC were found to be dominated by uncertainties in the rate parameters of their primary oxidation reactions. The coefficient of variation (COV) of most RIR values ranged from 20% to 30%, whereas the COV of absolute incremental reactivity ranged from about 30% to 40%. In general, uncertainty and variability both decreased when relative rather than absolute reactivity metrics were used.

  12. Simulating maize yield and bomass with spatial variability of soil field capacity

    USGS Publications Warehouse

    Ma, Liwang; Ahuja, Lajpat; Trout, Thomas; Nolan, Bernard T.; Malone, Robert W.

    2015-01-01

    Spatial variability in field soil properties is a challenge for system modelers who use single representative values, such as means, for model inputs, rather than their distributions. In this study, the root zone water quality model (RZWQM2) was first calibrated for 4 yr of maize (Zea mays L.) data at six irrigation levels in northern Colorado and then used to study spatial variability of soil field capacity (FC) estimated in 96 plots on maize yield and biomass. The best results were obtained when the crop parameters were fitted along with FCs, with a root mean squared error (RMSE) of 354 kg ha–1 for yield and 1202 kg ha–1 for biomass. When running the model using each of the 96 sets of field-estimated FC values, instead of calibrating FCs, the average simulated yield and biomass from the 96 runs were close to measured values with a RMSE of 376 kg ha–1 for yield and 1504 kg ha–1 for biomass. When an average of the 96 FC values for each soil layer was used, simulated yield and biomass were also acceptable with a RMSE of 438 kg ha–1 for yield and 1627 kg ha–1 for biomass. Therefore, when there are large numbers of FC measurements, an average value might be sufficient for model inputs. However, when the ranges of FC measurements were known for each soil layer, a sampled distribution of FCs using the Latin hypercube sampling (LHS) might be used for model inputs.

  13. Significant uncertainty in global scale hydrological modeling from precipitation data errors

    NASA Astrophysics Data System (ADS)

    Sperna Weiland, Frederiek C.; Vrugt, Jasper A.; van Beek, Rens (L.) P. H.; Weerts, Albrecht H.; Bierkens, Marc F. P.

    2015-10-01

    In the past decades significant progress has been made in the fitting of hydrologic models to data. Most of this work has focused on simple, CPU-efficient, lumped hydrologic models using discharge, water table depth, soil moisture, or tracer data from relatively small river basins. In this paper, we focus on large-scale hydrologic modeling and analyze the effect of parameter and rainfall data uncertainty on simulated discharge dynamics with the global hydrologic model PCR-GLOBWB. We use three rainfall data products; the CFSR reanalysis, the ERA-Interim reanalysis, and a combined ERA-40 reanalysis and CRU dataset. Parameter uncertainty is derived from Latin Hypercube Sampling (LHS) using monthly discharge data from five of the largest river systems in the world. Our results demonstrate that the default parameterization of PCR-GLOBWB, derived from global datasets, can be improved by calibrating the model against monthly discharge observations. Yet, it is difficult to find a single parameterization of PCR-GLOBWB that works well for all of the five river basins considered herein and shows consistent performance during both the calibration and evaluation period. Still there may be possibilities for regionalization based on catchment similarities. Our simulations illustrate that parameter uncertainty constitutes only a minor part of predictive uncertainty. Thus, the apparent dichotomy between simulations of global-scale hydrologic behavior and actual data cannot be resolved by simply increasing the model complexity of PCR-GLOBWB and resolving sub-grid processes. Instead, it would be more productive to improve the characterization of global rainfall amounts at spatial resolutions of 0.5° and smaller.

  14. A framework for building hypercubes using MapReduce

    NASA Astrophysics Data System (ADS)

    Tapiador, D.; O'Mullane, W.; Brown, A. G. A.; Luri, X.; Huedo, E.; Osuna, P.

    2014-05-01

    The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalog will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigm but without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.

  15. RTNN: The New Parallel Machine in Zaragoza

    NASA Astrophysics Data System (ADS)

    Sijs, A. J. V. D.

    I report on the development of RTNN, a parallel computer designed as a 4^4 hypercube of 256 T9000 transputer nodes, each with 8 MB memory. The peak performance of the machine is expected to be 2.5 Gflops.

  16. Fault tolerant hypercube computer system architecture

    NASA Technical Reports Server (NTRS)

    Madan, Herb S. (Inventor); Chow, Edward (Inventor)

    1989-01-01

    A fault-tolerant multiprocessor computer system of the hypercube type comprising a hierarchy of computers of like kind which can be functionally substituted for one another as necessary is disclosed. Communication between the working nodes is via one communications network while communications between the working nodes and watch dog nodes and load balancing nodes higher in the structure is via another communications network separate from the first. A typical branch of the hierarchy reporting to a master node or host computer comprises, a plurality of first computing nodes; a first network of message conducting paths for interconnecting the first computing nodes as a hypercube. The first network provides a path for message transfer between the first computing nodes; a first watch dog node; and a second network of message connecting paths for connecting the first computing nodes to the first watch dog node independent from the first network, the second network provides an independent path for test message and reconfiguration affecting transfers between the first computing nodes and the first switch watch dog node. There is additionally, a plurality of second computing nodes; a third network of message conducting paths for interconnecting the second computing nodes as a hypercube. The third network provides a path for message transfer between the second computing nodes; a fourth network of message conducting paths for connecting the second computing nodes to the first watch dog node independent from the third network. The fourth network provides an independent path for test message and reconfiguration affecting transfers between the second computing nodes and the first watch dog node; and a first multiplexer disposed between the first watch dog node and the second and fourth networks for allowing the first watch dog node to selectively communicate with individual ones of the computing nodes through the second and fourth networks; as well as, a second watch dog node operably connected to the first multiplexer whereby the second watch dog node can selectively communicate with individual ones of the computing nodes through the second and fourth networks. The branch is completed by a first load balancing node; and a second multiplexer connected between the first load balancing node and the first and second watch dog nodes, allowing the first load balancing node to selectively communicate with the first and second watch dog nodes.

  17. Re-evaluation of the sorption behaviour of Bromide and Sulfamethazine under field conditions using leaching data and modelling methods

    NASA Astrophysics Data System (ADS)

    Gassmann, Matthias; Olsson, Oliver; Höper, Heinrich; Hamscher, Gerd; Kümmerer, Klaus

    2016-04-01

    The simulation of reactive transport in the aquatic environment is hampered by the ambiguity of environmental fate process conceptualizations for a specific substance in the literature. Concepts are usually identified by experimental studies and inverse modelling under controlled lab conditions in order to reduce environmental uncertainties such as uncertain boundary conditions and input data. However, since environmental conditions affect substance behaviour, a re-evaluation might be necessary under environmental conditions which might, in turn, be affected by uncertainties. Using a combination of experimental data and simulations of the leaching behaviour of the veterinary antibiotic Sulfamethazine (SMZ; synonym: sulfadimidine) and the hydrological tracer Bromide (Br) in a field lysimeter, we re-evaluated the sorption concepts of both substances under uncertain field conditions. Sampling data of a field lysimeter experiment in which both substances were applied twice a year with manure and sampled at the bottom of two lysimeters during three subsequent years was used for model set-up and evaluation. The total amount of leached SMZ and Br were 22 μg and 129 mg, respectively. A reactive transport model was parameterized to the conditions of the two lysimeters filled with monoliths (depth 2 m, area 1 m²) of a sandy soil showing a low pH value under which Bromide is sorptive. We used different sorption concepts such as constant and organic-carbon dependent sorption coefficients and instantaneous and kinetic sorption equilibrium. Combining the sorption concepts resulted in four scenarios per substance with different equations for sorption equilibrium and sorption kinetics. The GLUE (Generalized Likelihood Uncertainty Estimation) method was applied to each scenario using parameter ranges found in experimental and modelling studies. The parameter spaces for each scenario were sampled using a Latin Hypercube method which was refined around local model efficiency maxima. Results of the cumulative SMZ leaching simulations suggest a best conceptualization combination of instantaneous sorption to organic carbon which is consistent with the literature. The best Nash-Sutcliffe efficiency (Neff) was 0.96 and the 5th and 95th percentile of the uncertainty estimation were 18 and 27 μg. In contrast, both scenarios of kinetic Br sorption had similar results (Neff =0.99, uncertainty bounds 110-176 mg and 112-176 mg) but were clearly better than instantaneous sorption scenarios. Therefore, only the concept of sorption kinetics could be identified for Br modelling whereas both tested sorption equilibrium coefficient concepts performed equally well. The reasons for this specific case of equifinality may be uncertainties of model input data under field conditions or an insensitivity of the sorption equilibrium method due to relatively low adsorption of Br. Our results show that it may be possible to identify or at least falsify specific sorption concepts under uncertain field conditions using a long-term leaching experiment and modelling methods. Cases of environmental fate concept equifinality arouse the possibility of future model structure uncertainty analysis using an ensemble of models with different environmental fate concepts.

  18. Aquatic risk assessment of pesticides in Latin America.

    PubMed

    Carriquiriborde, Pedro; Mirabella, Paula; Waichman, Andrea; Solomon, Keith; Van den Brink, Paul J; Maund, Steve

    2014-10-01

    Latin America is anticipated to be a major growth market for agriculture and production is increasing with use of technologies such as pesticides. Reports of contamination of aquatic ecosystems by pesticides in Latin America have raised concerns about potential for adverse ecological effects. In the registration process of pesticides, all countries require significant data packages on aquatic toxicology and environmental fate. However, there are usually no specific requirements to conduct an aquatic risk assessment. To address this issue, the Society of Environmental Toxicology and Chemistry organized a workshop that brought together scientists from academia, government, and industry to review and elaborate on aquatic risk assessment frameworks that can be implemented into regulation of pesticides in Latin America. The workshop concluded that the international framework for risk assessments (protection goals, effects, and exposure assessments, risk characterization, and risk mitigation) is broadly applicable in Latin America but needs further refinement for the use in the region. Some of the challenges associated with these refinements are discussed in the article. It was recognized that there is potential for data sharing both within and outside of the region where conditions are similar. However, there is a need for research to compare local species and environmental conditions to those in other jurisdictions to be able to evaluate the applicability of data used in other countries. Development should also focus on human resources as there is a need to build local capacity and capability, and scientific collaboration and exchange between stakeholders in industry, government, and academia is also important. The meeting also emphasized that, although establishing a regionally relevant risk assessment framework is important, this also needs to be accompanied by enforcement of developed regulations and good management practices to help protect aquatic habitats. Education, training, and communication efforts are needed to achieve this. © 2014 SETAC.

  19. [Adaptive reactions of students from mountain and plain regions of Latin America to conditions of middle Russia].

    PubMed

    Ermakova, N V

    2003-01-01

    This article contains results of the comparative study of the functional state of respiratory and cardiovascular systems of almost healthy students (man) of age 19-22, inhabitants of mountain and plain regions of Latin America during their adaptation to the conditions of middle Russia. We have established that there are reliable distinctions in the functional state of cardio-respiratory system of students from mountain and plain regions of Latin America. So for representatives of mountain regions of LA were typical higher indicators of vital capacity, permeability of large and medium bronchial tubes, stroke volume, lower indicators of heart rate, systolic arterial pressure, myocard tension index, but higher coefficient of myocard efficiency than for inhabitants the plain. Considerable distinctions have been observed also in the intercommunication between different indicators. There have been marked considerable correlation connections between small bronchial tubes permeability and cardiovascular system indicators for plain inhabitants. For mountain regions inhabitants almost every indicator of bronchial tubes permeability correlate reliably with vital capacity, but didn't correlate with hemodynamics indicators.

  20. [Allergic rhinitis update and its impact on asthma (ARIA 2008). Latin American perspective].

    PubMed

    Cagnani, Carlos E Baena; Solé, Dirceu; Díaz, Sandra N González; Zernotti, Mario E; Sisul, Juan C; Borges, Mario Sánchez; Guzmán, María Antonieta; Ivancevich, Juan C; Cepeda, Alfonso; Pérez, Noel Rodríguez; Gereda, José; Cruz, Alvaro; Croce, Victor H; Khaltaev, Nikolai; Bousquet, Jean

    2009-01-01

    Rhinitis is the most frequent respiratory disease in most countries of the world. It is estimated that 600 million people suffer this condition. Allergic rhinitis is a public health problem at global level. Patients who suffer allergic rhinitis have from mild to annoying nasal symptoms which affect quality of life, cause sleep disorders, scholar and workplace absenteeism, and health expenditure. Rhinitis is frequently associated to co-morbidities such as sinusitis, otitis media, and especially asthma. Rhinitis is under-diagnosed and under-treated worldwide and also in Latin American countries. ARIA is the very first evidence-based guideline for the diagnosis and treatment of rhinitis with focus in its co-morbidities (2001), especially asthma published in 2001. In 2008 an update was published. ARIA recommends an integrative approach for management; including anti-histamines (second generation), intra-nasal corticosteroids, anti-leukotrienes and immunotherapy. It also provides a questionnaire to evaluate asthma and its severity in those patients suffering rhinitis. The prevalence of allergic rhinitis is quite high in Latin American countries and in recent years a great insight on the burden of this condition has been gained.

  1. Integrated Pest Management of Coffee Berry Borer: Strategies from Latin America that Could Be Useful for Coffee Farmers in Hawaii.

    PubMed

    Aristizábal, Luis F; Bustillo, Alex E; Arthurs, Steven P

    2016-02-03

    The coffee berry borer (CBB), Hypothenemus hampei Ferrari (Coleoptera: Curculionidae: Scolytinae) is the primary arthropod pest of coffee plantations worldwide. Since its detection in Hawaii (September 2010), coffee growers are facing financial losses due to reduced quality of coffee yields. Several control strategies that include cultural practices, biological control agents (parasitoids), chemical and microbial insecticides (entomopathogenic fungi), and a range of post-harvest sanitation practices have been conducted to manage CBB around the world. In addition, sampling methods including the use of alcohol based traps for monitoring CBB populations have been implemented in some coffee producing countries in Latin America. It is currently unclear which combination of CBB control strategies is optimal under economical, environmental, and sociocultural conditions of Hawaii. This review discusses components of an integrated pest management program for CBB. We focus on practical approaches to provide guidance to coffee farmers in Hawaii. Experiences of integrated pest management (IPM) of CBB learned from Latin America over the past 25 years may be relevant for establishing strategies of control that may fit under Hawaiian coffee farmers' conditions.

  2. Latin America: population and internal unrest.

    PubMed

    Wiarda, J H; Siqueira Wiarda, I

    1985-09-01

    This discussion of population and internal unrest in Latin America covers the following: pressures on land and agriculture; economic frustrations; the youth and radicalism; rising social tensions; and political instability. At current growth rates, Latin America's population is projected to increases between 1981 2001 by 225 million people. This staggering population growth is likely to have serious political, economic, social, strategic, and other implications. The strong opposition to family planning which came principally from nationlists, the military, and the church during the 1960s has changed to general support for voluntary family planning programs in much of Latin America. Too rapid population growth now is viewed widely as aggravating the problems of development and putting severe strains on services and facilities. The wish to limit family size is particularly strong among women. Most of Latin America's untapped land is unusable, either so steeply mountainous, densely tropical, or barren of topsoil that it cannot support life at even the most meager level of subsistence. Food production in most of Latin America has not kept pace with population growth. Since most new agricultural production is oriented toward exports rather than home consumption, conditions for most rural populations are worsening. Economic dilemmas facing Latin America include widespread poverty, the world's highest per capita debt, unemployment and underemployment that may reach between 40-50% of the workforce, negative economic growth rates over the past 5 years, immense income inequalities, declining terms of trade, extensive capital flight, little new investment or foreign assistance, increased protectionism on the part of those countriews with whom Latin America must trade, rising prices for the goods Latin America must import, and (in some countries) devastation of the economic infrastrucutre by guerrilla forces. The unprecedent flow from the countryside has made Latin America the world's 3rd most highly urbanized region. Over 65% of its population reside in cities, particularly capital cities. Social services are breaking down in all Latin American capitals to the extent that over half the inhabitants lack water or sewage facilities. There is mixed evidence on the relationship between youth and radicalism, yet it is clear that in Latin America most of the generation of university-trained young people are Marxists. As many as 45% of college-aged people are activists. The implications of this disaffected group coming to power are enormous, particularly since the generation currently under 15 is likely to be even more embittered and radical. Tension, frustration, and violence in society all are likely to have a political impact. In sum, a close but indirect relationship exists between unchecked population growth, spiraling socioeconomic problems, and the potential for political breakdown, destabilization, and internal unrest in Latin America.

  3. On the relationship between parallel computation and graph embedding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, A.K.

    1989-01-01

    The problem of efficiently simulating an algorithm designed for an n-processor parallel machine G on an m-processor parallel machine H with n > m arises when parallel algorithms designed for an ideal size machine are simulated on existing machines which are of a fixed size. The author studies this problem when every processor of H takes over the function of a number of processors in G, and he phrases the simulation problem as a graph embedding problem. New embeddings presented address relevant issues arising from the parallel computation environment. The main focus centers around embedding complete binary trees into smaller-sizedmore » binary trees, butterflies, and hypercubes. He also considers simultaneous embeddings of r source machines into a single hypercube. Constant factors play a crucial role in his embeddings since they are not only important in practice but also lead to interesting theoretical problems. All of his embeddings minimize dilation and load, which are the conventional cost measures in graph embeddings and determine the maximum amount of time required to simulate one step of G on H. His embeddings also optimize a new cost measure called ({alpha},{beta})-utilization which characterizes how evenly the processors of H are used by the processors of G. Ideally, the utilization should be balanced (i.e., every processor of H simulates at most (n/m) processors of G) and the ({alpha},{beta})-utilization measures how far off from a balanced utilization the embedding is. He presents embeddings for the situation when some processors of G have different capabilities (e.g. memory or I/O) than others and the processors with different capabilities are to be distributed uniformly among the processors of H. Placing such conditions on an embedding results in an increase in some of the cost measures.« less

  4. Hypercube Expert System Shell - Applying Production Parallelism.

    DTIC Science & Technology

    1989-12-01

    possible processor organizations, or int( rconntction n thod,, for par- allel architetures . The following are examples of commonlv used interconnection...this timing analysis because match speed-up avaiiah& from production parallelism is proportional to the average number of affected produclions1 ( 11:5

  5. The human gut microbiome of Latin America populations: a landscape to be discovered.

    PubMed

    Magne, Fabien; O'Ryan, Miguel L; Vidal, Roberto; Farfan, Mauricio

    2016-10-01

    The gut microbiome is critical for human health, and its alteration is associated with intestinal, autoimmune and metabolic diseases. Numerous studies have focused on prevention or treatment of dysbiotic microbiome to reduce the risk or effect of these diseases. A key issue is to define the microbiome associated with the state of good health. The purpose of this review is to describe factors influencing the gut microbiome with special emphasis on contributions from Latin America. In addition, we will highlight opportunities for future studies on gut microbiome in Latin America. A relevant factor influencing gut microbiome composition is geographical location associated with specific genetic, dietary and lifestyle factors. Geographical specificities suggest that a universal 'healthy microbiome' is unlikely. Several research programs, mostly from Europe and North America, are extensively sequencing gut microbiome of healthy people, whereas data from Latin America remain scarce yet slowly increasing. Few studies have shown difference in the composition of gut microbiome between their local populations with that of other industrialized countries (North American populations). Latin America is composed of countries with a myriad of lifestyles, traditions, genetic backgrounds and socioeconomic conditions, which may determine differences in gut microbiome of individuals from different countries. This represents an opportunity to better understand the relationship between these factors and gut microbiome.

  6. The cost of Latin American science Introduction for the second issue of CBP-Latin America.

    PubMed

    Zenteno-Savín, Tania; Beleboni, René Oliveira; Hermes-Lima, Marcelo

    2007-04-01

    Latin American researchers in science and engineering (S&E), including those in biology and biomedical sciences, are frequently exposed to unstable conditions of financial support, material and human resources, and a limited number of positions at public and private institutions. Such uncertainties impose continuous challenges for the scientific community which, in the best of cases, responds with careful planning and creativity, and in the worst scenario endures the migration of scientists to the USA or Europe. Still, the number of scientific publications from Latin American institutions in the last decade increased at a much faster rate than publications from the USA and Canada. A brief analysis per country of the gross domestic product (GDP) spent in research and development (R&D) and the S&E production reported by the Pascal bibliographic database suggests that the number and quality of S&E publications is directly proportional to the financial support for R&D. However, the investment in R&D in Latin America did not increase at the same rate (from 0.49 to 0.55% of GDP, from 1990 to 2003) at which S&E publications did in the same period (2.9-fold increase, from 1988 to 2001). In Latin America, the traditional financial support for scientific research continues to be from federal and state government funds, associated in some cases with institutional funds that are mostly directed towards administrative costs and infrastructure maintenance. The aim of this introduction is to briefly discuss the production cost of articles published in refereed S&E journals, including the cost of the scientific research behind them, and, at the same time, to increase the awareness of the high quality of scientific research in Latin American institutions despite the many challenges, especially financial constraints, faced by their scientists. The second issue of Comparative Biochemistry and Physiology dedicated to Latin America ("The Face of Latin American Comparative Biochemistry and Physiology") celebrates, by means of 26 manuscripts from five countries, the diversity and quality of biological science in the continent.

  7. Matched-filter algorithm for subpixel spectral detection in hyperspectral image data

    NASA Astrophysics Data System (ADS)

    Borough, Howard C.

    1991-11-01

    Hyperspectral imagery, spatial imagery with associated wavelength data for every pixel, offers a significant potential for improved detection and identification of certain classes of targets. The ability to make spectral identifications of objects which only partially fill a single pixel (due to range or small size) is of considerable interest. Multiband imagery such as Landsat's 5 and 7 band imagery has demonstrated significant utility in the past. Hyperspectral imaging systems with hundreds of spectral bands offer improved performance. To explore the application of differentpixel spectral detection algorithms a synthesized set of hyperspectral image data (hypercubes) was generated utilizing NASA earth resources and other spectral data. The data was modified using LOWTRAN 7 to model the illumination, atmospheric contributions, attenuations and viewing geometry to represent a nadir view from 10,000 ft. altitude. The base hypercube (HC) represented 16 by 21 spatial pixels with 101 wavelength samples from 0.5 to 2.5 micrometers for each pixel. Insertions were made into the base data to provide random location, random pixel percentage, and random material. Fifteen different hypercubes were generated for blind testing of candidate algorithms. An algorithm utilizing a matched filter in the spectral dimension proved surprisingly good yielding 100% detections for pixels filled greater than 40% with a standard camouflage paint, and a 50% probability of detection for pixels filled 20% with the paint, with no false alarms. The false alarm rate as a function of the number of spectral bands in the range from 101 to 12 bands was measured and found to increase from zero to 50% illustrating the value of a large number of spectral bands. This test was on imagery without system noise; the next step is to incorporate typical system noise sources.

  8. The Latin American Social Medicine database

    PubMed Central

    Eldredge, Jonathan D; Waitzkin, Howard; Buchanan, Holly S; Teal, Janis; Iriart, Celia; Wiley, Kevin; Tregear, Jonathan

    2004-01-01

    Background Public health practitioners and researchers for many years have been attempting to understand more clearly the links between social conditions and the health of populations. Until recently, most public health professionals in English-speaking countries were unaware that their colleagues in Latin America had developed an entire field of inquiry and practice devoted to making these links more clearly understood. The Latin American Social Medicine (LASM) database finally bridges this previous gap. Description This public health informatics case study describes the key features of a unique information resource intended to improve access to LASM literature and to augment understanding about the social determinants of health. This case study includes both quantitative and qualitative evaluation data. Currently the LASM database at The University of New Mexico brings important information, originally known mostly within professional networks located in Latin American countries to public health professionals worldwide via the Internet. The LASM database uses Spanish, Portuguese, and English language trilingual, structured abstracts to summarize classic and contemporary works. Conclusion This database provides helpful information for public health professionals on the social determinants of health and expands access to LASM. PMID:15627401

  9. [Latin American consensus on hypertension in patients with diabetes type 2 and metabolic syndrome].

    PubMed

    López-Jaramillo, Patricio; Sánchez, Ramiro A; Diaz, Margarita; Cobos, Leonardo; Bryce, Alfonso; Parra-Carrillo, Jose Z; Lizcano, Fernando; Lanas, Fernando; Sinay, Isaac; Sierra, Iván D; Peñaherrera, Ernesto; Bendersky, Mario; Schmid, Helena; Botero, Rodrigo; Urina, Manuel; Lara, Joffre; Foss, Milton C; Márquez, Gustavo; Harrap, Stephen; Ramírez, Agustín J; Zanchetti, Alberto

    2014-04-01

    The present document has been prepared by a group of experts, members of cardiology, endocrinology, internal medicine, nephrology and diabetes societies of Latin American countries, to serve as a guide to physicians taking care of patients with diabetes, hypertension and comorbidities or complications of both conditions. Although the concept of metabolic syndrome is currently disputed, the higher prevalence in Latin America of that cluster of metabolic alterations has suggested that metabolic syndrome is a useful nosography entity in the context of Latin American medicine. Therefore, in the present document, particular attention is paid to this syndrome in order to alert physicians on a particular high-risk population, usually underestimated and undertreated. These recommendations result from presentations and debates by discussion panels during a 2-day conference held in Bucaramanga, in October 2012, and all the participants have approved the final conclusions. The authors acknowledge that the publication and diffusion of guidelines do not suffice to achieve the recommended changes in diagnostic or therapeutic strategies, and plan suitable interventions overcoming knowledge, attitude and behavioural barriers, preventing both physicians and patients from effectively adhering to guideline recommendations.

  10. PCLIPS: Parallel CLIPS

    NASA Technical Reports Server (NTRS)

    Hall, Lawrence O.; Bennett, Bonnie H.; Tello, Ivan

    1994-01-01

    A parallel version of CLIPS 5.1 has been developed to run on Intel Hypercubes. The user interface is the same as that for CLIPS with some added commands to allow for parallel calls. A complete version of CLIPS runs on each node of the hypercube. The system has been instrumented to display the time spent in the match, recognize, and act cycles on each node. Only rule-level parallelism is supported. Parallel commands enable the assertion and retraction of facts to/from remote nodes working memory. Parallel CLIPS was used to implement a knowledge-based command, control, communications, and intelligence (C(sup 3)I) system to demonstrate the fusion of high-level, disparate sources. We discuss the nature of the information fusion problem, our approach, and implementation. Parallel CLIPS has also be used to run several benchmark parallel knowledge bases such as one to set up a cafeteria. Results show from running Parallel CLIPS with parallel knowledge base partitions indicate that significant speed increases, including superlinear in some cases, are possible.

  11. Optimisation and evaluation of hyperspectral imaging system using machine learning algorithm

    NASA Astrophysics Data System (ADS)

    Suthar, Gajendra; Huang, Jung Y.; Chidangil, Santhosh

    2017-10-01

    Hyperspectral imaging (HSI), also called imaging spectrometer, originated from remote sensing. Hyperspectral imaging is an emerging imaging modality for medical applications, especially in disease diagnosis and image-guided surgery. HSI acquires a three-dimensional dataset called hypercube, with two spatial dimensions and one spectral dimension. Spatially resolved spectral imaging obtained by HSI provides diagnostic information about the objects physiology, morphology, and composition. The present work involves testing and evaluating the performance of the hyperspectral imaging system. The methodology involved manually taking reflectance of the object in many images or scan of the object. The object used for the evaluation of the system was cabbage and tomato. The data is further converted to the required format and the analysis is done using machine learning algorithm. The machine learning algorithms applied were able to distinguish between the object present in the hypercube obtain by the scan. It was concluded from the results that system was working as expected. This was observed by the different spectra obtained by using the machine-learning algorithm.

  12. Assessing Degree of Susceptibility to Landslide Hazard

    NASA Astrophysics Data System (ADS)

    Sheridan, M. F.; Cordoba, G. A.; Delgado, H.; Stefanescu, R.

    2013-05-01

    The modeling of hazardous mass flows, both dry and water saturated, is currently an area of active research and several stable models have now emerged that have differing degrees of physical and mathematical fidelity. Models based on the early work of Savage and Hutter (1989) assume that very large dense granular flows could be modeled as incompressible continua governed by a Coulomb failure criterion. Based on this concept, Patra et al. (2005) developed a code for dry avalanches, which proposes a thin layer mathematical model similar to shallow-water equations. This concept was implemented in the widely-used TITAN2D program, which integrates the shock-capturing Godunov solution methodology for the equation system. We propose a method to assess the susceptibility of specific locations susceptible to landslides following heavy tephra fall using the TIATN2D code. Successful application requires that the range of several uncertainties must be framed in the selection of model input data: 1) initial conditions, like volume and location of origin of the landslide, 2) bed and internal friction parameters and 3) digital elevation model (DEM) uncertainties. Among the possible ways of coping with these uncertainties, we chose to use Latin Hypercube Sampling (LHS). This statistical technique reduces a computationally intractable problem to such an extent that is it possible to apply it, even with current personal computers. LHS requires that there is only one sample in each row and each column of the sampling matrix, where each row (multi-dimensional) corresponds to each uncertainty. LHS requires less than 10% of the sample runs needed by Monte Carlo approaches to achieve a stable solution. In our application LHS output provides model sampling for 4 input parameters: initial random volumes, UTM location (x and y), and bed friction. We developed a simple Octave script to link the output of LHS with TITAN2D. In this way, TITAN2D can run several times with successively different initial conditions provided by the LHC routine. Finally the set of results from TITAN2D are computed to obtain the distribution of maximum exceedance probability given that a landslide occurs at a place of interest. We apply this method to find sectors least prone to be affected by landslides, in a region along the Panamerican Highway in the southern part of Colombia. The goal of such a study is to provide decision makers to improve their assessments regarding permissions for development along the highway.

  13. [Nosocomial myiasis in Latin America and the Caribbean: an overlooked reality?].

    PubMed

    Sánchez-Sánchez, Rocío; Calderón-Arguedas, Ólger; Mora-Brenes, Nury; Troyo, Adriana

    2014-09-01

    Nosocomial myiasis is an infestation by fly larvae that occurs while a patient is hospitalized. To analyze the available information on nosocomial myiasis in Latin America and the Caribbean, a search was done for cases published in the last 52 years. Nine clinical cases were found for Brazil, Costa Rica, French Guiana, Honduras, and Jamaica. Two other publications mention 139 cases in El Salvador and some 32 in Colombia, respectively. The patients and environments described presented conditions that predispose to this type of infestation. Compulsory notification is not usually required for nosocomial myiasis in Latin America and the Caribbean, meaning that there is probably considerable underreporting. Awareness needs to be raised and registry improved of myiasis in the region to aid in adoption of better prevention measures, which will benefit patient care during hospitalization.

  14. Relation between the global burden of disease and randomized clinical trials conducted in Latin America published in the five leading medical journals.

    PubMed

    Perel, Pablo; Miranda, J Jaime; Ortiz, Zulma; Casas, Juan Pablo

    2008-02-27

    Since 1990 non communicable diseases and injuries account for the majority of death and disability-adjusted life years in Latin America. We analyzed the relationship between the global burden of disease and Randomized Clinical Trials (RCTs) conducted in Latin America that were published in the five leading medical journals. We included all RCTS in humans, exclusively conducted in Latin American countries, and published in any of the following journals: Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, and New England Journal of Medicine. We described the trials and reported the number of RCTs according to the main categories of the global burden of disease. Sixty-six RCTs were identified. Communicable diseases accounted for 38 (57%) reports. Maternal, perinatal, and nutritional conditions accounted for 19 (29%) trials. Non-communicable diseases represent 48% of the global burden of disease but only 14% of reported trials. No trial addressed injuries despite its 18% contribution to the burden of disease in 2000. A poor correlation between the burden of disease and RCTs publications was found. Non communicable diseases and injuries account for up to two thirds of the burden of disease in Latin America but these topics are seldom addressed in published RCTs in the selected sample of journals. Funding bodies of health research and editors should be aware of the increasing burden of non communicable diseases and injuries occurring in Latin America to ensure that this growing epidemic is not neglected in the research agenda and not affected by publication bias.

  15. The role of tramadol in pain management in Latin America: a report by the Change Pain Latin America Advisory Panel.

    PubMed

    Santos Garcia, Joäo Batista; Lech, Osvandré; Campos Kraychete, Durval; Rico, María Antonieta; Hernández-Castro, John Jairo; Colimon, Frantz; Guerrero, Carlos; Sempértegui Gallegos, Manuel; Lara-Solares, Argelia; Flores Cantisani, José Alberto; Amescua-Garcia, César; Guillén Núñez, María Del Rocío; Berenguel Cook, María Del Rosario; Jreige Iskandar, Aziza; Bonilla Sierra, Patricia

    2017-09-01

    Change Pain Latin America (CPLA) was created to enhance chronic pain understanding and develop pain management improving strategies in this region. During its seventh meeting (August 2016), the main objective was to discuss tramadol's role in treating pain in Latin America. Furthermore, potential pain management consequences were considered, if tramadol was to become more stringently controlled. Key topics discussed were: main indications for prescribing tramadol, its pharmacological characteristics, safety and tolerability, effects of restrictions on its availability and use, and consequent impact on pain care quality. The experts agreed that tramadol is used to treat a wide spectrum of non-oncological pain conditions (e.g. post-surgical, musculoskeletal, post-traumatic, neuropathic, fibromyalgia), as well as cancer pain. Its relevance when treating special patient groups (e.g. the elderly) is recognized. The main reasons for tramadol's high significance as a treatment option are: its broad efficacy, an inconspicuous safety profile and its availability, considering that access to strong analgesics - mainly controlled drugs (classical opioids) - is highly restricted in some countries. The CPLA also agreed that tramadol is well tolerated, without the safety issues associated with long-term nonsteroidal anti-inflammatory drug (NSAID) use, with fewer opioid-like side effects than classical opioids and lower abuse risk. In Latin America, tramadol is a valuable and frequently used medication for treating moderate to severe pain. More stringent regulations would have significant impact on its availability, especially for outpatients. This could cause regression to older and frequently inadequate pain management methods, resulting in unnecessary suffering for many Latin American patients.

  16. Integrating satellite actual evapotranspiration patterns into distributed model parametrization and evaluation for a mesoscale catchment

    NASA Astrophysics Data System (ADS)

    Demirel, M. C.; Mai, J.; Stisen, S.; Mendiguren González, G.; Koch, J.; Samaniego, L. E.

    2016-12-01

    Distributed hydrologic models are traditionally calibrated and evaluated against observations of streamflow. Spatially distributed remote sensing observations offer a great opportunity to enhance spatial model calibration schemes. For that it is important to identify the model parameters that can change spatial patterns before the satellite based hydrologic model calibration. Our study is based on two main pillars: first we use spatial sensitivity analysis to identify the key parameters controlling the spatial distribution of actual evapotranspiration (AET). Second, we investigate the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale Hydrologic Model (mHM). This distributed model is selected as it allows for a change in the spatial distribution of key soil parameters through the calibration of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) directly as input. In addition the simulated AET can be estimated at the spatial resolution suitable for comparison to the spatial patterns observed using MODIS data. We introduce a new dynamic scaling function employing remotely sensed vegetation to downscale coarse reference evapotranspiration. In total, 17 parameters of 47 mHM parameters are identified using both sequential screening and Latin hypercube one-at-a-time sampling methods. The spatial patterns are found to be sensitive to the vegetation parameters whereas streamflow dynamics are sensitive to the PTF parameters. The results of multi-objective model calibration show that calibration of mHM against observed streamflow does not reduce the spatial errors in AET while they improve only the streamflow simulations. We will further examine the results of model calibration using only multi spatial objective functions measuring the association between observed AET and simulated AET maps and another case including spatial and streamflow metrics together.

  17. Stepwise sensitivity analysis from qualitative to quantitative: Application to the terrestrial hydrological modeling of a Conjunctive Surface-Subsurface Process (CSSP) land surface model

    NASA Astrophysics Data System (ADS)

    Gan, Yanjun; Liang, Xin-Zhong; Duan, Qingyun; Choi, Hyun Il; Dai, Yongjiu; Wu, Huan

    2015-06-01

    An uncertainty quantification framework was employed to examine the sensitivities of 24 model parameters from a newly developed Conjunctive Surface-Subsurface Process (CSSP) land surface model (LSM). The sensitivity analysis (SA) was performed over 18 representative watersheds in the contiguous United States to examine the influence of model parameters in the simulation of terrestrial hydrological processes. Two normalized metrics, relative bias (RB) and Nash-Sutcliffe efficiency (NSE), were adopted to assess the fit between simulated and observed streamflow discharge (SD) and evapotranspiration (ET) for a 14 year period. SA was conducted using a multiobjective two-stage approach, in which the first stage was a qualitative SA using the Latin Hypercube-based One-At-a-Time (LH-OAT) screening, and the second stage was a quantitative SA using the Multivariate Adaptive Regression Splines (MARS)-based Sobol' sensitivity indices. This approach combines the merits of qualitative and quantitative global SA methods, and is effective and efficient for understanding and simplifying large, complex system models. Ten of the 24 parameters were identified as important across different watersheds. The contribution of each parameter to the total response variance was then quantified by Sobol' sensitivity indices. Generally, parameter interactions contribute the most to the response variance of the CSSP, and only 5 out of 24 parameters dominate model behavior. Four photosynthetic and respiratory parameters are shown to be influential to ET, whereas reference depth for saturated hydraulic conductivity is the most influential parameter for SD in most watersheds. Parameter sensitivity patterns mainly depend on hydroclimatic regime, as well as vegetation type and soil texture. This article was corrected on 26 JUN 2015. See the end of the full text for details.

  18. Estimation of ice activation parameters within a particle tracking Lagrangian cloud model using the ensemble Kalman filter to match ISCDAC golden case observations

    NASA Astrophysics Data System (ADS)

    Reisner, J. M.; Dubey, M. K.

    2010-12-01

    To both quantify and reduce uncertainty in ice activation parameterizations for stratus clouds occurring in the temperature range between -5 to -10 C ensemble simulations of an ISDAC golden case have been conducted. To formulate the ensemble, three parameters found within an ice activation model have been sampled using a Latin hypercube technique over a parameter range that induces large variability in both number and mass of ice. The ice activation model is contained within a Lagrangian cloud model that simulates particle number as a function of radius for cloud ice, snow, graupel, cloud, and rain particles. A unique aspect of this model is that it produces very low levels of numerical diffusion that enable the model to accurately resolve the sharp cloud edges associated with the ISDAC stratus deck. Another important aspect of the model is that near the cloud edges the number of particles can be significantly increased to reduce sampling errors and accurately resolve physical processes such as collision-coalescence that occur in this region. Thus, given these relatively low numerical errors, as compared to traditional bin models, the sensitivity of a stratus deck to changes in parameters found within the activation model can be examined without fear of numerical contamination. Likewise, once the ensemble has been completed, ISDAC observations can be incorporated into a Kalman filter to optimally estimate the ice activation parameters and reduce overall model uncertainty. Hence, this work will highlight the ability of an ensemble Kalman filter system coupled to a highly accurate numerical model to estimate important parameters found within microphysical parameterizations containing high uncertainty.

  19. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method—A Case Study of Western Jilin Province

    PubMed Central

    An, Yongkai; Lu, Wenxi; Cheng, Weiguo

    2015-01-01

    This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS) method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately. PMID:26264008

  20. Towards a consensus-based biokinetic model for green microalgae - The ASM-A.

    PubMed

    Wágner, Dorottya S; Valverde-Pérez, Borja; Sæbø, Mariann; Bregua de la Sotilla, Marta; Van Wagenen, Jonathan; Smets, Barth F; Plósz, Benedek Gy

    2016-10-15

    Cultivation of microalgae in open ponds and closed photobioreactors (PBRs) using wastewater resources offers an opportunity for biochemical nutrient recovery. Effective reactor system design and process control of PBRs requires process models. Several models with different complexities have been developed to predict microalgal growth. However, none of these models can effectively describe all the relevant processes when microalgal growth is coupled with nutrient removal and recovery from wastewaters. Here, we present a mathematical model developed to simulate green microalgal growth (ASM-A) using the systematic approach of the activated sludge modelling (ASM) framework. The process model - identified based on a literature review and using new experimental data - accounts for factors influencing photoautotrophic and heterotrophic microalgal growth, nutrient uptake and storage (i.e. Droop model) and decay of microalgae. Model parameters were estimated using laboratory-scale batch and sequenced batch experiments using the novel Latin Hypercube Sampling based Simplex (LHSS) method. The model was evaluated using independent data obtained in a 24-L PBR operated in sequenced batch mode. Identifiability of the model was assessed. The model can effectively describe microalgal biomass growth, ammonia and phosphate concentrations as well as the phosphorus storage using a set of average parameter values estimated with the experimental data. A statistical analysis of simulation and measured data suggests that culture history and substrate availability can introduce significant variability on parameter values for predicting the reaction rates for bulk nitrate and the intracellularly stored nitrogen state-variables, thereby requiring scenario specific model calibration. ASM-A was identified using standard cultivation medium and it can provide a platform for extensions accounting for factors influencing algal growth and nutrient storage using wastewater resources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Cystic echinococcosis in marketed offal of sheep in Basrah, Iraq: Abattoir-based survey and a probabilistic model estimation of the direct economic losses due to hydatid cyst.

    PubMed

    Abdulhameed, Mohanad F; Habib, Ihab; Al-Azizz, Suzan A; Robertson, Ian

    2018-02-01

    Cystic echinococcosis (CE) is a highly endemic parasitic zoonosis in Iraq with substantial impacts on livestock productivity and human health. The objectives of this study were to study the abattoir-based occurrence of CE in marketed offal of sheep in Basrah province, Iraq, and to estimate, using a probabilistic modelling approach, the direct economic losses due to hydatid cysts. Based on detailed visual meat inspection, results from an active abattoir survey in this study revealed detection of hydatid cysts in 7.3% (95% CI: 5.4; 9.6) of 631 examined sheep carcasses. Post-mortem lesions of hydatid cyst were concurrently present in livers and lungs of more than half (54.3% (25/46)) of the positive sheep. Direct economic losses due to hydatid cysts in marketed offal were estimated using data from government reports, the one abattoir survey completed in this study, and expert opinions of local veterinarians and butchers. A Monte-Carlo simulation model was developed in a spreadsheet utilizing Latin Hypercube sampling to account for uncertainty in the input parameters. The model estimated that the average annual economic losses associated with hydatid cysts in the liver and lungs of sheep marketed for human consumption in Basrah to be US$72,470 (90% Confidence Interval (CI); ±11,302). The mean proportion of annual losses in meat products value (carcasses and offal) due to hydatid cysts in the liver and lungs of sheep marketed in Basrah province was estimated as 0.42% (90% CI; ±0.21). These estimates suggest that CE is responsible for considerable livestock-associated monetary losses in the south of Iraq. These findings can be used to inform different regional CE control program options in Iraq.

  2. Analysis of the hydrological response of a distributed physically-based model using post-assimilation (EnKF) diagnostics of streamflow and in situ soil moisture observations

    NASA Astrophysics Data System (ADS)

    Trudel, Mélanie; Leconte, Robert; Paniconi, Claudio

    2014-06-01

    Data assimilation techniques not only enhance model simulations and forecast, they also provide the opportunity to obtain a diagnostic of both the model and observations used in the assimilation process. In this research, an ensemble Kalman filter was used to assimilate streamflow observations at a basin outlet and at interior locations, as well as soil moisture at two different depths (15 and 45 cm). The simulation model is the distributed physically-based hydrological model CATHY (CATchment HYdrology) and the study site is the Des Anglais watershed, a 690 km2 river basin located in southern Quebec, Canada. Use of Latin hypercube sampling instead of a conventional Monte Carlo method to generate the ensemble reduced the size of the ensemble, and therefore the calculation time. Different post-assimilation diagnostics, based on innovations (observation minus background), analysis residuals (observation minus analysis), and analysis increments (analysis minus background), were used to evaluate assimilation optimality. An important issue in data assimilation is the estimation of error covariance matrices. These diagnostics were also used in a calibration exercise to determine the standard deviation of model parameters, forcing data, and observations that led to optimal assimilations. The analysis of innovations showed a lag between the model forecast and the observation during rainfall events. Assimilation of streamflow observations corrected this discrepancy. Assimilation of outlet streamflow observations improved the Nash-Sutcliffe efficiencies (NSE) between the model forecast (one day) and the observation at both outlet and interior point locations, owing to the structure of the state vector used. However, assimilation of streamflow observations systematically increased the simulated soil moisture values.

  3. Optimization of a simplified automobile finite element model using time varying injury metrics.

    PubMed

    Gaewsky, James P; Danelson, Kerry A; Weaver, Caitlin M; Stitzel, Joel D

    2014-01-01

    In 2011, frontal crashes resulted in 55% of passenger car injuries with 10,277 fatalities and 866,000 injuries in the United States. To better understand frontal crash injury mechanisms, human body finite element models (FEMs) can be used to reconstruct Crash Injury Research and Engineering Network (CIREN) cases. A limitation of this method is the paucity of vehicle FEMs; therefore, we developed a functionally equivalent simplified vehicle model. The New Car Assessment Program (NCAP) data for our selected vehicle was from a frontal collision with Hybrid III (H3) Anthropomorphic Test Device (ATD) occupant. From NCAP test reports, the vehicle geometry was created and the H3 ATD was positioned. The material and component properties optimized using a variation study process were: steering column shear bolt fracture force and stroke resistance, seatbelt pretensioner force, frontal and knee bolster airbag stiffness, and belt friction through the D-ring. These parameters were varied using three successive Latin Hypercube Designs of Experiments with 130-200 simulations each. The H3 injury response was compared to the reported NCAP frontal test results for the head, chest and pelvis accelerations, and seat belt and femur forces. The phase, magnitude, and comprehensive error factors, from a Sprague and Geers analysis were calculated for each injury metric and then combined to determine the simulations with the best match to the crash test. The Sprague and Geers analyses typically yield error factors ranging from 0 to 1 with lower scores being more optimized. The total body injury response error factor for the most optimized simulation from each round of the variation study decreased from 0.466 to 0.395 to 0.360. This procedure to optimize vehicle FEMs is a valuable tool to conduct future CIREN case reconstructions in a variety of vehicles.

  4. Uncertainty analysis for absorbed dose from a brain receptor imaging agent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aydogan, B.; Miller, L.F.; Sparks, R.B.

    Absorbed dose estimates are known to contain uncertainties. A recent literature search indicates that prior to this study no rigorous investigation of uncertainty associated with absorbed dose has been undertaken. A method of uncertainty analysis for absorbed dose calculations has been developed and implemented for the brain receptor imaging agent {sup 123}I-IPT. The two major sources of uncertainty considered were the uncertainty associated with the determination of residence time and that associated with the determination of the S values. There are many sources of uncertainty in the determination of the S values, but only the inter-patient organ mass variation wasmore » considered in this work. The absorbed dose uncertainties were determined for lung, liver, heart and brain. Ninety-five percent confidence intervals of the organ absorbed dose distributions for each patient and for a seven-patient population group were determined by the ``Latin Hypercube Sampling`` method. For an individual patient, the upper bound of the 95% confidence interval of the absorbed dose was found to be about 2.5 times larger than the estimated mean absorbed dose. For the seven-patient population the upper bound of the 95% confidence interval of the absorbed dose distribution was around 45% more than the estimated population mean. For example, the 95% confidence interval of the population liver dose distribution was found to be between 1.49E+0.7 Gy/MBq and 4.65E+07 Gy/MBq with a mean of 2.52E+07 Gy/MBq. This study concluded that patients in a population receiving {sup 123}I-IPT could receive absorbed doses as much as twice as large as the standard estimated absorbed dose due to these uncertainties.« less

  5. Constraining a complex biogeochemical model for CO2 and N2O emission simulations from various land uses by model-data fusion

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Kraus, David; Kiese, Ralf; Breuer, Lutz

    2017-07-01

    This study presents the results of a combined measurement and modelling strategy to analyse N2O and CO2 emissions from adjacent arable land, forest and grassland sites in Hesse, Germany. The measured emissions reveal seasonal patterns and management effects, including fertilizer application, tillage, harvest and grazing. The measured annual N2O fluxes are 4.5, 0.4 and 0.1 kg N ha-1 a-1, and the CO2 fluxes are 20.0, 12.2 and 3.0 t C ha-1 a-1 for the arable land, grassland and forest sites, respectively. An innovative model-data fusion concept based on a multicriteria evaluation (soil moisture at different depths, yield, CO2 and N2O emissions) is used to rigorously test the LandscapeDNDC biogeochemical model. The model is run in a Latin-hypercube-based uncertainty analysis framework to constrain model parameter uncertainty and derive behavioural model runs. The results indicate that the model is generally capable of predicting trace gas emissions, as evaluated with RMSE as the objective function. The model shows a reasonable performance in simulating the ecosystem C and N balances. The model-data fusion concept helps to detect remaining model errors, such as missing (e.g. freeze-thaw cycling) or incomplete model processes (e.g. respiration rates after harvest). This concept further elucidates the identification of missing model input sources (e.g. the uptake of N through shallow groundwater on grassland during the vegetation period) and uncertainty in the measured validation data (e.g. forest N2O emissions in winter months). Guidance is provided to improve the model structure and field measurements to further advance landscape-scale model predictions.

  6. Simplifying the complexity of a coupled carbon turnover and pesticide degradation model

    NASA Astrophysics Data System (ADS)

    Marschmann, Gianna; Erhardt, André H.; Pagel, Holger; Kügler, Philipp; Streck, Thilo

    2016-04-01

    The mechanistic one-dimensional model PECCAD (PEsticide degradation Coupled to CArbon turnover in the Detritusphere; Pagel et al. 2014, Biogeochemistry 117, 185-204) has been developed as a tool to elucidate regulation mechanisms of pesticide degradation in soil. A feature of this model is that it integrates functional traits of microorganisms, identifiable by molecular tools, and physicochemical processes such as transport and sorption that control substrate availability. Predicting the behavior of microbially active interfaces demands a fundamental understanding of factors controlling their dynamics. Concepts from dynamical systems theory allow us to study general properties of the model such as its qualitative behavior, intrinsic timescales and dynamic stability: Using a Latin hypercube method we sampled the parameter space for physically realistic steady states of the PECCAD ODE system and set up a numerical continuation and bifurcation problem with the open-source toolbox MatCont in order to obtain a complete classification of the dynamical system's behaviour. Bifurcation analysis reveals an equilibrium state of the system entirely controlled by fungal kinetic parameters. The equilibrium is generally unstable in response to small perturbations except for a small band in parameter space where the pesticide pool is stable. Time scale separation is a phenomenon that occurs in almost every complex open physical system. Motivated by the notion of "initial-stage" and "late-stage" decomposers and the concept of r-, K- or L-selected microbial life strategies, we test the applicability of geometric singular perturbation theory to identify fast and slow time scales of PECCAD. Revealing a generic fast-slow structure would greatly simplify the analysis of complex models of organic matter turnover by reducing the number of unknowns and parameters and providing a systematic mathematical framework for studying their properties.

  7. Effects of surface area and inflow on the performance of stormwater best management practices with uncertainty analysis.

    PubMed

    Park, Daeryong; Roesner, Larry A

    2013-09-01

    The performance of stormwater best management practices (BMPs) is affected by BMP geometric and hydrologic factors. The objective of this study was to investigate the effect of BMP surface area and inflow on BMP performance using the k-C* model with uncertainty analysis. Observed total suspended solids (TSS) from detention basins and retention ponds data sets in the International Stormwater BMP Database were used to build and evaluate the model. Detention basins are regarded as dry ponds because they do not always have water, whereas retention ponds have a permanent pool and are considered wet ponds. In this study, Latin hypercube sampling (LHS) was applied to consider uncertainty in both influent event mean concentration (EMC), C(in), and the areal removal constant, k. The latter was estimated from the hydraulic loading rate, q, through use of a power function relationship. Results show that effluent EMC, C(out), decreased as inflow decreased and as BMP surface area increased in both detention basins and retention ponds. However, the change in C(out), depending on inflow and BMP surface area for detention basins, differed from the change in C(out) for retention ponds. Specifically, C(in) was more dominantly associated with the performance of the k-C* model of detention basins than were BMP surface area and inflow. For retention ponds, however, results suggest that BMP surface area and inflow both influenced changes in C(out) as well as C(in). These results suggest that sensitive factors in the performance of the k-C* model are limited to C(in) for detention basins, whereas BMP surface area, inflow, and C(in) are important for retention ponds.

  8. Using palaeoclimate data to improve models of the Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    Phipps, Steven; King, Matt; Roberts, Jason; White, Duanne

    2017-04-01

    Ice sheet models are the most descriptive tools available to simulate the future evolution of the Antarctic Ice Sheet (AIS), including its contribution towards changes in global sea level. However, our knowledge of the dynamics of the coupled ice-ocean-lithosphere system is inevitably limited, in part due to a lack of observations. Furthemore, to build computationally efficient models that can be run for multiple millennia, it is necessary to use simplified descriptions of ice dynamics. Ice sheet modelling is therefore an inherently uncertain exercise. The past evolution of the AIS provides an opportunity to constrain the description of physical processes within ice sheet models and, therefore, to constrain our understanding of the role of the AIS in driving changes in global sea level. We use the Parallel Ice Sheet Model (PISM) to demonstrate how palaeoclimate data can improve our ability to predict the future evolution of the AIS. A 50-member perturbed-physics ensemble is generated, spanning uncertainty in the parameterisations of three key physical processes within the model: (i) the stress balance within the ice sheet, (ii) basal sliding and (iii) calving of ice shelves. A Latin hypercube approach is used to optimally sample the range of uncertainty in parameter values. This perturbed-physics ensemble is used to simulate the evolution of the AIS from the Last Glacial Maximum ( 21,000 years ago) to present. Palaeoclimate records are then used to determine which ensemble members are the most realistic. This allows us to use data on past climates to directly constrain our understanding of the past contribution of the AIS towards changes in global sea level. Critically, it also allows us to determine which ensemble members are likely to generate the most realistic projections of the future evolution of the AIS.

  9. Using paleoclimate data to improve models of the Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    King, M. A.; Phipps, S. J.; Roberts, J. L.; White, D.

    2016-12-01

    Ice sheet models are the most descriptive tools available to simulate the future evolution of the Antarctic Ice Sheet (AIS), including its contribution towards changes in global sea level. However, our knowledge of the dynamics of the coupled ice-ocean-lithosphere system is inevitably limited, in part due to a lack of observations. Furthemore, to build computationally efficient models that can be run for multiple millennia, it is necessary to use simplified descriptions of ice dynamics. Ice sheet modeling is therefore an inherently uncertain exercise. The past evolution of the AIS provides an opportunity to constrain the description of physical processes within ice sheet models and, therefore, to constrain our understanding of the role of the AIS in driving changes in global sea level. We use the Parallel Ice Sheet Model (PISM) to demonstrate how paleoclimate data can improve our ability to predict the future evolution of the AIS. A large, perturbed-physics ensemble is generated, spanning uncertainty in the parameterizations of four key physical processes within ice sheet models: ice rheology, ice shelf calving, and the stress balances within ice sheets and ice shelves. A Latin hypercube approach is used to optimally sample the range of uncertainty in parameter values. This perturbed-physics ensemble is used to simulate the evolution of the AIS from the Last Glacial Maximum ( 21,000 years ago) to present. Paleoclimate records are then used to determine which ensemble members are the most realistic. This allows us to use data on past climates to directly constrain our understanding of the past contribution of the AIS towards changes in global sea level. Critically, it also allows us to determine which ensemble members are likely to generate the most realistic projections of the future evolution of the AIS.

  10. Constructing Rigorous and Broad Biosurveillance Networks for Detecting Emerging Zoonotic Outbreaks

    PubMed Central

    Brown, Mac; Moore, Leslie; McMahon, Benjamin; Powell, Dennis; LaBute, Montiago; Hyman, James M.; Rivas, Ariel; Jankowski, Mark; Berendzen, Joel; Loeppky, Jason; Manore, Carrie; Fair, Jeanne

    2015-01-01

    Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a wide range of scenarios of pathogen spread. We demonstrate how a hierarchy of mathematical and statistical tools can be used in surveillance planning help guide successful surveillance and mitigation policies for a wide range of zoonotic pathogens. The model forecasts can help clarify the complexities of potential scenarios, and optimize biosurveillance programs for rapidly detecting infectious diseases. Using the highly pathogenic zoonotic H5N1 avian influenza 2006-2007 epidemic in Nigeria as an example, we determined the risk for infection for localized areas in an outbreak and designed biosurveillance stations that are effective for different pathogen strains and a range of possible outbreak locations. We created a general multi-scale, multi-host stochastic SEIR epidemiological network model, with both short and long-range movement, to simulate the spread of an infectious disease through Nigerian human, poultry, backyard duck, and wild bird populations. We chose parameter ranges specific to avian influenza (but not to a particular strain) and used a Latin hypercube sample experimental design to investigate epidemic predictions in a thousand simulations. We ranked the risk of local regions by the number of times they became infected in the ensemble of simulations. These spatial statistics were then complied into a potential risk map of infection. Finally, we validated the results with a known outbreak, using spatial analysis of all the simulation runs to show the progression matched closely with the observed location of the farms infected in the 2006-2007 epidemic. PMID:25946164

  11. Soil hydraulic material properties and layered architecture from time-lapse GPR

    NASA Astrophysics Data System (ADS)

    Jaumann, Stefan; Roth, Kurt

    2018-04-01

    Quantitative knowledge of the subsurface material distribution and its effective soil hydraulic material properties is essential to predict soil water movement. Ground-penetrating radar (GPR) is a noninvasive and nondestructive geophysical measurement method that is suitable to monitor hydraulic processes. Previous studies showed that the GPR signal from a fluctuating groundwater table is sensitive to the soil water characteristic and the hydraulic conductivity function. In this work, we show that the GPR signal originating from both the subsurface architecture and the fluctuating groundwater table is suitable to estimate the position of layers within the subsurface architecture together with the associated effective soil hydraulic material properties with inversion methods. To that end, we parameterize the subsurface architecture, solve the Richards equation, convert the resulting water content to relative permittivity with the complex refractive index model (CRIM), and solve Maxwell's equations numerically. In order to analyze the GPR signal, we implemented a new heuristic algorithm that detects relevant signals in the radargram (events) and extracts the corresponding signal travel time and amplitude. This algorithm is applied to simulated as well as measured radargrams and the detected events are associated automatically. Using events instead of the full wave regularizes the inversion focussing on the relevant measurement signal. For optimization, we use a global-local approach with preconditioning. Starting from an ensemble of initial parameter sets drawn with a Latin hypercube algorithm, we sequentially couple a simulated annealing algorithm with a Levenberg-Marquardt algorithm. The method is applied to synthetic as well as measured data from the ASSESS test site. We show that the method yields reasonable estimates for the position of the layers as well as for the soil hydraulic material properties by comparing the results to references derived from ground truth data as well as from time domain reflectometry (TDR).

  12. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  13. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  14. Using an ensemble smoother to evaluate parameter uncertainty of an integrated hydrological model of Yanqi basin

    NASA Astrophysics Data System (ADS)

    Li, Ning; McLaughlin, Dennis; Kinzelbach, Wolfgang; Li, WenPeng; Dong, XinGuang

    2015-10-01

    Model uncertainty needs to be quantified to provide objective assessments of the reliability of model predictions and of the risk associated with management decisions that rely on these predictions. This is particularly true in water resource studies that depend on model-based assessments of alternative management strategies. In recent decades, Bayesian data assimilation methods have been widely used in hydrology to assess uncertain model parameters and predictions. In this case study, a particular data assimilation algorithm, the Ensemble Smoother with Multiple Data Assimilation (ESMDA) (Emerick and Reynolds, 2012), is used to derive posterior samples of uncertain model parameters and forecasts for a distributed hydrological model of Yanqi basin, China. This model is constructed using MIKESHE/MIKE11software, which provides for coupling between surface and subsurface processes (DHI, 2011a-d). The random samples in the posterior parameter ensemble are obtained by using measurements to update 50 prior parameter samples generated with a Latin Hypercube Sampling (LHS) procedure. The posterior forecast samples are obtained from model runs that use the corresponding posterior parameter samples. Two iterative sample update methods are considered: one based on an a perturbed observation Kalman filter update and one based on a square root Kalman filter update. These alternatives give nearly the same results and converge in only two iterations. The uncertain parameters considered include hydraulic conductivities, drainage and river leakage factors, van Genuchten soil property parameters, and dispersion coefficients. The results show that the uncertainty in many of the parameters is reduced during the smoother updating process, reflecting information obtained from the observations. Some of the parameters are insensitive and do not benefit from measurement information. The correlation coefficients among certain parameters increase in each iteration, although they generally stay below 0.50.

  15. Determining Reduced Order Models for Optimal Stochastic Reduced Order Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonney, Matthew S.; Brake, Matthew R.W.

    2015-08-01

    The use of parameterized reduced order models(PROMs) within the stochastic reduced order model (SROM) framework is a logical progression for both methods. In this report, five different parameterized reduced order models are selected and critiqued against the other models along with truth model for the example of the Brake-Reuss beam. The models are: a Taylor series using finite difference, a proper orthogonal decomposition of the the output, a Craig-Bampton representation of the model, a method that uses Hyper-Dual numbers to determine the sensitivities, and a Meta-Model method that uses the Hyper-Dual results and constructs a polynomial curve to better representmore » the output data. The methods are compared against a parameter sweep and a distribution propagation where the first four statistical moments are used as a comparison. Each method produces very accurate results with the Craig-Bampton reduction having the least accurate results. The models are also compared based on time requirements for the evaluation of each model where the Meta- Model requires the least amount of time for computation by a significant amount. Each of the five models provided accurate results in a reasonable time frame. The determination of which model to use is dependent on the availability of the high-fidelity model and how many evaluations can be performed. Analysis of the output distribution is examined by using a large Monte-Carlo simulation along with a reduced simulation using Latin Hypercube and the stochastic reduced order model sampling technique. Both techniques produced accurate results. The stochastic reduced order modeling technique produced less error when compared to an exhaustive sampling for the majority of methods.« less

  16. Educational Inequalities among Latin American Adolescents: Continuities and Changes over the 1980s, 1990s and 2000s.

    PubMed

    Marteleto, Letícia; Gelber, Denisse; Hubert, Celia; Salinas, Viviana

    2012-09-01

    The goal of this paper is to examine recent trends in educational stratification for Latin American adolescents growing up in three distinct periods: the 1980s, during severe recession; the 1990s, a period of structural adjustments imposed by international organizations; and the late 2000s, when most countries in the region experienced positive and stable growth. In addition to school enrollment and educational transitions, we examine the quality of education through enrollment in private schools, an important aspect of inequality in education that most studies have neglected. We use nationally representative household survey data for the 1980s, 1990s and 2000s in Brazil, Chile, Mexico and Uruguay. Our overall findings confirm the importance of macroeconomic conditions for inequalities in educational opportunity, suggesting important benefits brought up by the favorable conditions of the 2000s. However, our findings also call attention to increasing disadvantages associated with the quality of the education adolescents receive, suggesting the significance of the EMI framework-Effectively Maintained Inequality-and highlighting the value of examining the quality in addition to the quantity of education in order to fully understand educational stratification in the Latin American context.

  17. Group Violence and Migration Experience among Latin American Youths in Justice Enforcement Centers (Madrid, Spain).

    PubMed

    Martínez García, José Manuel; Martín López, María Jesús

    2015-10-30

    Group violence among Latin American immigrant youth has led to ongoing debates in political, legal, and media circles, yet none of those many perspectives has arrived at a solid, empirically supported definition for the phenomenon. This study aims to explore the relationship between the immigrant experience and violent group behavior in youths from Latin America serving prison sentences in Justice Enforcement Centers in the Community of Madrid. Semi-structured interviews were conducted with 19 juveniles, and content analysis was applied to the resulting transcripts, employing Grounded Theory to create an axial codification of intra- and inter-categorical contents, and Delphi panels for quality control. The research team delved into 62 topics, addressing participants' perceptions of the immigrant experience and its effects on five socialization settings (neighborhood, school, family, peer group, and significant other), and each one's relationship to violent behavior. The results led us to believe the young people's immigration experiences had been systematically examined. Their personal and social development was influenced by negative socioeconomic conditions, ineffective parental supervision, maladjustment and conflict at school, and experiences of marginalization and xenophobia. All those conditions favored affiliation with violent groups that provided them instrumental (economic and material), expressive, or affective support.

  18. Integrated Pest Management of Coffee Berry Borer: Strategies from Latin America that Could Be Useful for Coffee Farmers in Hawaii

    PubMed Central

    Aristizábal, Luis F.; Bustillo, Alex E.; Arthurs, Steven P.

    2016-01-01

    The coffee berry borer (CBB), Hypothenemus hampei Ferrari (Coleoptera: Curculionidae: Scolytinae) is the primary arthropod pest of coffee plantations worldwide. Since its detection in Hawaii (September 2010), coffee growers are facing financial losses due to reduced quality of coffee yields. Several control strategies that include cultural practices, biological control agents (parasitoids), chemical and microbial insecticides (entomopathogenic fungi), and a range of post-harvest sanitation practices have been conducted to manage CBB around the world. In addition, sampling methods including the use of alcohol based traps for monitoring CBB populations have been implemented in some coffee producing countries in Latin America. It is currently unclear which combination of CBB control strategies is optimal under economical, environmental, and sociocultural conditions of Hawaii. This review discusses components of an integrated pest management program for CBB. We focus on practical approaches to provide guidance to coffee farmers in Hawaii. Experiences of integrated pest management (IPM) of CBB learned from Latin America over the past 25 years may be relevant for establishing strategies of control that may fit under Hawaiian coffee farmers’ conditions. PMID:26848690

  19. Educational Inequalities among Latin American Adolescents: Continuities and Changes over the 1980s, 1990s and 2000s

    PubMed Central

    Marteleto, Letícia; Gelber, Denisse; Hubert, Celia; Salinas, Viviana

    2012-01-01

    The goal of this paper is to examine recent trends in educational stratification for Latin American adolescents growing up in three distinct periods: the 1980s, during severe recession; the 1990s, a period of structural adjustments imposed by international organizations; and the late 2000s, when most countries in the region experienced positive and stable growth. In addition to school enrollment and educational transitions, we examine the quality of education through enrollment in private schools, an important aspect of inequality in education that most studies have neglected. We use nationally representative household survey data for the 1980s, 1990s and 2000s in Brazil, Chile, Mexico and Uruguay. Our overall findings confirm the importance of macroeconomic conditions for inequalities in educational opportunity, suggesting important benefits brought up by the favorable conditions of the 2000s. However, our findings also call attention to increasing disadvantages associated with the quality of the education adolescents receive, suggesting the significance of the EMI framework—Effectively Maintained Inequality—and highlighting the value of examining the quality in addition to the quantity of education in order to fully understand educational stratification in the Latin American context. PMID:22962512

  20. [Epidemiological transition in Latin America: a comparison of four countries].

    PubMed

    Albala, C; Vio, F; Yáñez, M

    1997-06-01

    In the last decade, Latin America has experienced important transformations in its health conditions, due to demographic changes and a rapid urbanization process. To analyze socioeconomic, demographic and epidemiological changes in Chile, Guatemala, Mexico and Uruguay and relate them to the different stages in the demographic and epidemiological transition of these countries. Data was obtained from official information of local and international organizations such as Pan-American Health Organization, United Nations, Latin American Center for Demography (CELADE) and World Bank. Guatemala is in a pre-transition stage with a high proportion of communicable diseases as causes of death (61%) as compared with Mexico (22%), Chile (13%) and Uruguay (7%). Mexico is in a prolonged transition situation and Chile is close to Uruguay in a post-transitional stage. Despite decreasing rates of mortality, the proportion of deaths represented by chronic diseases and injuries has increased to over 30% in all countries, except Uruguay. Adjusted mortality rates for cardiovascular diseases are lower in Latin American countries, as compared to Canada. However, excepting Guatemala, there are differences in the pattern of cardiovascular disease, with a higher mortality due to cerebrovascular and a lower mortality due to coronary artery diseases. An increment in non communicable diseases is expected for the next decades in Latin America. Analysis of demographic and epidemiological transition is crucial to define health policies and to adequate health systems to the new situations.

  1. Application of positive-real functions in hyperstable discrete model-reference adaptive system design.

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.

    1972-01-01

    Proposal of an algorithmic procedure, based on mathematical programming methods, to design compensators for hyperstable discrete model-reference adaptive systems (MRAS). The objective of the compensator is to render the MRAS insensitive to initial parameter estimates within a maximized hypercube in the model parameter space.

  2. Parallel Algorithm Solves Coupled Differential Equations

    NASA Technical Reports Server (NTRS)

    Hayashi, A.

    1987-01-01

    Numerical methods adapted to concurrent processing. Algorithm solves set of coupled partial differential equations by numerical integration. Adapted to run on hypercube computer, algorithm separates problem into smaller problems solved concurrently. Increase in computing speed with concurrent processing over that achievable with conventional sequential processing appreciable, especially for large problems.

  3. NASA Tech Briefs, November/December 1986, Special Edition

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics: Computing: The View from NASA Headquarters; Earth Resources Laboratory Applications Software: Versatile Tool for Data Analysis; The Hypercube: Cost-Effective Supercomputing; Artificial Intelligence: Rendezvous with NASA; NASA's Ada Connection; COSMIC: NASA's Software Treasurehouse; Golden Oldies: Tried and True NASA Software; Computer Technical Briefs; NASA TU Services; Digital Fly-by-Wire.

  4. Effective implementation of wavelet Galerkin method

    NASA Astrophysics Data System (ADS)

    Finěk, Václav; Šimunková, Martina

    2012-11-01

    It was proved by W. Dahmen et al. that an adaptive wavelet scheme is asymptotically optimal for a wide class of elliptic equations. This scheme approximates the solution u by a linear combination of N wavelets and a benchmark for its performance is the best N-term approximation, which is obtained by retaining the N largest wavelet coefficients of the unknown solution. Moreover, the number of arithmetic operations needed to compute the approximate solution is proportional to N. The most time consuming part of this scheme is the approximate matrix-vector multiplication. In this contribution, we will introduce our implementation of wavelet Galerkin method for Poisson equation -Δu = f on hypercube with homogeneous Dirichlet boundary conditions. In our implementation, we identified nonzero elements of stiffness matrix corresponding to the above problem and we perform matrix-vector multiplication only with these nonzero elements.

  5. How much do physicians in Latin America know about behavioral variant frontotemporal dementia?

    PubMed

    Gleichgerrcht, Ezequiel; Flichtentrei, Daniel; Manes, Facundo

    2011-11-01

    Diagnosis of behavioral variant frontotemporal dementia (bvFTD) can be especially challenging during the early stages for several reasons, including the fact that (a) behavioral disturbances in bvFTD can mimic the symptomatic profile of psychiatric disorders; (b) neuropsychological performance may be relatively spared; and (c) changes in structural neuroimaging may go undetected. Most frequently, bvFTD is not included as part of medical or residency training outside the field of cognitive neurology. The present study aimed at examining bvFTD-related practices concerning academic and professional training, diagnosis, and treatment across Latin America. We surveyed the academic and professional aspects of clinical practice related to bvFTD of 596 physicians from different fields throughout the continent. We discuss several aspects concerning Latin American physicians' training on dementia and bvFTD, the way in which they approach the differential diagnosis of bvFTD, and their most frequent strategies for the treatment of this condition. We conclude that information about bvFTD deserves more attention in both undergraduate and postgraduate medical education in Latin America, and that understanding clinical practices related to FTD can help design more efficient training programs for physicians in this and other world regions.

  6. [Latin American consensus on hypertension in patients with diabetes type 2 and metabolic syndrome].

    PubMed

    López-Jaramillo, Patricio; Sánchez, Ramiro A; Díaz, Margarita; Cobos, Leonardo; Bryce, Alfonso; Parra-Carrillo, José Z; Lizcano, Fernando; Lanas, Fernando; Sinay, Isaac; Sierra, Iván D; Peñaherrera, Ernesto; Benderky, Mario; Schmid, Helena; Botero, Rodrigo; Urina, Manuel; Lara, Joffre; Foos, Milton C; Márquez, Gustavo; Harrap, Stephen; Ramírez, Agustín J; Zanchetti, Alberto

    2014-01-01

    The present document has been prepared by a group of experts, members of Cardiology, Endocrinology, Internal Medicine, Nephrology and Diabetes societies of Latin American countries, to serve as a guide to physicians taking care of patients with diabetes, hypertension and comorbidities or complications of both conditions. Although the concept of metabolic syndrome is currently disputed, the higher prevalence in Latin America of that cluster of metabolic alterations has suggested that metabolic syndrome is useful nosography entity in the context of Latin American medicine. Therefore, in the present document, particular attention is paid to this syndrome in order to alert physicians on a particular high- risk population, usually underestimated and undertreated. These recommendations results from presentation and debates by discussion panels during a 2-day conference held in Bucaramanga, in October 2012, and all the participants have approved the final conclusions. The authors acknowledge that the publication and diffusion of guidelines do not suffice to achieve the recommended changes in diagnostic or therapeutic strategies, and plan suitable interventions overcoming both physicians and patients from effectively adhering to guideline recommendations. Copyright © 2013 Elsevier España, S.L. y SEA. All rights reserved.

  7. Relation between the Global Burden of Disease and Randomized Clinical Trials Conducted in Latin America Published in the Five Leading Medical Journals

    PubMed Central

    Perel, Pablo; Miranda, J. Jaime; Ortiz, Zulma; Casas, Juan Pablo

    2008-01-01

    Background Since 1990 non communicable diseases and injuries account for the majority of death and disability-adjusted life years in Latin America. We analyzed the relationship between the global burden of disease and Randomized Clinical Trials (RCTs) conducted in Latin America that were published in the five leading medical journals. Methodology/Principal Findings We included all RCTs in humans, exclusively conducted in Latin American countries, and published in any of the following journals: Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, and New England Journal of Medicine. We described the trials and reported the number of RCTs according to the main categories of the global burden of disease. Sixty-six RCTs were identified. Communicable diseases accounted for 38 (57%) reports. Maternal, perinatal, and nutritional conditions accounted for 19 (29%) trials. Non-communicable diseases represent 48% of the global burden of disease but only 14% of reported trials. No trial addressed injuries despite its 18% contribution to the burden of disease in 2000. Conclusions/Significance A poor correlation between the burden of disease and RCTs publications was found. Non communicable diseases and injuries account for up to two thirds of the burden of disease in Latin America but these topics are seldom addressed in published RCTs in the selected sample of journals. Funding bodies of health research and editors should be aware of the increasing burden of non communicable diseases and injuries occurring in Latin America to ensure that this growing epidemic is not neglected in the research agenda and not affected by publication bias. PMID:18301772

  8. [The politics of gender and of women's health in Latin America: Peru. 1].

    PubMed

    de Martos, M V; da Fonseca, R M

    1996-04-01

    This study includes a presentation and analysis about the woman's problematic. It considers the woman social condition according to the androgynors ideology of the society. Briefly it presents a historical evolution of the woman in accord with the gender analysis. On the other hand, it presents too, an analysis of the health and the public politics in relation to the woman in Latin America, mainly in Peru, since the Incanato period, justifying the motives of the formulation and implementation of the Family Planning Program, the Responsible Paternity Program and Child and Mother Attention Program.

  9. [Conditions for universal access to health in Latin America: social rights, social protection and financial and political constraints].

    PubMed

    Sojo, Ana

    2011-06-01

    After a brief review of the concept of health equity and its social and sectoral determinants, some macroeconomic aspects of health expenditure in Latin America are considered. Given the significant contemporary tensions with regard to social rights and the definition of health benefits, three emblematic experiences are analyzed in very different health systems, namely those of Chile, Colombia and Mexico. They cover different aspects, such as the guarantee of health benefits, the reduction of forms of implicit rationing and/or barriers to admission, and also aspects related to the quality of services.

  10. RAVEN Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    2016-06-01

    RAVEN is a software framework able to perform parametric and stochastic analysis based on the response of complex system codes. The initial development was aimed at providing dynamic risk analysis capabilities to the thermohydraulic code RELAP-7, currently under development at Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose stochastic and uncertainty quantification platform, capable of communicating with any system code. In fact, the provided Application Programming Interfaces (APIs) allow RAVEN to interact with any code as long as all the parameters that need to be perturbed are accessible by input filesmore » or via python interfaces. RAVEN is capable of investigating system response and explore input space using various sampling schemes such as Monte Carlo, grid, or Latin hypercube. However, RAVEN strength lies in its system feature discovery capabilities such as: constructing limit surfaces, separating regions of the input space leading to system failure, and using dynamic supervised learning techniques. The development of RAVEN started in 2012 when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework arose. RAVEN’s principal assignment is to provide the necessary software and algorithms in order to employ the concepts developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just to identify the frequency of an event potentially leading to a system failure, but the proximity (or lack thereof) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. peak pressure in a pipe) is exceeded under certain conditions. Most of the capabilities, implemented having RELAP-7 as a principal focus, are easily deployable to other system codes. For this reason, several side activates have been employed (e.g. RELAP5-3D, any MOOSE-based App, etc.) or are currently ongoing for coupling RAVEN with several different software. The aim of this document is to provide a set of commented examples that can help the user to become familiar with the RAVEN code usage.« less

  11. Coach simplified structure modeling and optimization study based on the PBM method

    NASA Astrophysics Data System (ADS)

    Zhang, Miaoli; Ren, Jindong; Yin, Ying; Du, Jian

    2016-09-01

    For the coach industry, rapid modeling and efficient optimization methods are desirable for structure modeling and optimization based on simplified structures, especially for use early in the concept phase and with capabilities of accurately expressing the mechanical properties of structure and with flexible section forms. However, the present dimension-based methods cannot easily meet these requirements. To achieve these goals, the property-based modeling (PBM) beam modeling method is studied based on the PBM theory and in conjunction with the characteristics of coach structure of taking beam as the main component. For a beam component of concrete length, its mechanical characteristics are primarily affected by the section properties. Four section parameters are adopted to describe the mechanical properties of a beam, including the section area, the principal moments of inertia about the two principal axles, and the torsion constant of the section. Based on the equivalent stiffness strategy, expressions for the above section parameters are derived, and the PBM beam element is implemented in HyperMesh software. A case is realized using this method, in which the structure of a passenger coach is simplified. The model precision is validated by comparing the basic performance of the total structure with that of the original structure, including the bending and torsion stiffness and the first-order bending and torsional modal frequencies. Sensitivity analysis is conducted to choose design variables. The optimal Latin hypercube experiment design is adopted to sample the test points, and polynomial response surfaces are used to fit these points. To improve the bending and torsion stiffness and the first-order torsional frequency and taking the allowable maximum stresses of the braking and left turning conditions as constraints, the multi-objective optimization of the structure is conducted using the NSGA-II genetic algorithm on the ISIGHT platform. The result of the Pareto solution set is acquired, and the selection strategy of the final solution is discussed. The case study demonstrates that the mechanical performances of the structure can be well-modeled and simulated by PBM beam. Because of the merits of fewer parameters and convenience of use, this method is suitable to be applied in the concept stage. Another merit is that the optimization results are the requirements for the mechanical performance of the beam section instead of those of the shape and dimensions, bringing flexibility to the succeeding design.

  12. Lumbar vertebrae fracture injury risk in finite element reconstruction of CIREN and NASS frontal motor vehicle crashes.

    PubMed

    Jones, Derek A; Gaewsky, James P; Kelley, Mireille E; Weaver, Ashley A; Miller, Anna N; Stitzel, Joel D

    2016-09-01

    The objective of this study was to reconstruct 4 real-world motor vehicle crashes (MVCs), 2 with lumbar vertebral fractures and 2 without vertebral fractures in order to elucidate the MVC and/or restraint variables that increase this injury risk. A finite element (FE) simplified vehicle model (SVM) was used in conjunction with a previously developed semi-automated tuning method to arrive at 4 SVMs that were tuned to mimic frontal crash responses of a 2006 Chevrolet Cobalt, 2012 Ford Escape, 2007 Hummer H3, and 2002 Chevrolet Cavalier. Real-world crashes in the first 2 vehicles resulted in lumbar vertebrae fractures, whereas the latter 2 did not. Once each SVM was tuned to its corresponding vehicle, the Total HUman Model for Safety (THUMS) v4.01 was positioned in 120 precrash configurations in each SVM by varying 5 parameters using a Latin hypercube design (LHD) of experiments: seat track position, seatback angle, steering column angle, steering column telescoping position, and d-ring height. For each case, the event data recorder (EDR) crash pulse was used to apply kinematic boundary conditions to the model. By analyzing cross-sectional vertebral loads, vertebral bending moments, and maximum principal strain and stress in both cortical and trabecular bone, injury metric response as a function of posture and restraint parameters was computed. Tuning the SVM to specific vehicle models produced close matches between the simulated and experimental crash test responses for head, T6, and pelvis resultant acceleration; left and right femur loads; and shoulder and lap belt loads. Though vertebral load in the THUMS simulations was highly similar between injury cases and noninjury cases, the amount of bending moment was much higher for the injury cases. Seatback angle had a large effect on the maximum compressive load and bending moment in the lumbar spine, indicating the upward tilt of the seat pan in conjunction with precrash positioning may increase the likelihood of suffering lumbar injury even in frontal, planar MVCs. In conclusion, precrash positioning has a large effect on lumbar injury metrics. The lack of lumbar injury criteria in regulatory crash tests may have led to inadvertent design of seat pans that work to apply axial force to the spinal column during frontal crashes.

  13. Super and parallel computers and their impact on civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamat, M.P.

    1986-01-01

    This book presents the papers given at a conference on the use of supercomputers in civil engineering. Topics considered at the conference included solving nonlinear equations on a hypercube, a custom architectured parallel processing system, distributed data processing, algorithms, computer architecture, parallel processing, vector processing, computerized simulation, and cost benefit analysis.

  14. Assignment Of Finite Elements To Parallel Processors

    NASA Technical Reports Server (NTRS)

    Salama, Moktar A.; Flower, Jon W.; Otto, Steve W.

    1990-01-01

    Elements assigned approximately optimally to subdomains. Mapping algorithm based on simulated-annealing concept used to minimize approximate time required to perform finite-element computation on hypercube computer or other network of parallel data processors. Mapping algorithm needed when shape of domain complicated or otherwise not obvious what allocation of elements to subdomains minimizes cost of computation.

  15. 3D Navier-Stokes Flow Analysis for a Large-Array Multiprocessor

    DTIC Science & Technology

    1989-04-17

    computer, Alliant’s FX /8, Intel’s Hypercube, and Encore’s Multimax. Unfortunately, the current algorithms have been developed pri- marily for SISD machines...Reversing and Thrust-Vectoring Nozzle Flows," Ph.D. Dissertation in the Dept. of Aero. and Astro ., Univ. of Wash., Washington, 1986. [11] Anderson

  16. Distributed Memory Compiler Methods for Irregular Problems - Data Copy Reuse and Runtime Partitioning

    DTIC Science & Technology

    1991-09-01

    addition, support for Saltz was provided by NSF from NSF Grant ASC-8819374. i 1, introduction Over the past fewyers, ,we have devoped -methods needed to... network . In Third Conf. on Hypercube Concurrent Computers and Applications, pages 241-27278, 1988. [17] G. Fox, S. Hiranandani, K. Kennedy, C. Koelbel

  17. Determinants of Subjective Social Status and Health Among Latin American Women Immigrants in Spain: A Qualitative Approach.

    PubMed

    Sanchón-Macias, Ma Visitación; Bover-Bover, Andreu; Prieto-Salceda, Dolores; Paz-Zulueta, María; Torres, Blanca; Gastaldo, Denise

    2016-04-01

    This qualitative study was carried out to better understand factors that determine the subjective social status of Latin Americans in Spain. The study was conducted following a theoretical framework and forms part of broader study on subjective social status and health. Ten immigrant participants engaged in semi-structured interviews, from which data were collected. The study results show that socioeconomic aspects of the crisis and of policies adopted have shaped immigrant living conditions in Spain. Four major themes that emerged from the analysis were related to non-recognition of educational credentials, precarious working conditions, unemployment and loneliness. These results illustrate the outcomes of current policies on health and suggest a need for health professionals to orient practices toward social determinants, thus utilizing evaluations of subjective social status to reduce inequalities in health.

  18. Most Relevant Neuropathic Pain Treatment and Chronic Low Back Pain Management Guidelines: A Change Pain Latin America Advisory Panel Consensus.

    PubMed

    Amescua-Garcia, Cesar; Colimon, Frantz; Guerrero, Carlos; Jreige Iskandar, Aziza; Berenguel Cook, Maria; Bonilla, Patricia; Campos Kraychete, Durval; Delgado Barrera, William; Alberto Flores Cantisani, Jose; Hernandez-Castro, John Jairo; Lara-Solares, Argelia; Perez Hernandez, Concepcion; Rico, Maria Antonieta; Del Rocio Guillen Nunez, Maria; Sempertegui Gallegos, Manuel; Garcia, Joao Batista Santos

    2018-03-01

    Chronic pain conditions profoundly affect the daily living of a significant number of people and are a major economic and social burden, particularly in developing countries. The Change Pain Latin America (CPLA) advisory panel aimed to identify the most appropriate guidelines for the treatment of neuropathic pain (NP) and chronic low back pain (CLBP) for use across Latin America. Published systematic reviews or practice guidelines were identified by a systematic search of PubMed, the Guidelines Clearinghouse, and Google. Articles were screened by an independent reviewer, and potential candidate guidelines were selected for more in-depth review. A shortlist of suitable guidelines was selected and critically evaluated by the CPLA advisory panel. Searches identified 674 and 604 guideline articles for NP and CLBP, respectively. Of these, 14 guidelines were shortlisted for consensus consideration, with the following final selections made: "Recommendations for the pharmacological management of neuropathic pain from the Neuropathic Pain Special Interest Group in 2015-pharmacotherapy for neuropathic pain in adults: A systematic review and meta-analysis."Diagnosis and treatment of low back pain: A joint clinical practice guideline from the American College of Physicians and the American Pain Society" (2007). The selected guidelines were endorsed by all members of the CPLA advisory board as the best fit for use across Latin America. In addition, regional considerations were discussed and recorded. We have included this expert local insight and advice to enhance the implementation of each guideline across all Latin American countries.

  19. Taste symmetry breaking with hypercubic-smeared staggered fermions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bae, Taegil; Adams, David H.; Kim, Hyung-Jin

    2008-05-01

    We study the impact of hypercubic (HYP) smearing on the size of taste-breaking for staggered fermions, comparing to unimproved and to asqtad-improved staggered fermions. As in previous studies, we find a substantial reduction in taste-breaking compared to unimproved staggered fermions (by a factor of 4-7 on lattices with spacing a{approx_equal}0.1 fm). In addition, we observe that discretization effects of next-to-leading order in the chiral expansion (O(a{sup 2}p{sup 2})) are markedly reduced by HYP smearing. Compared to asqtad valence fermions, we find that taste-breaking in the pion spectrum is reduced by a factor of 2.5-3, down to a level comparable tomore » the expected size of generic O(a{sup 2}) effects. Our results suggest that, once one reaches a lattice spacing of a{approx_equal}0.09 fm, taste-breaking will be small enough after HYP smearing that one can use a modified power counting in which O(a{sup 2})<

  20. A parallel expert system for the control of a robotic air vehicle

    NASA Technical Reports Server (NTRS)

    Shakley, Donald; Lamont, Gary B.

    1988-01-01

    Expert systems can be used to govern the intelligent control of vehicles, for example the Robotic Air Vehicle (RAV). Due to the nature of the RAV system the associated expert system needs to perform in a demanding real-time environment. The use of a parallel processing capability to support the associated expert system's computational requirement is critical in this application. Thus, algorithms for parallel real-time expert systems must be designed, analyzed, and synthesized. The design process incorporates a consideration of the rule-set/face-set size along with representation issues. These issues are looked at in reference to information movement and various inference mechanisms. Also examined is the process involved with transporting the RAV expert system functions from the TI Explorer, where they are implemented in the Automated Reasoning Tool (ART), to the iPSC Hypercube, where the system is synthesized using Concurrent Common LISP (CCLISP). The transformation process for the ART to CCLISP conversion is described. The performance characteristics of the parallel implementation of these expert systems on the iPSC Hypercube are compared to the TI Explorer implementation.

  1. High-throughput Raman chemical imaging for evaluating food safety and quality

    NASA Astrophysics Data System (ADS)

    Qin, Jianwei; Chao, Kuanglin; Kim, Moon S.

    2014-05-01

    A line-scan hyperspectral system was developed to enable Raman chemical imaging for large sample areas. A custom-designed 785 nm line-laser based on a scanning mirror serves as an excitation source. A 45° dichroic beamsplitter reflects the laser light to form a 24 cm x 1 mm excitation line normally incident on the sample surface. Raman signals along the laser line are collected by a detection module consisting of a dispersive imaging spectrograph and a CCD camera. A hypercube is accumulated line by line as a motorized table moves the samples transversely through the laser line. The system covers a Raman shift range of -648.7-2889.0 cm-1 and a 23 cm wide area. An example application, for authenticating milk powder, was presented to demonstrate the system performance. In four minutes, the system acquired a 512x110x1024 hypercube (56,320 spectra) from four 47-mm-diameter Petri dishes containing four powder samples. Chemical images were created for detecting two adulterants (melamine and dicyandiamide) that had been mixed into the milk powder.

  2. Hypercat - Hypercube of AGN tori

    NASA Astrophysics Data System (ADS)

    Nikutta, Robert; Lopez-Rodriguez, Enrique; Ichikawa, Kohei; Levenson, Nancy A.; Packham, Christopher C.

    2018-06-01

    AGN unification and observations hold that a dusty torus obscures the central accretion engine along some lines of sight. SEDs of dust tori have been modeled for a long time, but resolved emission morphologies have not been studied in much detail, because resolved observations are only possible recently (VLTI,ALMA) and in the near future (TMT,ELT,GMT). Some observations challenge a simple torus model, because in several objects most of MIR emission appears to emanate from polar regions high above the equatorial plane, i.e. not where the dust supposedly resides.We introduce our software framework and hypercube of AGN tori (Hypercat) made with CLUMPY (www.clumpy.org), a large set of images (6 model parameters + wavelength) to facilitate studies of emission and dust morphologies. We make use of Hypercat to study the morphological properties of the emission and dust distributions as function of model parameters. We find that a simple clumpy torus can indeed produce 10-micron emission patterns extended in polar directions, with extension ratios compatible with those found in observations. We are able to constrain the range of parameters that produce such morphologies.

  3. Immigration, Work, and Health: A Literature Review of Immigration Between Mexico and the United States

    PubMed Central

    Flynn, Michael A.; Carreón, Tania; Eggerth, Donald E.; Johnson, Antoinette I.

    2015-01-01

    Understanding the influence someone’s job or career has on their health goes beyond the physical, emotional and social hazards, risks and conditions that they face at work. One’s job or career also exerts a significant influence over other aspects of life that contribute or detract from their health and that of their family. Work is the major incentive for Latin American migration to the United States. Latino immigrants experience increasingly poorer outcomes for physical health and chronic diseases the longer they remain in the U.S. The strong link between work and immigration suggests that, for many Latin Americans, immigration can be understood as a career path which puts them, and their family members, in situations that can change their physical, emotional, and social health as a condition of their employment. Given the large number of Latin Americans who emigrate for work, it is essential that the unique physical, mental and social impacts of emigration are accounted for when working with clients impacted by emigration at the individual, family and community level as well as those social workers practicing at the system level. This paper is a literature review that explores the impact that emigrating for work has on the health of those that emigrate and their family members that stay behind. PMID:28260831

  4. Scalability of Parallel Spatial Direct Numerical Simulations on Intel Hypercube and IBM SP1 and SP2

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Hanebutte, Ulf R.; Zubair, Mohammad

    1995-01-01

    The implementation and performance of a parallel spatial direct numerical simulation (PSDNS) approach on the Intel iPSC/860 hypercube and IBM SP1 and SP2 parallel computers is documented. Spatially evolving disturbances associated with the laminar-to-turbulent transition in boundary-layer flows are computed with the PSDNS code. The feasibility of using the PSDNS to perform transition studies on these computers is examined. The results indicate that PSDNS approach can effectively be parallelized on a distributed-memory parallel machine by remapping the distributed data structure during the course of the calculation. Scalability information is provided to estimate computational costs to match the actual costs relative to changes in the number of grid points. By increasing the number of processors, slower than linear speedups are achieved with optimized (machine-dependent library) routines. This slower than linear speedup results because the computational cost is dominated by FFT routine, which yields less than ideal speedups. By using appropriate compile options and optimized library routines on the SP1, the serial code achieves 52-56 M ops on a single node of the SP1 (45 percent of theoretical peak performance). The actual performance of the PSDNS code on the SP1 is evaluated with a "real world" simulation that consists of 1.7 million grid points. One time step of this simulation is calculated on eight nodes of the SP1 in the same time as required by a Cray Y/MP supercomputer. For the same simulation, 32-nodes of the SP1 and SP2 are required to reach the performance of a Cray C-90. A 32 node SP1 (SP2) configuration is 2.9 (4.6) times faster than a Cray Y/MP for this simulation, while the hypercube is roughly 2 times slower than the Y/MP for this application. KEY WORDS: Spatial direct numerical simulations; incompressible viscous flows; spectral methods; finite differences; parallel computing.

  5. A Shared Heritage: Afro-Latin@s and Black History

    ERIC Educational Resources Information Center

    Busey, Christopher L.; Cruz, Bárbara C.

    2015-01-01

    As the Latin@ population continues to grow in the United States, it is imperative that social studies teachers are aware of the rich history and sociocultural complexities of Latin@ identity. In particular, there is a large population of Latin@s of African descent throughout Latin America, the Caribbean, and North America. However, Afro-Latin@s…

  6. The Wavelet Element Method. Part 2; Realization and Additional Features in 2D and 3D

    NASA Technical Reports Server (NTRS)

    Canuto, Claudio; Tabacco, Anita; Urban, Karsten

    1998-01-01

    The Wavelet Element Method (WEM) provides a construction of multiresolution systems and biorthogonal wavelets on fairly general domains. These are split into subdomains that are mapped to a single reference hypercube. Tensor products of scaling functions and wavelets defined on the unit interval are used on the reference domain. By introducing appropriate matching conditions across the interelement boundaries, a globally continuous biorthogonal wavelet basis on the general domain is obtained. This construction does not uniquely define the basis functions but rather leaves some freedom for fulfilling additional features. In this paper we detail the general construction principle of the WEM to the 1D, 2D and 3D cases. We address additional features such as symmetry, vanishing moments and minimal support of the wavelet functions in each particular dimension. The construction is illustrated by using biorthogonal spline wavelets on the interval.

  7. Multinode reconfigurable pipeline computer

    NASA Technical Reports Server (NTRS)

    Nosenchuck, Daniel M. (Inventor); Littman, Michael G. (Inventor)

    1989-01-01

    A multinode parallel-processing computer is made up of a plurality of innerconnected, large capacity nodes each including a reconfigurable pipeline of functional units such as Integer Arithmetic Logic Processors, Floating Point Arithmetic Processors, Special Purpose Processors, etc. The reconfigurable pipeline of each node is connected to a multiplane memory by a Memory-ALU switch NETwork (MASNET). The reconfigurable pipeline includes three (3) basic substructures formed from functional units which have been found to be sufficient to perform the bulk of all calculations. The MASNET controls the flow of signals from the memory planes to the reconfigurable pipeline and vice versa. the nodes are connectable together by an internode data router (hyperspace router) so as to form a hypercube configuration. The capability of the nodes to conditionally configure the pipeline at each tick of the clock, without requiring a pipeline flush, permits many powerful algorithms to be implemented directly.

  8. The Reconstruction of the Historical Building of the Latin School in Malbork

    NASA Astrophysics Data System (ADS)

    Duda, Zenon; Kryzia, Katarzyna

    2013-03-01

    The paper summarizes the reconstruction of the historical building erected in the 14th century, during the times of the residence of Grand Master of the Teutonic Order Winrich von Kniprode, currently referred to as the Latin School. It characterizes the location of the Latin School in the urban conservation area of the town of Malbork. The building is situated in the stretch of the buttressed brick escarpment on the Nogat River in the line of the historic defensive walls of Malbork. The paper also outlines the history of this building, constructed and managed by the municipal authorities of Malbork, which for a long time was a seat of a Patronage of Saint George and the Merchant Guild, and next, from the 16th century until 1864, the building housed a school where basic Latin was taught. Next, the situation of this historical monument in the 20th century is discussed. In the next part of the paper, the geological conditions of the site where the building was erected are discussed. The conducted archeological and architectural exploratory research related to the historical building with a particular emphasis on historic preservation and restoration works focusing on the building and its surroundings is presented and analyzed. Currently carried out design, construction and adaptation works allowing new functions to be embedded into this building are also discussed. The paper shows the benefits due to the realization of the reconstruction program of the degraded building of the Latin school in the historic quarter of the town. These activities are aimed at the conversion of the currently derelict building by means of embedding new functions into it. There are being designed, among others, an interactive educational center modern library, astronomical observatory, craft museum and multifunctional hall, allowing proper conditions to be created for the development of educational, artistic and tourism related activities in the reconstructed building. The reconstruction of the historical building is a positive response to its deterioration resulting from former activities and it will contribute to the improvement of the quality of cultural life of both local inhabitants and visitors.

  9. Research into the Architecture of CAD Based Robot Vision Systems

    DTIC Science & Technology

    1988-02-09

    Vision 󈨚 and "Automatic Generation of Recognition Features for Com- puter Vision," Mudge, Turney and Volz, published in Robotica (1987). All of the...Occluded Parts," (T.N. Mudge, J.L. Turney, and R.A. Volz), Robotica , vol. 5, 1987, pp. 117-127. 5. "Vision Algorithms for Hypercube Machines," (T.N. Mudge

  10. A graph with fractional revival

    NASA Astrophysics Data System (ADS)

    Bernard, Pierre-Antoine; Chan, Ada; Loranger, Érika; Tamon, Christino; Vinet, Luc

    2018-02-01

    An example of a graph that admits balanced fractional revival between antipodes is presented. It is obtained by establishing the correspondence between the quantum walk on a hypercube where the opposite vertices across the diagonals of each face are connected and, the coherent transport of single excitations in the extension of the Krawtchouk spin chain with next-to-nearest neighbour interactions.

  11. Nutrition status of children in Latin America

    PubMed Central

    Garmendia, M. L.; Jones‐Smith, J.; Lutter, C. K.; Miranda, J. J.; Pedraza, L. S.; Popkin, B. M.; Ramirez‐Zea, M.; Salvo, D.; Stein, A. D.

    2017-01-01

    Summary The prevalence of overweight and obesity is rapidly increasing among Latin American children, posing challenges for current healthcare systems and increasing the risk for a wide range of diseases. To understand the factors contributing to childhood obesity in Latin America, this paper reviews the current nutrition status and physical activity situation, the disparities between and within countries and the potential challenges for ensuring adequate nutrition and physical activity. Across the region, children face a dual burden of undernutrition and excess weight. While efforts to address undernutrition have made marked improvements, childhood obesity is on the rise as a result of diets that favour energy‐dense, nutrient‐poor foods and the adoption of a sedentary lifestyle. Over the last decade, changes in socioeconomic conditions, urbanization, retail foods and public transportation have all contributed to childhood obesity in the region. Additional research and research capacity are needed to address this growing epidemic, particularly with respect to designing, implementing and evaluating the impact of evidence‐based obesity prevention interventions. PMID:28741907

  12. [Workplace violence in Latin America: A review of the scientific evidence].

    PubMed

    Ansoleaga, Elisa; Gómez-Rubio, Constanza; Mauro, Amalia

    2015-01-01

    Workplace Violence has acquired social relevance given the evidence regarding there its health consequences. Has identified various effects such as mood disorders and sleep disorders, hostility, isolation, insecurity, among others. To describe and analyze scientific evidence published on workplace violence in studies in Latin American countries between 2009 and 2014. A descriptive and quantitative study. A search was made on the basis of Academic Search Complete (EBSCOhost) Academic Source Premier, PSICODOC, Scielo.org, JSTOR and SCOPUS. And indexed empirical studies were considered. We worked with 46 selected articles. The studies showed a higher amount of psychological violence at work, with a potential risk in women and health professionals. Also, the analysis categories were the most reported behaviors that express violence, health implications and facilitators. The literature on the study of workplace violence in Latin America is recent. Items are descriptive, interpretative studies with insufficient or analytical nature. Health personnel, particularly women, have conditions of vulnerability, with relevance with regard to sexual harassment, wage inequality and bullying.

  13. Report on the First PANLAR Rheumatology Review Course Rheumatoid Arthritis: Challenges and Solutions in Latin America.

    PubMed

    Pineda, Carlos; Caballero-Uribe, Carlo V; Gutiérrez, Marwin; Cazenave, Tomás; Cardiel, Mario H; Levy, Roger; Espada, Graciela; Rose, Carlos; Santos-Moreno, Pedro; Pons-Estel, Bernardo A; Muñoz-Louis, Roberto; Soriano, Enrique R; Reveille, John D

    2015-12-01

    The First PANLAR Rheumatology Review Course was held in Barranquilla, Colombia, in April 2015. Researchers, rheumatologists, epidemiologists, and a variety of allied professionals and patients attended the meeting. The scientific program included plenary sessions and symposia delivered by renowned experts in the field, followed by an interactive forum of discussion during 2 days.A broad spectrum of topics was discussed, reflecting the current challenges and opportunities for diagnosis and treatment of rheumatoid arthritis (RA) in Latin America. The scientific program included not only traditional disease aspects, but also social implications, research projects, and educational characteristics, patient perspectives, and novel care models, emphasizing the need for training human resources and proposing unique approaches to RA health care in Latin America, therefore helping us to increase and improve the knowledge and understanding of the characteristics of this health condition in the region, thus promoting and encouraging equity, quality, and efficiency of RA health care.

  14. [Satellite symposium: Asthma in the World. Asthma among children in Latin America].

    PubMed

    Mallol, J

    2004-01-01

    The prevalence of respiratory symptoms related to asthma in children from Latin America has been largely ignored. This region participated in phases I and III of the International Study of Asthma and Allergies in Childhood (ISAAC) with 17 participating centers in phase I and 78 centers in phase III. Data were obtained on asthma, rhinitis and eczema from countries and centers with markedly different climactic, cultural and environmental conditions and socioeconomic development. The results for phase I are presented herein because data from phase III are currently being revised at the ISAAC international data control center and will be officially available in the second half of 2004. Phase I provided important information on the prevalence of asthma in the participating countries and demonstrated wide variation among centers in the same country and among countries. The participating Latin American countries are all developing countries and share more or less the same problems related to low socioeconomic status. Therefore, the results and figures should be analyzed within that context. The range for accumulative and current asthma symptoms in children from the Latin American countries that participated in phase I (89,000) were as follows: the prevalence of asthma ranged from 5.5% to 28% in children aged 13-14 years and from 4.1% to 26.9% in children aged 6-7 years. The prevalence of wheezing in the previous 12 months ranged from 6.6% to 27% in children aged 13-14 years and from 8.6% to 32.1% in children aged 6-7 years. The high figures for asthma in a region with a high level of gastrointestinal parasites infestation, a high burden of acute respiratory and gastrointestinal infections occurring early in life, severe environmental and hygiene problems, suggest that these factors, considered as protective in other (developed) regions of the world, do not have the same effect in this region. Furthermore, those aggressive environmental conditions acting together from very early in life might condition different asthmatic phenotypes with more severe clinical presentation in infancy (first 2 years of life), lower atopy and enhanced airways reactivity. The present study indicates that the prevalence of asthma and related symptoms in Latin America is as high and variable as described previously for industrialized or developed regions of the world and that the environmental risk factors, mainly related with poverty, could be responsible for the different clinical and functional presentations of asthma in children from developing regions.

  15. Health-related quality of life of latin-american immigrants and spanish-born attended in spanish primary health care: socio-demographic and psychosocial factors.

    PubMed

    Salinero-Fort, Miguel Ángel; Gómez-Campelo, Paloma; Bragado-Alvárez, Carmen; Abánades-Herranz, Juan Carlos; Jiménez-García, Rodrigo; de Burgos-Lunar, Carmen

    2015-01-01

    This study compares the health-related quality of life of Spanish-born and Latin American-born individuals settled in Spain. Socio-demographic and psychosocial factors associated with health-related quality of life are analyzed. A cross-sectional Primary Health Care multi center-based study of Latin American-born (n = 691) and Spanish-born (n = 903) outpatients from 15 Primary Health Care Centers (Madrid, Spain). The Medical Outcomes Study 36-Item Short Form Health Survey (SF-36) was used to assess health-related quality of life. Socio-demographic, psychosocial, and specific migration data were also collected. Compared to Spanish-born participants, Latin American-born participants reported higher health-related quality of life in the physical functioning and vitality dimensions. Across the entire sample, Latin American-born participants, younger participants, men and those with high social support reported significantly higher levels of physical health. Men with higher social support and a higher income reported significantly higher mental health. When stratified by gender, data show that for men physical health was only positively associated with younger age. For women, in addition to age, social support and marital status were significantly related. Both men and women with higher social support and income had significantly better mental health. Finally, for immigrants, the physical and mental health components of health-related quality of life were not found to be significantly associated with any of the pre-migration factors or conditions of migration. Only the variable "exposure to political violence" was significantly associated with the mental health component (p = 0.014). The key factors to understanding HRQoL among Latin American-born immigrants settled in Spain are age, sex and social support. Therefore, strategies to maintain optimal health outcomes in these immigrant communities should include public policies on social inclusion in the host society and focus on improving social support networks in order to foster and maintain the health and HRQoL of this group.

  16. Use of m-Health Technology for Preventive Interventions to Tackle Cardiometabolic Conditions and Other Non-Communicable Diseases in Latin America- Challenges and Opportunities.

    PubMed

    Beratarrechea, Andrea; Diez-Canseco, Francisco; Irazola, Vilma; Miranda, Jaime; Ramirez-Zea, Manuel; Rubinstein, Adolfo

    2016-01-01

    In Latin America, cardiovascular disease (CVD) mortality rates will increase by an estimated 145% from 1990 to 2020. Several challenges related to social strains, inadequate public health infrastructure, and underfinanced healthcare systems make cardiometabolic conditions and non-communicable diseases (NCDs) difficult to prevent and control. On the other hand, the region has high mobile phone coverage, making mobile health (mHealth) particularly attractive to complement and improve strategies toward prevention and control of these conditions in low- and middle-income countries. In this article, we describe the experiences of three Centers of Excellence for prevention and control of NCDs sponsored by the National Heart, Lung, and Blood Institute with mHealth interventions to address cardiometabolic conditions and other NCDs in Argentina, Guatemala, and Peru. The nine studies described involved the design and implementation of complex interventions targeting providers, patients and the public. The rationale, design of the interventions, and evaluation of processes and outcomes of each of these studies are described, together with barriers and enabling factors associated with their implementation. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Driver Injury Risk Variability in Finite Element Reconstructions of Crash Injury Research and Engineering Network (CIREN) Frontal Motor Vehicle Crashes.

    PubMed

    Gaewsky, James P; Weaver, Ashley A; Koya, Bharath; Stitzel, Joel D

    2015-01-01

    A 3-phase real-world motor vehicle crash (MVC) reconstruction method was developed to analyze injury variability as a function of precrash occupant position for 2 full-frontal Crash Injury Research and Engineering Network (CIREN) cases. Phase I: A finite element (FE) simplified vehicle model (SVM) was developed and tuned to mimic the frontal crash characteristics of the CIREN case vehicle (Camry or Cobalt) using frontal New Car Assessment Program (NCAP) crash test data. Phase II: The Toyota HUman Model for Safety (THUMS) v4.01 was positioned in 120 precrash configurations per case within the SVM. Five occupant positioning variables were varied using a Latin hypercube design of experiments: seat track position, seat back angle, D-ring height, steering column angle, and steering column telescoping position. An additional baseline simulation was performed that aimed to match the precrash occupant position documented in CIREN for each case. Phase III: FE simulations were then performed using kinematic boundary conditions from each vehicle's event data recorder (EDR). HIC15, combined thoracic index (CTI), femur forces, and strain-based injury metrics in the lung and lumbar vertebrae were evaluated to predict injury. Tuning the SVM to specific vehicle models resulted in close matches between simulated and test injury metric data, allowing the tuned SVM to be used in each case reconstruction with EDR-derived boundary conditions. Simulations with the most rearward seats and reclined seat backs had the greatest HIC15, head injury risk, CTI, and chest injury risk. Calculated injury risks for the head, chest, and femur closely correlated to the CIREN occupant injury patterns. CTI in the Camry case yielded a 54% probability of Abbreviated Injury Scale (AIS) 2+ chest injury in the baseline case simulation and ranged from 34 to 88% (mean = 61%) risk in the least and most dangerous occupant positions. The greater than 50% probability was consistent with the case occupant's AIS 2 hemomediastinum. Stress-based metrics were used to predict injury to the lower leg of the Camry case occupant. The regional-level injury metrics evaluated for the Cobalt case occupant indicated a low risk of injury; however, strain-based injury metrics better predicted pulmonary contusion. Approximately 49% of the Cobalt occupant's left lung was contused, though the baseline simulation predicted 40.5% of the lung to be injured. A method to compute injury metrics and risks as functions of precrash occupant position was developed and applied to 2 CIREN MVC FE reconstructions. The reconstruction process allows for quantification of the sensitivity and uncertainty of the injury risk predictions based on occupant position to further understand important factors that lead to more severe MVC injuries.

  18. Comparative Analysis of Reconstructed Image Quality in a Simulated Chromotomographic Imager

    DTIC Science & Technology

    2014-03-01

    quality . This example uses five basic images a backlit bar chart with random intensity, 100 nm separation. A total of 54 initial target...compared for a variety of scenes. Reconstructed image quality is highly dependent on the initial target hypercube so a total of 54 initial target...COMPARATIVE ANALYSIS OF RECONSTRUCTED IMAGE QUALITY IN A SIMULATED CHROMOTOMOGRAPHIC IMAGER THESIS

  19. Different Formulations of the Orthogonal Array Problem and Their Symmetries

    DTIC Science & Technology

    2014-06-19

    t). Note that OAP(k, s, t, λ) is a polytope not an unbounded polyhedron because it can be embedded inside the hypercube [0, λ]m. Theorem 5. The OAP(k...3.1) is a facet since otherwise OAP(k, s, t, λ) would be an unbounded polyhedron . Then there exists Ny ∈ Rm satisfying all but the 28 facet defining

  20. Hypercube Solutions for Conjugate Directions

    DTIC Science & Technology

    1991-12-01

    1800s, Charles Babbage had designed his Difference Engine and proceeded to the more advanced Analytical Engine. These machines were 1 never completed...Consider his motivation. The following example was frequently cited by Charles Babbage (1792-1871) to justify the construction of his first computing...360 LIST OF REFERENCES [1] P. Morrison and E. Morrison, editors. Charles Babbage and His Calculating Engines. Dover Publications, Inc., New

  1. Mold: "tsara'at," Leviticus, and the history of a confusion.

    PubMed

    Heller, Richard M; Heller, Toni W; Sasson, Jack M

    2003-01-01

    The noun tsara'at appears about two dozen times in the Hebrew Bible, almost exclusively in Leviticus, where it is used to describe a state of ritual defilement manifested as a scaly condition of the skin, a condition of cloth, leather, and the walls of houses. In the Septuagint, the Greek translation of the Hebrew Bible, negac tsara'at was translated as aphe lepras; in the Latin Vulgate, this became plega leprae. These words in Greek and Latin implied a condition that spread over the body, not a term of ritual impurity. Tsara'at has continued to be translated as "leprosy," even though this term is not appropriate, as there was no leprosy as we know it in the Middle East during the time period the Hebrew Bible was written. Others have suggested that the proper translation of tsara'at is "mold." The recent identification of a specific mold (Stachybotrys sp.) that contaminates buildings and causes respiratory distress, memory loss, and rash, and the fact that mold has been present for millennia, lend support to the translation of tsara'at as "mold."

  2. HERMIES-I: a mobile robot for navigation and manipulation experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisbin, C.R.; Barhen, J.; de Saussure, G.

    1985-01-01

    The purpose of this paper is to report the current status of investigations ongoing at the Center for Engineering Systems Advanced Research (CESAR) in the areas of navigation and manipulation in unstructured environments. The HERMIES-I mobile robot, a prototype of a series which contains many of the major features needed for remote work in hazardous environments is discussed. Initial experimental work at CESAR has begun in the area of navigation. It briefly reviews some of the ongoing research in autonomous navigation and describes initial research with HERMIES-I and associated graphic simulation. Since the HERMIES robots will generally be composed ofmore » a variety of asynchronously controlled hardware components (such as manipulator arms, digital image sensors, sonars, etc.) it seems appropriate to consider future development of the HERMIES brain as a hypercube ensemble machine with concurrent computation and associated message passing. The basic properties of such a hypercube architecture are presented. Decision-making under uncertainty eventually permeates all of our work. Following a survey of existing analytical approaches, it was decided that a stronger theoretical basis is required. As such, this paper presents the framework for a recently developed hybrid uncertainty theory. 21 refs., 2 figs.« less

  3. Critical and Griffiths-McCoy singularities in quantum Ising spin glasses on d-dimensional hypercubic lattices: A series expansion study.

    PubMed

    Singh, R R P; Young, A P

    2017-08-01

    We study the ±J transverse-field Ising spin-glass model at zero temperature on d-dimensional hypercubic lattices and in the Sherrington-Kirkpatrick (SK) model, by series expansions around the strong-field limit. In the SK model and in high dimensions our calculated critical properties are in excellent agreement with the exact mean-field results, surprisingly even down to dimension d=6, which is below the upper critical dimension of d=8. In contrast, at lower dimensions we find a rich singular behavior consisting of critical and Griffiths-McCoy singularities. The divergence of the equal-time structure factor allows us to locate the critical coupling where the correlation length diverges, implying the onset of a thermodynamic phase transition. We find that the spin-glass susceptibility as well as various power moments of the local susceptibility become singular in the paramagnetic phase before the critical point. Griffiths-McCoy singularities are very strong in two dimensions but decrease rapidly as the dimension increases. We present evidence that high enough powers of the local susceptibility may become singular at the pure-system critical point.

  4. A parallel strategy for implementing real-time expert systems using CLIPS

    NASA Technical Reports Server (NTRS)

    Ilyes, Laszlo A.; Villaseca, F. Eugenio; Delaat, John

    1994-01-01

    As evidenced by current literature, there appears to be a continued interest in the study of real-time expert systems. It is generally recognized that speed of execution is only one consideration when designing an effective real-time expert system. Some other features one must consider are the expert system's ability to perform temporal reasoning, handle interrupts, prioritize data, contend with data uncertainty, and perform context focusing as dictated by the incoming data to the expert system. This paper presents a strategy for implementing a real time expert system on the iPSC/860 hypercube parallel computer using CLIPS. The strategy takes into consideration not only the execution time of the software, but also those features which define a true real-time expert system. The methodology is then demonstrated using a practical implementation of an expert system which performs diagnostics on the Space Shuttle Main Engine (SSME). This particular implementation uses an eight node hypercube to process ten sensor measurements in order to simultaneously diagnose five different failure modes within the SSME. The main program is written in ANSI C and embeds CLIPS to better facilitate and debug the rule based expert system.

  5. Parallel algorithms for quantum chemistry. I. Integral transformations on a hypercube multiprocessor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteside, R.A.; Binkley, J.S.; Colvin, M.E.

    1987-02-15

    For many years it has been recognized that fundamental physical constraints such as the speed of light will limit the ultimate speed of single processor computers to less than about three billion floating point operations per second (3 GFLOPS). This limitation is becoming increasingly restrictive as commercially available machines are now within an order of magnitude of this asymptotic limit. A natural way to avoid this limit is to harness together many processors to work on a single computational problem. In principle, these parallel processing computers have speeds limited only by the number of processors one chooses to acquire. Themore » usefulness of potentially unlimited processing speed to a computationally intensive field such as quantum chemistry is obvious. If these methods are to be applied to significantly larger chemical systems, parallel schemes will have to be employed. For this reason we have developed distributed-memory algorithms for a number of standard quantum chemical methods. We are currently implementing these on a 32 processor Intel hypercube. In this paper we present our algorithm and benchmark results for one of the bottleneck steps in quantum chemical calculations: the four index integral transformation.« less

  6. Critical and Griffiths-McCoy singularities in quantum Ising spin glasses on d -dimensional hypercubic lattices: A series expansion study

    NASA Astrophysics Data System (ADS)

    Singh, R. R. P.; Young, A. P.

    2017-08-01

    We study the ±J transverse-field Ising spin-glass model at zero temperature on d -dimensional hypercubic lattices and in the Sherrington-Kirkpatrick (SK) model, by series expansions around the strong-field limit. In the SK model and in high dimensions our calculated critical properties are in excellent agreement with the exact mean-field results, surprisingly even down to dimension d =6 , which is below the upper critical dimension of d =8 . In contrast, at lower dimensions we find a rich singular behavior consisting of critical and Griffiths-McCoy singularities. The divergence of the equal-time structure factor allows us to locate the critical coupling where the correlation length diverges, implying the onset of a thermodynamic phase transition. We find that the spin-glass susceptibility as well as various power moments of the local susceptibility become singular in the paramagnetic phase before the critical point. Griffiths-McCoy singularities are very strong in two dimensions but decrease rapidly as the dimension increases. We present evidence that high enough powers of the local susceptibility may become singular at the pure-system critical point.

  7. Spacecraft On-Board Information Extraction Computer (SOBIEC)

    NASA Technical Reports Server (NTRS)

    Eisenman, David; Decaro, Robert E.; Jurasek, David W.

    1994-01-01

    The Jet Propulsion Laboratory is the Technical Monitor on an SBIR Program issued for Irvine Sensors Corporation to develop a highly compact, dual use massively parallel processing node known as SOBIEC. SOBIEC couples 3D memory stacking technology provided by nCUBE. The node contains sufficient network Input/Output to implement up to an order-13 binary hypercube. The benefit of this network, is that it scales linearly as more processors are added, and it is a superset of other commonly used interconnect topologies such as: meshes, rings, toroids, and trees. In this manner, a distributed processing network can be easily devised and supported. The SOBIEC node has sufficient memory for most multi-computer applications, and also supports external memory expansion and DMA interfaces. The SOBIEC node is supported by a mature set of software development tools from nCUBE. The nCUBE operating system (OS) provides configuration and operational support for up to 8000 SOBIEC processors in an order-13 binary hypercube or any subset or partition(s) thereof. The OS is UNIX (USL SVR4) compatible, with C, C++, and FORTRAN compilers readily available. A stand-alone development system is also available to support SOBIEC test and integration.

  8. Spent sealed radium sources conditioning in Latin America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mourao, R.P.

    1999-06-01

    The management of spent sealed sources is considered by the International Atomic Energy Agency (IAEA) one of the greatest challenges faced by nuclear authorities today, especially in developing countries. One of the Agency`s initiatives to tackle this problem is the Spent Radium Sources Conditioning Project, a worldwide project relying on the regional co-operation between countries. A team from the Brazilian nuclear research institute Centro de Desenvolvimento da Tecnologia Nuclear (CDTN) was chosen as the expert team to carry out the operations in Latin America; since December 1996 radium sources have been safely conditioned in Uruguay, Nicaragua, Guatemala, Ecuador and Paraguay.more » A Quality Assurance Program was established, encompassing the qualification of the capsule welding process, written operational procedures referring to all major steps of the operation, calibration of monitors and information retrievability. A 200L carbon steel drum-based packaging concept was used to condition the sources, its cavity being designed to receive the lead shield device containing stainless steel capsules with the radium sources. As a result of these operations, a total amount of 2,897 mg of needles, tubes, medical applicators, standard sources for calibration, lightning rods, secondary wastes and contaminated objects were stored in proper conditions and are now under control of the nuclear authorities of the visited countries.« less

  9. Increasing access to Latin American social medicine resources: a preliminary report.

    PubMed

    Buchanan, Holly Shipp; Waitzkin, Howard; Eldredge, Jonathan; Davidson, Russ; Iriart, Celia; Teal, Janis

    2003-10-01

    This preliminary report describes the development and implementation of a project to improve access to literature in Latin American social medicine (LASM). The University of New Mexico project team collaborated with participants from Argentina, Brazil, Chile, and Ecuador to identify approximately 400 articles and books in Latin American social medicine. Structured abstracts were prepared, translated into English, Spanish, and Portuguese, assigned Medical Subject Headings (MeSH), and loaded into a Web-based database for public searching. The project has initiated Web-based publication for two LASM journals. Evaluation included measures of use and content. The LASM Website (http://hsc.unm.edu/lasm) and database create access to formerly little-known literature that addresses problems relevant to current medicine and public health. This Website offers a unique resource for researchers, practitioners, and teachers who seek to understand the links between socioeconomic conditions and health. The project provides a model for collaboration between librarians and health care providers. Challenges included procurement of primary material; preparation of concise abstracts; working with trilingual translations of abstracts, metadata, and indexing; and the work processes of the multidisciplinary team. The literature of Latin American social medicine has become more readily available to researchers worldwide. The LASM project serves as a collaborative model for the creation of sustainable solutions for disseminating information that is difficult to access through traditional methods.

  10. Vascular Dysfunction in Mother and Offspring During Preeclampsia: Contributions from Latin-American Countries.

    PubMed

    Giachini, Fernanda Regina; Galaviz-Hernandez, Carlos; Damiano, Alicia E; Viana, Marta; Cadavid, Angela; Asturizaga, Patricia; Teran, Enrique; Clapes, Sonia; Alcala, Martin; Bueno, Julio; Calderón-Domínguez, María; Ramos, María P; Lima, Victor Vitorino; Sosa-Macias, Martha; Martinez, Nora; Roberts, James M; Escudero, Carlos

    2017-10-06

    Pregnancy is a physiologically stressful condition that generates a series of functional adaptations by the cardiovascular system. The impact of pregnancy on this system persists from conception beyond birth. Recent evidence suggests that vascular changes associated with pregnancy complications, such as preeclampsia, affect the function of the maternal and offspring vascular systems, after delivery and into adult life. Since the vascular system contributes to systemic homeostasis, defective development or function of blood vessels predisposes both mother and infant to future risk for chronic disease. These alterations in later life range from fertility problems to alterations in the central nervous system or immune system, among others. It is important to note that rates of morbi-mortality due to pregnancy complications including preeclampsia, as well as cardiovascular diseases, have a higher incidence in Latin-American countries than in more developed countries. Nonetheless, there is a lack both in the amount and impact of research conducted in Latin America. An impact, although smaller, can be seen when research in vascular disorders related to problems during pregnancy is analyzed. Therefore, in this review, information about preeclampsia and endothelial dysfunction generated from research groups based in Latin-American countries will be highlighted. We relate the need, as present in many other countries in the world, for increased effective regional and international collaboration to generate new data specific to our region on this topic.

  11. Increasing access to Latin American social medicine resources: a preliminary report*

    PubMed Central

    Buchanan, Holly Shipp; Waitzkin, Howard; Eldredge, Jonathan; Davidson, Russ; Iriart, Celia; Teal, Janis

    2003-01-01

    Purpose: This preliminary report describes the development and implementation of a project to improve access to literature in Latin American social medicine (LASM). Methods: The University of New Mexico project team collaborated with participants from Argentina, Brazil, Chile, and Ecuador to identify approximately 400 articles and books in Latin American social medicine. Structured abstracts were prepared, translated into English, Spanish, and Portuguese, assigned Medical Subject Headings (MeSH), and loaded into a Web-based database for public searching. The project has initiated Web-based publication for two LASM journals. Evaluation included measures of use and content. Results: The LASM Website (http://hsc.unm.edu/lasm) and database create access to formerly little-known literature that addresses problems relevant to current medicine and public health. This Website offers a unique resource for researchers, practitioners, and teachers who seek to understand the links between socioeconomic conditions and health. The project provides a model for collaboration between librarians and health care providers. Challenges included procurement of primary material; preparation of concise abstracts; working with trilingual translations of abstracts, metadata, and indexing; and the work processes of the multidisciplinary team. Conclusions: The literature of Latin American social medicine has become more readily available to researchers worldwide. The LASM project serves as a collaborative model for the creation of sustainable solutions for disseminating information that is difficult to access through traditional methods. PMID:14566372

  12. Prevalence of Celiac Disease in Latin America: A Systematic Review and Meta-Regression

    PubMed Central

    Parra-Medina, Rafael; Molano-Gonzalez, Nicolás; Rojas-Villarraga, Adriana; Agmon-Levin, Nancy; Arango, Maria-Teresa; Shoenfeld, Yehuda; Anaya, Juan-Manuel

    2015-01-01

    Background Celiac disease (CD) is an immune-mediated enteropathy triggered by the ingestion of gluten in susceptible individuals, and its prevalence varies depending on the studied population. Given that information on CD in Latin America is scarce, we aimed to investigate the prevalence of CD in this region of the world through a systematic review and meta-analysis. Methods and Findings This was a two-phase study. First, a cross-sectional analysis from 981 individuals of the Colombian population was made. Second, a systematic review and meta-regression analysis were performed following the Preferred Reporting Items for Systematic Meta- Analyses (PRISMA) guidelines. Our results disclosed a lack of celiac autoimmunity in the studied Colombian population (i.e., anti-tissue transglutaminase (tTG) and IgA anti-endomysium (EMA)). In the systematic review, 72 studies were considered. The estimated prevalence of CD in Latin Americans ranged between 0.46% and 0.64%. The prevalence of CD in first-degree relatives of CD probands was 5.5%. The coexistence of CD and type 1 diabetes mellitus varied from 4.6% to 8.7%, depending on the diagnosis methods (i.e., autoantibodies and/or biopsies). Conclusions Although CD seems to be a rare condition in Colombians; the general prevalence of the disease in Latin Americans seemingly corresponds to a similar scenario observed in Europeans. PMID:25942408

  13. Asthma mortality in Latin America.

    PubMed

    Neffen, H; Baena-Cagnani, C E; Malka, S; Solé, D; Sepúlveda, R; Caraballo, L; Caravajal, E; Rodríguez Gavaldá, R; González Díaz, S; Guggiari Chase, J; Díez, C; Baluga, J; Capriles Hulett, A

    1997-01-01

    There are not enough data concerning asthma mortality in Latin America. The Latin American Society of Allergy and Immunology coordinated this project to provide reliable data for gaining knowledge about our present situation, which is a condition indispensable to changing it. The following countries participated in this study: Argentina, Brazil, Chile, Colombia, Costa Rica, Cuba, Mexico, Paraguay, Peru, Uruguay and Venezuela. A uniform protocol was designed in Santa Fe, Argentina. Asthma mortality rates were analyzed in accordance with two variables: age-adjusted rates (5-34) and total death rates. The total population studied was 107, 122, 529 inhabitants. The highest death rates were found in Uruguay and Mexico (5.63), and the lowest in Paraguay (0.8) and Colombia (1.35). Age-adjusted (5-34) rates were higher in Costa Rica (1.38) and lower in Chile (0.28). Regarding sex, the analysis of the information provided by seven countries showed a predominance of females (51.8%) over males (48.18%). In the southern Latin American countries such as Chile, Uruguay, Paraguay and Argentina, which have marked climatic differences, deaths occurred mainly in the winter. It is important to emphasize that, in most countries, deaths from asthma occurred at home: Chile (60.7%), Argentina (63.4%) and Paraguay (88%). However, in Uruguay, 58.6% occurred during hospitalization. Mortality rates from bronchial asthma are high in most of the Latin American countries studied, even though further studies are needed. Asthma is a serious global health problem. People of all ages in countries throughout the world are affected by this chronic airway disorder that can be severe and sometimes fatal. The health ministries of each country do not believe asthma is a significant issue. Therefore, we should provide them with sound epidemiological studies to convince them to change their attitude toward this disease.

  14. Light and Dark Waves.

    ERIC Educational Resources Information Center

    Parker, Henry H.; Sturdivant, Ann

    1966-01-01

    Several Latin textbooks are described and evaluated in terms of their affectiveness in teaching language mastery. These are: (1) "Using Latin," Scott, Foresman, (2) "Latin for Americans," Macmillan, (3) "Lingua Latina," by Burns, Medicus, and Sherburne, and (4) "A Basic Course in Latin,""An Intermediate Course in Latin--Reading," and "An…

  15. Latin American scientific contribution to ecology.

    PubMed

    Wojciechowski, Juliana; Ceschin, Fernanda; Pereto, Suelen C A S; Ribas, Luiz G S; Bezerra, Luis A V; Dittrich, Jaqueline; Siqueira, Tadeu; Padial, André A

    2017-01-01

    Latin America embodies countries of special interest for ecological studies, given that areas with great value for biodiversity are located within their territories. This highlights the importance of an evaluation of ecological research in the Latin America region. We assessed the scientific participation of Latin American researchers in ecological journals, patterns of international collaboration, and defined the main characteristics of the articles. Although Latin American publications have increased in fourteen years, they accounted up to 9% of publications in Ecology. Brazil leaded the scientific production in Latin America, followed by Argentina and Mexico. In general, Latin American articles represented a low percentage of most journals total publication, with particularly low expression in high impact-factor journals. A half of the Latin American publications had international collaboration. Articles with more than five authors and with international collaboration were the most cited. Descriptive studies, mainly based in old theories, are still majority, suggesting that Ecology is in a developing stage in Latin America.

  16. Nutrition status of children in Latin America.

    PubMed

    Corvalán, C; Garmendia, M L; Jones-Smith, J; Lutter, C K; Miranda, J J; Pedraza, L S; Popkin, B M; Ramirez-Zea, M; Salvo, D; Stein, A D

    2017-07-01

    The prevalence of overweight and obesity is rapidly increasing among Latin American children, posing challenges for current healthcare systems and increasing the risk for a wide range of diseases. To understand the factors contributing to childhood obesity in Latin America, this paper reviews the current nutrition status and physical activity situation, the disparities between and within countries and the potential challenges for ensuring adequate nutrition and physical activity. Across the region, children face a dual burden of undernutrition and excess weight. While efforts to address undernutrition have made marked improvements, childhood obesity is on the rise as a result of diets that favour energy-dense, nutrient-poor foods and the adoption of a sedentary lifestyle. Over the last decade, changes in socioeconomic conditions, urbanization, retail foods and public transportation have all contributed to childhood obesity in the region. Additional research and research capacity are needed to address this growing epidemic, particularly with respect to designing, implementing and evaluating the impact of evidence-based obesity prevention interventions. © 2017 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity.

  17. Vision-Based Navigation and Parallel Computing

    DTIC Science & Technology

    1990-08-01

    33 5.8. Behizad Kamgar-Parsi and Behrooz Karngar-Parsi,"On Problem 5- lving with Hopfield Neural Networks", CAR-TR-462, CS-TR...Second. the hypercube connections support logarithmic implementations of fundamental parallel algorithms. such as grid permutations and scan...the pose space. It also uses a set of virtual processors to represent an orthogonal projection grid , and projections of the six dimensional pose space

  18. Hypercat - Hypercube of Clumpy AGN Tori

    NASA Astrophysics Data System (ADS)

    Nikutta, Robert; Lopez-Rodriguez, Enrique; Ichikawa, Kohei; Levenson, Nancy; Packham, Christopher C.

    2017-06-01

    Dusty tori surrounding the central engines of Active Galactic Nuclei (AGN) are required by the Unification Paradigm, and are supported by many observations, e.g. variable nuclear absorber (sometimes Compton-thick) in X-rays, reverberation mapping in optical/UV, hot dust emission and SED shapes in NIR/MIR, molecular and cool-dust tori observed with ALMA in sub-mm.While models of AGN torus SEDs have been developed and utilized for a long time, the study of the resolved emission morphology (brightness maps) has so far been under-appreciated, presumably because resolved observations of the central parsec in AGN are only possible very recently. Currently, only NIR+MIR interferometry is capable of resolving the nuclear dust emission (but not of producing images, until MATISSE comes online). Furthermore, MIR interferometry has delivered also puzzling results, e.g. that in some resolved sources the light emanates preferentially from polar directions above the "torus" system, and not from the equatorial plane, where most of the dust is located.We are preparing the release of a panchromatic, fully interpolable hypercube of brightness maps and projected dust images for a large number of CLUMPY torus models (Nenkova+2008), that will help facilitate studies of resolved AGN emission and dust morphologies. Together with the cube we will release a comprehensive set of open-source tools (Python) that will enable researches to work efficiently with this large hypercube:* easy sub-cube selection + memory-mapping (mitigating the too-big-for-RAM problem)* multi-dim image interpolation (get an image at any wavelength & model parameter combination)* simulation of observations with telescopes (compute/provide + apply a PSF) and interferometers (get visibilities)* analyze images with respect to the power contained at all scales and orientations (via 2D steerable wavelets), addressing the seemingly puzzling results mentioned aboveA series of papers is in preparation, aiming at solving the puzzles, and at making predictions about the resolvability of all nearby AGN tori with any combination of current and future instruments (e.g. VLTI+MATISSE, TMT+MICHI, GMT, ELT, JWST, ALMA).

  19. Latin and the World of Work: Career Education and Foreign Languages.

    ERIC Educational Resources Information Center

    Kennedy, Dora F.; And Others

    This curriculum guide for high school Latin courses emphasizes the usefulness of a knowledge of Latin for career preparation. As a supplement to standard Latin textbooks, a variety of classroom material is offered. The instructional approach revolves around the relation of English and Latin vocabulary acquisition and etymological knowledge on the…

  20. How Do You Say "MOO" in Latin? Assessing Student Learning and Motivation in Beginning Latin.

    ERIC Educational Resources Information Center

    Gruber-Miller, John; Benton, Cindy

    2001-01-01

    Assesses the value of VRoma for Latin language learning. In particular, discusses three exercises that were developed that combine Latin language and Roman culture in order to help students reinforce their Latin skills and gain a more in-depth understanding of ancient Roman society. (Author/VWL)

  1. Latin and Magic Squares

    ERIC Educational Resources Information Center

    Emanouilidis, Emanuel

    2005-01-01

    Latin squares have existed for hundreds of years but it wasn't until rather recently that Latin squares were used in other areas such as statistics, graph theory, coding theory and the generation of random numbers as well as in the design and analysis of experiments. This note describes Latin and diagonal Latin squares, a method of constructing…

  2. Military Medical Care: Questions and Answers

    DTIC Science & Technology

    2009-01-29

    Tricare Latin America and Canada Area covering Central and South America, the Caribbean Basin, Canada, Puerto Rico and the Virgin Islands. • Tricare...mail order program is designed to fill long-term prescriptions to treat conditions such as high blood pressure, asthma, or diabetes ; it does not

  3. Work, malaise, and well-being in Spanish and Latin-American doctors

    PubMed Central

    Ochoa, Paola; Blanch, Josep M

    2016-01-01

    ABSTRACT OBJECTIVE To analyze the relations between the meanings of working and the levels of doctors work well-being in the context of their working conditions. METHOD The research combined the qualitative methodology of textual analysis and the quantitative one of correspondence factor analysis. A convenience, intentional, and stratified sample composed of 305 Spanish and Latin American doctors completed an extensive questionnaire on the topics of the research. RESULTS The general meaning of working for the group located in the quartile of malaise included perceptions of discomfort, frustration, and exhaustion. However, those showing higher levels of well-being, located on the opposite quartile, associated their working experience with good conditions and the development of their professional and personal competences. CONCLUSIONS The study provides empirical evidence of the relationship between contextual factors and the meanings of working for participants with higher levels of malaise, and of the importance granted both to intrinsic and extrinsic factors by those who scored highest on well-being. PMID:27191157

  4. [Child survival: magnitude of the problem in Latin America].

    PubMed

    Behm-Rosas, H

    1988-01-01

    This document summarizes the most relevant epidemiologic characteristics of infant and child mortality in Latin America. The gap in infant mortality rates between Latin America and the developed countries is wide and appears to be increasing. In the developed countries, 980 of each 1000 infants survive to the age of 5, but only 900 did so in Latin America in 1975-80. Infant mortality declined in Latin America between 1950-55 and 1980-85 from 128 to 63/1000 live births, with a slight increase in the rate of decline over the past decade. The great differences in social and economic development within Latin America are reflected in mortality rates before the age of 5 that also vary widely, from 34/1000 in Cuba to 221/1000 in Bolivia in 1975-80. Latin American countries with moderate risk of early childhood mortality are led by Cuba and Costa Rica, with rates of 34-35/1000. The 2 countries are very different politically but both have implemented vigorous social policies that benefitted their entire populations. Both had sustained mortality declines between 1955-80. Argentina, Chile, Uruguay, Venezuela, and Panama had mortality rates of 46-56/1000. Within the region, 16.4% of births and 8% of deaths in children under 5 are estimated to occur in these 7 countries. The countries of very high mortality include the least developed Caribbean, Central American, and Andean countries: Haiti, guatemala, Honduras, Nicaragua, Bolivia, and Peru. 3 of these countries contain large indigenous populations that have largely remained outside the development process. Their average rate of infant mortality is 162/1000. 14.7% of births and 27.0% of deaths in children under 5 in Latin America occur in these 6 countries. The intermediate group contains the 2 most populated countries of the region, Brazil and Mexico. The risk of death under age 5 ranges from 74 to 114/1000 and averages 99/1000. The 7 countries account for 68.9% of births and 68% of deaths in children under 5. The rate of decline in infant mortality in Latin America is on the whole moderate, with no sign of acceleration. Progress is slowest in the countries with the highest rates. Available data clearly demonstrate excess mortality in rural areas, especially when compared to capital cities, but the degree of disparity varies among countries. In countries with high mortality and a large rural population, sustained decline in national mortality rates will require rural populations to be incorporated in the decline. In 1985, about 40% of Latin American children under 5 were believed to be in rural areas, but the proportion rural was 57% in the countries with highest mortality. Statistical information on causes of death in children under 5 is most deficient in exactly the areas where it is most needed. Most deaths are clearly due to infectious diseases and conditions preventable by vaccination. Social inequalities in survival of young children have been extensively described as a function of paternal occupational status, maternal education, and geographic factors. More effective policies are needed to ensure a more equitable distribution of wealth that will make possible a major improvement in child survival.

  5. Star Trek with Latin. Teacher's Guide. Tentative Edition.

    ERIC Educational Resources Information Center

    Masciantonio, Rudolph; And Others

    The purpose of this guide is to assist Latin and English teachers with some background in Latin to expand the English vocabulary and reading skills of students through the study of Latin roots, prefixes, and suffixes. The introductory material in the guide provides general notes on the teaching of Latin in the Philadelphia School District,…

  6. ON THE CONSTRUCTION OF LATIN SQUARES COUNTERBALANCED FOR IMMEDIATE SEQUENTIAL EFFECTS.

    ERIC Educational Resources Information Center

    HOUSTON, TOM R., JR.

    THIS REPORT IS ONE OF A SERIES DESCRIBING NEW DEVELOPMENTS IN THE AREA OF RESEARCH METHODOLOGY. IT DEALS WITH LATIN SQUARES AS A CONTROL FOR PROGRESSIVE AND ADJACENCY EFFECTS IN EXPERIMENTAL DESIGNS. THE HISTORY OF LATIN SQUARES IS ALSO REVIEWED, AND SEVERAL ALGORITHMS FOR THE CONSTRUCTION OF LATIN AND GRECO-LATIN SQUARES ARE PROPOSED. THE REPORT…

  7. Optimal design of a piezoelectric transducer for exciting guided wave ultrasound in rails

    NASA Astrophysics Data System (ADS)

    Ramatlo, Dineo A.; Wilke, Daniel N.; Loveday, Philip W.

    2017-02-01

    An existing Ultrasonic Broken Rail Detection System installed in South Africa on a heavy duty railway line is currently being upgraded to include defect detection and location. To accomplish this, an ultrasonic piezoelectric transducer to strongly excite a guided wave mode with energy concentrated in the web (web mode) of a rail is required. A previous study demonstrated that the recently developed SAFE-3D (Semi-Analytical Finite Element - 3 Dimensional) method can effectively predict the guided waves excited by a resonant piezoelectric transducer. In this study, the SAFE-3D model is used in the design optimization of a rail web transducer. A bound-constrained optimization problem was formulated to maximize the energy transmitted by the transducer in the web mode when driven by a pre-defined excitation signal. Dimensions of the transducer components were selected as the three design variables. A Latin hypercube sampled design of experiments that required a total of 500 SAFE-3D analyses in the design space was employed in a response surface-based optimization approach. The Nelder-Mead optimization algorithm was then used to find an optimal transducer design on the constructed response surface. The radial basis function response surface was first verified by comparing a number of predicted responses against the computed SAFE-3D responses. The performance of the optimal transducer predicted by the optimization algorithm on the response surface was also verified to be sufficiently accurate using SAFE-3D. The computational advantages of SAFE-3D in optimal transducer design are noteworthy as more than 500 analyses were performed. The optimal design was then manufactured and experimental measurements were used to validate the predicted performance. The adopted design method has demonstrated the capability to automate the design of transducers for a particular rail cross-section and frequency range.

  8. Designing an optimal software intensive system acquisition: A game theoretic approach

    NASA Astrophysics Data System (ADS)

    Buettner, Douglas John

    The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.

  9. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  10. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  11. Development and Implementation of the DTOPLATS-MP land surface model over the Continental US at 30 meters

    NASA Astrophysics Data System (ADS)

    Chaney, N.; Wood, E. F.

    2014-12-01

    The increasing accessibility of high-resolution land data (< 100 m) and high performance computing allows improved parameterizations of subgrid hydrologic processes in macroscale land surface models. Continental scale fully distributed modeling at these spatial scales is possible; however, its practicality for operational use is still unknown due to uncertainties in input data, model parameters, and storage requirements. To address these concerns, we propose a modeling framework that provides the spatial detail of a fully distributed model yet maintains the benefits of a semi-distributed model. In this presentation we will introduce DTOPLATS-MP, a coupling between the NOAH-MP land surface model and the Dynamic TOPMODEL hydrologic model. This new model captures a catchment's spatial heterogeneity by clustering high-resolution land datasets (soil, topography, and land cover) into hundreds of hydrologic similar units (HSUs). A prior DEM analysis defines the connections between each HSU. At each time step, the 1D land surface model updates each HSU; the HSUs then interact laterally via the subsurface and surface. When compared to the fully distributed form of the model, this framework allows a significant decrease in computation and storage while providing most of the same information and enabling parameter transferability. As a proof of concept, we will show how this new modeling framework can be run over CONUS at a 30-meter spatial resolution. For each catchment in the WBD HUC-12 dataset, the model is run between 2002 and 2012 using available high-resolution continental scale land and meteorological datasets over CONUS (dSSURGO, NLCD, NED, and NCEP Stage IV). For each catchment, the model is run with 1000 model parameter sets obtained from a Latin hypercube sample. This exercise will illustrate the feasibility of running the model operationally at continental scales while accounting for model parameter uncertainty.

  12. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5 and Appendices G through I.« less

  13. Cystic Echinococcosis in the Province of Álava, North Spain: The Monetary Burden of a Disease No Longer under Surveillance

    PubMed Central

    Carabin, Hélène; Balsera-Rodríguez, Francisco J.; Rebollar-Sáenz, José; Benner, Christine T.; Benito, Aitziber; Fernández-Crespo, Juan C.; Carmena, David

    2014-01-01

    Cystic echinococcosis (CE) is endemic in Spain but has been considered non-endemic in the province of Álava, Northern Spain, since 1997. However, Álava is surrounded by autonomous regions with some of the highest CE prevalence proportions in the nation, casting doubts about the current classification. The purpose of this study is to estimate the frequency of CE in humans and animals and to use this data to determine the societal cost incurred due to CE in the Álava population in 2005. We have identified epidemiological and clinical data from surveillance and hospital records, prevalence data in intermediate (sheep and cattle) host species from abattoir records, and economical data from national and regional official institutions. Direct costs (diagnosis, treatment, medical care in humans and condemnation of offal in livestock species) and indirect costs (productivity losses in humans and reduction in growth, fecundity and milk production in livestock) were modelled using the Latin hypercube method under five different scenarios reflecting different assumptions regarding the prevalence of asymptomatic cases and associated productivity losses in humans. A total of 13 human CE cases were reported in 2005. The median total cost (95% credible interval) of CE in humans and animals in Álava in 2005 was estimated to range between €61,864 (95%CI%: €47,304–€76,590) and €360,466 (95%CI: €76,424–€752,469), with human-associated losses ranging from 57% to 93% of the total losses, depending on the scenario used. Our data provide evidence that CE is still very well present in Álava and incurs important cost to the province every year. We expect this information to prove valuable for public health agencies and policy-makers, as it seems advisable to reinstate appropriate surveillance and monitoring systems and to implement effective control measures that avoid the spread and recrudescence of the disease. PMID:25102173

  14. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  15. Constructing rigorous and broad biosurveillance networks for detecting emerging zoonotic outbreaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Mac; Moore, Leslie; McMahon, Benjamin

    Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a wide range of scenarios of pathogen spread. We demonstrate how a hierarchy of mathematical and statistical tools can be used in surveillance planning help guide successful surveillance and mitigation policies for a wide range of zoonotic pathogens. The model forecasts can help clarify the complexities ofmore » potential scenarios, and optimize biosurveillance programs for rapidly detecting infectious diseases. Using the highly pathogenic zoonotic H5N1 avian influenza 2006-2007 epidemic in Nigeria as an example, we determined the risk for infection for localized areas in an outbreak and designed biosurveillance stations that are effective for different pathogen strains and a range of possible outbreak locations. We created a general multi-scale, multi-host stochastic SEIR epidemiological network model, with both short and long-range movement, to simulate the spread of an infectious disease through Nigerian human, poultry, backyard duck, and wild bird populations. We chose parameter ranges specific to avian influenza (but not to a particular strain) and used a Latin hypercube sample experimental design to investigate epidemic predictions in a thousand simulations. We ranked the risk of local regions by the number of times they became infected in the ensemble of simulations. These spatial statistics were then complied into a potential risk map of infection. Finally, we validated the results with a known outbreak, using spatial analysis of all the simulation runs to show the progression matched closely with the observed location of the farms infected in the 2006-2007 epidemic.« less

  16. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  17. Probabilistic seismic history matching using binary images

    NASA Astrophysics Data System (ADS)

    Davolio, Alessandra; Schiozer, Denis Jose

    2018-02-01

    Currently, the goal of history-matching procedures is not only to provide a model matching any observed data but also to generate multiple matched models to properly handle uncertainties. One such approach is a probabilistic history-matching methodology based on the discrete Latin Hypercube sampling algorithm, proposed in previous works, which was particularly efficient for matching well data (production rates and pressure). 4D seismic (4DS) data have been increasingly included into history-matching procedures. A key issue in seismic history matching (SHM) is to transfer data into a common domain: impedance, amplitude or pressure, and saturation. In any case, seismic inversions and/or modeling are required, which can be time consuming. An alternative to avoid these procedures is using binary images in SHM as they allow the shape, rather than the physical values, of observed anomalies to be matched. This work presents the incorporation of binary images in SHM within the aforementioned probabilistic history matching. The application was performed with real data from a segment of the Norne benchmark case that presents strong 4D anomalies, including softening signals due to pressure build up. The binary images are used to match the pressurized zones observed in time-lapse data. Three history matchings were conducted using: only well data, well and 4DS data, and only 4DS. The methodology is very flexible and successfully utilized the addition of binary images for seismic objective functions. Results proved the good convergence of the method in few iterations for all three cases. The matched models of the first two cases provided the best results, with similar well matching quality. The second case provided models presenting pore pressure changes according to the expected dynamic behavior (pressurized zones) observed on 4DS data. The use of binary images in SHM is relatively new with few examples in the literature. This work enriches this discussion by presenting a new application to match pressure in a reservoir segment with complex pressure behavior.

  18. Assessing the Impact of Climate Change on Extreme Streamflow and Reservoir Operation for Nuuanu Watershed, Oahu, Hawaii

    NASA Astrophysics Data System (ADS)

    Leta, O. T.; El-Kadi, A. I.; Dulaiova, H.

    2016-12-01

    Extreme events, such as flooding and drought, are expected to occur at increased frequencies worldwide due to climate change influencing the water cycle. This is particularly critical for tropical islands where the local freshwater resources are very sensitive to climate. This study examined the impact of climate change on extreme streamflow, reservoir water volume and outflow for the Nuuanu watershed, using the Soil and Water Assessment Tool (SWAT) model. Based on the sensitive parameters screened by the Latin Hypercube-One-factor-At-a-Time (LH-OAT) method, SWAT was calibrated and validated to daily streamflow using the SWAT Calibration and Uncertainty Program (SWAT-CUP) at three streamflow gauging stations. Results showed that SWAT adequately reproduced the observed daily streamflow hydrographs at all stations. This was verified with Nash-Sutcliffe Efficiency that resulted in acceptable values of 0.58 to 0.88, whereby more than 90% of observations were bracketed within 95% model prediction uncertainty interval for both calibration and validation periods, signifying the potential applicability of SWAT for future prediction. The climate change impact on extreme flows, reservoir water volume and outflow was assessed under the Representative Concentration Pathways of 4.5 and 8.5 scenarios. We found wide changes in extreme peak and low flows ranging from -44% to 20% and -50% to -2%, respectively, compared to baseline. Consequently, the amount of water stored in Nuuanu reservoir will be decreased up to 27% while the corresponding outflow rates are expected to decrease up to 37% relative to the baseline. In addition, the stored water and extreme flows are highly sensitive to rainfall change when compared to temperature and solar radiation changes. It is concluded that the decrease in extreme low and peak flows can have serious consequences, such as flooding, drought, with detrimental effects on riparian ecological functioning. This study's results are expected to aid in reservoir operation as well as in identifying appropriate climate change adaptation strategies.

  19. Assessing the impact of intervention strategies against Taenia solium cysticercosis using the EPICYST transmission model.

    PubMed

    Winskill, Peter; Harrison, Wendy E; French, Michael D; Dixon, Matthew A; Abela-Ridder, Bernadette; Basáñez, María-Gloria

    2017-02-09

    The pork tapeworm, Taenia solium, and associated human infections, taeniasis, cysticercosis and neurocysticercosis, are serious public health problems, especially in developing countries. The World Health Organization (WHO) has set goals for having a validated strategy for control and elimination of T. solium taeniasis/cysticercosis by 2015 and interventions scaled-up in selected countries by 2020. Timely achievement of these internationally-endorsed targets requires that the relative benefits and effectiveness of potential interventions be explored rigorously within a quantitative framework. A deterministic, compartmental transmission model (EPICYST) was developed to capture the dynamics of the taeniasis/cysticercosis disease system in the human and pig hosts. Cysticercosis prevalence in humans, an outcome of high epidemiological and clinical importance, was explicitly modelled. A next generation matrix approach was used to derive an expression for the basic reproduction number, R 0 . A full sensitivity analysis was performed using a methodology based on Latin-hypercube sampling partial rank correlation coefficient index. EPICYST outputs indicate that chemotherapeutic intervention targeted at humans or pigs would be highly effective at reducing taeniasis and cysticercosis prevalence when applied singly, with annual chemotherapy of humans and pigs resulting, respectively, in 94 and 74% of human cysticercosis cases averted. Improved sanitation, meat inspection and animal husbandry are less effective but are still able to reduce prevalence singly or in combination. The value of R 0 for taeniasis was estimated at 1.4 (95% Credible Interval: 0.5-3.6). Human- and pig-targeted drug-focussed interventions appear to be the most efficacious approach from the options currently available. The model presented is a forward step towards developing an informed control and elimination strategy for cysticercosis. Together with its validation against field data, EPICYST will be a valuable tool to help reach the WHO goals and to conduct economic evaluations of interventions in varying epidemiological settings.

  20. Performance Improvement of a Return Channel in a Multistage Centrifugal Compressor Using Multiobjective Optimization.

    PubMed

    Nishida, Yoshifumi; Kobayashi, Hiromi; Nishida, Hideo; Sugimura, Kazuyuki

    2013-05-01

    The effect of the design parameters of a return channel on the performance of a multistage centrifugal compressor was numerically investigated, and the shape of the return channel was optimized using a multiobjective optimization method based on a genetic algorithm to improve the performance of the centrifugal compressor. The results of sensitivity analysis using Latin hypercube sampling suggested that the inlet-to-outlet area ratio of the return vane affected the total pressure loss in the return channel, and that the inlet-to-outlet radius ratio of the return vane affected the outlet flow angle from the return vane. Moreover, this analysis suggested that the number of return vanes affected both the loss and the flow angle at the outlet. As a result of optimization, the number of return vane was increased from 14 to 22 and the area ratio was decreased from 0.71 to 0.66. The radius ratio was also decreased from 2.1 to 2.0. Performance tests on a centrifugal compressor with two return channels (the original design and optimized design) were carried out using two-stage test apparatus. The measured flow distribution exhibited a swirl flow in the center region and a reversed swirl flow near the hub and shroud sides. The exit flow of the optimized design was more uniform than that of the original design. For the optimized design, the overall two-stage efficiency and pressure coefficient were increased by 0.7% and 1.5%, respectively. Moreover, the second-stage efficiency and pressure coefficient were respectively increased by 1.0% and 3.2%. It is considered that the increase in the second-stage efficiency was caused by the increased uniformity of the flow, and the rise in the pressure coefficient was caused by a decrease in the residual swirl flow. It was thus concluded from the numerical and experimental results that the optimized return channel improved the performance of the multistage centrifugal compressor.

  1. A clinically parameterized mathematical model of Shigella immunity to inform vaccine design

    PubMed Central

    Wahid, Rezwanul; Toapanta, Franklin R.; Simon, Jakub K.; Sztein, Marcelo B.

    2018-01-01

    We refine and clinically parameterize a mathematical model of the humoral immune response against Shigella, a diarrheal bacteria that infects 80-165 million people and kills an estimated 600,000 people worldwide each year. Using Latin hypercube sampling and Monte Carlo simulations for parameter estimation, we fit our model to human immune data from two Shigella EcSf2a-2 vaccine trials and a rechallenge study in which antibody and B-cell responses against Shigella′s lipopolysaccharide (LPS) and O-membrane proteins (OMP) were recorded. The clinically grounded model is used to mathematically investigate which key immune mechanisms and bacterial targets confer immunity against Shigella and to predict which humoral immune components should be elicited to create a protective vaccine against Shigella. The model offers insight into why the EcSf2a-2 vaccine had low efficacy and demonstrates that at a group level a humoral immune response induced by EcSf2a-2 vaccine or wild-type challenge against Shigella′s LPS or OMP does not appear sufficient for protection. That is, the model predicts an uncontrolled infection of gut epithelial cells that is present across all best-fit model parameterizations when fit to EcSf2a-2 vaccine or wild-type challenge data. Using sensitivity analysis, we explore which model parameter values must be altered to prevent the destructive epithelial invasion by Shigella bacteria and identify four key parameter groups as potential vaccine targets or immune correlates: 1) the rate that Shigella migrates into the lamina propria or epithelium, 2) the rate that memory B cells (BM) differentiate into antibody-secreting cells (ASC), 3) the rate at which antibodies are produced by activated ASC, and 4) the Shigella-specific BM carrying capacity. This paper underscores the need for a multifaceted approach in ongoing efforts to design an effective Shigella vaccine. PMID:29304144

  2. A clinically parameterized mathematical model of Shigella immunity to inform vaccine design.

    PubMed

    Davis, Courtney L; Wahid, Rezwanul; Toapanta, Franklin R; Simon, Jakub K; Sztein, Marcelo B

    2018-01-01

    We refine and clinically parameterize a mathematical model of the humoral immune response against Shigella, a diarrheal bacteria that infects 80-165 million people and kills an estimated 600,000 people worldwide each year. Using Latin hypercube sampling and Monte Carlo simulations for parameter estimation, we fit our model to human immune data from two Shigella EcSf2a-2 vaccine trials and a rechallenge study in which antibody and B-cell responses against Shigella's lipopolysaccharide (LPS) and O-membrane proteins (OMP) were recorded. The clinically grounded model is used to mathematically investigate which key immune mechanisms and bacterial targets confer immunity against Shigella and to predict which humoral immune components should be elicited to create a protective vaccine against Shigella. The model offers insight into why the EcSf2a-2 vaccine had low efficacy and demonstrates that at a group level a humoral immune response induced by EcSf2a-2 vaccine or wild-type challenge against Shigella's LPS or OMP does not appear sufficient for protection. That is, the model predicts an uncontrolled infection of gut epithelial cells that is present across all best-fit model parameterizations when fit to EcSf2a-2 vaccine or wild-type challenge data. Using sensitivity analysis, we explore which model parameter values must be altered to prevent the destructive epithelial invasion by Shigella bacteria and identify four key parameter groups as potential vaccine targets or immune correlates: 1) the rate that Shigella migrates into the lamina propria or epithelium, 2) the rate that memory B cells (BM) differentiate into antibody-secreting cells (ASC), 3) the rate at which antibodies are produced by activated ASC, and 4) the Shigella-specific BM carrying capacity. This paper underscores the need for a multifaceted approach in ongoing efforts to design an effective Shigella vaccine.

  3. Sensitivity analysis of some critical factors affecting simulated intrusion volumes during a low pressure transient event in a full-scale water distribution system.

    PubMed

    Ebacher, G; Besner, M C; Clément, B; Prévost, M

    2012-09-01

    Intrusion events caused by transient low pressures may result in the contamination of a water distribution system (DS). This work aims at estimating the range of potential intrusion volumes that could result from a real downsurge event caused by a momentary pump shutdown. A model calibrated with transient low pressure recordings was used to simulate total intrusion volumes through leakage orifices and submerged air vacuum valves (AVVs). Four critical factors influencing intrusion volumes were varied: the external head of (untreated) water on leakage orifices, the external head of (untreated) water on submerged air vacuum valves, the leakage rate, and the diameter of AVVs' outlet orifice (represented by a multiplicative factor). Leakage orifices' head and AVVs' orifice head levels were assessed through fieldwork. Two sets of runs were generated as part of two statistically designed experiments. A first set of 81 runs was based on a complete factorial design in which each factor was varied over 3 levels. A second set of 40 runs was based on a latin hypercube design, better suited for experimental runs on a computer model. The simulations were conducted using commercially available transient analysis software. Responses, measured by total intrusion volumes, ranged from 10 to 366 L. A second degree polynomial was used to analyze the total intrusion volumes. Sensitivity analyses of both designs revealed that the relationship between the total intrusion volume and the four contributing factors is not monotonic, with the AVVs' orifice head being the most influential factor. When intrusion through both pathways occurs concurrently, interactions between the intrusion flows through leakage orifices and submerged AVVs influence intrusion volumes. When only intrusion through leakage orifices is considered, the total intrusion volume is more largely influenced by the leakage rate than by the leakage orifices' head. The latter mainly impacts the extent of the area affected by intrusion. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  5. Constructing rigorous and broad biosurveillance networks for detecting emerging zoonotic outbreaks

    DOE PAGES

    Brown, Mac; Moore, Leslie; McMahon, Benjamin; ...

    2015-05-06

    Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a wide range of scenarios of pathogen spread. We demonstrate how a hierarchy of mathematical and statistical tools can be used in surveillance planning help guide successful surveillance and mitigation policies for a wide range of zoonotic pathogens. The model forecasts can help clarify the complexities ofmore » potential scenarios, and optimize biosurveillance programs for rapidly detecting infectious diseases. Using the highly pathogenic zoonotic H5N1 avian influenza 2006-2007 epidemic in Nigeria as an example, we determined the risk for infection for localized areas in an outbreak and designed biosurveillance stations that are effective for different pathogen strains and a range of possible outbreak locations. We created a general multi-scale, multi-host stochastic SEIR epidemiological network model, with both short and long-range movement, to simulate the spread of an infectious disease through Nigerian human, poultry, backyard duck, and wild bird populations. We chose parameter ranges specific to avian influenza (but not to a particular strain) and used a Latin hypercube sample experimental design to investigate epidemic predictions in a thousand simulations. We ranked the risk of local regions by the number of times they became infected in the ensemble of simulations. These spatial statistics were then complied into a potential risk map of infection. Finally, we validated the results with a known outbreak, using spatial analysis of all the simulation runs to show the progression matched closely with the observed location of the farms infected in the 2006-2007 epidemic.« less

  6. Between the national and the universal: natural history networks in Latin America in the nineteenth and twentieth centuries.

    PubMed

    Duarte, Regina Horta

    2013-12-01

    This essay examines contemporary Latin American historical writing about natural history from the nineteenth through the twentieth centuries. Natural history is a "network science," woven out of connections and communications between diverse people and centers of scholarship, all against a backdrop of complex political and economic changes. Latin American naturalists navigated a tension between promoting national science and participating in "universal" science. These tensions between the national and the universal have also been reflected in historical writing on Latin America. Since the 1980s, narratives that recognize Latin Americans' active role have become more notable within the renewal of the history of Latin American science. However, the nationalist slant of these approaches has kept Latin American historiography on the margins. The networked nature of natural history and Latin America's active role in it afford an opportunity to end the historiographic isolation of Latin America and situate it within world history.

  7. Latino College Students.

    ERIC Educational Resources Information Center

    Olivas, Michael A., Ed.

    The condition of higher education for Hispanic Americans and Latin Americans is addressed in 12 papers from the 1983 Conference on Latino College Students. Attention is directed to the transition from high school to college, Hispanic student achievement, and economics and stratification. In addition to forewords by Gregory R. Anrig and Arturo…

  8. Health-Related Quality of Life of Latin-American Immigrants and Spanish-Born Attended in Spanish Primary Health Care: Socio-Demographic and Psychosocial Factors

    PubMed Central

    Salinero-Fort, Miguel Ángel; Gómez-Campelo, Paloma; Bragado-Alvárez, Carmen; Abánades-Herranz, Juan Carlos; Jiménez-García, Rodrigo; de Burgos-Lunar, Carmen

    2015-01-01

    Background This study compares the health-related quality of life of Spanish-born and Latin American-born individuals settled in Spain. Socio-demographic and psychosocial factors associated with health-related quality of life are analyzed. Methods A cross-sectional Primary Health Care multi center-based study of Latin American-born (n = 691) and Spanish-born (n = 903) outpatients from 15 Primary Health Care Centers (Madrid, Spain). The Medical Outcomes Study 36-Item Short Form Health Survey (SF-36) was used to assess health-related quality of life. Socio-demographic, psychosocial, and specific migration data were also collected. Results Compared to Spanish-born participants, Latin American-born participants reported higher health-related quality of life in the physical functioning and vitality dimensions. Across the entire sample, Latin American-born participants, younger participants, men and those with high social support reported significantly higher levels of physical health. Men with higher social support and a higher income reported significantly higher mental health. When stratified by gender, data show that for men physical health was only positively associated with younger age. For women, in addition to age, social support and marital status were significantly related. Both men and women with higher social support and income had significantly better mental health. Finally, for immigrants, the physical and mental health components of health-related quality of life were not found to be significantly associated with any of the pre-migration factors or conditions of migration. Only the variable “exposure to political violence” was significantly associated with the mental health component (p = 0.014). Conclusions The key factors to understanding HRQoL among Latin American-born immigrants settled in Spain are age, sex and social support. Therefore, strategies to maintain optimal health outcomes in these immigrant communities should include public policies on social inclusion in the host society and focus on improving social support networks in order to foster and maintain the health and HRQoL of this group. PMID:25835714

  9. Characterization of microflora in Latin-style cheeses by next-generation sequencing technology.

    PubMed

    Lusk, Tina S; Ottesen, Andrea R; White, James R; Allard, Marc W; Brown, Eric W; Kase, Julie A

    2012-11-07

    Cheese contamination can occur at numerous stages in the manufacturing process including the use of improperly pasteurized or raw milk. Of concern is the potential contamination by Listeria monocytogenes and other pathogenic bacteria that find the high moisture levels and moderate pH of popular Latin-style cheeses like queso fresco a hospitable environment. In the investigation of a foodborne outbreak, samples typically undergo enrichment in broth for 24 hours followed by selective agar plating to isolate bacterial colonies for confirmatory testing. The broth enrichment step may also enable background microflora to proliferate, which can confound subsequent analysis if not inhibited by effective broth or agar additives. We used 16S rRNA gene sequencing to provide a preliminary survey of bacterial species associated with three brands of Latin-style cheeses after 24-hour broth enrichment. Brand A showed a greater diversity than the other two cheese brands (Brands B and C) at nearly every taxonomic level except phylum. Brand B showed the least diversity and was dominated by a single bacterial taxon, Exiguobacterium, not previously reported in cheese. This genus was also found in Brand C, although Lactococcus was prominent, an expected finding since this bacteria belongs to the group of lactic acid bacteria (LAB) commonly found in fermented foods. The contrasting diversity observed in Latin-style cheese was surprising, demonstrating that despite similarity of cheese type, raw materials and cheese making conditions appear to play a critical role in the microflora composition of the final product. The high bacterial diversity associated with Brand A suggests it may have been prepared with raw materials of high bacterial diversity or influenced by the ecology of the processing environment. Additionally, the presence of Exiguobacterium in high proportions (96%) in Brand B and, to a lesser extent, Brand C (46%), may have been influenced by the enrichment process. This study is the first to define Latin-style cheese microflora using Next-Generation Sequencing. These valuable preliminary data will direct selective tailoring of agar formulations to improve culture-based detection of pathogens in Latin-style cheese.

  10. Characterization of microflora in Latin-style cheeses by next-generation sequencing technology

    PubMed Central

    2012-01-01

    Background Cheese contamination can occur at numerous stages in the manufacturing process including the use of improperly pasteurized or raw milk. Of concern is the potential contamination by Listeria monocytogenes and other pathogenic bacteria that find the high moisture levels and moderate pH of popular Latin-style cheeses like queso fresco a hospitable environment. In the investigation of a foodborne outbreak, samples typically undergo enrichment in broth for 24 hours followed by selective agar plating to isolate bacterial colonies for confirmatory testing. The broth enrichment step may also enable background microflora to proliferate, which can confound subsequent analysis if not inhibited by effective broth or agar additives. We used 16S rRNA gene sequencing to provide a preliminary survey of bacterial species associated with three brands of Latin-style cheeses after 24-hour broth enrichment. Results Brand A showed a greater diversity than the other two cheese brands (Brands B and C) at nearly every taxonomic level except phylum. Brand B showed the least diversity and was dominated by a single bacterial taxon, Exiguobacterium, not previously reported in cheese. This genus was also found in Brand C, although Lactococcus was prominent, an expected finding since this bacteria belongs to the group of lactic acid bacteria (LAB) commonly found in fermented foods. Conclusions The contrasting diversity observed in Latin-style cheese was surprising, demonstrating that despite similarity of cheese type, raw materials and cheese making conditions appear to play a critical role in the microflora composition of the final product. The high bacterial diversity associated with Brand A suggests it may have been prepared with raw materials of high bacterial diversity or influenced by the ecology of the processing environment. Additionally, the presence of Exiguobacterium in high proportions (96%) in Brand B and, to a lesser extent, Brand C (46%), may have been influenced by the enrichment process. This study is the first to define Latin-style cheese microflora using Next-Generation Sequencing. These valuable preliminary data will direct selective tailoring of agar formulations to improve culture-based detection of pathogens in Latin-style cheese. PMID:23134566

  11. Role and working conditions of nurses in public health in Mexico and Peru: a binational qualitative study.

    PubMed

    De Córdova, Maria Isabel Peñarrietade; Mier, Nelda; Quirarte, Nora Hilda Gonzales; Gómez, Tranquilina Gutiérrez; Piñones, Socorro; Borda, Alejandro

    2013-11-01

    This exploratory study conducted in Mexico and Peru investigated nurses' perceptions about their role in public health and working conditions. Health reform efforts in many countries are redefining the role of health professionals in public health. Little is known about the role of nurses working in public health contexts in Latin America. Fourteen focus groups were conducted in Mexico and Peru with 82 nurses working in government-sponsored community health centres. Data were analysed using a content analysis technique. Themes identified were: nurses' job descriptions in public health settings; organisational factors influencing the nurses' work, and influence of academic and social image factors. Management barriers and limited training influences the role and working conditions of public health nurses in Mexico and Peru. The professional role of nurses working in public health in Latin America is not well defined because of the health-care system infrastructure and the lack of a clear public health nurse job description. Further research is needed to better understand the role of public health nurses and strengthen their training, particularly in relation to nursing management encompassing abilities for decision-making processes and public health program planning and evaluation. © 2012 John Wiley & Sons Ltd.

  12. "The South American Way": Hollywood Looks at Latins and at Latin America.

    ERIC Educational Resources Information Center

    Aiex, Nola Kortner

    Latin elements or themes made for the North American market have been used in American films, but at the same time these films have been playing in a Latin American market, making it useful to examine how Latin America has been portrayed in these films. The taste for exotic locales and themes is an element that has been present since the…

  13. Testing the Latino paradox in Latin America: a population-based study of Intra-regional immigrants in Chile.

    PubMed

    Cabieses, Baltica; Tunstall, Helena; Pickett, Kate

    2013-10-01

    Several studies in high-income countries report better health status of immigrants compared to the local population ("healthy migrant" effect), regardless of their socioeconomic deprivation. This is known as the Latino paradox. To test the Latino paradox within Latin America by assessing the health of international immigrants to Chile, most of them from Latin American countries, and comparing them to the Chilean-born. Secondary data analysis of the population-based CASEN survey-2006. Three health outcomes were included: disability, illness/accident, and cancer/chronic condition (dichotomous). Demographics (age, sex, marital status, urban/rural, ethnicity), socioeconomic-status (SES: educational level, employment status and household income per-capita), and material standards (overcrowding, sanitation, housing quality). Crude and adjusted weighted regression models were performed. One percent of Chile's population were immigrants, mainly from other Latin American countries. A "healthy migrant" effect appeared within the total immigrant population: this group had a significantly lower crude prevalence of almost all health indicators than the Chilean-born, which remained after adjusting for various demographic characteristics. However, this effect lost significance when adjusting by SES for most outcomes. The Latino paradox was not observed for international immigrants compared to the local population in Chile. Also, health of immigrants with the longest time of residency showed similar health rates to the Chilean-born. The Latino paradox was not observed in Chile. Protecting low SES immigrants in Chile could have large positive effects in their health at arrival and over time.

  14. STAKEHOLDER INVOLVEMENT IN THE HEALTH TECHNOLOGY ASSESSMENT PROCESS IN LATIN AMERICA.

    PubMed

    Pichon-Riviere, Andres; Soto, Natalie; Augustovski, Federico; Sampietro-Colom, Laura

    2018-06-11

    Latin American countries are taking important steps to expand and strengthen universal health coverage, and health technology assessment (HTA) has an increasingly prominent role in this process. Participation of all relevant stakeholders has become a priority in this effort. Key issues in this area were discussed during the 2017 Latin American Health Technology Assessment International (HTAi) Policy Forum. The Forum included forty-one participants from Latin American HTA agencies; public, social security, and private insurance sectors; and the pharmaceutical and medical device industry. A background paper and presentations by invited experts and Forum members supported discussions. This study presents a summary of these discussions. Stakeholder involvement in HTA remains inconsistently implemented in the region and few countries have established formal processes. Participants agreed that stakeholder involvement is key to improve the HTA process, but the form and timing of such improvements must be adapted to local contexts. The legitimization of both HTA and decision-making processes was identified as one of the main reasons to promote stakeholder involvement; but to be successful, the entire system of assessment and decision making must be properly staffed and organized, and certain basic conditions must be met, including transparency in the HTA process and a clear link between HTA and decision making. Participants suggested a need for establishing clear rules of participation in HTA that would protect HTA producers and decision makers from potentially distorting external influences. Such rules and mechanisms could help foster trust and credibility among stakeholders, supporting actual involvement in HTA processes.

  15. Computer program to minimize prediction error in models from experiments with 16 hypercube points and 0 to 6 center points

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1982-01-01

    A previous report described a backward deletion procedure of model selection that was optimized for minimum prediction error and which used a multiparameter combination of the F - distribution and an order statistics distribution of Cochran's. A computer program is described that applies the previously optimized procedure to real data. The use of the program is illustrated by examples.

  16. Software Techniques for Non-Von Neumann Architectures

    DTIC Science & Technology

    1990-01-01

    Commtopo programmable Benes net.; hypercubic lattice for QCD Control CENTRALIZED Assign STATIC Memory :SHARED Synch UNIVERSAL Max-cpu 566 Proessor...boards (each = 4 floating point units, 2 multipliers) Cpu-size 32-bit floating point chips Perform 11.4 Gflops Market quantum chromodynamics ( QCD ...functions there should exist a capability to define hierarchies and lattices of complex objects. A complex object can be made up of a set of simple objects

  17. Growing a hypercubical output space in a self-organizing feature map.

    PubMed

    Bauer, H U; Villmann, T

    1997-01-01

    Neural maps project data from an input space onto a neuron position in a (often lower dimensional) output space grid in a neighborhood preserving way, with neighboring neurons in the output space responding to neighboring data points in the input space. A map-learning algorithm can achieve an optimal neighborhood preservation only, if the output space topology roughly matches the effective structure of the data in the input space. We here present a growth algorithm, called the GSOM or growing self-organizing map, which enhances a widespread map self-organization process, Kohonen's self-organizing feature map (SOFM), by an adaptation of the output space grid during learning. The GSOM restricts the output space structure to the shape of a general hypercubical shape, with the overall dimensionality of the grid and its extensions along the different directions being subject of the adaptation. This constraint meets the demands of many larger information processing systems, of which the neural map can be a part. We apply our GSOM-algorithm to three examples, two of which involve real world data. Using recently developed methods for measuring the degree of neighborhood preservation in neural maps, we find the GSOM-algorithm to produce maps which preserve neighborhoods in a nearly optimal fashion.

  18. Exploring the spatial variability of soil properties in an Alfisol Catena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosemary, F.; Vitharana, U. W. A.; Indraratne, S. P.

    Detailed digital soil maps showing the spatial heterogeneity of soil properties consistent with the landscape are required for site-specific management of plant nutrients, land use planning and process-based environmental modeling. We characterized the short-scale spatial heterogeneity of soil properties in an Alfisol catena in a tropical landscape of Sri Lanka. The impact of different land-uses (paddy, vegetable and un-cultivated) was examined to assess the impact of anthropogenic activities on the variability of soil properties at the catenary level. Conditioned Latin hypercube sampling was used to collect 58 geo-referenced topsoil samples (0–30 cm) from the study area. Soil samples were analyzedmore » for pH, electrical conductivity (EC), organic carbon (OC), cation exchange capacity (CEC) and texture. The spatial correlation between soil properties was analyzed by computing crossvariograms and subsequent fitting of theoretical model. Spatial distribution maps were developed using ordinary kriging. The range of soil properties, pH: 4.3–7.9; EC: 0.01–0.18 dS m –1 ; OC: 0.1–1.37%; CEC: 0.44– 11.51 cmol (+) kg –1 ; clay: 1.5–25% and sand: 59.1–84.4% and their coefficient of variations indicated a large variability in the study area. Electrical conductivity and pH showed a strong spatial correlation which was reflected by the cross-variogram close to the hull of the perfect correlation. Moreover, cross-variograms calculated for EC and Clay, CEC and OC, CEC and clay and CEC and pH indicated weak positive spatial correlation between these properties. Relative nugget effect (RNE) calculated from variograms showed strongly structured spatial variability for pH, EC and sand content (RNE < 25%) while CEC, organic carbon and clay content showed moderately structured spatial variability (25% < RNE < 75%). Spatial dependencies for examined soil properties ranged from 48 to 984 m. The mixed effects model fitting followed by Tukey's post-hoc test showed significant effect of land use on the spatial variability of EC. Our study revealed a structured variability of topsoil properties in the selected tropical Alfisol catena. Except for EC, observed variability was not modified by the land uses. Investigated soil properties showed distinct spatial structures at different scales and magnitudes of strength. Our results will be useful for digital soil mapping, site specific management of soil properties, developing appropriate land use plans and quantifying anthropogenic impacts on the soil system.« less

  19. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    NASA Astrophysics Data System (ADS)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of saltwater intrusion are considered. The salinity levels resulting at strategic locations due to these pumping are predicted using the ensemble surrogates and are constrained to be within pre-specified levels. Different realizations of the concentration values are obtained from the ensemble predictions corresponding to each candidate solution of pumping. Reliability concept is incorporated as the percent of the total number of surrogate models which satisfy the imposed constraints. The methodology was applied to a realistic coastal aquifer system in Burdekin delta area in Australia. It was found that all optimal solutions corresponding to a reliability level of 0.99 satisfy all the constraints and as reducing reliability level decreases the constraint violation increases. Thus ensemble surrogate model based simulation-optimization was found to be useful in deriving multi-objective optimal pumping strategies for coastal aquifers under parameter uncertainty.

  20. Within-field and regional-scale accuracies of topsoil organic carbon content prediction from an airborne visible near-infrared hyperspectral image combined with synchronous field spectra for temperate croplands

    NASA Astrophysics Data System (ADS)

    Vaudour, Emmanuelle; Gilliot, Jean-Marc; Bel, Liliane; Lefevre, Josias; Chehdi, Kacem

    2016-04-01

    This study was carried out in the framework of the TOSCA-PLEIADES-CO of the French Space Agency and benefited data from the earlier PROSTOCK-Gessol3 project supported by the French Environment and Energy Management Agency (ADEME). It aimed at identifying the potential of airborne hyperspectral visible near-infrared AISA-Eagle data for predicting the topsoil organic carbon (SOC) content of bare cultivated soils over a large peri-urban area (221 km2) with intensive annual crop cultivation and both contrasted soils and SOC contents, located in the western region of Paris, France. Soils comprise hortic or glossic luvisols, calcaric, rendzic cambisols and colluvic cambisols. Airborne AISA-Eagle images (400-1000 nm, 126 bands) with 1 m-resolution were acquired on 17 April 2013 over 13 tracks. Tracks were atmospherically corrected then mosaicked at a 2 m-resolution using a set of 24 synchronous field spectra of bare soils, black and white targets and impervious surfaces. The land use identification system layer (RPG) of 2012 was used to mask non-agricultural areas, then calculation and thresholding of NDVI from an atmospherically corrected SPOT4 image acquired the same day enabled to map agricultural fields with bare soil. A total of 101 sites, which were sampled either at the regional scale or within one field, were identified as bare by means of this map. Predictions were made from the mosaic AISA spectra which were related to SOC contents by means of partial least squares regression (PLSR). Regression robustness was evaluated through a series of 1000 bootstrap data sets of calibration-validation samples, considering those 75 sites outside cloud shadows only, and different sampling strategies for selecting calibration samples. Validation root-mean-square errors (RMSE) were comprised between 3.73 and 4.49 g. Kg-1 and were ~4 g. Kg-1 in median. The most performing models in terms of coefficient of determination (R²) and Residual Prediction Deviation (RPD) values were the calibration models derived either from Kennard-Stone or conditioned Latin Hypercube sampling on smoothed spectra. However, the most generalizable model leading to lowest RMSE value of 3.73 g. Kg-1 at the regional scale and 1.44 g. Kg-1 at the within-field scale and low validation bias was the cross-validated leave-one-out PLSR model constructed with the 28 near-synchronous samples and raw spectra.

  1. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    NASA Astrophysics Data System (ADS)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used to probe the design space of several local minima and maxima. After analysis of numerous samples, an optimum configuration of the design that is more stable than that of the initial design is reached. The above process requires several software tools: CATIA as the CAD tool, ANSYS as the FEA tool, VABS for obtaining the cross-sectional structural properties, and DYMORE for the frequency and dynamic analysis of the rotor. MATLAB codes are also employed to generate input files and read output files of DYMORE. All these tools are connected using ModelCenter.

  2. Tailoring Information Strategies for Developing Countries: Some Latin American Experiences.

    ERIC Educational Resources Information Center

    Crowther, Warren

    This article addresses the conditions of developing countries which must be taken into account in developing information strategies for their public and educational institutions or projects. Its central argument is that newer information science concepts, although they demand technological and conceptual sophistication, can be useful in the…

  3. Indigenous Affairs = Asuntos Indigenas, 2000.

    ERIC Educational Resources Information Center

    Indigenous Affairs, 2000

    2000-01-01

    This document contains the four English-language issues of Indigenous Affairs published in 2000 and four corresponding issues in Spanish. The Spanish issues contain all or some of the articles contained in the English issues plus additional articles on Latin America. These periodicals provide a resource on the history, current conditions, and…

  4. Conditions for the Application of Science and Technology for Human Welfare

    ERIC Educational Resources Information Center

    Sabin, Albert B.

    1972-01-01

    Maintains that the overpopulation without adequate resources in Asia, Latin America and other parts of the world can be catastrophic to all the countries in the world. Efforts to provide aid to developing nations have been inadequate. International cooperation for aid is very much needed now. (PS)

  5. Concurrent Verbalizations, Pedagogical Conditions, and Reactivity: Two CALL Studies

    ERIC Educational Resources Information Center

    Sanz, Cristina; Lin, Hui-Ju; Lado, Beatriz; Bowden, Harriet Wood; Stafford, Catherine A.

    2009-01-01

    The article summarizes results from two experimental studies on reactivity. In the first experiment, 24 college-age participants received a computerized treatment that delivered a grammar lesson, practice, and feedback on assignment of semantic functions in Latin. Verbalizations did not induce reactivity on accuracy, but they slowed down posttest…

  6. French Latin Association Demands Full Restoration of Secondary Classics Curriculum

    ERIC Educational Resources Information Center

    Denegri, Raymond

    1978-01-01

    Discusses the demands of the Association pour la Defense du Latin that Latin be returned to the French secondary school curriculum and that the Classics-Science baccalaureat be restored as one of the French secondary programs. Benefits of studying Latin are enumerated. (KM)

  7. Tobacco industry success in preventing regulation of secondhand smoke in Latin America: the "Latin Project"

    PubMed Central

    Barnoya, J; Glantz, S

    2002-01-01

    Objective: To examine the tobacco industry's strategy to avoid regulations on secondhand smoke exposure in Latin America. Methods: Systematic search of tobacco industry documents available through the internet. All available materials, including confidential reports regarding research, lobbying, and internal memoranda exchanged between the tobacco industry representatives, tobacco industry lawyers, and key players in Latin America. Results: In Latin America, Philip Morris International and British American Tobacco, working through the law firm Covington & Burling, developed a network of well placed physicians and scientists through their "Latin Project" to generate scientific arguments minimising secondhand smoke as a health hazard, produce low estimates of exposure, and to lobby against smoke-free workplaces and public places. The tobacco industry's role was not disclosed. Conclusions: The strategies used by the industry have been successful in hindering development of public health programmes on secondhand smoke. Latin American health professionals need to be aware of this industry involvement and must take steps to counter it to halt the tobacco epidemic in Latin America. PMID:12432156

  8. Latin Curriculum Standards. Revised.

    ERIC Educational Resources Information Center

    Delaware State Dept. of Public Instruction, Dover.

    Delaware's state standards for the Latin curriculum in the public schools are presented. An introductory section outlines the goals of the Latin program for reading, cultural awareness, grammar, writing, and oral language and briefly discusses the philosophy of and approaches to Latin instruction in elementary and middle schools. Three subsequent…

  9. Baculovirus Insecticides in Latin America: Historical Overview, Current Status and Future Perspectives

    PubMed Central

    Haase, Santiago; Sciocco-Cap, Alicia; Romanowski, Víctor

    2015-01-01

    Baculoviruses are known to regulate many insect populations in nature. Their host-specificity is very high, usually restricted to a single or a few closely related insect species. They are amongst the safest pesticides, with no or negligible effects on non-target organisms, including beneficial insects, vertebrates and plants. Baculovirus-based pesticides are compatible with integrated pest management strategies and the expansion of their application will significantly reduce the risks associated with the use of synthetic chemical insecticides. Several successful baculovirus-based pest control programs have taken place in Latin American countries. Sustainable agriculture (a trend promoted by state authorities in most Latin American countries) will benefit from the wider use of registered viral pesticides and new viral products that are in the process of registration and others in the applied research pipeline. The success of baculovirus-based control programs depends upon collaborative efforts among government and research institutions, growers associations, and private companies, which realize the importance of using strategies that protect human health and the environment at large. Initiatives to develop new regulations that promote the use of this type of ecological alternatives tailored to different local conditions and farming systems are underway. PMID:25941826

  10. Cuadernos Médico Sociales from Rosario, Argentina, and other Latin American social medicine journals from the 1970s and 1980s.

    PubMed

    Spinelli, Hugo; Librandi, Juan Martín; Zabala, Juan Pablo

    2017-01-01

    In the 1970s and 1980s, a series of journals were founded to disseminate ideas from Latin American social medicine in various countries across the continent, during the early stage of a movement that would later become institutionalized in Brazil under the name "collective medicine." In this article, we look at the principal characteristics of those endeavors: Revista Centroamericana de Ciencias de la Salud, Saúde em Debate, Salud Problema, Revista Latinoamericana de Salud and Cuadernos Médico Sociales. We focus in particular on Cuadernos Médico Sociales, published in Rosario, Argentina. We analyze the conditions under which this publication emerged, the editorial processes it followed, and the central role played by Carlos Bloch, its founder and managing editor.

  11. The Bologna Process from a Latin American Perspective

    ERIC Educational Resources Information Center

    Brunner, Jose Joaquin

    2009-01-01

    Although Latin America's geography, history, and languages might seem a suitable foundation for a Bologna-type process, the development of a common Latin American higher education and research area meets predictable difficulties.The reasons are to be found in the continent's historic and modern institutional patterns. Latin American governments…

  12. Latin America and the United States: What Do United States History Textbooks Tell Us?

    ERIC Educational Resources Information Center

    Fleming, Dan B.

    1982-01-01

    Evaluates how U.S.-Latin American relations are presented in high school U.S. history textbooks. An examination of 10 textbooks published between 1977-81 revealed inadequate coverage of Latin American cultural diversity and United States foreign policy from the Latin American perspective. (AM)

  13. Atlanta Public Schools Latin Guide.

    ERIC Educational Resources Information Center

    Atlanta Public Schools, GA.

    This teacher's guide outlines the basic objectives and the content of the Atlanta Public Schools Latin program and suggests resources and methods to achieve the stated goals. The philosophy and general objectives of the program are presented. Course outlines include: (1) Beginning Latin, (2) Intermediate Latin, (3) Vergil's "Aeneid," (4) Ovid:…

  14. HRD in Latin America. Symposium 5. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This document contains three papers on human resource development (HRD) in Latin America. "Looking at the Literature on Workplace Democracy in Latin America: Factors in Favor and Against It" (Max U. Montesino) discusses selected issues related to workplace democracy in Latin America and identifies salient issues for further research,…

  15. Apuntes sobre Latinismos en Espanol (Notes on Latinisms in Spanish)

    ERIC Educational Resources Information Center

    Perez B., L. A.

    1977-01-01

    Several Latinisms appear in Latin American Spanish, which would logically be farther from its Latin roots than Spanish in Spain. The existence of these elements and their importance as linguistic facts is analyzed here. Four words are treated: "Cliente,""cuadrar,""cuarto" and "rabula." (Text is in Spanish.) (CHK)

  16. Textbooks in Greek and Latin: 1975 List

    ERIC Educational Resources Information Center

    McCarty, Thomas G.

    1975-01-01

    List of textbooks in Greek and Latin for 1975. Subject, title, publisher and price are noted. Greek and Latin works are listed separately under the eight categories of texts, beginner's books, grammars, books about the language, readers and anthologies, composition, dictionaries, and New Testament Greek and Later Latin. (RM)

  17. Internationalizing Business Education in Latin America: Issues and Challenges

    ERIC Educational Resources Information Center

    Elahee, Mohammad; Norbis, Mario

    2009-01-01

    This article examines the extent of internationalization of business education in Latin America and identifies the key challenges facing the Latin American business schools. Based on a survey of the business schools that are members of CLADEA (Consejo Latinoamericano de Escuelas de Administracion--Latin American Council of Management Schools), and…

  18. Parallel Processing and Scientific Applications

    DTIC Science & Technology

    1992-11-30

    Lattice QCD Calculations on the Connection Machine), SIAM News 24, 1 (May 1991) 5. C. F. Baillie and D. A. Johnston, Crumpling Dynamically Triangulated...hypercubic lattice ; in the second, the surface is randomly triangulated once at the beginning of the simulation; and in the third the random...Sharpe, QCD with Dynamical Wilson Fermions 1I, Phys. Rev. D44, 3272 (1991), 8. R. Gupta and C. F. Baillie, Critical Behavior of the 2D XY Model, Phys

  19. Mapping Flows onto Networks to Optimize Organizational Processes

    DTIC Science & Technology

    2005-01-01

    And G . Porter, “Assessments of Simulated Performance of Alternative Architectures for Command and Control: The Role of Coordination”, Proceedings of...the 1999 Command & Control Research & Technology Symposium, NWC, Newport, RI, June 1999, pp. 123-143. [Iverson95] M. Iverson, F. Ozguner, G . Follen...Technology Symposium, NPS, Monterrey, CA, June, 2002. [Wu88] Min-You Wu, D. Gajski . “A Programming Aid for Hypercube Architectures.” The Journal of Supercomputing, 2(1988), pp. 349-372.

  20. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  1. Look for the Latin Word: A Gamebook on English Derivatives and Cognates to Accompany "How the Romans Lived and Spoke (Romani Viventes et Dicentes): A Humanistic Approach to Latin for Children in the Fifth Grade".

    ERIC Educational Resources Information Center

    Masciantonio, Rudolph; And Others

    This gamebook is intended to assist the elementary school Latin teacher in introducing the reading and writing of English derivatives and cognates after these have been mastered audiolingually in the Latin course "How the Romans Lived and Spoke (Romani Viventes et Dicentes): A Humanistic Approach to Latin for Children in the Fifth Grade." (For the…

  2. [Scientific journals of medical students in Latin-America].

    PubMed

    Cabrera-Samith, Ignacio; Oróstegui-Pinilla, Diana; Angulo-Bazán, Yolanda; Mayta-Tristán, Percy; Rodríguez-Morales, Alfonso J

    2010-11-01

    This article deals with the history and evolution of student's scientific journals in Latin-America, their beginnings, how many still exist and which is their future projection. Relevant events show the growth of student's scientific journals in Latin-America and how are they working together to improve their quality. This article is addressed not only for Latin American readers but also to worldwide readers. Latin American medical students are consistently working together to publish scientific research, whose quality is constantly improving.

  3. Sickle cell in Latin America and the United States [corrected].

    PubMed

    Huttle, Alexandra; Maestre, Gladys E; Lantigua, Rafael; Green, Nancy S

    2015-07-01

    Latin Americans are an underappreciated population affected by sickle cell disease (SCD). Sickle trait and SCD exist throughout Latin America and U.S. Latino communities. We describe the epidemiology and genetic heterogeneity of SCD among Latin Americans, and fetal hemoglobin expression. National population-based newborn screening for SCD is limited to Brazil, Costa Rica, and the U.S. Available and extrapolated data suggest that over 6,000 annual births and 100,000-150,000 Latin Americans are affected by SCD. This comprehensive review highlights the substantial numbers and population distribution of SCD and sickle trait in Latin America, and where national newborn screening programs for SCD exist. © 2015 Wiley Periodicals, Inc.

  4. Harvest of Empire: A History of Latinos in America.

    ERIC Educational Resources Information Center

    Gonzalez, Juan

    This book presents an integrated historical look at Latin America and Latinos in the United States, offering portraits of real-life Latino pioneers and sketches of the political events and social conditions that compelled them to leave their homeland and examining how they have transformed the nation's cultural landscape. Part 1,…

  5. Promoting Ethical and Environmental Awareness in Vulnerable Communities: A Research Action Plan

    ERIC Educational Resources Information Center

    Araujo, Ulisses

    2012-01-01

    Urban populations that live in the outskirts of major Latin American cities usually face conditions of vulnerability attached to complex environmental issues, such as the lack of sewerage, floods, pollution and soil and water contamination. This article reports an intervention research programme in Sao Paulo, Brazil that combines a moral education…

  6. Latin American Universities and the Bologna Process: From Commercialisation to the "Tuning" Competencies Project

    ERIC Educational Resources Information Center

    Aboites, Hugo

    2010-01-01

    Through the "Tuning-Latin America" competencies project, Latin American universities have been incorporated into the Bologna Process. In 2003 the European Commission approved an initiative of this project for Latin America and began to promote it among ministries, university presidents' organisations and other institutions in Latin…

  7. Considerations for Integrating Technology in Developing Communities in Latin America

    ERIC Educational Resources Information Center

    Ponte, Daniela Núñez; Cullen, Theresa A.

    2013-01-01

    This article discusses issues related to introducing new information and communication technologies (ICT) into Latin American countries. Latin American countries are gaining world focus with political changes such as the death of Hugo Chavez in Venezuela and the election of the first Latin American Pope. This region will host the World Cup,…

  8. Latin American Immigrant Women and Intergenerational Sex Education

    ERIC Educational Resources Information Center

    Alcalde, Maria Cristina; Quelopana, Ana Maria

    2013-01-01

    People of Latin American descent make up the largest and fastest-growing minority group in the USA. Rates of pregnancy, childbirth, and sexually transmitted infections among people of Latin American descent are higher than among other ethnic groups. This paper builds on research that suggests that among families of Latin American descent, mothers…

  9. Latin and Cross Latin Squares

    ERIC Educational Resources Information Center

    Emanouilidis, Emanuel

    2008-01-01

    Latin squares were first introduced and studied by the famous mathematician Leonhard Euler in the 1700s. Through the years, Latin squares have been used in areas such as statistics, graph theory, coding theory, the generation of random numbers as well as in the design and analysis of experiments. Recently, with the international popularity of…

  10. Colloquamur Latine cum Pueris Puellesque: Latin in the Middle School.

    ERIC Educational Resources Information Center

    Graber, Charles F.; Norton, Harriet S.

    Guidelines for the development of a Latin curriculum for the middle school, with specific suggestions as to content and methodology, are presented in this manual. The material, oriented toward new approaches in the teaching of the Latin and Graeco-Roman cultures, strives to develop proficiency in the skills of listening, speaking, reading, and…

  11. International Clinical Trials in Latin American and Caribbean Countries: Research and Development to Meet Local Health Needs

    PubMed Central

    da Silva, Ricardo E.; Amato, Angélica A.; Guilhem, Dirce B.; de Carvalho, Marta R.; Novaes, Maria R. C. G.

    2018-01-01

    Introduction: Although international health research involves some benefits for the host countries, such as access to innovative treatments, the research itself may not be aligned with their communities' actual health needs. Objective: To map the global landscape of clinical trials run in Latin American and Caribbean countries and discuss the addressing of local health needs in the agenda of international clinical trials. Methods: The present study is a cross-sectional overview and used data referent to studies registered between 01/01/2014 and 12/31/2014 in the World Health Organization's (WHO) International Clinical Trials Registry Platform (ICTRP). Results: Non-communicable diseases such as diabetes, cancer, and asthma—studies which were financed mainly by industries—were the conditions investigated most in the region of Latin America and the Caribbean. The neglected diseases, on the other hand, such as Chagas disease, and dengue, made up 1% of the total number of studies. Hospitals and nonprofit nongovernmental organizations prioritize resources for investigating new drugs for neglected diseases, such as Chagas disease and dengue. Conclusion: The international multicenter clinical trials for investigating new drugs are aligned with the health needs of the region of Latin America and the Caribbean, when one considers the burden resulting from the non-communicable diseases in this region. However, the transmissible diseases, such as tuberculosis and AIDS, and the neglected diseases, such as Chagas disease and dengue, which have an important impact on public health in this region, continue to arouse little interest among the institutions which finance the clinical trials. PMID:29354059

  12. Hospital malnutrition in Latin America: A systematic review.

    PubMed

    Correia, Maria Isabel T D; Perman, Mario Ignacio; Waitzberg, Dan Linetzky

    2017-08-01

    Disease-related malnutrition is a major public health issue in both industrialised and emerging countries. The reported prevalence in hospitalised adults ranges from 20% to 50%. Initial reports from emerging countries suggested a higher prevalence compared with other regions, with limited data on outcomes and costs. We performed a systematic literature search for articles on disease-related malnutrition in Latin American countries published between January 1995 and September 2014. Studies reporting data on the prevalence, clinical outcomes, or economic costs of malnutrition in an adult (≥18 years) inpatient population with a sample size of ≥30 subjects were eligible for inclusion. Methodological quality of the studies was assessed by two independent reviewers using published criteria. We identified 1467 citations; of these, 66 studies including 29 ,474 patients in 12 Latin American countries met the criteria for inclusion. There was considerable variability in methodology and in the reported prevalence of disease-related malnutrition; however, prevalence was consistently in the range of 40%-60% at the time of admission, with several studies reporting an increase in prevalence with increasing duration of hospitalisation. Disease-related malnutrition was associated with an increase in infectious and non-infectious clinical complications, length of hospital stay, and costs. Disease-related malnutrition is a highly prevalent condition that imposes a substantial health and economic burden on the countries of Latin America. Further research is necessary to characterise screening/assessment practices and identify evidence-based solutions to this persistent and costly public health issue. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  13. Las Migraciones en el Proceso de Integracion de las Americas: Seminario Internacional (Migration in the Integration Process in the Americas: International Seminar). Conference report.

    PubMed

    Berglund, S

    1993-01-01

    This conference report of the Centro de Estudios Migratorios Latinoamericanos and the Center for Migration Studies pertains to meetings held in August 1992. Summary information on migration movements in Latin America is presented by region and subject. The topic of integration in the Americas was presented by Mr. Lelio Marmora. Other topics and presenters include the new Colombian Migration Policy (Mr. Carlos Adolfo Arenas), the integration policies in Central America (Mr. Pilar Norza of Costa Rica, Raimundo Alvarado of El Salvador, and Luis Armando Gusman of Nicaragua), the Andean Pact agreements (representatives of each country), US immigration policy (Charles B. Keely), the Mexican integration with Latin America and immigration to the US (Jorge Bustamante), migration to Bolivia and Argentina and Chile, and transnationalism in the Caribbean (Professor Andre Corten). Migration policy needs to be tailored specifically to the situation in Latin America, and greater attention needs to be devoted to labor migrants' rights and working conditions. There are still fundamental differences among countries in policies regarding the free circulation of persons across borders. There is a division among those who support migration and those who are realists. National sovereignty issues are solvable because of a common national past and a relatively homogenous population. Another opinion is that Latin America is more diverse than commonly recognized. Capital is protected more in international agreements than is migrant labor. Regional integration for the US does mean immigration from Mexico. The US sees Mexican migration as a policy problem, and Mexico sees migration as a labor opportunity.

  14. The use of Latin terminology in medical case reports: quantitative, structural, and thematic analysis.

    PubMed

    Lysanets, Yuliia V; Bieliaieva, Olena M

    2018-02-23

    This paper focuses on the prevalence of Latin terms and terminological collocations in the issues of Journal of Medical Case Reports (February 2007-August 2017) and discusses the role of Latin terminology in the contemporary process of writing medical case reports. The objective of the research is to study the frequency of using Latin terminology in English-language medical case reports, thus providing relevant guidelines for medical professionals who deal with this genre and drawing their attention to the peculiarities of using Latin in case reports. The selected medical case reports are considered, using methods of quantitative examination and structural, narrative, and contextual analyses. We developed structural and thematic typologies of Latin terms and expressions, and we conducted a quantitative analysis that enabled us to observe the tendencies in using these lexical units in medical case reports. The research revealed that the use of Latin fully complies with the communicative strategies of medical case reports as a genre. Owing to the fact that Latin medical lexis is internationally adopted and understood worldwide, it promotes the conciseness of medical case reports, as well as contributes to their narrative style and educational intentions. The adequate use of Latin terms in medical case reports is an essential prerequisite of effective sharing of one's clinical findings with fellow researchers from all over the world. Therefore, it is highly important to draw students' attention to Latin terms and expressions that are used in medical case reports most frequently. Hence, the analysis of structural, thematic, and contextual features of Latin terms in case reports should be an integral part of curricula at medical universities.

  15. From magic to science: a journey throughout Latin American medical mycology.

    PubMed

    San-Blas, G

    2000-01-01

    The start of Latin America's love story with fungi may be placed in pre-Hispanic times when the use of fungi in both ritual ceremonies and daily life were common to the native civilizations. But the medical mycology discipline in Latin America started at the end of the 19th Century. At that time, scholars such as A. Posadas, R. Seeber, A. Lutz and P. Almeida, discovered agents of fungal diseases, the study of which has influenced the regional research ever since. Heirs to them are the researchers that today thrive in regional Universities and Research Institutes. Two current initiatives improve cooperation among Latin American medical mycologists. First, the periodical organization of International Paracoccidioidomycosis Meetings (seven so far, from 1979 to 1999); second, the creation of the Latin American Association for Mycology in 1991 (three Congresses, from 1993 to 1999). Latin American publications have increased in international specialized journals such as that from our Society (ISHAM) (from 8% in 1967 to 19% in 1999), and the Iberoamerican Journal of Mycology (Revista Iberoamericana de Micologia; > 40% from 1997 to 1999). In addition, Latin American participation at ISHAM International Congresses has risen from 6.9% in 1975 to 21.3% in 1997, and 43.2% at the 14th ISHAM Congress, held for the first time in a Latin American country, Argentina. A significant contribution of women to the scientific establishment of Latin American medical mycology (e.g., 45% of Latin American papers vs. 18% of other regions published in Journal of Medical and Veterinary Mycology in 1987, had women as authors or coauthors) suggests a better academic consideration of Latin American women against their counterparts in the developed world. Taken together, all these figures reflect the enthusiasm of our Latin American colleagues in the field, despite the difficulties that afflict our region, and affect our work.

  16. Education, Adjustment, and Democracy in Latin America. Development Discussion Paper No. 363.

    ERIC Educational Resources Information Center

    Reimers, Fernando M.

    This document examines changes in Latin American economies and educational systems during the 1980s and the responses of the Latin American democracies. Following a description of changes in Latin American public expenditures in the 1980s and subsequent adjustments in education expenditures, the dynamics of the adjustment in Costa Rica and…

  17. "Dance for Your Health": Exploring Social Latin Dancing for Community Health Promotion

    ERIC Educational Resources Information Center

    Iuliano, Joseph E.; Lutrick, Karen; Maez, Paula; Nacim, Erika; Reinschmidt, Kerstin

    2017-01-01

    The goal of "Dance for Your Health" was to explore the relationship between social Latin dance and health as described by members of the Tucson social Latin dance community. Social Latin dance was selected because of the variety of dances, cultural relevance and popularity in Tucson, and the low-key, relaxed atmosphere. Dance has been…

  18. Latin America: A Filmic Approach. Latin American Studies Program, Film Series No. 1.

    ERIC Educational Resources Information Center

    Campbell, Leon G.; And Others

    This document describes a university course designed to provide an historical understanding of Latin America through feature films. The booklet contains an introductory essay on the teaching of a film course on Latin America, a general discussion of strengths and weaknesses of student analyses of films, and nine analyses written by students during…

  19. She Doesn't Even Act Mexican: Smartness Trespassing in the New South

    ERIC Educational Resources Information Center

    Carrillo, Juan F.; Rodriguez, Esmeralda

    2016-01-01

    Historically, Latin@s have experienced deficit thinking as it relates to their perceived academic abilities. In North Carolina's emerging Latin@ population, the first waves of K-12 students are writing a new chapter in the state's history. Yet, there is a dearth of research that focuses on the identities of academically successful Latin@ students…

  20. LATIN FOR SECONDARY SCHOOLS (A GUIDE TO MINIMUM ESSENTIALS).

    ERIC Educational Resources Information Center

    LEAMON, M. PHILLIP; AND OTHERS

    A SET OF MINIMUM ESSENTIALS FOR EACH LEVEL OF A 4-YEAR SEQUENCE OF LATIN IN SECONDARY SCHOOLS IS PRESENTED IN THIS CURRICULUM GUIDE. FOLLOWING STATEMENTS OF THE OBJECTIVES OF LATIN STUDY--READING THE LATIN OF THE GREAT ROMAN AUTHORS, ATTAINING A LINGUISTIC PROFICIENCY, AND ACQUIRING A WIDER HISTORICAL AND CULTURAL AWARENESS--THE GUIDE OUTLINES FOR…

  1. Word Power through Latin; A Curriculum Resource.

    ERIC Educational Resources Information Center

    Masciantonio, Rudolph

    This curriculum guide is intended to assist Latin teachers in the School District of Philadelphia in achieving one of the goals of Latin instruction: the development of word power in English through a structured study of Latin roots and affixes. The guide may be used in two different ways. First, it may form the basis for a separate…

  2. Machismo and Virginidad: Sex Roles in Latin America. Discussion Paper 79-10.

    ERIC Educational Resources Information Center

    Quinones, Julio

    The purpose of this paper is to present a view of Latin American males and females that describes the situation in Latin America more accurately than the current stereotypical view accepted in the United States. The author discusses the roots of the North American misconception, citing differences between Latin American and North American cultures…

  3. Effective promotion of breastfeeding among Latin American women newly immigrated to the United States.

    PubMed

    Denman-Vitale, S; Murillo, E K

    1999-07-01

    Across the United States, advance practice nurses (APNs) are increasingly encountering recently immigrated Latin American populations. This article provides an overview of the situation of Latin Americans in the United States and discusses aspects of Latin American culture such as, respeto (respect), confianza (confidence), the importance of family, and the value of a personal connection. Strategies that will assist practitioners to incorporate culturally holistic principles in the promotion of breastfeeding among Latin American women who are new arrivals in the United States are described. If practitioners are to respond to the increasing numbers of Latin American women who need health care services, and also provide thorough, holistic health care then health care activities must be integrated with cultural competence.

  4. A unified model of the standard genetic code.

    PubMed

    José, Marco V; Zamudio, Gabriel S; Morgado, Eberto R

    2017-03-01

    The Rodin-Ohno (RO) and the Delarue models divide the table of the genetic code into two classes of aminoacyl-tRNA synthetases (aaRSs I and II) with recognition from the minor or major groove sides of the tRNA acceptor stem, respectively. These models are asymmetric but they are biologically meaningful. On the other hand, the standard genetic code (SGC) can be derived from the primeval RNY code (R stands for purines, Y for pyrimidines and N any of them). In this work, the RO-model is derived by means of group actions, namely, symmetries represented by automorphisms, assuming that the SGC originated from a primeval RNY code. It turns out that the RO-model is symmetric in a six-dimensional (6D) hypercube. Conversely, using the same automorphisms, we show that the RO-model can lead to the SGC. In addition, the asymmetric Delarue model becomes symmetric by means of quotient group operations. We formulate isometric functions that convert the class aaRS I into the class aaRS II and vice versa. We show that the four polar requirement categories display a symmetrical arrangement in our 6D hypercube. Altogether these results cannot be attained, neither in two nor in three dimensions. We discuss the present unified 6D algebraic model, which is compatible with both the SGC (based upon the primeval RNY code) and the RO-model.

  5. Optimizing Radiometric Processing and Feature Extraction of Drone Based Hyperspectral Frame Format Imagery for Estimation of Yield Quantity and Quality of a Grass Sward

    NASA Astrophysics Data System (ADS)

    Näsi, R.; Viljanen, N.; Oliveira, R.; Kaivosoja, J.; Niemeläinen, O.; Hakala, T.; Markelin, L.; Nezami, S.; Suomalainen, J.; Honkavaara, E.

    2018-04-01

    Light-weight 2D format hyperspectral imagers operable from unmanned aerial vehicles (UAV) have become common in various remote sensing tasks in recent years. Using these technologies, the area of interest is covered by multiple overlapping hypercubes, in other words multiview hyperspectral photogrammetric imagery, and each object point appears in many, even tens of individual hypercubes. The common practice is to calculate hyperspectral orthomosaics utilizing only the most nadir areas of the images. However, the redundancy of the data gives potential for much more versatile and thorough feature extraction. We investigated various options of extracting spectral features in the grass sward quantity evaluation task. In addition to the various sets of spectral features, we used photogrammetry-based ultra-high density point clouds to extract features describing the canopy 3D structure. Machine learning technique based on the Random Forest algorithm was used to estimate the fresh biomass. Results showed high accuracies for all investigated features sets. The estimation results using multiview data provided approximately 10 % better results than the most nadir orthophotos. The utilization of the photogrammetric 3D features improved estimation accuracy by approximately 40 % compared to approaches where only spectral features were applied. The best estimation RMSE of 239 kg/ha (6.0 %) was obtained with multiview anisotropy corrected data set and the 3D features.

  6. A multi-satellite orbit determination problem in a parallel processing environment

    NASA Technical Reports Server (NTRS)

    Deakyne, M. S.; Anderle, R. J.

    1988-01-01

    The Engineering Orbit Analysis Unit at GE Valley Forge used an Intel Hypercube Parallel Processor to investigate the performance and gain experience of parallel processors with a multi-satellite orbit determination problem. A general study was selected in which major blocks of computation for the multi-satellite orbit computations were used as units to be assigned to the various processors on the Hypercube. Problems encountered or successes achieved in addressing the orbit determination problem would be more likely to be transferable to other parallel processors. The prime objective was to study the algorithm to allow processing of observations later in time than those employed in the state update. Expertise in ephemeris determination was exploited in addressing these problems and the facility used to bring a realism to the study which would highlight the problems which may not otherwise be anticipated. Secondary objectives were to gain experience of a non-trivial problem in a parallel processor environment, to explore the necessary interplay of serial and parallel sections of the algorithm in terms of timing studies, to explore the granularity (coarse vs. fine grain) to discover the granularity limit above which there would be a risk of starvation where the majority of nodes would be idle or under the limit where the overhead associated with splitting the problem may require more work and communication time than is useful.

  7. Public Policies and Interventions for Diabetes in Latin America: a Scoping Review.

    PubMed

    Kaselitz, Elizabeth; Rana, Gurpreet K; Heisler, Michele

    2017-08-01

    Successful interventions are needed to diagnose and manage type 2 diabetes (T2DM) in Latin America, a region that is experiencing a significant rise in rates of T2DM. Complementing an earlier review exploring diabetes prevention efforts in Latin America, this scoping review examines the literature on (1) policies and governmental programs intended to improve diabetes diagnosis and treatment in Latin America and (2) interventions to improve diabetes management in Latin America. It concludes with a brief discussion of promising directions for future research. Governmental policies and programs for the diagnosis and treatment of diabetes in different Latin American countries have been implemented, but their efficacy to date has not been rigorously evaluated. There are some promising intervention approaches in Latin America to manage diabetes that have been evaluated. Some of these utilize multidisciplinary teams, a relatively resource-intensive approach difficult to replicate in low-resource settings. Other evaluated interventions in Latin America have successfully leveraged mobile health tools, trained peer volunteers, and community health workers (CHWs) to improve diabetes management and outcomes. There are some promising approaches and large-scale governmental efforts underway to curb the growing burden of type 2 diabetes in Latin America. While some of these interventions have been rigorously evaluated, further research is warranted to determine their effectiveness, cost, and scalability in this region.

  8. "Should We Carry Our Passports with Us?": Resisting Violence against Latin@s in Light of Political Rhetoric

    ERIC Educational Resources Information Center

    Bondy, Jennifer M.

    2017-01-01

    In this article, Jennifer Bondy argues that educators should teach students about the emotions that drive racialized violence. In order to do this, she focuses on political rhetoric that locates Latin@s as illegal aliens, criminals, and questionably American. She begins with her professional experiences as a social studies teacher who has worked…

  9. From Latin Americans to Latinos: Latin American Immigration in US: The Unwanted Children

    ERIC Educational Resources Information Center

    Moraña, Ana

    2007-01-01

    It is my understanding that Latin American immigrants in the United States, during the contested process of becoming Latinos (US citizens or the offspring of Latin Americans born in US) are for the most part socially portrayed as unwanted, messy children who need to be educated before they can become American citizens. Whether they can be called…

  10. Latin American Culture Studies: Information and Materials for Teaching About Latin America. Revised Edition.

    ERIC Educational Resources Information Center

    Glab, Edward, Jr., Ed.

    This resource manual provides ideas, lesson plans, course outlines, arts and crafts projects, games, and other materials for teaching K-12 students about Latin America. A major objective is to help students understand and appreciate the diverse Latin American culture. There are six chapters in this volume. Chapter one discusses key ideas that can…

  11. History of primary vasculitis in Latin America.

    PubMed

    Iglesias Gammara, Antonio; Coral, Paola; Quintana, Gerardo; Toro, Carlos E; Flores, Luis Felipe; Matteson, Eric L; Restrepo, José Félix

    2010-03-01

    A literature review utilizing Fepafem, Bireme, LiLacs, Scielo Colombia, Scielo Internacional, former MedLine, Pubmed, and BVS Colombia as well as manual searches in the libraries of major Latin American universities was performed to study vasculitis in Latin America. Since 1945, a total of 752 articles have been published by Latin American authors. However, only a minority are devoted to primary vasculitides, and even fewer have been published in indexed journals. Approximately 126 are in OLD, Medline, Pubmed, Bireme, and Scielo. Most publications are from Mexico, followed by Brazil and Colombia. Systematic studies of the epidemiology of primary idiopathic vasculitis are available for a few countries, i.e. Brazil, Mexico, Colombia, Chile, and Peru. Takayasu arteritis and ANCA-associated vasculitis are the best studied forms of vasculitis in Latin America. Interest and expertise in vasculitis is growing in Latin America, as reflected in the increased number of published articles from this region of the world in the last decade. Racial and environmental factors are possibly responsible for the differential expression of various types of primary vasculitis observed in Latin America. With time, the unique features, epidemiology, and better treatment strategies for idiopathic vasculitides in Latin America will emerge.

  12. The history of Latin teeth names.

    PubMed

    Šimon, František

    2015-01-01

    This paper aims to give an account of the Latin naming of the different types of teeth by reviewing relevant historical and contemporary literature. The paper presents etymologies of Latin or Greek teeth names, their development, variants and synonyms, and sometimes the names of their authors. The Greek names did not have the status of official terms, but the Latin terms for particular types of teeth gradually established themselves. Names for the incisors, canines and molars are Latin calques for the Greek ones (tomeis, kynodontes, mylai), dens serotinus is an indirect calque of the Greek name (odús) opsigonos, and the term pre-molar is created in the way which is now common in modern anatomical terminology, using the prefix prae- = pre and the adjective molaris. The Latin terms dentes canini and dentes molares occur in the Classical Latin literature, the term (dentes) incisivi is found first time in medieval literature, and the terms dentes premolares and dens serotinus are modern-age ones.

  13. "Manana Is Soon Enough for Me": Latin America through Tin Pan Alley's Prism.

    ERIC Educational Resources Information Center

    Aiex, Nola Kortner

    In order to examine the vision of Latin America transmitted to the American public in Tin Pan Alley's popular songs in the first half of the twentieth century, a study analyzed nearly 50 songs. The songs were grouped into five categories: (1) songs which describe Latin locales; (2) songs which are constructed around a Latin woman's name; (3) songs…

  14. The Power of Tradition: Methods for Teaching Latin in the Context of History of Educational Thought

    ERIC Educational Resources Information Center

    Fomin, Andriy

    2005-01-01

    Many authors note that the history of teaching Latin would be a fruitful topic for a comprehensive treatise. Although intense debates about the quality and necessity of teaching Latin date back as early as in the eighteenth century, Latin courses have persisted into the present and, notably, with few changes in content. The author supports the…

  15. China and Latin America: The Other Option

    DTIC Science & Technology

    2017-06-09

    maps, graphics, and any other works incorporated into this manuscript. A work of the United States Government is not subject to copyright, however... government adjust its priorities to mitigate the growing Chinese influence in Latin America. 15. SUBJECT TERMS China, Latin America, national...environment, should the U.S. government adjust its priorities to mitigate the growing Chinese influence in Latin America. v ACKNOWLEDGMENTS No great

  16. A Critical Co-Constructed Autoethnography of a Gendered Cross-Cultural Mentoring between Two Early Career Latin@ Scholars Working in the Deep South

    ERIC Educational Resources Information Center

    Suriel, Regina L.; Martinez, James; Evans-Winters, Venus

    2018-01-01

    Multicultural mentoring has been suggested to support Latin@ faculty success in their careers, yet current literature on effective mentorships of Latin@ faculty is limited. This critical co-constructed autoethnography draws on critical race theory (CRT) and latin@ critical race theory (LatCrit) frameworks to highlight the lived experiences and key…

  17. Latin, the Key to English Vocabulary. A Gamebook on English Derivatives and Cognates to Accompany "Voces de Olympo" (Echoes from Mt. Olympus).

    ERIC Educational Resources Information Center

    Masciantonio, Rudolph

    This gamebook is intended to assist the grade school (FLES) Latin teacher to introduce the reading and writing of English derivatives and cognates after these have been mastered audiolingually in Philadelphia's Latin course "Voces de Olympo (Echoes from Mt. Olympus): A Humanistic Approach to Latin for Children in the Sixth Grade." Games are…

  18. Characterization of individuals at high risk of developing melanoma in Latin America: bases for genetic counseling in melanoma

    PubMed Central

    Puig, Susana; Potrony, Miriam; Cuellar, Francisco; Puig-Butille, Joan Anton; Carrera, Cristina; Aguilera, Paula; Nagore, Eduardo; Garcia-Casado, Zaida; Requena, Celia; Kumar, Rajiv; Landman, Gilles; Costa Soares de Sá, Bianca; Gargantini Rezze, Gisele; Facure, Luciana; de Avila, Alexandre Leon Ribeiro; Achatz, Maria Isabel; Carraro, Dirce Maria; Duprat Neto, João Pedreira; Grazziotin, Thais C.; Bonamigo, Renan R.; Rey, Maria Carolina W.; Balestrini, Claudia; Morales, Enrique; Molgo, Montserrat; Bakos, Renato Marchiori; Ashton-Prolla, Patricia; Giugliani, Roberto; Larre Borges, Alejandra; Barquet, Virginia; Pérez, Javiera; Martínez, Miguel; Cabo, Horacio; Cohen Sabban, Emilia; Latorre, Clara; Carlos-Ortega, Blanca; Salas-Alanis, Julio C; Gonzalez, Roger; Olazaran, Zulema; Malvehy, Josep; Badenas, Celia

    2016-01-01

    Purpose: CDKN2A is the main high-risk melanoma-susceptibility gene, but it has been poorly assessed in Latin America. We sought to analyze CDKN2A and MC1R in patients from Latin America with familial and sporadic multiple primary melanoma (SMP) and compare the data with those for patients from Spain to establish bases for melanoma genetic counseling in Latin America. Genet Med 18 7, 727–736. Methods: CDKN2A and MC1R were sequenced in 186 Latin American patients from Argentina, Brazil, Chile, Mexico, and Uruguay, and in 904 Spanish patients. Clinical and phenotypic data were obtained. Genet Med 18 7, 727–736. Results: Overall, 24 and 14% of melanoma-prone families in Latin America and Spain, respectively, had mutations in CDKN2A. Latin American families had CDKN2A mutations more frequently (P = 0.014) than Spanish ones. Of patients with SMP, 10% of those from Latin America and 8.5% of those from Spain had mutations in CDKN2A (P = 0.623). The most recurrent CDKN2A mutations were c.-34G>T and p.G101W. Latin American patients had fairer hair (P = 0.016) and skin (P < 0.001) and a higher prevalence of MC1R variants (P = 0.003) compared with Spanish patients. Genet Med 18 7, 727–736. Conclusion: The inclusion criteria for genetic counseling of melanoma in Latin America may be the same criteria used in Spain, as suggested in areas with low to medium incidence, SMP with at least two melanomas, or families with at least two cases among first- or second-degree relatives. Genet Med 18 7, 727–736. PMID:26681309

  19. Environmental conditions, immunologic phenotypes, atopy, and asthma: New evidence of how the hygiene hypothesis operates in Latin America

    PubMed Central

    Alcantara-Neves, Neuza M.; Matos, Sheila M. A.; Cooper, Philip J.; Rodrigues, Laura C.; Barreto, Mauricio L.

    2016-01-01

    Background It has been proposed that improved hygiene and reduced experience of infections in childhood influences the development of allergic diseases. The mechanisms by which the hygiene operates are not well established but are underpinned by two apparently incompatible immunologic paradigms, the balance of TH1 versus TH2 cytokines and IL-10–mediated regulation of TH2 cytokines. Objective This study defined immunologic phenotypes with the use of latent class analysis and investigated their associations with environmental factors, markers of allergy and asthma, in a Latin American population. Methods We studied 1127 children living in urban Brazil. Data on wheeze and environmental exposures were collected with standardized questionnaires. Atopy was measured by specific IgE in serum and skin prick test reactivity to aeroallergens. Cytokines were measured in culture after the stimulation of peripheral blood leukocytes with mitogen. Infections with pathogens were assessed by serology and stool examinations. Children were classified as having high or low burden of infection. Latent class analysis was used to identify immune phenotypes on the basis of cytokine production. Logistic regression was used to evaluate the adjusted effects of environment and burden of infection on the immunologic phenotypes and the effect of the phenotypes on atopy and asthma. Results Three phenotypes were identified, labeled underresponsive, intermediate, and responsive. Children of more educated mothers, living in improved environmental conditions, and with a low burden of infection were significantly more likely to have the responsive phenotype. The responsive phenotype was significantly associated with an increased prevalence of atopy but not asthma. Conclusion Our findings contribute to a better understanding of the immune mechanisms by which the hygiene hypothesis operates in urban Latin America. PMID:23414599

  20. Environmental conditions, immunologic phenotypes, atopy, and asthma: new evidence of how the hygiene hypothesis operates in Latin America.

    PubMed

    Figueiredo, Camila Alexandrina; Amorim, Leila D; Alcantara-Neves, Neuza M; Matos, Sheila M A; Cooper, Philip J; Rodrigues, Laura C; Barreto, Mauricio L

    2013-04-01

    It has been proposed that improved hygiene and reduced experience of infections in childhood influences the development of allergic diseases. The mechanisms by which the hygiene operates are not well established but are underpinned by two apparently incompatible immunologic paradigms, the balance of TH1 versus TH2 cytokines and IL-10-mediated regulation of TH2 cytokines. This study defined immunologic phenotypes with the use of latent class analysis and investigated their associations with environmental factors, markers of allergy and asthma, in a Latin American population. We studied 1127 children living in urban Brazil. Data on wheeze and environmental exposures were collected with standardized questionnaires. Atopy was measured by specific IgE in serum and skin prick test reactivity to aeroallergens. Cytokines were measured in culture after the stimulation of peripheral blood leukocytes with mitogen. Infections with pathogens were assessed by serology and stool examinations. Children were classified as having high or low burden of infection. Latent class analysis was used to identify immune phenotypes on the basis of cytokine production. Logistic regression was used to evaluate the adjusted effects of environment and burden of infection on the immunologic phenotypes and the effect of the phenotypes on atopy and asthma. Three phenotypes were identified, labeled underresponsive, intermediate, and responsive. Children of more educated mothers, living in improved environmental conditions, and with a low burden of infection were significantly more likely to have the responsive phenotype. The responsive phenotype was significantly associated with an increased prevalence of atopy but not asthma. Our findings contribute to a better understanding of the immune mechanisms by which the hygiene hypothesis operates in urban Latin America. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.

  1. Latin American Newspapers.

    ERIC Educational Resources Information Center

    Gardner, Mary A.

    1980-01-01

    Reviews the historical development of the press in Latin America from the sixteenth century to the present. Discusses the various pressures that Latin American newspapers are subject to, including political censorship, economic restrictions, and cultural conflicts. (AEA)

  2. Cardiovascular Disease in Rheumatoid Arthritis: A Systematic Literature Review in Latin America

    PubMed Central

    Sarmiento-Monroy, Juan Camilo; Amaya-Amaya, Jenny; Espinosa-Serna, Juan Sebastián; Herrera-Díaz, Catalina; Anaya, Juan-Manuel; Rojas-Villarraga, Adriana

    2012-01-01

    Background. Cardiovascular disease (CVD) is the major predictor of poor prognosis in rheumatoid arthritis (RA) patients. There is an increasing interest to identify “nontraditional” risk factors for this condition. Latin Americans (LA) are considered as a minority subpopulation and ethnically different due to admixture characteristics. To date, there are no systematic reviews of the literature published in LA and the Caribbean about CVD in RA patients. Methods. The systematic literature review was done by two blinded reviewers who independently assessed studies for eligibility. The search was completed through PubMed, LILACS, SciELO, and Virtual Health Library scientific databases. Results. The search retrieved 10,083 potential studies. A total of 16 articles concerning cardiovascular risk factors and measurement of any cardiovascular outcome in LA were included. The prevalence of CVD in LA patients with RA was 35.3%. Non-traditional risk factors associated to CVD in this population were HLA-DRB1 shared epitope alleles, rheumatoid factor, markers of chronic inflammation, long duration of RA, steroids, familial autoimmunity, and thrombogenic factors. Conclusions. There is limited data about CVD and RA in LA. We propose to evaluate cardiovascular risk factors comprehensively in the Latin RA patient and to generate specific public health policies in order to diminish morbi-mortality rates. PMID:23193471

  3. Obesity and disability: relation among older adults living in Latin America and the Caribbean.

    PubMed

    Al Snih, Soham; Graham, James E; Kuo, Yong-Fang; Goodwin, James S; Markides, Kyriakos S; Ottenbacher, Kenneth J

    2010-06-15

    The prevalence and incidence of both obesity and disability are projected to increase in the coming decades. The authors examined the relation between obesity and disability in older adults from 6 Latin American cities participating in the Health, Well-Being and Aging in Latin America and the Caribbean (SABE) Study (1999-2000). The sample included 6,166 participants aged 65 years or more. Data on sociodemographic factors, smoking status, medical conditions, body mass index (BMI; weight (kg)/height (m)(2)), and self-reported activities of daily living (ADL) were obtained. The prevalence of obesity (BMI > or = 30) ranged from 13.3% in Havana, Cuba, to 37.6% in Montevideo, Uruguay. Using a BMI of 18.5-<25 as the reference category and controlling for all covariates, the lowest odds ratio for ADL limitation was for a BMI of 25-<30 (odds ratio = 1.10, 95% confidence interval: 0.93, 1.30), and the highest odds ratio for ADL limitation was for a BMI of 35 or higher (odds ratio = 1.63, 95% confidence interval: 1.26, 2.11). The results indicated that obesity is an independent factor contributing to ADL disability in these populations and should be included in future planning to reduce the impact of disability on global health.

  4. At the crossroads of Greek and Roman medicine: the contribution of Latin papyri. 1. Medical texts; 2. Iatromagical papyri.

    PubMed

    Marganne, Marie-Hélène; de Haro Sanchez, Magali

    2014-01-01

    1. Far fewer Latin medical papyri, whether paraliterary, documentary or magical, have survived compared to Greek medical papyri, but they nonetheless provide interesting information about medical practices in the Graeco-Roman world, the relationship between Greek and Latin medical languages, and the choices made to use one rather than the other, a subject that has never been exhaustively studied. As part of the update undertaken by CEDOPAL since 2008 of the Corpus papyrorum Latinarum, published fifty years ago by the late Robert Cavenaile, we have inventoried Latin papyri containing medical references, classifying them by type or nature of content, provenance, form, layout, and writing. We finally analyse their content and what it reveals about the reception of Greek medicine by Latin or Latin-speaking writers. 2. The second section presents the only iatromagical papyrus in Latin known at the present time, P. Held. inv. lat. 5 (Suppl. Mag. 1.36, ca. fifth/sixth centuries, Fustat [?]), and compares its content with that of the Greek iatromagical papyri (dating from the first century B.C. to the seventh century A.D.) on one hand, and on the other hand with iatromagical formulae in Latin that have been preserved on metal leaves coming from Italy, Hungary, France, and England.

  5. A Prospective Cohort Multicenter Study of Molecular Epidemiology and Phylogenomics of Staphylococcus aureus Bacteremia in Nine Latin American Countries

    PubMed Central

    Reyes, Jinnethe; Carvajal, Lina Paola; Rincon, Sandra; Diaz, Lorena; Panesso, Diana; Ibarra, Gabriel; Rios, Rafael; Munita, Jose M.; Salles, Mauro J.; Alvarez-Moreno, Carlos; Labarca, Jaime; Garcia, Coralith; Luna, Carlos M.; Mejia-Villatoro, Carlos; Zurita, Jeannete; Guzman-Blanco, Manuel; Rodriguez-Noriega, Eduardo; Narechania, Apurva; Rojas, Laura J.; Planet, Paul J.; Weinstock, George M.; Gotuzzo, Eduardo; Seas, Carlos

    2017-01-01

    ABSTRACT Staphylococcus aureus is an important pathogen causing a spectrum of diseases ranging from mild skin and soft tissue infections to life-threatening conditions. Bloodstream infections are particularly important, and the treatment approach is complicated by the presence of methicillin-resistant S. aureus (MRSA) isolates. The emergence of new genetic lineages of MRSA has occurred in Latin America (LA) with the rise and dissemination of the community-associated USA300 Latin American variant (USA300-LV). Here, we prospectively characterized bloodstream MRSA recovered from selected hospitals in 9 Latin American countries. All isolates were typed by pulsed-field gel electrophoresis (PFGE) and subjected to antibiotic susceptibility testing. Whole-genome sequencing was performed on 96 MRSA representatives. MRSA represented 45% of all (1,185 S. aureus) isolates. The majority of MRSA isolates belonged to clonal cluster (CC) 5. In Colombia and Ecuador, most isolates (≥72%) belonged to the USA300-LV lineage (CC8). Phylogenetic reconstructions indicated that MRSA isolates from participating hospitals belonged to three major clades. Clade A grouped isolates with sequence type 5 (ST5), ST105, and ST1011 (mostly staphylococcal chromosomal cassette mec [SCCmec] I and II). Clade B included ST8, ST88, ST97, and ST72 strains (SCCmec IV, subtypes a, b, and c/E), and clade C grouped mostly Argentinian MRSA belonging to ST30. In summary, CC5 MRSA was prevalent in bloodstream infections in LA with the exception of Colombia and Ecuador, where USA300-LV is now the dominant lineage. Clonal replacement appears to be a common phenomenon, and continuous surveillance is crucial to identify changes in the molecular epidemiology of MRSA. PMID:28760895

  6. Current practice in Latin America of flexible ureterorenoscopy with laser for treating kidney stones.

    PubMed

    Manzo, B O; Bertacchi, M; Lozada, E; Rasguido, A; Aleman, E; Cabrera, M; Rodríguez, A; Manzo, G; Sánchez, H; Blasco, J

    2016-05-01

    The use of flexible ureterorenoscopy for treating kidney stones has increased in recent years, with considerable worldwide variation in the surgical technique and indications. To determine the current practice, technique variations, use and indications of flexible ureterorenoscopy for treating kidney stones in Latin American. We sent (by email and web link) an anonymous questionnaire with 30 questions on flexible ureterorenoscopy for treating kidney stones to Latin American urologists from January 2015 to July 2015. We collected the responses through the Survey Monkey system. A total of 283 urologists in 15 Latin American countries participated (response rate, 10.8%); 254 answered the questionnaire completely; 52.8% were urologists from Mexico and 11% were from Argentina; 11.8% of the responders stated that they performed >100 cases per year; 15.2% considered ureterorenoscopy as the treatment of choice for stones >2cm, and 19.6% performed ureterorenoscopy in single stages for calculi measuring >2.5cm. Some 78.4% use fluoroscopy, 69.1% use a ureteral sheath in all cases, 55.8% place double-J catheters at the end of surgery, 37.3% considered a stone-free state to be 0 fragments, and 41.2% use plain radiography to assess the stone-free condition. Most participating urologists consider flexible ureterorenoscopy as the first-choice treatment for stones <2cm; a small percentage of these urologists perform >100 ureterorenoscopies per year. More than half of the urologists routinely used fluoroscopy and ureteral access sheath; the most common method for determining the stone-free state is plain abdominal radiography. Copyright © 2015 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. The historical setting of Latin American bioethics.

    PubMed

    Gracia, D

    1996-12-01

    The historical stages through which Latin American society has passed are at least four: the first, dominated by a particular sort of ethic I have termed the "ethic of the gift;" then the period of conquest, in which the prevalent ethic was one of war and subjection by force, which I call the "ethic of despotism;" followed by the colonial age, in which a new ethical model of "paternalism" emerged; and finally the stage of the "ethic of autonomy," which began with the independence movements of the 18th and 19th centuries and is far from ended. Independence was won by the criollos from European domination with very little participation by the Indian population. The latter was left out of the democratic process and saw itself relegated to a worse situation than in the centuries of colonial rule, for it was no longer protected by the paternalism of the Laws of the Indies of 1542. This is the reason for the division of the Latin American society of the last century into two quite different social strata: one bourgeois, which has assimilated the liberal revolution, and enjoys a health care quite similar to that available in any other Western country and hence faces the same bioethical problems as any developed Western society; the other a very poor stratum, without any economic or social power and hence unable to exercise its civil rights, such as the rights to life and to humane treatment. In this population sector; which is numerically the larger, the major bioethical problems are those of justice and the distribution of scarce resources. The study of Latin American medical ethics can earn for these subjects, whose deplorable condition has been essentially ignored in the bioethics of the first-world countries, the importance they merit.

  8. A Prospective Cohort Multicenter Study of Molecular Epidemiology and Phylogenomics of Staphylococcus aureus Bacteremia in Nine Latin American Countries.

    PubMed

    Arias, Cesar A; Reyes, Jinnethe; Carvajal, Lina Paola; Rincon, Sandra; Diaz, Lorena; Panesso, Diana; Ibarra, Gabriel; Rios, Rafael; Munita, Jose M; Salles, Mauro J; Alvarez-Moreno, Carlos; Labarca, Jaime; Garcia, Coralith; Luna, Carlos M; Mejia-Villatoro, Carlos; Zurita, Jeannete; Guzman-Blanco, Manuel; Rodriguez-Noriega, Eduardo; Narechania, Apurva; Rojas, Laura J; Planet, Paul J; Weinstock, George M; Gotuzzo, Eduardo; Seas, Carlos

    2017-10-01

    Staphylococcus aureus is an important pathogen causing a spectrum of diseases ranging from mild skin and soft tissue infections to life-threatening conditions. Bloodstream infections are particularly important, and the treatment approach is complicated by the presence of methicillin-resistant S. aureus (MRSA) isolates. The emergence of new genetic lineages of MRSA has occurred in Latin America (LA) with the rise and dissemination of the community-associated USA300 Latin American variant (USA300-LV). Here, we prospectively characterized bloodstream MRSA recovered from selected hospitals in 9 Latin American countries. All isolates were typed by pulsed-field gel electrophoresis (PFGE) and subjected to antibiotic susceptibility testing. Whole-genome sequencing was performed on 96 MRSA representatives. MRSA represented 45% of all (1,185 S. aureus ) isolates. The majority of MRSA isolates belonged to clonal cluster (CC) 5. In Colombia and Ecuador, most isolates (≥72%) belonged to the USA300-LV lineage (CC8). Phylogenetic reconstructions indicated that MRSA isolates from participating hospitals belonged to three major clades. Clade A grouped isolates with sequence type 5 (ST5), ST105, and ST1011 (mostly staphylococcal chromosomal cassette mec [SCC mec ] I and II). Clade B included ST8, ST88, ST97, and ST72 strains (SCC mec IV, subtypes a, b, and c/E), and clade C grouped mostly Argentinian MRSA belonging to ST30. In summary, CC5 MRSA was prevalent in bloodstream infections in LA with the exception of Colombia and Ecuador, where USA300-LV is now the dominant lineage. Clonal replacement appears to be a common phenomenon, and continuous surveillance is crucial to identify changes in the molecular epidemiology of MRSA. Copyright © 2017 American Society for Microbiology.

  9. Long-term abatement potential and current policy trajectories in Latin American countries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarke, Leon; McFarland, James; Octaviano, Claudia

    2016-05-01

    This paper provides perspectives on the role of Latin American and Latin American countries in meeting global abatement goals, based on the scenarios developed through the CLIMACAP-LAMP modeling study.

  10. Educational Building in Latin America.

    ERIC Educational Resources Information Center

    Baza, Jadille; Vaz, Rita de Cassia Alves; Millan, Eduardo; Almeida, Rodolfo

    2002-01-01

    Presents articles describing recent developments in three Latin American countries (Chile, Brazil, and Venezuela) to expand public education facilities, along with a report on UNESCO's recent seminar in Latin America on architecture for an inclusive education. (EV)

  11. Gene therapy coming of age in Latin America.

    PubMed

    Podhajcer, Osvaldo; Pitossi, Fernando; Agilar-Cordova, Estuardo

    2002-08-01

    "Gene Therapy in Latin America: From the Bench to the Clinic," a meeting sponsored by the Wellcome Trust and the United Nations University through the Biotechnology Program for Latin America and the Caribbean, took place in Buenos Aires, Argentina from May 20 to 22. This symposium, which was hosted by Osvaldo Podhajcer and Fernando Pitossi,had more than 150 basic scientists and physician-scientists from academia, government and industry in Latin America, similar to the first meeting of the Asociacion Iberoamericana de Terapia Génica (Iberoamerican Society of Gene Therapy, AITG) held in Guadalajara, México, two years ago. Participants represented Argentina, Mexico, Brazil, Chile, Uruguay, Costa Rica, Colombia, Venezuela, and Guatemala, with guests from the United States and Europe. All came together to discuss the latest developments in this field in the region. A primary objective of this gathering was to bring together Latin American scientists involved in gene therapy to strengthen continental collaborations and to further disseminate the scientific expertise available in Latin America. The symposium was followed by a 10-day practical course for 25 students from all over Latin America.

  12. Radiology practice in Latin America: a literature review.

    PubMed

    Teague, Jordan

    2013-01-01

    To discover the status and structure of radiology in Latin America with respect to the health care systems it is part of, the effects of socioeconomics, the equipment and technology used, technologists and their training, accreditation, and professional organizations. Health-related databases and Google Scholar were searched for articles concerning radiology practice in Latin America. Articles were selected based on relevance to the research scope. Many regions in Latin America offer little to no access to radiology. Where there is access, the equipment often is old or not functioning, with limited and costly service and maintenance. Most trained technologists live in urban areas. There are no standardized accreditation practices in Latin America. However, forming professional organizations would help promote the practice of radiology and accreditation standards. International cooperative organizations enhance radiology by providing resources and opportunities for cooperation between countries. The current status of radiology in Latin America must be determined. This knowledge will help us discover opportunities for cooperation and ways to improve radiology practice. The main need in Latin America is to extend coverage to the underserved population.

  13. Implementation of an Object-Oriented Flight Simulator D.C. Electrical System on a Hypercube Architecture

    DTIC Science & Technology

    1991-12-01

    abstract data type is, what an object-oriented design is and how to apply "software engineering" principles to the design of both of them. I owe a great... Program (ASVP), a research and development effort by two aerospace contractors to redesign and implement subsets of two existing flight simulators in...effort addresses how to implement a simulator designed using the SEI OOD Paradigm on a distributed, parallel, multiple instruction, multiple data (MIMD

  14. A Portable Parallel Implementation of the U.S. Navy Layered Ocean Model

    DTIC Science & Technology

    1995-01-01

    Wallcraft, PhD (I.C. 1981) Planning Systems Inc. & P. R. Moore, PhD (Camb. 1971) IC Dept. Math. DR Moore 1° Encontro de Metodos Numericos...Kendall Square, Hypercube, D R Moore 1 ° Encontro de Metodos Numericos para Equacöes de Derivadas Parciais A. J. Wallcraft IC Mathematics...chips: Chips Machine DEC Alpha CrayT3D/E SUN Sparc Fujitsu AP1000 Intel 860 Paragon D R Moore 1° Encontro de Metodos Numericos para Equacöes

  15. On k-ary n-cubes: Theory and applications

    NASA Technical Reports Server (NTRS)

    Mao, Weizhen; Nicol, David M.

    1994-01-01

    Many parallel processing networks can be viewed as graphs called k-ary n-cubes, whose special cases include rings, hypercubes and toruses. In this paper, combinatorial properties of k-ary n-cubes are explored. In particular, the problem of characterizing the subgraph of a given number of nodes with the maximum edge count is studied. These theoretical results are then used to compute a lower bounding function in branch-and-bound partitioning algorithms and to establish the optimality of some irregular partitions.

  16. Synchronizability of random rectangular graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estrada, Ernesto, E-mail: ernesto.estrada@strath.ac.uk; Chen, Guanrong

    2015-08-15

    Random rectangular graphs (RRGs) represent a generalization of the random geometric graphs in which the nodes are embedded into hyperrectangles instead of on hypercubes. The synchronizability of RRG model is studied. Both upper and lower bounds of the eigenratio of the network Laplacian matrix are determined analytically. It is proven that as the rectangular network is more elongated, the network becomes harder to synchronize. The synchronization processing behavior of a RRG network of chaotic Lorenz system nodes is numerically investigated, showing complete consistence with the theoretical results.

  17. Proceedings of the Image Understanding Workshop (18th) Held in Cambridge, Massachusetts on 6-8 April 1988. Volume 1

    DTIC Science & Technology

    1988-04-01

    hypercube; methods, which work best in structured scenes, and parent/ child operations run in a smaller fixed time independent spatiotemporal energy...rectangle defines a new coordinate sVstem for it, image is constructed under orthographic projection. The child links that is relative to its own...that the child rectangle can shift in the X -- V plane, rectangle in the image. At first, a noiseless image is created relative to the nominal

  18. Latin America’s New Security Reality: Irregular Asymmetric Conflict and Hugo Chavez

    DTIC Science & Technology

    2007-08-01

    Bolivarian dream (the idea of a Latin American Liberation Movement against U.S. economic and political imperialism). Yet, most North Americans dismiss...imperialism of the North American “Colossus” (the United States). Chavez argues that liberation, New Socialism, and Bolivarianismo (the dream of a Latin...plan for gaining revolutionary power. In pursuit of this Bolivarian dream , Chavez has stirred the imaginations of many Latin Americans—especially the

  19. Psychotherapy research in developing countries: the case of Latin America.

    PubMed

    de la Parra, Guillermo

    2013-01-01

    This article describes the state of psychotherapy research in Latin America, a developing region, by means of its scientific production as reflected in international and Latin American publications, as well as through a survey and in-depth interviews with clinicians and researchers from the region. The Latin American publication rate is still low in international journals, which stands in contrast with the high level of publication within the region. The survey reveals an interest in research as well as a limited use of research in clinical practice, while exposing the difficulties of researching and publishing. The in-depth interviews, which cover most of the region, specify Latin America's shortcomings and obstacles to researching, as well as its facilitating factors and comparative advantages. The results are discussed within the framework of the colonizer-colonized relationship, Latin American identity, and the creative integration of developed and developing regions.

  20. Review: Malaria Chemoprophylaxis for Travelers to Latin America

    PubMed Central

    Steinhardt, Laura C.; Magill, Alan J.; Arguin, Paul M.

    2011-01-01

    Because of recent declining malaria transmission in Latin America, some authorities have recommended against chemoprophylaxis for most travelers to this region. However, the predominant parasite species in Latin America, Plasmodium vivax, can form hypnozoites sequestered in the liver, causing malaria relapses. Additionally, new evidence shows the potential severity of vivax infections, warranting continued consideration of prophylaxis for travel to Latin America. Individualized travel risk assessments are recommended and should consider travel locations, type, length, and season, as well as probability of itinerary changes. Travel recommendations might include no precautions, mosquito avoidance only, or mosquito avoidance and chemoprophylaxis. There are a range of good options for chemoprophylaxis in Latin America, including atovaquone-proguanil, doxycycline, mefloquine, and—in selected areas—chloroquine. Primaquine should be strongly considered for nonpregnant, G6PD-nondeficient patients traveling to vivax-endemic areas of Latin America, and it has the added benefit of being the only drug to protect against malaria relapses. PMID:22144437

  1. Two-dimensional nonsteady viscous flow simulation on the Navier-Stokes computer miniNode

    NASA Technical Reports Server (NTRS)

    Nosenchuck, Daniel M.; Littman, Michael G.; Flannery, William

    1986-01-01

    The needs of large-scale scientific computation are outpacing the growth in performance of mainframe supercomputers. In particular, problems in fluid mechanics involving complex flow simulations require far more speed and capacity than that provided by current and proposed Class VI supercomputers. To address this concern, the Navier-Stokes Computer (NSC) was developed. The NSC is a parallel-processing machine, comprised of individual Nodes, each comparable in performance to current supercomputers. The global architecture is that of a hypercube, and a 128-Node NSC has been designed. New architectural features, such as a reconfigurable many-function ALU pipeline and a multifunction memory-ALU switch, have provided the capability to efficiently implement a wide range of algorithms. Efficient algorithms typically involve numerically intensive tasks, which often include conditional operations. These operations may be efficiently implemented on the NSC without, in general, sacrificing vector-processing speed. To illustrate the architecture, programming, and several of the capabilities of the NSC, the simulation of two-dimensional, nonsteady viscous flows on a prototype Node, called the miniNode, is presented.

  2. Language and style: A barrier to neurosurgical research and advancement in Latin America

    PubMed Central

    Ashfaq, Adeel; Lazareff, Jorge

    2017-01-01

    Background: The neurosurgical burden in Latin America is understudied and likely underestimated, thus it is imperative to improve quality, training, and delivery of neurosurgical care. A significant aspect of this endeavor is for Latin America to become an integral aspect of the global neurosurgical community, however, there is a paucity of ideology and literature coming from Central and South America. We sought to explore neurosurgical dialogue originating from Latin America as well as barriers to the advancement of neurosurgery in this region. Methods: We conducted a systematic literature review exploring research originating in Latin America in three international neurosurgical journals – Journal of Neurosurgery, Surgical Neurology International, and World Neurosurgery. We utilized PubMed search algorithms to identify articles. Inclusion criteria included publication within the three aforementioned journals, author affiliation with Latin American institutions, and publication within the specified time frame of January 2014 to July 2017. Results: There were 7469 articles identified that met the search criteria. Of these 7469 articles, 326 (4.4%) were from Latin American nations. Conclusion: Our data suggests a relatively low percentage of neurosurgical research originating from Latin America, suggesting a significant lack of participation in the global neurosurgical community. Barriers to global scientific communication include language, rhetorical style, culture, history, biases, funding, and governmental support. Despite challenges, Latin America is making strides towards improvement including the development of neurosurgical societies, as well as international collaborative training and research programs. We consider our report to be a valid initiation of discussion of the broader issue of neurosurgical communication. PMID:29404195

  3. Evolutionary Developmental Biology (Evo-Devo) Research in Latin America.

    PubMed

    Marcellini, Sylvain; González, Favio; Sarrazin, Andres F; Pabón-Mora, Natalia; Benítez, Mariana; Piñeyro-Nelson, Alma; Rezende, Gustavo L; Maldonado, Ernesto; Schneider, Patricia Neiva; Grizante, Mariana B; Da Fonseca, Rodrigo Nunes; Vergara-Silva, Francisco; Suaza-Gaviria, Vanessa; Zumajo-Cardona, Cecilia; Zattara, Eduardo E; Casasa, Sofia; Suárez-Baron, Harold; Brown, Federico D

    2017-01-01

    Famous for its blind cavefish and Darwin's finches, Latin America is home to some of the richest biodiversity hotspots of our planet. The Latin American fauna and flora inspired and captivated naturalists from the nineteenth and twentieth centuries, including such notable pioneers such as Fritz Müller, Florentino Ameghino, and Léon Croizat who made a significant contribution to the study of embryology and evolutionary thinking. But, what are the historical and present contributions of the Latin American scientific community to Evo-Devo? Here, we provide the first comprehensive overview of the Evo-Devo laboratories based in Latin America and describe current lines of research based on endemic species, focusing on body plans and patterning, systematics, physiology, computational modeling approaches, ecology, and domestication. Literature searches reveal that Evo-Devo in Latin America is still in its early days; while showing encouraging indicators of productivity, it has not stabilized yet, because it relies on few and sparsely distributed laboratories. Coping with the rapid changes in national scientific policies and contributing to solve social and health issues specific to each region are among the main challenges faced by Latin American researchers. The 2015 inaugural meeting of the Pan-American Society for Evolutionary Developmental Biology played a pivotal role in bringing together Latin American researchers eager to initiate and consolidate regional and worldwide collaborative networks. Such networks will undoubtedly advance research on the extremely high genetic and phenotypic biodiversity of Latin America, bound to be an almost infinite source of amazement and fascinating findings for the Evo-Devo community. © 2016 Wiley Periodicals, Inc.

  4. Detection and follow-up of chronic obstructive pulmonary disease (COPD) and risk factors in the Southern Cone of Latin America: the pulmonary risk in South America (PRISA) study.

    PubMed

    Rubinstein, Adolfo L; Irazola, Vilma E; Bazzano, Lydia A; Sobrino, Edgardo; Calandrelli, Matías; Lanas, Fernando; Lee, Alison G; Manfredi, Jose A; Olivera, Héctor; Ponzo, Jacqueline; Seron, Pamela; He, Jiang

    2011-06-01

    The World Health Organization has estimated that by 2030, chronic obstructive pulmonary disease will be the third leading cause of death worldwide. Most knowledge of chronic obstructive pulmonary disease is based on studies performed in Europe or North America and little is known about the prevalence, patient characteristics and change in lung function over time in patients in developing countries, such as those of Latin America. This lack of knowledge is in sharp contrast to the high levels of tobacco consumption and exposure to biomass fuels exhibited in Latin America, both major risk factors for the development of chronic obstructive pulmonary disease. Studies have also demonstrated that most Latin American physicians frequently do not follow international chronic obstructive pulmonary disease diagnostic and treatment guidelines. The PRISA Study will expand the current knowledge regarding chronic obstructive pulmonary disease and risk factors in Argentina, Chile and Uruguay to inform policy makers and health professionals on the best policies and practices to address this condition. PRISA is an observational, prospective cohort study with at least four years of follow-up. In the first year, PRISA has employed a randomized three-staged stratified cluster sampling strategy to identify 6,000 subjects from Marcos Paz and Bariloche, Argentina, Temuco, Chile, and Canelones, Uruguay. Information, such as comorbidities, socioeconomic status and tobacco and biomass exposure, will be collected and spirometry, anthropometric measurements, blood sampling and electrocardiogram will be performed. In year four, subjects will have repeat measurements taken. There is no longitudinal data on chronic obstructive pulmonary disease incidence and risk factors in the southern cone of Latin America, therefore this population-based prospective cohort study will fill knowledge gaps in the prevalence and incidence of chronic obstructive pulmonary disease, patient characteristics and changes in lung function over time as well as quality of life and health care resource utilization. Information gathered during the PRISA Study will inform public health interventions and prevention practices to reduce risk of COPD in the region.

  5. Women and development in Latin America and the Caribbean. Lessons from the seventies and hopes for the future.

    PubMed

    Arizpe, L

    1982-01-01

    The early implicit assumptions that industrialization or, generally, modernization should automatically improve the condition of women have been challenged more and more by research and statistical data. In Latin America and the Caribbean, the theory which held that the cultural assimilation of ethnic groups of Indian and African descent into the national Hispanic or Portuguese cultures implied an improvement in the condition of women has been challenged through ethnographic and historical research. Women in closed corporate communities may have higher status, greater participation in authority, and more support from their children than those in open mestizo communities, where excessive alcohol consumption and abusive sexual relations form an integral part of the psychosocial complex of "machismo." New research has dealt with the forced integration of black women and Indian women, as concubines of the dominant white men, as a mechanism of "mestizaje," i.e., mixing of the population, against which women had no legal or "de facto" defense. Such abuse of women, masked by racial and cultural prejudice, continues in many backward rural areas in Latin America. In discussions of the peasantry and of rural development in Latin America and the Caribbean, women had been largely ignored because agriculture was conceptualized as an exclusively male activity. This androcentric view is reflected in census categories that make the component of women's labor in agriculture invisible or unimportant. Consequently, the statistical percentages have always been unrealistically low in most countries. Detailed observations and surveys conducted during the last decade have shown, to the contrary, that peasant women work longer hours than men and are more liable to increase their time and work load to offset pauperization. The research of Deere and Leon (Colombia) as well as that of other women in different countries of the region confirms that women's subordination precedes capitalism and is further used by this system of production for its only ends. Priorities in the Western feminist movements in the 1970s have been equal pay for equal work and sexual and psychological autonomy. In the 3rd world the priorities have been the right to adequate employment and to primary services such as schools, drinking water, housing, and medical services. The main strategy for women in Latin America and the Caribbean has been to participate alongside men in political movements seeking to attain national sovereignty or to challenge economic inequalities, both internally and internationally, as a precondition to the setting up of women's demands as a gender group. The research makes it clear that dependent capitalist development brings an added burden of poverty and subordination to women. Strategies to advance women must be assessed within their particular context.

  6. Thermal equilibrium of Nellore cattle in tropical conditions: an investigation of circadian pattern

    USDA-ARS?s Scientific Manuscript database

    The aim of this work was to evaluate the diurnal patterns of physiological responses and the thermal regulation of adult Nellore bulls. Six 30-mo-old Nellore bulls (669 ± 65 kg BW) were randomly assigned to four 6-h periods in a Latin Square design such that measurements of each animal cover a 24-h ...

  7. Compulsory Education Laws or Incentives from Conditional Cash Transfer Programs? Explaining the Rise in Secondary School Attendance Rate in Argentina

    ERIC Educational Resources Information Center

    Edo, María; Marchionni, Mariana; Garganta, Santiago

    2017-01-01

    Argentina has traditionally stood out in terms of educational outcomes among its Latin American counterparts. Schooling of older children, however, still shows room for improvement especially among the more vulnerable. Fortunately, during the last years a sizeable improvement in attendance rates for children aged 15 through 17 took place. This…

  8. Policy Shocks: On the Legal Auspices of Latin American Migration to the United States

    PubMed Central

    Riosmena, Fernando

    2011-01-01

    In this paper, I compare the transition into legal permanent residence (LPR) of Mexicans, Dominicans, and Nicaraguans. Dominicans had the highest likelihood of obtaining residence, mostly sponsored by parents and spouses. Mexicans had the lowest LPR transition rates and presented sharp gender differentials in modes: women mostly legalized through husbands while men were sponsored through IRCA, parents. Nicaraguans stood in-between, presenting few gender differences in rates and modes of transition and a heavy dependence on asylum and special provisions such as IRCA and NACARA. I argue these patterns stem from the interplay of conditions favoring the emigration of and the specific immigration policy context faced by migrant pioneers; the influence of social networks in reproducing the legal character of flows; and differences in the actual use of kinship ties as sponsors. I discuss the implications of these trends on the observed gendered patterns of migration from Latin America. PMID:21921965

  9. Children's environment and health in Latin America: the Ecuadorian case.

    PubMed

    Harari, Raul; Harari, Homero

    2006-09-01

    Environmental health problems of children in Latin America and Ecuador are complex due to the close relationship that exists between social and environmental factors. Extended poverty and basic problems, such as the lack of drinking water and sanitation, are common. Infectious diseases are the greatest cause of morbidity and mortality among children. Development in industry and the introduction of chemical substances in agriculture add new risks including pesticide use, heavy metal exposure, and air pollution. Major problems can be divided into (a) lack of basic infrastructure, (b) poor living conditions, (c) specific environmental problems, and (d) child labor. Reproductive health disorders are frequent in developing countries like Ecuador. Issues related to children's health should consider new approaches, creative methodologies, and the search for independent predictors to separate environmental from social problems. Only with knowledge of the specific contribution of each factor, can it be possible to develop a strategy for prevention.

  10. [Quality of life in Latin American immigrant caregivers in Spain].

    PubMed

    Bover, Andreu; Taltavull, Joana Maria; Gastaldo, Denise; Luengo, Raquel; Izquierdo, María Dolores; Juando-Prats, Clara; Sáenz de Ormijana, Amaia; Robledo, Juana

    2015-01-01

    To describe perceived quality of life in Latin American caregivers working in Spain and how it varies in relation to certain variables shared by this group. We used the SF-36 to measure perceived quality of life in 517 women residing in five Spanish regions: the Balearic Islands, Catalonia, the Basque Country, the Canary Islands, and Madrid. Several variables related to the socio-demographic profile and migration process were studied using Student's t test, ANOVA and linear regression models. The participants scored very low on the dimensions of physical and emotional roles. The factors associated with lower quality of life scores within the group were working as a live-in caregiver, lack of contract, multitasking, irregular status, and younger age. The vulnerability of these women can be explained by poor working conditions and other factors related to the migratory process. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  11. Easing the burden of rural women: a 16-hour workday.

    PubMed

    Fagley, R M

    1976-01-01

    Women are the 2nd-class citizens of the developing countries, especially in the rural areas. Not until the status of women is upgraded in these areas will the struggle for better nutrition, for smaller families, and for general social development be successful. The reasons why women have been neglected so far are discussed. Women in developing countries suffer from a lack of power. They can be helped by women in affluent societies. Information on the status of women in various Asian, African, and Latin American countries was solicited and is presented. Obstacles to improvement in the condition of women include: 1) continual childbearing 2) traditional values, 3) social pressures, and 4) the machismo philosophy. Recommendations are made for ways in which to aid the situation of women in Africa, Asia, and Latin America. Some beginning efforts in this direction are mentioned.

  12. Respiratory health in Latin America: number of specialists and human resources training.

    PubMed

    Vázquez-García, Juan-Carlos; Salas-Hernández, Jorge; Pérez Padilla, Rogelio; Montes de Oca, María

    2014-01-01

    Latin America is made up of a number of developing countries. Demographic changes are occurring in the close to 600 million inhabitants, in whom a significant growth in population is combined with the progressive ageing of the population. This part of the world poses great challenges for general and respiratory health. Most of the countries have significant, or even greater, rates of chronic respiratory diseases or exposure to risk. Human resources in healthcare are not readily available, particularly in the area of respiratory disease specialists. Academic training centers are few and even non-existent in the majority of the countries. The detailed analysis of these conditions provides a basis for reflection on the main challenges and proposals for the management and training of better human resources in this specialist area. Copyright © 2013 SEPAR. Published by Elsevier Espana. All rights reserved.

  13. LC-MS determination of steroidal glycosides from Dioscorea deltoidea Wall cell suspension culture: Optimization of pre-LC-MS procedure parameters by Latin Square design.

    PubMed

    Sarvin, Boris; Fedorova, Elizaveta; Shpigun, Oleg; Titova, Maria; Nikitin, Mikhail; Kochkin, Dmitry; Rodin, Igor; Stavrianidi, Andrey

    2018-03-30

    In this paper, the ultrasound assisted extraction method for isolation of steroidal glycosides from D. deltoidea plant cell suspension culture with a subsequent HPLC-MS determination was developed. After the organic solvent was selected via a two-factor experiment the optimization via Latin Square 4 × 4 experimental design was carried out for the following parameters: extraction time, organic solvent concentration in extraction solution and the ratio of solvent to sample. It was also shown that the ultrasound assisted extraction method is not suitable for isolation of steroidal glycosides from the D. deltoidea plant material. The results were double-checked using the multiple successive extraction method and refluxing extraction. Optimal conditions for the extraction of steroidal glycosides by the ultrasound assisted extraction method were: extraction time, 60 min; acetonitrile (water) concentration in extraction solution, 50%; the ratio of solvent to sample, 400 mL/g. Also, the developed method was tested on D. deltoidea cell suspension cultures of different terms and conditions of cultivation. The completeness of the extraction was confirmed using the multiple successive extraction method. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. [Key points for the management of dermatitis in Latin America. The SLAAI Consensus].

    PubMed

    Sánchez, Jorge; Páez, Bruno; Macías-Weinmann, Alejandra; De Falco, Alicia

    2015-01-01

    The incidence of atopic dermatitis in Latin America, as in other regions, has been increasing in recent years. The SLAAI consensus is based on a systematic search for articles related to dermatitis, with focus in the pathophysiology and treatment and its impact on Latin America, and reviewed using the Delphi methodology (Revista Alergia Mexico 2014;61:178-211). In this article we highlight the key points of consensus and particular considerations in Latin America.

  15. Latin American Network of students in Atmospheric Sciences and Meteorology

    NASA Astrophysics Data System (ADS)

    Cuellar-Ramirez, P.

    2017-12-01

    The Latin American Network of Students in Atmospheric Sciences and Meteorology (RedLAtM) is a civil nonprofit organization, organized by students from Mexico and some Latin- American countries. As a growing organization, providing human resources in the field of meteorology at regional level, the RedLAtM seeks to be a Latin American organization who helps the development of education and research in Atmospheric Sciences and Meteorology in order to engage and promote the integration of young people towards a common and imminent future: Facing the still unstudied various weather and climate events occurring in Latin America. The RedLAtM emerges from the analysis and observation/realization of a limited connection between Latin American countries around research in Atmospheric Sciences and Meteorology. The importance of its creation is based in cooperation, linking, research and development in Latin America and Mexico, in other words, to join efforts and stablish a regional scientific integration who leads to technological progress in the area of Atmospheric Sciences and Meteorology. As ultimate goal the RedLAtM pursuit to develop climatic and meteorological services for those countries unable to have their own programs, as well as projects linked with the governments of Latin American countries and private companies for the improvement of prevention strategies, research and decision making. All this conducing to enhance the quality of life of its inhabitants facing problems such as poverty and inequality.

  16. The reality of scientific research in Latin America; an insider's perspective.

    PubMed

    Ciocca, Daniel R; Delgado, Gabriela

    2017-11-01

    There is tremendous disparity in scientific productivity among nations, particularly in Latin America. At first sight, this could be linked to the relative economic health of the different countries of the region, but even large and relatively rich Latin American countries do not produce a good level of science. Although Latin America has increased the number of its scientists and research institutions in recent years, the gap between developed countries and Latin American countries is startling. The prime importance of science and technology to the development of a nation remains unacknowledged. The major factors contributing to low scientific productivity are the limited access to grant opportunities, inadequate budgets, substandard levels of laboratory infrastructure and equipment, the high cost and limited supply of reagents, and inadequate salaries and personal insecurity of scientists. The political and economic instability in several Latin America countries results in a lack of long-term goals that are essential to the development of science. In Latin America, science is not an engine of the economy. Most equipment and supplies are imported, and national industries are not given the incentives to produce these goods at home. It is a pity that Latin American society has become accustomed to expect new science and technological developments to come from developed countries rather than from their own scientists. In this article, we present a critical view of the Latin American investigator's daily life, particularly in the area of biomedicine. Too many bright young minds continue to leave Latin America for developed countries, where they are very successful. However, we still have many enthusiastic young graduates who want to make a career in science and contribute to society. Governments need to improve the status of science for the sake of these young graduates who represent the intellectual and economic future of their countries.

  17. Recommendations for the diagnosis of candidemia in Latin America. Latin America Invasive Mycosis Network.

    PubMed

    Colombo, Arnaldo Lopes; Cortes, Jorge Alberto; Zurita, Jeannete; Guzman-Blanco, Manuel; Alvarado Matute, Tito; de Queiroz Telles, Flavio; Santolaya, María E; Tiraboschi, Iris Nora; Echevarría, Juan; Sifuentes, Jose; Thompson-Moya, Luis; Nucci, Marcio

    2013-01-01

    Candidemia is one of the most frequent opportunistic mycoses worldwide. Limited epidemiological studies in Latin America indicate that incidence rates are higher in this region than in the Northern Hemisphere. Diagnosis is often made late in the infection, affecting the initiation of antifungal therapy. A more scientific approach, based on specific parameters, for diagnosis and management of candidemia in Latin America is warranted. 'Recommendations for the diagnosis and management of candidemia' are a series of manuscripts that have been developed by members of the Latin America Invasive Mycosis Network. They aim to provide a set of best-evidence recommendations for the diagnosis and management of candidemia. This publication, 'Recommendations for the diagnosis of candidemia in Latin America', was written to provide guidance to healthcare professionals on the diagnosis of candidemia, as well as on the usefulness and application of susceptibility testing in patients who have a confirmed diagnosis of candidemia. Computerized searches of existing literature were performed by PubMed. The data were extensively reviewed and analyzed by members of the group. The group also met on two occasions to pose questions, discuss conflicting views, and deliberate on a series of management recommendations. 'Recommendations for the diagnosis of candidemia in Latin America' includes diagnostic methods used to detect candidemia, Candida species identification, and susceptibility testing. The availability of methods, their costs and treatment settings are considered. This manuscript is the first of this series that deals with diagnosis and treatment of invasive candidiasis. Other publications in this series include: 'Recommendations for the management of candidemia in adults in Latin America', 'Recommendations for the management of candidemia in children in Latin America', and 'Recommendations for the management of candidemia in neonates in Latin America'. Copyright © 2013 Revista Iberoamericana de Micología. Published by Elsevier Espana. All rights reserved.

  18. Science in Latin America.

    ERIC Educational Resources Information Center

    Ayala, Francisco J.

    1995-01-01

    A brief history of science and technology in Latin America that begins with the Mayan civilization and progresses through the colonial period to the present. Compares increased scientific productivity in the Latin American and Caribbean regions to productivity in the United States and European Union. (LZ)

  19. History of rehabilitation in Latin America.

    PubMed

    Sotelano, Fernando

    2012-04-01

    Rehabilitation in Latin America was pioneered in the 1940s by orthopedists who envisioned the need for the integration of people with disabilities into society. The objective of this review is to discuss the evolution of rehabilitation in Latin America during the last few decades. This review is divided into the following sections: (1) prehistory, (2) the beginning, (3) common features in different countries, (4) the beginning and consolidation of the specialty, (5) the Latin American Medical Association of Rehabilitation, and (6) journals published by different countries.

  20. Perspectives of intellectual disability in Latin American countries: epidemiology, policy, and services for children and adults.

    PubMed

    Mercadante, Marcos T; Evans-Lacko, Sara; Paula, Cristiane S

    2009-09-01

    The prevalence of intellectual disability is an estimated 1-4% worldwide. Etiological factors such as malnutrition, lack of perinatal care, and exposure to toxic and infectious agents, which are more common in low-income and middle-income (LAMI) countries, may contribute to a higher prevalence of intellectual disability in Latin America. This review summarizes the data on intellectual disability coming from Latin America, which is published in scientific journals and is available from official websites and discusses potential health policy and services implications of these studies. Methodologically rigorous studies on intellectual disability in Latin America are lacking. This paucity of basic epidemiological information is a barrier to policy and services development and evaluation around intellectual disability. Only two studies, one from Chile and another from Jamaica, allow for adequate population estimates of intellectual disability. Interestingly, the countries with the highest scientific production in Latin America, Brazil and Mexico, did not produce the most informative research in epidemiology, policy or services related to intellectual disability. The main conclusion of this review is that a lack of scientific evidence makes it difficult to properly characterize the context of intellectual disability in Latin America. Insufficient data is also a barrier to policy and services development for governments in Latin America. Although recently there have been efforts to develop government programs to meet the needs of the intellectual disability population in Latin America, the effectiveness of these programs is questionable without proper evaluation. There is a need for studies that characterize the needs of people with intellectual disability specifically in Latin America, and future research in this area should emphasize how it can inform current and future policies and services for people with intellectual disability.

Top