NASA Astrophysics Data System (ADS)
Cannon, Alex J.
2018-01-01
Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin, particularly for annual maxima of the FWI distribution and spatiotemporal autocorrelation of precipitation fields.
Wen, Xiaotong; Rangarajan, Govindan; Ding, Mingzhou
2013-01-01
Granger causality is increasingly being applied to multi-electrode neurophysiological and functional imaging data to characterize directional interactions between neurons and brain regions. For a multivariate dataset, one might be interested in different subsets of the recorded neurons or brain regions. According to the current estimation framework, for each subset, one conducts a separate autoregressive model fitting process, introducing the potential for unwanted variability and uncertainty. In this paper, we propose a multivariate framework for estimating Granger causality. It is based on spectral density matrix factorization and offers the advantage that the estimation of such a matrix needs to be done only once for the entire multivariate dataset. For any subset of recorded data, Granger causality can be calculated through factorizing the appropriate submatrix of the overall spectral density matrix. PMID:23858479
An Individualized Student Term Project for Multivariate Calculus
ERIC Educational Resources Information Center
Gordon, Sheldon P.
2004-01-01
In this article, the author describes an individualized term project that is designed to increase student understanding of some of the major concepts and methods in multivariate calculus. The project involves having each student conduct a complete max-min analysis of a third degree polynomial in x and y that is based on his or her social security…
Early experiences building a software quality prediction model
NASA Technical Reports Server (NTRS)
Agresti, W. W.; Evanco, W. M.; Smith, M. C.
1990-01-01
Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.
Hair product use, age at menarche and mammographic breast density in multiethnic urban women.
McDonald, Jasmine A; Tehranifar, Parisa; Flom, Julie D; Terry, Mary Beth; James-Todd, Tamarra
2018-01-04
Select hair products contain endocrine disrupting chemicals (EDCs) that may affect breast cancer risk. We hypothesize that, if EDCs are related to breast cancer risk, then they may also affect two important breast cancer risk factors: age at menarche and mammographic breast density. In two urban female cohorts (N = 248): 1) the New York site of the National Collaborative Perinatal Project and 2) the New York City Multiethnic Breast Cancer Project, we measured childhood and adult use of hair oils, lotions, leave-in conditioners, root stimulators, perms/relaxers, and hair dyes using the same validated questionnaire. We used multivariable relative risk regression models to examine the association between childhood hair product use and early age at menarche (defined as <11 years of age) and multivariable linear regression models to examine the association between childhood and adult hair product use and adult mammographic breast density. Early menarche was associated with ever use of childhood hair products (RR 2.3, 95% CI 1.1, 4.8) and hair oil use (RR 2.5, 95% CI 1.2, 5.2); however, additional adjustment for race/ethnicity, attenuated associations (hair products RR 1.8, 95% CI 0.8, 4.1; hair oil use RR 2.3, 95% CI 1.0, 5.5). Breast density was not associated with adult or childhood hair product or hair oil use. If confirmed in larger prospective studies, these data suggest that exposure to EDCs through hair products in early life may affect breast cancer risk by altering timing of menarche, and may operate through a mechanism distinct from breast density.
NASA Astrophysics Data System (ADS)
Schwartz, Craig R.; Thelen, Brian J.; Kenton, Arthur C.
1995-06-01
A statistical parametric multispectral sensor performance model was developed by ERIM to support mine field detection studies, multispectral sensor design/performance trade-off studies, and target detection algorithm development. The model assumes target detection algorithms and their performance models which are based on data assumed to obey multivariate Gaussian probability distribution functions (PDFs). The applicability of these algorithms and performance models can be generalized to data having non-Gaussian PDFs through the use of transforms which convert non-Gaussian data to Gaussian (or near-Gaussian) data. An example of one such transform is the Box-Cox power law transform. In practice, such a transform can be applied to non-Gaussian data prior to the introduction of a detection algorithm that is formally based on the assumption of multivariate Gaussian data. This paper presents an extension of these techniques to the case where the joint multivariate probability density function of the non-Gaussian input data is known, and where the joint estimate of the multivariate Gaussian statistics, under the Box-Cox transform, is desired. The jointly estimated multivariate Gaussian statistics can then be used to predict the performance of a target detection algorithm which has an associated Gaussian performance model.
Lundberg, Frida E; Johansson, Anna L V; Rodriguez-Wallberg, Kenny; Brand, Judith S; Czene, Kamila; Hall, Per; Iliadou, Anastasia N
2016-04-13
Ovarian stimulation drugs, in particular hormonal agents used for controlled ovarian stimulation (COS) required to perform in vitro fertilization, increase estrogen and progesterone levels and have therefore been suspected to influence breast cancer risk. This study aims to investigate whether infertility and hormonal fertility treatment influences mammographic density, a strong hormone-responsive risk factor for breast cancer. Cross-sectional study including 43,313 women recruited to the Karolinska Mammography Project between 2010 and 2013. Among women who reported having had infertility, 1576 had gone through COS, 1429 had had hormonal stimulation without COS and 5958 had not received any hormonal fertility treatment. Percent and absolute mammographic densities were obtained using the volumetric method Volpara™. Associations with mammographic density were assessed using multivariable generalized linear models, estimating mean differences (MD) with 95 % confidence intervals (CI). After multivariable adjustment, women with a history of infertility had 1.53 cm(3) higher absolute dense volume compared to non-infertile women (95 % CI: 0.70 to 2.35). Among infertile women, only those who had gone through COS treatment had a higher absolute dense volume than those who had not received any hormone treatment (adjusted MD 3.22, 95 % CI: 1.10 to 5.33). No clear associations were observed between infertility, fertility treatment and percent volumetric density. Overall, women reporting infertility had more dense tissue in the breast. The higher absolute dense volume in women treated with COS may indicate a treatment effect, although part of the association might also be due to the underlying infertility. Continued monitoring of cancer risk in infertile women, especially those who undergo COS, is warranted.
Density correlators in a self-similar cascade
NASA Astrophysics Data System (ADS)
Bialas, A.; Czyz˙; Ewski, J.
1999-09-01
Multivariate density moments (correlators) of arbitrary order are obtained for the multiplicative self-similar cascade. This result is based on the calculation by Greiner, Eggers and Lipa where the correlators of the logarithms of the particle densities have been obtained. The density correlators, more suitable for comparison with multiparticle data, appear to have a simple factorizable form.
Balabin, Roman M; Lomakina, Ekaterina I
2011-04-21
In this study, we make a general comparison of the accuracy and robustness of five multivariate calibration models: partial least squares (PLS) regression or projection to latent structures, polynomial partial least squares (Poly-PLS) regression, artificial neural networks (ANNs), and two novel techniques based on support vector machines (SVMs) for multivariate data analysis: support vector regression (SVR) and least-squares support vector machines (LS-SVMs). The comparison is based on fourteen (14) different datasets: seven sets of gasoline data (density, benzene content, and fractional composition/boiling points), two sets of ethanol gasoline fuel data (density and ethanol content), one set of diesel fuel data (total sulfur content), three sets of petroleum (crude oil) macromolecules data (weight percentages of asphaltenes, resins, and paraffins), and one set of petroleum resins data (resins content). Vibrational (near-infrared, NIR) spectroscopic data are used to predict the properties and quality coefficients of gasoline, biofuel/biodiesel, diesel fuel, and other samples of interest. The four systems presented here range greatly in composition, properties, strength of intermolecular interactions (e.g., van der Waals forces, H-bonds), colloid structure, and phase behavior. Due to the high diversity of chemical systems studied, general conclusions about SVM regression methods can be made. We try to answer the following question: to what extent can SVM-based techniques replace ANN-based approaches in real-world (industrial/scientific) applications? The results show that both SVR and LS-SVM methods are comparable to ANNs in accuracy. Due to the much higher robustness of the former, the SVM-based approaches are recommended for practical (industrial) application. This has been shown to be especially true for complicated, highly nonlinear objects.
PYCHEM: a multivariate analysis package for python.
Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston
2006-10-15
We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem
Multivariate Density Estimation and Remote Sensing
NASA Technical Reports Server (NTRS)
Scott, D. W.
1983-01-01
Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.
Use of collateral information to improve LANDSAT classification accuracies
NASA Technical Reports Server (NTRS)
Strahler, A. H. (Principal Investigator)
1981-01-01
Methods to improve LANDSAT classification accuracies were investigated including: (1) the use of prior probabilities in maximum likelihood classification as a methodology to integrate discrete collateral data with continuously measured image density variables; (2) the use of the logit classifier as an alternative to multivariate normal classification that permits mixing both continuous and categorical variables in a single model and fits empirical distributions of observations more closely than the multivariate normal density function; and (3) the use of collateral data in a geographic information system as exercised to model a desired output information layer as a function of input layers of raster format collateral and image data base layers.
Dose-dependent effect of mammographic breast density on the risk of contralateral breast cancer.
Chowdhury, Marzana; Euhus, David; O'Donnell, Maureen; Onega, Tracy; Choudhary, Pankaj K; Biswas, Swati
2018-07-01
Increased mammographic breast density is a significant risk factor for breast cancer. It is not clear if it is also a risk factor for the development of contralateral breast cancer. The data were obtained from Breast Cancer Surveillance Consortium and included women diagnosed with invasive breast cancer or ductal carcinoma in situ between ages 18 and 88 and years 1995 and 2009. Each case of contralateral breast cancer was matched with three controls based on year of first breast cancer diagnosis, race, and length of follow-up. A total of 847 cases and 2541 controls were included. The risk factors included in the study were mammographic breast density, age of first breast cancer diagnosis, family history of breast cancer, anti-estrogen treatment, hormone replacement therapy, menopausal status, and estrogen receptor status, all from the time of first breast cancer diagnosis. Both univariate analysis and multivariate conditional logistic regression analysis were performed. In the final multivariate model, breast density, family history of breast cancer, and anti-estrogen treatment remained significant with p values less than 0.01. Increasing breast density had a dose-dependent effect on the risk of contralateral breast cancer. Relative to 'almost entirely fat' category of breast density, the adjusted odds ratios (and p values) in the multivariate analysis for 'scattered density,' 'heterogeneously dense,' and 'extremely dense' categories were 1.65 (0.036), 2.10 (0.002), and 2.32 (0.001), respectively. Breast density is an independent and significant risk factor for development of contralateral breast cancer. This risk factor should contribute to clinical decision making.
The Media and Suicide: Evidence Based on Population Data over 9 Years in Taiwan
ERIC Educational Resources Information Center
Tsai, Jui-Feng
2010-01-01
The relationship between the regional distribution densities of different media and the suicide death rate was explored by analyzing the annual total, male, and female suicide rates and media densities from 23 cities/counties in Taiwan during 1998-2006 by univariate and multivariate regression adjusted for five socioeconomic factors. The regional…
Ortega, Ileana; Martín, Alberto; Díaz, Yusbelly
2011-03-01
Astropecten marginatus is a sea star widely distributed in Northern and Eastern South America, found on sandy and muddy bottoms, in shallow and deep waters. To describe some of its ecological characteristics, we calculated it spatial-temporal distribution, population parameters (based on size and weight) and diet in the Orinoco Delta ecoregion (Venezuela). The ecoregion was divided in three sections: Golfo de Paria, Boca de Serpiente and Plataforma Deltana. Samples for the rainy and dry seasons came from megabenthos surveys of the "Línea Base Ambiental Plataforma Deltana (LBAPD)" and "Corocoro Fase I (CFI)" projects. The collected sea stars were measured, weighted and dissected by the oral side to extract their stomach and identify the preys consumed. A total of 570 sea stars were collected in LBAPD project and 306 in CFI one. The highest densities were found during the dry season in almost all sections. In LBAPD project the highest density was in "Plataforma Deltana" section (0.007 +/- 0.022 ind/m2 in dry season and 0.014 +/- 0.06 ind/m2 in rainy season) and in the CFI project the densities in "Golfo de Paria" section were 0.705 +/- 0.829 ind/m2 in rainy season and 1.027 +/- 1.107 ind/m2 in dry season. The most frequent size range was 3.1-4.6cm. The highest biomass was found in "Golfo de Paria" section (7.581 +/- 0.018 mg/m2 in dry season and 0.005 +/- 6.542 x 10(-06) mg/m2 in rainy season for 2004-2005 and 3.979 +/- 4.024 mg/m2 in dry season; and 3.117 +/- 3.137 mg/m2 in rainy season for 2006). A linear relationship was found between the sea star size and its weight but no relationship was observed between its size and the depth where it was collected. Mollusks are dominant in the sea star diet (47.4% in abundance). The diet in any of the sections, seasons or between projects or size class was heterogeneous, using multivariate ordinations (MDS) and SIMPER analysis and there was no difference in the prey number or food elements that a sea star can eat. Although A. marginatus has been described as a predator, in this study were also inferred scavenger and detritivorous habits.
ERIC Educational Resources Information Center
Mun, Eun Young; von Eye, Alexander; Bates, Marsha E.; Vaschillo, Evgeny G.
2008-01-01
Model-based cluster analysis is a new clustering procedure to investigate population heterogeneity utilizing finite mixture multivariate normal densities. It is an inferentially based, statistically principled procedure that allows comparison of nonnested models using the Bayesian information criterion to compare multiple models and identify the…
Forcino, Frank L; Leighton, Lindsey R; Twerdy, Pamela; Cahill, James F
2015-01-01
Community ecologists commonly perform multivariate techniques (e.g., ordination, cluster analysis) to assess patterns and gradients of taxonomic variation. A critical requirement for a meaningful statistical analysis is accurate information on the taxa found within an ecological sample. However, oversampling (too many individuals counted per sample) also comes at a cost, particularly for ecological systems in which identification and quantification is substantially more resource consuming than the field expedition itself. In such systems, an increasingly larger sample size will eventually result in diminishing returns in improving any pattern or gradient revealed by the data, but will also lead to continually increasing costs. Here, we examine 396 datasets: 44 previously published and 352 created datasets. Using meta-analytic and simulation-based approaches, the research within the present paper seeks (1) to determine minimal sample sizes required to produce robust multivariate statistical results when conducting abundance-based, community ecology research. Furthermore, we seek (2) to determine the dataset parameters (i.e., evenness, number of taxa, number of samples) that require larger sample sizes, regardless of resource availability. We found that in the 44 previously published and the 220 created datasets with randomly chosen abundances, a conservative estimate of a sample size of 58 produced the same multivariate results as all larger sample sizes. However, this minimal number varies as a function of evenness, where increased evenness resulted in increased minimal sample sizes. Sample sizes as small as 58 individuals are sufficient for a broad range of multivariate abundance-based research. In cases when resource availability is the limiting factor for conducting a project (e.g., small university, time to conduct the research project), statistically viable results can still be obtained with less of an investment.
Hemakom, Apit; Goverdovsky, Valentin; Looney, David; Mandic, Danilo P
2016-04-13
An extension to multivariate empirical mode decomposition (MEMD), termed adaptive-projection intrinsically transformed MEMD (APIT-MEMD), is proposed to cater for power imbalances and inter-channel correlations in real-world multichannel data. It is shown that the APIT-MEMD exhibits similar or better performance than MEMD for a large number of projection vectors, whereas it outperforms MEMD for the critical case of a small number of projection vectors within the sifting algorithm. We also employ the noise-assisted APIT-MEMD within our proposed intrinsic multiscale analysis framework and illustrate the advantages of such an approach in notoriously noise-dominated cooperative brain-computer interface (BCI) based on the steady-state visual evoked potentials and the P300 responses. Finally, we show that for a joint cognitive BCI task, the proposed intrinsic multiscale analysis framework improves system performance in terms of the information transfer rate. © 2016 The Author(s).
National Space Biomedical Research Institute (NSBRI) JSC Summer Projects
NASA Technical Reports Server (NTRS)
Dowdy, Forrest Ryan
2014-01-01
This project optimized the calorie content in a breakfast meal replacement bar for the Advanced Food Technology group. Use of multivariable optimization yielded the highest weight savings possible while simultaneously matching NASA Human Standards nutritional guidelines. The scope of this research included the study of shelf-life indicators such as water activity, moisture content, and texture analysis. Key metrics indicate higher protein content, higher caloric density, and greater mass savings as a result of the reformulation process. The optimization performed for this study demonstrated wide application to other food bars in the Advanced Food Technology portfolio. Recommendations for future work include shelf life studies on bar hardening and overall acceptability data over increased time frames and temperature fluctuation scenarios.
Developing population models with data from marked individuals
Hae Yeong Ryu,; Kevin T. Shoemaker,; Eva Kneip,; Anna Pidgeon,; Patricia Heglund,; Brooke Bateman,; Thogmartin, Wayne E.; Reşit Akçakaya,
2016-01-01
Population viability analysis (PVA) is a powerful tool for biodiversity assessments, but its use has been limited because of the requirements for fully specified population models such as demographic structure, density-dependence, environmental stochasticity, and specification of uncertainties. Developing a fully specified population model from commonly available data sources – notably, mark–recapture studies – remains complicated due to lack of practical methods for estimating fecundity, true survival (as opposed to apparent survival), natural temporal variability in both survival and fecundity, density-dependence in the demographic parameters, and uncertainty in model parameters. We present a general method that estimates all the key parameters required to specify a stochastic, matrix-based population model, constructed using a long-term mark–recapture dataset. Unlike standard mark–recapture analyses, our approach provides estimates of true survival rates and fecundities, their respective natural temporal variabilities, and density-dependence functions, making it possible to construct a population model for long-term projection of population dynamics. Furthermore, our method includes a formal quantification of parameter uncertainty for global (multivariate) sensitivity analysis. We apply this approach to 9 bird species and demonstrate the feasibility of using data from the Monitoring Avian Productivity and Survivorship (MAPS) program. Bias-correction factors for raw estimates of survival and fecundity derived from mark–recapture data (apparent survival and juvenile:adult ratio, respectively) were non-negligible, and corrected parameters were generally more biologically reasonable than their uncorrected counterparts. Our method allows the development of fully specified stochastic population models using a single, widely available data source, substantially reducing the barriers that have until now limited the widespread application of PVA. This method is expected to greatly enhance our understanding of the processes underlying population dynamics and our ability to analyze viability and project trends for species of conservation concern.
A Multivariate Analysis of Secondary Students' Experience of Web-Based Language Acquisition
ERIC Educational Resources Information Center
Felix, Uschi
2004-01-01
This paper reports on a large-scale project designed to replicate an earlier investigation of tertiary students (Felix, 2001) in a secondary school environment. The new project was carried out in five settings, again investigating the potential of the Web as a medium of language instruction. Data was collected by questionnaires and observational…
Multivariable PID controller design tuning using bat algorithm for activated sludge process
NASA Astrophysics Data System (ADS)
Atikah Nor’Azlan, Nur; Asmiza Selamat, Nur; Mat Yahya, Nafrizuan
2018-04-01
The designing of a multivariable PID control for multi input multi output is being concerned with this project by applying four multivariable PID control tuning which is Davison, Penttinen-Koivo, Maciejowski and Proposed Combined method. The determination of this study is to investigate the performance of selected optimization technique to tune the parameter of MPID controller. The selected optimization technique is Bat Algorithm (BA). All the MPID-BA tuning result will be compared and analyzed. Later, the best MPID-BA will be chosen in order to determine which techniques are better based on the system performances in terms of transient response.
Cardesa-Salzmann, Teresa M.; Colomo, Luis; Gutierrez, Gonzalo; Chan, Wing C.; Weisenburger, Dennis; Climent, Fina; González-Barca, Eva; Mercadal, Santiago; Arenillas, Leonor; Serrano, Sergio; Tubbs, Ray; Delabie, Jan; Gascoyne, Randy D.; Connors, Joseph M; Mate, Jose L.; Rimsza, Lisa; Braziel, Rita; Rosenwald, Andreas; Lenz, Georg; Wright, George; Jaffe, Elaine S.; Staudt, Louis; Jares, Pedro; López-Guillermo, Armando; Campo, Elias
2011-01-01
Background Diffuse large B-cell lymphoma is a clinically and molecularly heterogeneous disease. Gene expression profiling studies have shown that the tumor microenvironment affects survival and that the angiogenesis-related signature is prognostically unfavorable. The contribution of histopathological microvessel density to survival in diffuse large B-cell lymphomas treated with immunochemotherapy remains unknown. The purpose of this study is to assess the prognostic impact of histopathological microvessel density in two independent series of patients with diffuse large B-cell lymphoma treated with immunochemotherapy. Design and Methods One hundred and forty-seven patients from the Leukemia Lymphoma Molecular Profiling Project (training series) and 118 patients from the Catalan Lymphoma-Study group-GELCAB (validation cohort) were included in the study. Microvessels were immunostained with CD31 and quantified with a computerized image analysis system. The stromal scores previously defined in 110 Leukemia Lymphoma Molecular Profiling Project cases were used to analyze correlations with microvessel density data. Results Microvessel density significantly correlated with the stromal score (r=0.3209; P<0.001). Patients with high microvessel density showed significantly poorer overall survival than those with low microvessel density both in the training series (4-year OS 54% vs. 78%; P=0.004) and in the validation cohort (57% vs. 81%; P=0.006). In multivariate analysis, in both groups high microvessel density was a statistically significant unfavorable prognostic factor independent of international prognostic index [training series: international prognostic index (relative risk 2.7; P=0.003); microvessel density (relative risk 1.96; P=0.002); validation cohort: international prognostic index (relative risk 4.74; P<0.001); microvessel density (relative risk 2.4; P=0.016)]. Conclusions These findings highlight the impact of angiogenesis in the outcome of patients with diffuse large B-cell lymphoma and the interest of evaluating antiangiogenic drugs in clinical trials. PMID:21546504
Low Bone Density and Bisphosphonate Use and the Risk of Kidney Stones.
Prochaska, Megan; Taylor, Eric; Vaidya, Anand; Curhan, Gary
2017-08-07
Previous studies have demonstrated lower bone density in patients with kidney stones, but no longitudinal studies have evaluated kidney stone risk in individuals with low bone density. Small studies with short follow-up reported reduced 24-hour urine calcium excretion with bisphosphonate use. We examined history of low bone density and bisphosphonate use and the risk of incident kidney stone as well as the association with 24-hour calcium excretion. We conducted a prospective analysis of 96,092 women in the Nurses' Health Study II. We used Cox proportional hazards models to adjust for age, body mass index, thiazide use, fluid intake, supplemental calcium use, and dietary factors. We also conducted a cross-sectional analysis of 2294 participants using multivariable linear regression to compare 24-hour urinary calcium excretion between participants with and without a history of low bone density, and among 458 participants with low bone density, with and without bisphosphonate use. We identified 2564 incident stones during 1,179,860 person-years of follow-up. The multivariable adjusted relative risk for an incident kidney stone for participants with history of low bone density compared with participants without was 1.39 (95% confidence interval [95% CI], 1.20 to 1.62). Among participants with low bone density, the multivariable adjusted relative risk for an incident kidney stone for bisphosphonate users was 0.68 (95% CI, 0.48 to 0.98). In the cross-sectional analysis of 24-hour urine calcium excretion, the multivariable adjusted mean difference in 24-hour calcium was 10 mg/d (95% CI, 1 to 19) higher for participants with history of low bone density. However, among participants with history of low bone density, there was no association between bisphosphonate use and 24-hour calcium with multivariable adjusted mean difference in 24-hour calcium of -2 mg/d (95% CI, -25 to 20). Low bone density is an independent risk factor for incident kidney stone and is associated with higher 24-hour urine calcium excretion. Among participants with low bone density, bisphosphonate use was associated with lower risk of incident kidney stone but was not independently associated with 24-hour urine calcium excretion. Copyright © 2017 by the American Society of Nephrology.
Gaussian windows: A tool for exploring multivariate data
NASA Technical Reports Server (NTRS)
Jaeckel, Louis A.
1990-01-01
Presented here is a method for interactively exploring a large set of quantitative multivariate data, in order to estimate the shape of the underlying density function. It is assumed that the density function is more or less smooth, but no other specific assumptions are made concerning its structure. The local structure of the data in a given region may be examined by viewing the data through a Gaussian window, whose location and shape are chosen by the user. A Gaussian window is defined by giving each data point a weight based on a multivariate Gaussian function. The weighted sample mean and sample covariance matrix are then computed, using the weights attached to the data points. These quantities are used to compute an estimate of the shape of the density function in the window region. The local structure of the data is described by a method similar to the method of principal components. By taking many such local views of the data, we can form an idea of the structure of the data set. The method is applicable in any number of dimensions. The method can be used to find and describe simple structural features such as peaks, valleys, and saddle points in the density function, and also extended structures in higher dimensions. With some practice, we can apply our geometrical intuition to these structural features in any number of dimensions, so that we can think about and describe the structure of the data. Since the computations involved are relatively simple, the method can easily be implemented on a small computer.
Sex steroid metabolism polymorphisms and mammographic density in pre- and early perimenopausal women
Crandall, Carolyn J; Sehl, Mary E; Crawford, Sybil L; Gold, Ellen B; Habel, Laurel A; Butler, Lesley M; Sowers, MaryFran R; Greendale, Gail A; Sinsheimer, Janet S
2009-01-01
Introduction We examined the association between mammographic density and single-nucleotide polymorphisms (SNPs) in genes encoding CYP1A1, CYP1B1, aromatase, 17β-HSD, ESR1, and ESR2 in pre- and early perimenopausal white, African-American, Chinese, and Japanese women. Methods The Study of Women's Health Across the Nation is a longitudinal community-based cohort study. We analyzed data from 451 pre- and early perimenopausal participants of the ancillary SWAN Mammographic Density study for whom we had complete information regarding mammographic density, genotypes, and covariates. With multivariate linear regression, we examined the relation between percentage mammographic breast density (outcome) and each SNP (primary predictor), adjusting for age, race/ethnicity, parity, cigarette smoking, and body mass index (BMI). Results After multivariate adjustment, the CYP1B1 rs162555 CC genotype was associated with a 9.4% higher mammographic density than the TC/TT genotype (P = 0.04). The CYP19A1 rs936306 TT genotype was associated with 6.2% lower mammographic density than the TC/CC genotype (P = 0.02). The positive association between CYP1A1 rs2606345 and mammographic density was significantly stronger among participants with BMI greater than 30 kg/m2 than among those with BMI less than 25 kg/m2 (Pinteraction = 0.05). Among white participants, the ESR1 rs2234693 CC genotype was associated with a 7.0% higher mammographic density than the CT/TT genotype (P = 0.01). Conclusions SNPs in certain genes encoding sex steroid metabolism enzymes and ESRs were associated with mammographic density. Because the encoded enzymes and ESR1 are expressed in breast tissue, these SNPs may influence breast cancer risk by altering mammographic density. PMID:19630952
Post-processing of multi-hydrologic model simulations for improved streamflow projections
NASA Astrophysics Data System (ADS)
khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid
2016-04-01
Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.
Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fletcher, James H.; Cox, Philip; Harrington, William J
2013-09-03
ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focusedmore » on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel containment. PROJECT OVERVIEW The University of North Florida (UNF), with project partner the University of Florida, recently completed the Department of Energy (DOE) project entitled “Advanced Direct Methanol Fuel Cell for Mobile Computing”. The primary objective of the project was to advance portable fuel cell system technology towards the commercial targets as laid out in the DOE R&D roadmap by developing a 20-watt, direct methanol fuel cell (DMFC), portable power supply based on the UNF innovative “passive water recovery” MEA. Extensive component, sub-system, and system development and testing was undertaken to meet the rigorous demands of the consumer electronic application. Numerous brassboard (nonpackaged) systems were developed to optimize the integration process and facilitating control algorithm development. The culmination of the development effort was a fully-integrated, DMFC, power supply (referred to as DP4). The project goals were 40 W/kg for specific power, 55 W/l for power density, and 575 Whr/l for energy density. It should be noted that the specific power and power density were for the power section only, and did not include the hybrid battery. The energy density is based on three, 200 ml, fuel cartridges, and also did not include the hybrid battery. The results show that the DP4 system configured without the methanol concentration sensor exceeded all performance goals, achieving 41.5 W/kg for specific power, 55.3 W/l for power density, and 623 Whr/l for energy density. During the project, the DOE revised its technical targets, and the definition of many of these targets, for the portable power application. With this revision, specific power, power density, specific energy (Whr/kg), and energy density are based on the total system, including fuel tank, fuel, and hybridization battery. Fuel capacity is not defined, but the same value is required for all calculations. Test data showed that the DP4 exceeded all 2011 Technical Status values; for example, the DP4 energy density was 373 Whr/l versus the DOE 2011 status of 200 Whr/l. For the DOE 2013 Technical Goals, the operation time was increased from 10 hours to 14.3 hours. Under these conditions, the DP4 closely approached or surpassed the technical targets; for example, the DP4 achieved 468 Whr/l versus the goal of 500 Whr/l. Thus, UNF has successfully met the project goals. A fully-operational, 20-watt DMFC power supply was developed based on the UNF passive water recovery MEA. The power supply meets the project performance goals and advances portable power technology towards the commercialization targets set by the DOE.« less
Takayama, Yuki; Inui, Yayoi; Sekiguchi, Yuki; Kobayashi, Amane; Oroguchi, Tomotaka; Yamamoto, Masaki; Matsunaga, Sachihiro; Nakasako, Masayoshi
2015-07-01
Coherent X-ray diffraction imaging (CXDI) is a lens-less technique for visualizing the structures of non-crystalline particles with the dimensions of submicrometer to micrometer at a resolution of several tens of nanometers. We conducted cryogenic CXDI experiments at 66 K to visualize the internal structures of frozen-hydrated chloroplasts of Cyanidioschyzon merolae using X-ray free electron laser (XFEL) as a coherent X-ray source. Chloroplast dispersed specimen disks at a number density of 7/(10×10 µm(2)) were flash-cooled with liquid ethane without staining, sectioning or chemical labeling. Chloroplasts are destroyed at atomic level immediately after the diffraction by XFEL pulses. Thus, diffraction patterns with a good signal-to-noise ratio from single chloroplasts were selected from many diffraction patterns collected through scanning specimen disks to provide fresh specimens into the irradiation area. The electron density maps of single chloroplasts projected along the direction of the incident X-ray beam were reconstructed by using the iterative phase-retrieval method and multivariate analyses. The electron density map at a resolution of 70 nm appeared as a C-shape. In addition, the fluorescence image of proteins stained with Flamingo™ dye also appeared as a C-shape as did the autofluorescence from Chl. The similar images suggest that the thylakoid membranes with an abundance of proteins distribute along the outer membranes of chloroplasts. To confirm the present results statistically, a number of projection structures must be accumulated through high-throughput data collection in the near future. Based on the results, we discuss the feasibility of XFEL-CXDI experiments in the structural analyses of cellular organelles. © The Author 2015. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Multivariate Time Series Decomposition into Oscillation Components.
Matsuda, Takeru; Komaki, Fumiyasu
2017-08-01
Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.
Automatic contouring of geologic fabric and finite strain data on the unit hyperboloid
NASA Astrophysics Data System (ADS)
Vollmer, Frederick W.
2018-06-01
Fabric and finite strain analysis, an integral part of studies of geologic structures and orogenic belts, is commonly done by the analysis of particles whose shapes can be approximated as ellipses. Given a sample of such particles, the mean and confidence intervals of particular parameters can be calculated, however, taking the extra step of plotting and contouring the density distribution can identify asymmetries or modes related to sedimentary fabrics or other factors. A common graphical strain analysis technique is to plot final ellipse ratios, Rf , versus orientations, ϕf on polar Elliott or Rf / ϕ plots to examine the density distribution. The plot may be contoured, however, it is desirable to have a contouring method that is rapid, reproducible, and based on the underlying geometry of the data. The unit hyperboloid, H2 , gives a natural parameter space for two-dimensional strain, and various projections, including equal-area and stereographic, have useful properties for examining density distributions for anisotropy. An index, Ia , is given to quantify the magnitude and direction of anisotropy. Elliott and Rf / ϕ plots can be understood by applying hyperbolic geometry and recognizing them as projections of H2 . These both distort area, however, so the equal-area projection is preferred for examining density distributions. The algorithm presented here gives fast, accurate, and reproducible contours of density distributions calculated directly on H2 . The algorithm back-projects the data onto H2 , where the density calculation is done at regular nodes using a weighting value based on the hyperboloid distribution, which is then contoured. It is implemented as an Octave compatible MATLAB function that plots ellipse data using a variety of projections, and calculates and displays contours of their density distribution on H2 .
Teodoro, P E; Rodrigues, E V; Peixoto, L A; Silva, L A; Laviola, B G; Bhering, L L
2017-03-22
Jatropha is research target worldwide aimed at large-scale oil production for biodiesel and bio-kerosene. Its production potential is among 1200 and 1500 kg/ha of oil after the 4th year. This study aimed to estimate combining ability of Jatropha genotypes by multivariate diallel analysis to select parents and crosses that allow gains in important agronomic traits. We performed crosses in diallel complete genetic design (3 x 3) arranged in blocks with five replications and three plants per plot. The following traits were evaluated: plant height, stem diameter, canopy projection between rows, canopy projection on the line, number of branches, mass of hundred grains, and grain yield. Data were submitted to univariate and multivariate diallel analysis. Genotypes 107 and 190 can be used in crosses for establishing a base population of Jatropha, since it has favorable alleles for increasing the mass of hundred grains and grain yield and reducing the plant height. The cross 190 x 107 is the most promising to perform the selection of superior genotypes for the simultaneous breeding of these traits.
Memon, Aftab Hameed; Rahman, Ismail Abdul
2014-01-01
This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227
Memon, Aftab Hameed; Rahman, Ismail Abdul
2014-01-01
This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.
The effects of low environmental cadmium exposure on bone density
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trzcinka-Ochocka, M., E-mail: ochocka@imp.lodz.pl; Jakubowski, M.; Szymczak, W.
2010-04-15
Recent epidemiological data indicate that low environmental exposure to cadmium, as shown by cadmium body burden (Cd-U), is associated with renal dysfunction as well as an increased risk of cadmium-induced bone disorders. The present study was designed to assess the effects of low environmental cadmium exposure, at the level sufficient to induce kidney damage, on bone metabolism and mineral density (BMD). The project was conducted in the area contaminated with cadmium, nearby a zinc smelter located in the region of Poland where heavy industry prevails. The study population comprised 170 women (mean age=39.7; 18-70 years) and 100 men (mean age=31.9;more » 18-76 years). Urinary and blood cadmium and the markers of renal tubular dysfunction ({beta}{sub 2}M-U RBP, NAG), glomerular dysfunction (Alb-U and {beta}{sub 2}M-S) and bone metabolism markers (BAP-S, CTX-S) as well as forearm BMD, were measured. The results of this study based on simple dose-effect analysis showed the relationship between increasing cadmium concentrations and an increased excretion of renal dysfunction markers and decreasing bone density. However, the results of the multivariate analysis did not indicate the association between exposure to cadmium and decrease in bone density. They showed that the most important factors that have impact on bone density are body weight and age in the female subjects and body weight and calcium excretion in males. Our investigation revealed that the excretion of low molecular weight proteins occurred at a lower level of cadmium exposure than the possible loss of bone mass. It seems that renal tubular markers are the most sensitive and significant indicators of early health effects of cadmium intoxication in the general population. The correlation of urinary cadmium concentration with markers of kidney dysfunction was observed in the absence of significant correlations with bone effects. Our findings did not indicate any effects of environmental cadmium exposure on bone density.« less
Reconstruction of the ionospheric electron density by geostatistical inversion
NASA Astrophysics Data System (ADS)
Minkwitz, David; van den Boogaart, Karl Gerald; Hoque, Mainul; Gerzen, Tatjana
2015-04-01
The ionosphere is the upper part of the atmosphere where sufficient free electrons exist to affect the propagation of radio waves. Typically, the ionosphere extends from about 50 - 1000 km and its morphology is mainly driven by solar radiation, particle precipitation and charge exchange. Due to the strong ionospheric impact on many applications dealing with trans-ionospheric signals such as Global Navigation Satellite Systems (GNSS) positioning, navigation and remote sensing, the demand for a highly accurate reconstruction of the electron density is ever increasing. Within the Helmholtz Alliance project "Remote Sensing and Earth System Dynamics" (EDA) the utilization of the upcoming radar mission TanDEM-L and its related products are prepared. The TanDEM-L mission will operate in L-band with a wavelength of approximately 24 cm and aims at an improved understanding of environmental processes and ecosystem change, e.g. earthquakes, volcanos, glaciers, soil moisture and carbon cycle. Since its lower frequency compared to the X-band (3 cm) and C-band (5 cm) radar missions, the influence of the ionosphere will increase and might lead to a significant degradation of the radar image quality if no correction is applied. Consequently, our interest is the reconstruction of the ionospheric electron density in order to mitigate the ionospheric delay. Following the ionosphere's behaviour we establish a non-stationary and anisotropic spatial covariance model of the electron density separated into a vertical and horizontal component. In order to estimate the model's parameters we chose a maximum likelihood approach. This approach incorporates GNSS total electron content measurements, representing integral measurements of the electron density between satellite to receiver ray paths, and the NeQuick model as a non-stationary trend. Based on a multivariate normal distribution the spatial covariance model parameters are optimized and afterwards the 3D electron density can be calculated by kriging for arbitrary points or grids of interest.
Deterministic annealing for density estimation by multivariate normal mixtures
NASA Astrophysics Data System (ADS)
Kloppenburg, Martin; Tavan, Paul
1997-03-01
An approach to maximum-likelihood density estimation by mixtures of multivariate normal distributions for large high-dimensional data sets is presented. Conventionally that problem is tackled by notoriously unstable expectation-maximization (EM) algorithms. We remove these instabilities by the introduction of soft constraints, enabling deterministic annealing. Our developments are motivated by the proof that algorithmically stable fuzzy clustering methods that are derived from statistical physics analogs are special cases of EM procedures.
Pedersen, Kristine Bondo; Kirkelund, Gunvor M; Ottosen, Lisbeth M; Jensen, Pernille E; Lejon, Tore
2015-01-01
Chemometrics was used to develop a multivariate model based on 46 previously reported electrodialytic remediation experiments (EDR) of five different harbour sediments. The model predicted final concentrations of Cd, Cu, Pb and Zn as a function of current density, remediation time, stirring rate, dry/wet sediment, cell set-up as well as sediment properties. Evaluation of the model showed that remediation time and current density had the highest comparative influence on the clean-up levels. Individual models for each heavy metal showed variance in the variable importance, indicating that the targeted heavy metals were bound to different sediment fractions. Based on the results, a PLS model was used to design five new EDR experiments of a sixth sediment to achieve specified clean-up levels of Cu and Pb. The removal efficiencies were up to 82% for Cu and 87% for Pb and the targeted clean-up levels were met in four out of five experiments. The clean-up levels were better than predicted by the model, which could hence be used for predicting an approximate remediation strategy; the modelling power will however improve with more data included. Copyright © 2014 Elsevier B.V. All rights reserved.
Comparison of connectivity analyses for resting state EEG data
NASA Astrophysics Data System (ADS)
Olejarczyk, Elzbieta; Marzetti, Laura; Pizzella, Vittorio; Zappasodi, Filippo
2017-06-01
Objective. In the present work, a nonlinear measure (transfer entropy, TE) was used in a multivariate approach for the analysis of effective connectivity in high density resting state EEG data in eyes open and eyes closed. Advantages of the multivariate approach in comparison to the bivariate one were tested. Moreover, the multivariate TE was compared to an effective linear measure, i.e. directed transfer function (DTF). Finally, the existence of a relationship between the information transfer and the level of brain synchronization as measured by phase synchronization value (PLV) was investigated. Approach. The comparison between the connectivity measures, i.e. bivariate versus multivariate TE, TE versus DTF, TE versus PLV, was performed by means of statistical analysis of indexes based on graph theory. Main results. The multivariate approach is less sensitive to false indirect connections with respect to the bivariate estimates. The multivariate TE differentiated better between eyes closed and eyes open conditions compared to DTF. Moreover, the multivariate TE evidenced non-linear phenomena in information transfer, which are not evidenced by the use of DTF. We also showed that the target of information flow, in particular the frontal region, is an area of greater brain synchronization. Significance. Comparison of different connectivity analysis methods pointed to the advantages of nonlinear methods, and indicated a relationship existing between the flow of information and the level of synchronization of the brain.
Multivariate η-μ fading distribution with arbitrary correlation model
NASA Astrophysics Data System (ADS)
Ghareeb, Ibrahim; Atiani, Amani
2018-03-01
An extensive analysis for the multivariate ? distribution with arbitrary correlation is presented, where novel analytical expressions for the multivariate probability density function, cumulative distribution function and moment generating function (MGF) of arbitrarily correlated and not necessarily identically distributed ? power random variables are derived. Also, this paper provides exact-form expression for the MGF of the instantaneous signal-to-noise ratio at the combiner output in a diversity reception system with maximal-ratio combining and post-detection equal-gain combining operating in slow frequency nonselective arbitrarily correlated not necessarily identically distributed ?-fading channels. The average bit error probability of differentially detected quadrature phase shift keying signals with post-detection diversity reception system over arbitrarily correlated and not necessarily identical fading parameters ?-fading channels is determined by using the MGF-based approach. The effect of fading correlation between diversity branches, fading severity parameters and diversity level is studied.
2014-10-14
applications. By developing both inversion-based and projection -based strategies to enable 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...REPORT TYPE 17. LIMITATION OF ABSTRACT 15. NUMBER OF PAGES 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 5c. PROGRAM ELEMENT...constraint that excluded essentially all condensed-phase and reactive chemical applications. By developing both inversion-based and projection -based
The Multivariate Largest Lyapunov Exponent as an Age-Related Metric of Quiet Standing Balance
Liu, Kun; Wang, Hongrui; Xiao, Jinzhuang
2015-01-01
The largest Lyapunov exponent has been researched as a metric of the balance ability during human quiet standing. However, the sensitivity and accuracy of this measurement method are not good enough for clinical use. The present research proposes a metric of the human body's standing balance ability based on the multivariate largest Lyapunov exponent which can quantify the human standing balance. The dynamic multivariate time series of ankle, knee, and hip were measured by multiple electrical goniometers. Thirty-six normal people of different ages participated in the test. With acquired data, the multivariate largest Lyapunov exponent was calculated. Finally, the results of the proposed approach were analysed and compared with the traditional method, for which the largest Lyapunov exponent and power spectral density from the centre of pressure were also calculated. The following conclusions can be obtained. The multivariate largest Lyapunov exponent has a higher degree of differentiation in differentiating balance in eyes-closed conditions. The MLLE value reflects the overall coordination between multisegment movements. Individuals of different ages can be distinguished by their MLLE values. The standing stability of human is reduced with the increment of age. PMID:26064182
Modeling of turbulent supersonic H2-air combustion with a multivariate beta PDF
NASA Technical Reports Server (NTRS)
Baurle, R. A.; Hassan, H. A.
1993-01-01
Recent calculations of turbulent supersonic reacting shear flows using an assumed multivariate beta PDF (probability density function) resulted in reduced production rates and a delay in the onset of combustion. This result is not consistent with available measurements. The present research explores two possible reasons for this behavior: use of PDF's that do not yield Favre averaged quantities, and the gradient diffusion assumption. A new multivariate beta PDF involving species densities is introduced which makes it possible to compute Favre averaged mass fractions. However, using this PDF did not improve comparisons with experiment. A countergradient diffusion model is then introduced. Preliminary calculations suggest this to be the cause of the discrepancy.
Deeper Insights into the Circumgalactic Medium using Multivariate Analysis Methods
NASA Astrophysics Data System (ADS)
Lewis, James; Churchill, Christopher W.; Nielsen, Nikole M.; Kacprzak, Glenn
2017-01-01
Drawing from a database of galaxies whose surrounding gas has absorption from MgII, called the MgII-Absorbing Galaxy Catalog (MAGIICAT, Neilsen et al 2013), we studied the circumgalactic medium (CGM) for a sample of 47 galaxies. Using multivariate analysis, in particular the k-means clustering algorithm, we determined that simultaneously examining column density (N), rest-frame B-K color, virial mass, and azimuthal angle (the projected angle between the galaxy major axis and the quasar line of sight) yields two distinct populations: (1) bluer, lower mass galaxies with higher column density along the minor axis, and (2) redder, higher mass galaxies with lower column density along the major axis. We support this grouping by running (i) two-sample, two-dimensional Kolmogorov-Smirnov (KS) tests on each of the six bivariate planes and (ii) two-sample KS tests on each of the four variables to show that the galaxies significantly cluster into two independent populations. To account for the fact that 16 of our 47 galaxies have upper limits on N, we performed Monte-Carlo tests whereby we replaced upper limits with random deviates drawn from a Schechter distribution fit, f(N). These tests strengthen the results of the KS tests. We examined the behavior of the MgII λ2796 absorption line equivalent width and velocity width for each galaxy population. We find that equivalent width and velocity width do not show similar characteristic distinctions between the two galaxy populations. We discuss the k-means clustering algorithm for optimizing the analysis of populations within datasets as opposed to using arbitrary bivariate subsample cuts. We also discuss the power of the k-means clustering algorithm in extracting deeper physical insight into the CGM in relationship to host galaxies.
TANK SPACE ALTERNATIVES ANALYSIS REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
TURNER DA; KIRCH NW; WASHENFELDER DJ
2010-04-27
This report addresses the projected shortfall of double-shell tank (DST) space starting in 2018. Using a multi-variant methodology, a total of eight new-term options and 17 long-term options for recovering DST space were evaluated. These include 11 options that were previously evaluated in RPP-7702, Tank Space Options Report (Rev. 1). Based on the results of this evaluation, two near-term and three long-term options have been identified as being sufficient to overcome the shortfall of DST space projected to occur between 2018 and 2025.
Integration of vessel traits, wood density, and height in angiosperm shrubs and trees.
Martínez-Cabrera, Hugo I; Schenk, H Jochen; Cevallos-Ferriz, Sergio R S; Jones, Cynthia S
2011-05-01
Trees and shrubs tend to occupy different niches within and across ecosystems; therefore, traits related to their resource use and life history are expected to differ. Here we analyzed how growth form is related to variation in integration among vessel traits, wood density, and height. We also considered the ecological and evolutionary consequences of such differences. In a sample of 200 woody plant species (65 shrubs and 135 trees) from Argentina, Mexico, and the United States, standardized major axis (SMA) regression, correlation analyses, and ANOVA were used to determine whether relationships among traits differed between growth forms. The influence of phylogenetic relationships was examined with a phylogenetic ANOVA and phylogenetically independent contrasts (PICs). A principal component analysis was conducted to determine whether trees and shrubs occupy different portions of multivariate trait space. Wood density did not differ between shrubs and trees, but there were significant differences in vessel diameter, vessel density, theoretical conductivity, and as expected, height. In addition, relationships between vessel traits and wood density differed between growth forms. Trees showed coordination among vessel traits, wood density, and height, but in shrubs, wood density and vessel traits were independent. These results hold when phylogenetic relationships were considered. In the multivariate analyses, these differences translated as significantly different positions in multivariate trait space occupied by shrubs and trees. Differences in trait integration between growth forms suggest that evolution of growth form in some lineages might be associated with the degree of trait interrelation.
NASA Astrophysics Data System (ADS)
Oh, Seok-Geun; Suh, Myoung-Seok
2017-07-01
The projection skills of five ensemble methods were analyzed according to simulation skills, training period, and ensemble members, using 198 sets of pseudo-simulation data (PSD) produced by random number generation assuming the simulated temperature of regional climate models. The PSD sets were classified into 18 categories according to the relative magnitude of bias, variance ratio, and correlation coefficient, where each category had 11 sets (including 1 truth set) with 50 samples. The ensemble methods used were as follows: equal weighted averaging without bias correction (EWA_NBC), EWA with bias correction (EWA_WBC), weighted ensemble averaging based on root mean square errors and correlation (WEA_RAC), WEA based on the Taylor score (WEA_Tay), and multivariate linear regression (Mul_Reg). The projection skills of the ensemble methods improved generally as compared with the best member for each category. However, their projection skills are significantly affected by the simulation skills of the ensemble member. The weighted ensemble methods showed better projection skills than non-weighted methods, in particular, for the PSD categories having systematic biases and various correlation coefficients. The EWA_NBC showed considerably lower projection skills than the other methods, in particular, for the PSD categories with systematic biases. Although Mul_Reg showed relatively good skills, it showed strong sensitivity to the PSD categories, training periods, and number of members. On the other hand, the WEA_Tay and WEA_RAC showed relatively superior skills in both the accuracy and reliability for all the sensitivity experiments. This indicates that WEA_Tay and WEA_RAC are applicable even for simulation data with systematic biases, a short training period, and a small number of ensemble members.
Bias-Free Chemically Diverse Test Sets from Machine Learning.
Swann, Ellen T; Fernandez, Michael; Coote, Michelle L; Barnard, Amanda S
2017-08-14
Current benchmarking methods in quantum chemistry rely on databases that are built using a chemist's intuition. It is not fully understood how diverse or representative these databases truly are. Multivariate statistical techniques like archetypal analysis and K-means clustering have previously been used to summarize large sets of nanoparticles however molecules are more diverse and not as easily characterized by descriptors. In this work, we compare three sets of descriptors based on the one-, two-, and three-dimensional structure of a molecule. Using data from the NIST Computational Chemistry Comparison and Benchmark Database and machine learning techniques, we demonstrate the functional relationship between these structural descriptors and the electronic energy of molecules. Archetypes and prototypes found with topological or Coulomb matrix descriptors can be used to identify smaller, statistically significant test sets that better capture the diversity of chemical space. We apply this same method to find a diverse subset of organic molecules to demonstrate how the methods can easily be reapplied to individual research projects. Finally, we use our bias-free test sets to assess the performance of density functional theory and quantum Monte Carlo methods.
Demidenko, Eugene
2017-09-01
The exact density distribution of the nonlinear least squares estimator in the one-parameter regression model is derived in closed form and expressed through the cumulative distribution function of the standard normal variable. Several proposals to generalize this result are discussed. The exact density is extended to the estimating equation (EE) approach and the nonlinear regression with an arbitrary number of linear parameters and one intrinsically nonlinear parameter. For a very special nonlinear regression model, the derived density coincides with the distribution of the ratio of two normally distributed random variables previously obtained by Fieller (1932), unlike other approximations previously suggested by other authors. Approximations to the density of the EE estimators are discussed in the multivariate case. Numerical complications associated with the nonlinear least squares are illustrated, such as nonexistence and/or multiple solutions, as major factors contributing to poor density approximation. The nonlinear Markov-Gauss theorem is formulated based on the near exact EE density approximation.
NASA Astrophysics Data System (ADS)
Vittal, H.; Singh, Jitendra; Kumar, Pankaj; Karmakar, Subhankar
2015-06-01
In watershed management, flood frequency analysis (FFA) is performed to quantify the risk of flooding at different spatial locations and also to provide guidelines for determining the design periods of flood control structures. The traditional FFA was extensively performed by considering univariate scenario for both at-site and regional estimation of return periods. However, due to inherent mutual dependence of the flood variables or characteristics [i.e., peak flow (P), flood volume (V) and flood duration (D), which are random in nature], analysis has been further extended to multivariate scenario, with some restrictive assumptions. To overcome the assumption of same family of marginal density function for all flood variables, the concept of copula has been introduced. Although, the advancement from univariate to multivariate analyses drew formidable attention to the FFA research community, the basic limitation was that the analyses were performed with the implementation of only parametric family of distributions. The aim of the current study is to emphasize the importance of nonparametric approaches in the field of multivariate FFA; however, the nonparametric distribution may not always be a good-fit and capable of replacing well-implemented multivariate parametric and multivariate copula-based applications. Nevertheless, the potential of obtaining best-fit using nonparametric distributions might be improved because such distributions reproduce the sample's characteristics, resulting in more accurate estimations of the multivariate return period. Hence, the current study shows the importance of conjugating multivariate nonparametric approach with multivariate parametric and copula-based approaches, thereby results in a comprehensive framework for complete at-site FFA. Although the proposed framework is designed for at-site FFA, this approach can also be applied to regional FFA because regional estimations ideally include at-site estimations. The framework is based on the following steps: (i) comprehensive trend analysis to assess nonstationarity in the observed data; (ii) selection of the best-fit univariate marginal distribution with a comprehensive set of parametric and nonparametric distributions for the flood variables; (iii) multivariate frequency analyses with parametric, copula-based and nonparametric approaches; and (iv) estimation of joint and various conditional return periods. The proposed framework for frequency analysis is demonstrated using 110 years of observed data from Allegheny River at Salamanca, New York, USA. The results show that for both univariate and multivariate cases, the nonparametric Gaussian kernel provides the best estimate. Further, we perform FFA for twenty major rivers over continental USA, which shows for seven rivers, all the flood variables followed nonparametric Gaussian kernel; whereas for other rivers, parametric distributions provide the best-fit either for one or two flood variables. Thus the summary of results shows that the nonparametric method cannot substitute the parametric and copula-based approaches, but should be considered during any at-site FFA to provide the broadest choices for best estimation of the flood return periods.
Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas
2017-06-07
A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.
NASA Astrophysics Data System (ADS)
Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas
2017-06-01
A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.
ERIC Educational Resources Information Center
Polly, Drew; Wang, Chuang; Martin, Christie; Lambert, Richard; Pugalee, David; Middleton, Catherina
2018-01-01
This study examined the influence of a professional development project about an internet-based mathematics formative assessment tool and related pedagogies on primary teachers' instruction and student achievement. Teachers participated in 72 h of professional development during the year. Descriptive statistics and multivariate analyses of…
Techniques for deriving tissue structure from multiple projection dual-energy x-ray absorptiometry
NASA Technical Reports Server (NTRS)
Feldmesser, Howard S. (Inventor); Charles, Jr., Harry K. (Inventor); Beck, Thomas J. (Inventor); Magee, Thomas C. (Inventor)
2004-01-01
Techniques for deriving bone properties from images generated by a dual-energy x-ray absorptiometry apparatus include receiving first image data having pixels indicating bone mineral density projected at a first angle of a plurality of projection angles. Second image data and third image data are also received. The second image data indicates bone mineral density projected at a different second angle. The third image data indicates bone mineral density projected at a third angle. The third angle is different from the first angle and the second angle. Principal moments of inertia for a bone in the subject are computed based on the first image data, the second image data and the third image data. The techniques allow high-precision, high-resolution dual-energy x-ray attenuation images to be used for computing principal moments of inertia and strength moduli of individual bones, plus risk of injury and changes in risk of injury to a patient.
Lv, Yong; Song, Gangbing
2018-01-01
Rolling bearings are important components in rotary machinery systems. In the field of multi-fault diagnosis of rolling bearings, the vibration signal collected from single channels tends to miss some fault characteristic information. Using multiple sensors to collect signals at different locations on the machine to obtain multivariate signal can remedy this problem. The adverse effect of a power imbalance between the various channels is inevitable, and unfavorable for multivariate signal processing. As a useful, multivariate signal processing method, Adaptive-projection has intrinsically transformed multivariate empirical mode decomposition (APIT-MEMD), and exhibits better performance than MEMD by adopting adaptive projection strategy in order to alleviate power imbalances. The filter bank properties of APIT-MEMD are also adopted to enable more accurate and stable intrinsic mode functions (IMFs), and to ease mode mixing problems in multi-fault frequency extractions. By aligning IMF sets into a third order tensor, high order singular value decomposition (HOSVD) can be employed to estimate the fault number. The fault correlation factor (FCF) analysis is used to conduct correlation analysis, in order to determine effective IMFs; the characteristic frequencies of multi-faults can then be extracted. Numerical simulations and the application of multi-fault situation can demonstrate that the proposed method is promising in multi-fault diagnoses of multivariate rolling bearing signal. PMID:29659510
Yuan, Rui; Lv, Yong; Song, Gangbing
2018-04-16
Rolling bearings are important components in rotary machinery systems. In the field of multi-fault diagnosis of rolling bearings, the vibration signal collected from single channels tends to miss some fault characteristic information. Using multiple sensors to collect signals at different locations on the machine to obtain multivariate signal can remedy this problem. The adverse effect of a power imbalance between the various channels is inevitable, and unfavorable for multivariate signal processing. As a useful, multivariate signal processing method, Adaptive-projection has intrinsically transformed multivariate empirical mode decomposition (APIT-MEMD), and exhibits better performance than MEMD by adopting adaptive projection strategy in order to alleviate power imbalances. The filter bank properties of APIT-MEMD are also adopted to enable more accurate and stable intrinsic mode functions (IMFs), and to ease mode mixing problems in multi-fault frequency extractions. By aligning IMF sets into a third order tensor, high order singular value decomposition (HOSVD) can be employed to estimate the fault number. The fault correlation factor (FCF) analysis is used to conduct correlation analysis, in order to determine effective IMFs; the characteristic frequencies of multi-faults can then be extracted. Numerical simulations and the application of multi-fault situation can demonstrate that the proposed method is promising in multi-fault diagnoses of multivariate rolling bearing signal.
Heuristic-driven graph wavelet modeling of complex terrain
NASA Astrophysics Data System (ADS)
Cioacǎ, Teodor; Dumitrescu, Bogdan; Stupariu, Mihai-Sorin; Pǎtru-Stupariu, Ileana; Nǎpǎrus, Magdalena; Stoicescu, Ioana; Peringer, Alexander; Buttler, Alexandre; Golay, François
2015-03-01
We present a novel method for building a multi-resolution representation of large digital surface models. The surface points coincide with the nodes of a planar graph which can be processed using a critically sampled, invertible lifting scheme. To drive the lazy wavelet node partitioning, we employ an attribute aware cost function based on the generalized quadric error metric. The resulting algorithm can be applied to multivariate data by storing additional attributes at the graph's nodes. We discuss how the cost computation mechanism can be coupled with the lifting scheme and examine the results by evaluating the root mean square error. The algorithm is experimentally tested using two multivariate LiDAR sets representing terrain surface and vegetation structure with different sampling densities.
Li, Jinshan
2010-02-15
The ZPE-corrected N-NO(2) bond dissociation energies (BDEs(ZPE)) of a series of model N-nitrocompounds and typical energetic N-nitrocompounds have been calculated using density functional theory methods. Computed results show that using the 6-31G** basis set the UB3LYP calculated BDE(ZPE) is similar to the B3PW91 but is less than the UB3P86 and that for both UB3P86 and UB3PW91 methods the 6-31G(**) calculated BDE(ZPE) is close to the 6-31++G(**). For the series of model N-nitrocompounds it is drawn from the NBO analysis that at the UB3LYP/6-31G(**) level the order of BDE(ZPE) is not only in line with that of bond order but also with that of the energy gap between N-NO(2) bond and antibond orbitals. For the typical energetic N-nitrocompounds the impact sensitivity is strongly related to the BDE(ZPE) indeed, and based on the BDEs(ZPE) calculated at different density functional theory levels this work has established a good multivariate correlation of impact sensitivity with molecular parameters, which provides a method to address the sensitivity problem.
Hot spots of multivariate extreme anomalies in Earth observations
NASA Astrophysics Data System (ADS)
Flach, M.; Sippel, S.; Bodesheim, P.; Brenning, A.; Denzler, J.; Gans, F.; Guanche, Y.; Reichstein, M.; Rodner, E.; Mahecha, M. D.
2016-12-01
Anomalies in Earth observations might indicate data quality issues, extremes or the change of underlying processes within a highly multivariate system. Thus, considering the multivariate constellation of variables for extreme detection yields crucial additional information over conventional univariate approaches. We highlight areas in which multivariate extreme anomalies are more likely to occur, i.e. hot spots of extremes in global atmospheric Earth observations that impact the Biosphere. In addition, we present the year of the most unusual multivariate extreme between 2001 and 2013 and show that these coincide with well known high impact extremes. Technically speaking, we account for multivariate extremes by using three sophisticated algorithms adapted from computer science applications. Namely an ensemble of the k-nearest neighbours mean distance, a kernel density estimation and an approach based on recurrences is used. However, the impact of atmosphere extremes on the Biosphere might largely depend on what is considered to be normal, i.e. the shape of the mean seasonal cycle and its inter-annual variability. We identify regions with similar mean seasonality by means of dimensionality reduction in order to estimate in each region both the `normal' variance and robust thresholds for detecting the extremes. In addition, we account for challenges like heteroscedasticity in Northern latitudes. Apart from hot spot areas, those anomalies in the atmosphere time series are of particular interest, which can only be detected by a multivariate approach but not by a simple univariate approach. Such an anomalous constellation of atmosphere variables is of interest if it impacts the Biosphere. The multivariate constellation of such an anomalous part of a time series is shown in one case study indicating that multivariate anomaly detection can provide novel insights into Earth observations.
Rudi, Knut; Zimonja, Monika; Kvenshagen, Bente; Rugtveit, Jarle; Midtvedt, Tore; Eggesbø, Merete
2007-01-01
We present a novel approach for comparing 16S rRNA gene clone libraries that is independent of both DNA sequence alignment and definition of bacterial phylogroups. These steps are the major bottlenecks in current microbial comparative analyses. We used direct comparisons of taxon density distributions in an absolute evolutionary coordinate space. The coordinate space was generated by using alignment-independent bilinear multivariate modeling. Statistical analyses for clone library comparisons were based on multivariate analysis of variance, partial least-squares regression, and permutations. Clone libraries from both adult and infant gastrointestinal tract microbial communities were used as biological models. We reanalyzed a library consisting of 11,831 clones covering complete colons from three healthy adults in addition to a smaller 390-clone library from infant feces. We show that it is possible to extract detailed information about microbial community structures using our alignment-independent method. Our density distribution analysis is also very efficient with respect to computer operation time, meeting the future requirements of large-scale screenings to understand the diversity and dynamics of microbial communities. PMID:17337554
Dorsal motor nucleus of the vagus neurons: a multivariate taxonomy.
Jarvinen, M K; Powley, T L
1999-01-18
The dorsal motor nucleus of the vagus (DMNX) contains neurons with different projections and discrete functions, but little success has been achieved in distinguishing the cells cytoarchitectonically. The present experiment employed multivariate analytical techniques to evaluate DMNX neuronal morphology. Male Sprague-Dawley rats (n = 77) were perfused, and the brainstems were stained en bloc with a Golgi-Cox protocol. DMNX neurons in each of three planes (coronal, sagittal, and horizontal; total sample = 607) were digitized. Three-dimensional features quantified included dendritic length, number of segments, spine density, number of primary dendrites, dendritic orientation, and soma form factor. Cluster analyses of six independent samples of 100+ neurons and of three composite replicate pools of 200+ neurons consistently identified similar sets of four distinct neuronal profiles. One profile (spinous, limited dendrites, small somata) appears to correspond to the interneuron population of the DMNX. In contrast, the other three distinctive profiles (e.g., one is multipolar, with large dendritic fields and large somata) are different types of preganglionic neurons. Each of the four types of neurons is found throughout the DMNX, suggesting that the individual columnar subnuclei and other postulated vagal motorneuron pools are composed of all types of neurons. Within individual motor pools, ensembles of the different neuronal types must cooperatively organize different functions and project to different effectors within a target organ. By extension, specializations of the preganglionic motor pools are more likely to result from their afferent inputs, peripheral target tissues, neurochemistry, or physiological features rather than from any unique morphological profiles.
Fast clustering using adaptive density peak detection.
Wang, Xiao-Feng; Xu, Yifan
2017-12-01
Common limitations of clustering methods include the slow algorithm convergence, the instability of the pre-specification on a number of intrinsic parameters, and the lack of robustness to outliers. A recent clustering approach proposed a fast search algorithm of cluster centers based on their local densities. However, the selection of the key intrinsic parameters in the algorithm was not systematically investigated. It is relatively difficult to estimate the "optimal" parameters since the original definition of the local density in the algorithm is based on a truncated counting measure. In this paper, we propose a clustering procedure with adaptive density peak detection, where the local density is estimated through the nonparametric multivariate kernel estimation. The model parameter is then able to be calculated from the equations with statistical theoretical justification. We also develop an automatic cluster centroid selection method through maximizing an average silhouette index. The advantage and flexibility of the proposed method are demonstrated through simulation studies and the analysis of a few benchmark gene expression data sets. The method only needs to perform in one single step without any iteration and thus is fast and has a great potential to apply on big data analysis. A user-friendly R package ADPclust is developed for public use.
Murakami, Nozomu; Tanabe, Kouichi; Morita, Tatsuya; Fujikawa, Yasunaga; Koseki, Shiro; Kajiura, Shinya; Nakajima, Kazunori; Hayashi, Ryuji
2018-05-03
To examine the clinical outcomes of a project to enhance the awareness of community-based palliative care (awareness-enhancing project), focusing on home death and care rates in communities. A single-center study on community-based intervention was conducted. The awareness-enhancing project, consisting of three intervention approaches (outreach, palliative care education for community-based medical professionals, and information-sharing tool use), was executed, and changes in the home death rate in the community were examined. The home death rate markedly exceeded the national mean from 2010. In 2012-2013, it was as high as 19.9%, greater than the previous 5.9% (p = 0.001). Through multivariate analysis, the participation of home care physicians and visiting nurses in a palliative care education program, and patients' Palliative Prognostic Index values were identified as factors significantly influencing the home death rate. The three intervention approaches time dependently increased the home death rate as a clinical outcome in the community, although they targeted limited areas. These approaches may aid in increasing the number of individuals who die in their homes.
Blended learning in situated contexts: 3-year evaluation of an online peer review project.
Bridges, S; Chang, J W W; Chu, C H; Gardner, K
2014-08-01
Situated and sociocultural perspectives on learning indicate that the design of complex tasks supported by educational technologies holds potential for dental education in moving novices towards closer approximation of the clinical outcomes of their expert mentors. A cross-faculty-, student-centred, web-based project in operative dentistry was established within the Universitas 21 (U21) network of higher education institutions to support university goals for internationalisation in clinical learning by enabling distributed interactions across sites and institutions. This paper aims to present evaluation of one dental faculty's project experience of curriculum redesign for deeper student learning. A mixed-method case study approach was utilised. Three cohorts of second-year students from a 5-year bachelor of dental surgery (BDS) programme were invited to participate in annual surveys and focus group interviews on project completion. Survey data were analysed for differences between years using multivariate logistical regression analysis. Thematic analysis of questionnaire open responses and interview transcripts was conducted. Multivariate logistic regression analysis noted significant differences across items over time indicating learning improvements, attainment of university aims and the positive influence of redesign. Students perceived the enquiry-based project as stimulating and motivating, and building confidence in operative techniques. Institutional goals for greater understanding of others and lifelong learning showed improvement over time. Despite positive scores, students indicated global citizenship and intercultural understanding were conceptually challenging. Establishment of online student learning communities through a blended approach to learning stimulated motivation and intellectual engagement, thereby supporting a situated approach to cognition. Sociocultural perspectives indicate that novice-expert interactions supported student development of professional identities. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Wind Turbine Load Mitigation based on Multivariable Robust Control and Blade Root Sensors
NASA Astrophysics Data System (ADS)
Díaz de Corcuera, A.; Pujana-Arrese, A.; Ezquerra, J. M.; Segurola, E.; Landaluze, J.
2014-12-01
This paper presents two H∞ multivariable robust controllers based on blade root sensors' information for individual pitch angle control. The wind turbine of 5 MW defined in the Upwind European project is the reference non-linear model used in this research work, which has been modelled in the GH Bladed 4.0 software package. The main objective of these controllers is load mitigation in different components of wind turbines during power production in the above rated control zone. The first proposed multi-input multi-output (MIMO) individual pitch H" controller mitigates the wind effect on the tower side-to-side acceleration and reduces the asymmetrical loads which appear in the rotor due to its misalignment. The second individual pitch H" multivariable controller mitigates the loads on the three blades reducing the wind effect on the bending flapwise and edgewise momentums in the blades. The designed H" controllers have been validated in GH Bladed and an exhaustive analysis has been carried out to calculate fatigue load reduction on wind turbine components, as well as to analyze load mitigation in some extreme cases.
NASA Astrophysics Data System (ADS)
Svoboda, Aaron A.; Forbes, Jeffrey M.; Miyahara, Saburo
2005-11-01
A self-consistent global tidal climatology, useful for comparing and interpreting radar observations from different locations around the globe, is created from space-based Upper Atmosphere Research Satellite (UARS) horizontal wind measurements. The climatology created includes tidal structures for horizontal winds, temperature and relative density, and is constructed by fitting local (in latitude and height) UARS wind data at 95 km to a set of basis functions called Hough mode extensions (HMEs). These basis functions are numerically computed modifications to Hough modes and are globally self-consistent in wind, temperature, and density. We first demonstrate this self-consistency with a proxy data set from the Kyushu University General Circulation Model, and then use a linear weighted superposition of the HMEs obtained from monthly fits to the UARS data to extrapolate the global, multi-variable tidal structure. A brief explanation of the HMEs’ origin is provided as well as information about a public website that has been set up to make the full extrapolated data sets available.
Analysis of polarization in hydrogen bonded complexes: An asymptotic projection approach
NASA Astrophysics Data System (ADS)
Drici, Nedjoua
2018-03-01
The asymptotic projection technique is used to investigate the polarization effect that arises from the interaction between the relaxed, and frozen monomeric charge densities of a set of neutral and charged hydrogen bonded complexes. The AP technique based on the resolution of the original Kohn-Sham equations can give an acceptable qualitative description of the polarization effect in neutral complexes. The significant overlap of the electron densities, in charged and π-conjugated complexes, impose further development of a new functional, describing the coupling between constrained and non-constrained electron densities within the AP technique to provide an accurate representation of the polarization effect.
Crandall, Carolyn J; Zheng, Yan; Karlamangla, Arun; Sternfeld, Barbara; Habel, Laurel A; Oestreicher, Nina; Johnston, Janet; Cauley, Jane A; Greendale, Gail A
2007-08-01
Bone mineral density and mammographic breast density are each associated with markers of lifetime estrogen exposure. The association between mammographic breast density and bone mineral density in early perimenopausal women is unknown. We analyzed data from a cohort (n = 501) of premenopausal (no change in menstrual regularity) and early perimenopausal (decreased menstrual regularity in past 3 months) participants of African-American, Caucasian, Chinese, and Japanese ethnicity in the Study of Women's Health Across the Nation. Using multivariable linear regression, we examined the cross-sectional association between percent mammographic density and bone mineral density (BMD). Percent mammographic density was statistically significantly inversely associated with hip BMD and lumbar spine BMD after adjustment (body mass index, ethnicity, age, study site, parity, alcohol intake, cigarette smoking, physical activity, age at first childbirth) in early perimenopausal, but not premenopausal, women. In early perimenopausal women, every 0.1g/cm(2) greater hip BMD predicted a 2% lower percent mammographic density (95% confidence interval -37.0 to -0.6%, p = 0.04). Mammographic breast density is inversely associated with BMD in the perimenopausal participants of this community-based cohort. The biological underpinnings of these findings may reflect differential responsiveness of breast and bone mineral density to the steroid milieu.
Parametric Cost Models for Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney
2010-01-01
Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.
Kim, Won Hwa; Cho, Nariya; Kim, Young-Seon; Yi, Ann
2018-04-06
To evaluate the changes in mammographic density after tamoxifen discontinuation in premenopausal women with oestrogen receptor-positive breast cancers and the underlying factors METHODS: A total of 213 consecutive premenopausal women with breast cancer who received tamoxifen treatment after curative surgery and underwent three mammograms (baseline, after tamoxifen treatment, after tamoxifen discontinuation) were included. Changes in mammographic density after tamoxifen discontinuation were assessed qualitatively (decrease, no change, or increase) by two readers and measured quantitatively by semi-automated software. The association between % density change and clinicopathological factors was evaluated using univariate and multivariate regression analyses. After tamoxifen discontinuation, a mammographic density increase was observed in 31.9% (68/213, reader 1) to 22.1% (47/213, reader 2) by qualitative assessment, with a mean density increase of 1.8% by quantitative assessment compared to density before tamoxifen discontinuation. In multivariate analysis, younger age (≤ 39 years) and greater % density decline after tamoxifen treatment (≥ 17.0%) were independent factors associated with density change after tamoxifen discontinuation (p < .001 and p = .003, respectively). Tamoxifen discontinuation was associated with mammographic density change with a mean density increase of 1.8%, which was associated with younger age and greater density change after tamoxifen treatment. • Increased mammographic density after tamoxifen discontinuation can occur in premenopausal women. • Mean density increase after tamoxifen discontinuation was 1.8%. • Density increase is associated with age and density decrease after tamoxifen.
Hohn, M. Ed; Nuhfer, E.B.; Vinopal, R.J.; Klanderman, D.S.
1980-01-01
Classifying very fine-grained rocks through fabric elements provides information about depositional environments, but is subject to the biases of visual taxonomy. To evaluate the statistical significance of an empirical classification of very fine-grained rocks, samples from Devonian shales in four cored wells in West Virginia and Virginia were measured for 15 variables: quartz, illite, pyrite and expandable clays determined by X-ray diffraction; total sulfur, organic content, inorganic carbon, matrix density, bulk density, porosity, silt, as well as density, sonic travel time, resistivity, and ??-ray response measured from well logs. The four lithologic types comprised: (1) sharply banded shale, (2) thinly laminated shale, (3) lenticularly laminated shale, and (4) nonbanded shale. Univariate and multivariate analyses of variance showed that the lithologic classification reflects significant differences for the variables measured, difference that can be detected independently of stratigraphic effects. Little-known statistical methods found useful in this work included: the multivariate analysis of variance with more than one effect, simultaneous plotting of samples and variables on canonical variates, and the use of parametric ANOVA and MANOVA on ranked data. ?? 1980 Plenum Publishing Corporation.
Density of Indoor Tanning Facilities in 116 Large U.S. Cities
Hoerster, Katherine D.; Garrow, Rebecca L.; Mayer, Joni A.; Clapp, Elizabeth J.; Weeks, John R.; Woodruff, Susan I.; Sallis, James F.; Slymen, Donald J.; Patel, Minal R.; Sybert, Stephanie A.
2009-01-01
Background U.S. adolescents and young adults are using indoor tanning at high rates, even though it has been linked to both melanoma and squamous cell cancer. Because the availability of commercial indoor tanning facilities may influence use, data are needed on the number and density of such facilities. Methods In March 2006, commercial indoor tanning facilities in 116 large U.S. cities were identified, and the number and density (per 100,000 population) were computed for each city. Bivariate and multivariate analyses conducted in 2008 tested the association between tanning-facility density and selected geographic, climatologic, demographic, and legislative variables. Results Mean facility number and density across cities were 41.8 (SD=30.8) and 11.8 (SD=6.0), respectively. In multivariate analysis, cities with higher percentages of whites and lower ultraviolet (UV)index scores had significantly higher facility densities than those with lower percentages of whites and higher UV index scores. Conclusions These data indicate that commercial indoor tanning is widely available in the urban U.S., and this availability may help explain the high usage of indoor tanning. PMID:19215849
Density of indoor tanning facilities in 116 large U.S. cities.
Hoerster, Katherine D; Garrow, Rebecca L; Mayer, Joni A; Clapp, Elizabeth J; Weeks, John R; Woodruff, Susan I; Sallis, James F; Slymen, Donald J; Patel, Minal R; Sybert, Stephanie A
2009-03-01
U.S. adolescents and young adults are using indoor tanning at high rates, even though it has been linked to both melanoma and squamous cell cancer. Because the availability of commercial indoor tanning facilities may influence use, data are needed on the number and density of such facilities. In March 2006, commercial indoor tanning facilities in 116 large U.S. cities were identified, and the number and density (per 100,000 population) were computed for each city. Bivariate and multivariate analyses conducted in 2008 tested the association between tanning-facility density and selected geographic, climatologic, demographic, and legislative variables. Mean facility number and density across cities were 41.8 (SD=30.8) and 11.8 (SD=6.0), respectively. In multivariate analysis, cities with higher percentages of whites and lower ultraviolet (UV)index scores had significantly higher facility densities than those with lower percentages of whites and higher UV index scores. These data indicate that commercial indoor tanning is widely available in the urban U.S., and this availability may help explain the high usage of indoor tanning.
Bernchou, Uffe; Hansen, Olfred; Schytte, Tine; Bertelsen, Anders; Hope, Andrew; Moseley, Douglas; Brink, Carsten
2015-10-01
This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT markers to predict lung density changes induced by radiotherapy was investigated. Age and CBCT markers extracted at 10th, 20th, and 30th treatment fraction significantly predicted lung density changes in a multivariable analysis, and a set of response models based on these parameters were established. The correlation coefficient for the models was 0.35, 0.35, and 0.39, when based on the markers obtained at the 10th, 20th, and 30th fraction, respectively. The study indicates that younger patients without lung tissue reactions early into their treatment course may have minimal radiation induced lung density increase at follow-up. Further investigations are needed to examine the ability of the models to identify patients with low risk of symptomatic toxicity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Analysis of speckle patterns in phase-contrast images of lung tissue
NASA Astrophysics Data System (ADS)
Kitchen, M. J.; Paganin, D.; Lewis, R. A.; Yagi, N.; Uesugi, K.
2005-08-01
Propagation-based phase-contrast images of mice lungs have been obtained at the SPring-8 synchrotron research facility. Such images exhibit a speckled intensity pattern that bears a superficial resemblance to alveolar structures. This speckle results from focussing effects as projected air-filled alveoli form aberrated compound refractive lenses. An appropriate phase-retrieval algorithm has been utilized to reconstruct the approximate projected lung tissue thickness from single-phase-contrast mice chest radiographs. The results show projected density variations across the lung, highlighting regions of low density corresponding to air-filled regions. Potentially, this offers a better method than conventional radiography for detecting lung diseases such as fibrosis, emphysema and cancer, though this has yet to be demonstrated. As such, the approach can assist in continuing studies of lung function utilizing propagation-based phase-contrast imaging.
A pilot study of NMR-based sensory prediction of roasted coffee bean extracts.
Wei, Feifei; Furihata, Kazuo; Miyakawa, Takuya; Tanokura, Masaru
2014-01-01
Nuclear magnetic resonance (NMR) spectroscopy can be considered a kind of "magnetic tongue" for the characterisation and prediction of the tastes of foods, since it provides a wealth of information in a nondestructive and nontargeted manner. In the present study, the chemical substances in roasted coffee bean extracts that could distinguish and predict the different sensations of coffee taste were identified by the combination of NMR-based metabolomics and human sensory test and the application of the multivariate projection method of orthogonal projection to latent structures (OPLS). In addition, the tastes of commercial coffee beans were successfully predicted based on their NMR metabolite profiles using our OPLS model, suggesting that NMR-based metabolomics accompanied with multiple statistical models is convenient, fast and accurate for the sensory evaluation of coffee. Copyright © 2013 Elsevier Ltd. All rights reserved.
Two-dimensional imaging of two types of radicals by the CW-EPR method
NASA Astrophysics Data System (ADS)
Czechowski, Tomasz; Krzyminiewski, Ryszard; Jurga, Jan; Chlewicki, Wojciech
2008-01-01
The CW-EPR method of image reconstruction is based on sample rotation in a magnetic field with a constant gradient (50 G/cm). In order to obtain a projection (radical density distribution) along a given direction, the EPR spectra are recorded with and without the gradient. Deconvolution, then gives the distribution of the spin density. Projection at 36 different angles gives the information that is necessary for reconstruction of the radical distribution. The problem becomes more complex when there are at least two types of radicals in the sample, because the deconvolution procedure does not give satisfactory results. We propose a method to calculate the projections for each radical, based on iterative procedures. The images of density distribution for each radical obtained by our procedure have proved that the method of deconvolution, in combination with iterative fitting, provides correct results. The test was performed on a sample of polymer PPS Br 111 ( p-phenylene sulphide) with glass fibres and minerals. The results indicated a heterogeneous distribution of radicals in the sample volume. The images obtained were in agreement with the known shape of the sample.
Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic
2017-02-01
Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T 2 and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.
The factors controlling species density in herbaceous plant communities: An assessment
Grace, J.B.
1999-01-01
This paper evaluates both the ideas and empirical evidence pertaining to the control of species density in herbaceous plant communities. While most theoretical discussions of species density have emphasized the importance of habitat productivity and disturbance regimes, many other factors (e.g. species pools, plant litter accumulation, plant morphology) have been proposed to be important. A review of literature presenting observations on the density of species in small plots (in the vicinity of a few square meters or less), as well as experimental studies, suggests several generalizations: (1) Available data are consistent with an underlying unimodal relationship between species density and total community biomass. While variance in species density is often poorly explained by predictor variables, there is strong evidence that high levels of community biomass are antagonistic to high species density. (2) Community biomass is just one of several factors affecting variations in species density. Multivariate analyses typically explain more than twice as much variance in species density as can be explained by community biomass alone. (3) Disturbance has important and sometimes complex effects on species density. In general, the evidence is consistent with the intermediate disturbance hypothesis but exceptions exist and effects can be complex. (4) Gradients in the species pool can have important influences on patterns of species density. Evidence is mounting that a considerable amount of the observed variability in species density within a landscape or region may result from environmental effects on the species pool. (5) Several additional factors deserve greater consideration, including time lags, species composition, plant morphology, plant density and soil microbial effects. Based on the available evidence, a conceptual model of the primary factors controlling species density is presented here. This model suggests that species density is controlled by the effects of disturbance, total community biomass, colonization, the species pool and spatial heterogeneity. The structure of the model leads to two main expectations: (1) while community biomass is important, multivariate approaches will be required to understand patterns of variation in species density, and (2) species density will be more highly correlated with light penetration to the soil surface, than with above-ground biomass, and even less well correlated with plant growth rates (productivity) or habitat fertility. At present, data are insufficient to evaluate the relative importance of the processes controlling species density. Much more work is needed if we are to adequately predict the effects of environmental changes on plant communities and species diversity.
Forward model with space-variant of source size for reconstruction on X-ray radiographic image
NASA Astrophysics Data System (ADS)
Liu, Jin; Liu, Jun; Jing, Yue-feng; Xiao, Bo; Wei, Cai-hua; Guan, Yong-hong; Zhang, Xuan
2018-03-01
The Forward Imaging Technique is a method to solve the inverse problem of density reconstruction in radiographic imaging. In this paper, we introduce the forward projection equation (IFP model) for the radiographic system with areal source blur and detector blur. Our forward projection equation, based on X-ray tracing, is combined with the Constrained Conjugate Gradient method to form a new method for density reconstruction. We demonstrate the effectiveness of the new technique by reconstructing density distributions from simulated and experimental images. We show that for radiographic systems with source sizes larger than the pixel size, the effect of blur on the density reconstruction is reduced through our method and can be controlled within one or two pixels. The method is also suitable for reconstruction of non-homogeneousobjects.
Tang, Rongnian; Chen, Xupeng; Li, Chuang
2018-05-01
Near-infrared spectroscopy is an efficient, low-cost technology that has potential as an accurate method in detecting the nitrogen content of natural rubber leaves. Successive projections algorithm (SPA) is a widely used variable selection method for multivariate calibration, which uses projection operations to select a variable subset with minimum multi-collinearity. However, due to the fluctuation of correlation between variables, high collinearity may still exist in non-adjacent variables of subset obtained by basic SPA. Based on analysis to the correlation matrix of the spectra data, this paper proposed a correlation-based SPA (CB-SPA) to apply the successive projections algorithm in regions with consistent correlation. The result shows that CB-SPA can select variable subsets with more valuable variables and less multi-collinearity. Meanwhile, models established by the CB-SPA subset outperform basic SPA subsets in predicting nitrogen content in terms of both cross-validation and external prediction. Moreover, CB-SPA is assured to be more efficient, for the time cost in its selection procedure is one-twelfth that of the basic SPA.
KMgene: a unified R package for gene-based association analysis for complex traits.
Yan, Qi; Fang, Zhou; Chen, Wei; Stegle, Oliver
2018-02-09
In this report, we introduce an R package KMgene for performing gene-based association tests for familial, multivariate or longitudinal traits using kernel machine (KM) regression under a generalized linear mixed model (GLMM) framework. Extensive simulations were performed to evaluate the validity of the approaches implemented in KMgene. http://cran.r-project.org/web/packages/KMgene. qi.yan@chp.edu or wei.chen@chp.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.
High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniel A. Mosher; Xia Tang; Ronald J. Brown
2007-07-27
This final report describes the motivations, activities and results of the hydrogen storage independent project "High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides" performed by the United Technologies Research Center under the Department of Energy Hydrogen Program, contract # DE-FC36-02AL67610. The objectives of the project were to identify and address the key systems technologies associated with applying complex hydride materials, particularly ones which differ from those for conventional metal hydride based storage. This involved the design, fabrication and testing of two prototype systems based on the hydrogen storage material NaAlH4. Safety testing, catalysis studies, heat exchangermore » optimization, reaction kinetics modeling, thermochemical finite element analysis, powder densification development and material neutralization were elements included in the effort.« less
Flow experience in teams: The role of shared leadership.
Aubé, Caroline; Rousseau, Vincent; Brunelle, Eric
2018-04-01
The present study tests a multilevel mediation model concerning the effect of shared leadership on team members' flow experience. Specifically, we investigate the mediating role of teamwork behaviors in the relationships between 2 complementary indicators of shared leadership (i.e., density and centralization) and flow. Based on a multisource approach, we collected data through observation and survey of 111 project teams (521 individuals) made up of university students participating in a project management simulation. The results show that density and centralization have both an additive effect and an interaction effect on teamwork behaviors, such that the relationship between density and teamwork behaviors is stronger when centralization is low. In addition, teamwork behaviors play a mediating role in the relationship between shared leadership and flow. Overall, the findings highlight the importance of promoting team-based shared leadership in organizations to favor the flow experience. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
BrainPort(Registered trademark) Technology Tongue Interface Characterization
2010-03-01
22 boards in house. … … … … … Experiment Control Workstation HDA with 2000 to 20,000 electrodes ●●● TCP/IP 10/100 Linux-based Control...Defense Advanced Research Projects Agency HD High Density HDA High Density Array IOD Intra-Oral Device IRB Institutional Review Board Electrode
NASA Astrophysics Data System (ADS)
Wahl, Thomas; Jensen, Jürgen; Mudersbach, Christoph
2010-05-01
Storm surges along the German North Sea coastline led to major damages in the past and the risk of inundation is expected to increase in the course of an ongoing climate change. The knowledge of the characteristics of possible storm surges is essential for the performance of integrated risk analyses, e.g. based on the source-pathway-receptor concept. The latter includes the storm surge simulation/analyses (source), modelling of dike/dune breach scenarios (pathway) and the quantification of potential losses (receptor). In subproject 1b of the German joint research project XtremRisK (www.xtremrisk.de), a stochastic storm surge generator for the south-eastern North Sea area is developed. The input data for the multivariate model are high resolution sea level observations from tide gauges during extreme events. Based on 25 parameters (19 sea level parameters and 6 time parameters) observed storm surge hydrographs consisting of three tides are parameterised. Followed by the adaption of common parametric probability distributions and a large number of Monte-Carlo-Simulations, the final reconstruction leads to a set of 100.000 (default) synthetic storm surge events with a one-minute resolution. Such a data set can potentially serve as the basis for a large number of applications. For risk analyses, storm surges with peak water levels exceeding the design water levels are of special interest. The occurrence probabilities of the simulated extreme events are estimated based on multivariate statistics, considering the parameters "peak water level" and "fullness/intensity". In the past, most studies considered only the peak water levels during extreme events, which might not be the most important parameter in any cases. Here, a 2D-Archimedian copula model is used for the estimation of the joint probabilities of the selected parameters, accounting for the structures of dependence overlooking the margins. In coordination with subproject 1a, the results will be used as the input for the XtremRisK subprojects 2 to 4. The project is funded by the German Federal Ministry of Education and Research (BMBF) (Project No. 03 F 0483 B).
SPReM: Sparse Projection Regression Model For High-dimensional Linear Regression *
Sun, Qiang; Zhu, Hongtu; Liu, Yufeng; Ibrahim, Joseph G.
2014-01-01
The aim of this paper is to develop a sparse projection regression modeling (SPReM) framework to perform multivariate regression modeling with a large number of responses and a multivariate covariate of interest. We propose two novel heritability ratios to simultaneously perform dimension reduction, response selection, estimation, and testing, while explicitly accounting for correlations among multivariate responses. Our SPReM is devised to specifically address the low statistical power issue of many standard statistical approaches, such as the Hotelling’s T2 test statistic or a mass univariate analysis, for high-dimensional data. We formulate the estimation problem of SPREM as a novel sparse unit rank projection (SURP) problem and propose a fast optimization algorithm for SURP. Furthermore, we extend SURP to the sparse multi-rank projection (SMURP) by adopting a sequential SURP approximation. Theoretically, we have systematically investigated the convergence properties of SURP and the convergence rate of SURP estimates. Our simulation results and real data analysis have shown that SPReM out-performs other state-of-the-art methods. PMID:26527844
Monograph on the use of the multivariate Gram Charlier series Type A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hatayodom, T.; Heydt, G.
1978-01-01
The Gram-Charlier series in an infinite series expansion for a probability density function (pdf) in which terms of the series are Hermite polynomials. There are several Gram-Charlier series - the best known is Type A. The Gram-Charlier series, Type A (GCA) exists for both univariate and multivariate random variables. This monograph introduces the multivariate GCA and illustrates its use through several examples. A brief bibliography and discussion of Hermite polynomials is also included. 9 figures, 2 tables.
NASA Astrophysics Data System (ADS)
Nagai, Keiji; Yang, H.; Norimatsu, T.; Azechi, H.; Belkada, F.; Fujimoto, Y.; Fujimura, T.; Fujioka, K.; Fujioka, S.; Homma, H.; Ito, F.; Iwamoto, A.; Jitsuno, T.; Kaneyasu, Y.; Nakai, M.; Nemoto, N.; Saika, H.; Shimoyama, T.; Suzuki, Y.; Yamanaka, K.; Mima, K.
2009-09-01
The development of target fabrication for the Fast Ignition Realization EXperiment (FIREX) Project is described in this paper. For the first stage of the FIREX Project (FIREX-I), the previously designed target has been modified by using a bromine-doped ablator and coating the inner gold cone with a low-density material. A high-quality bromine-doped capsule without vacuoles was fabricated from bromine-doped deuterated polystyrene. The gold surface was coated with a low-density material by electrochemical plating. For the cryogenic fuel target, a brand new type of aerogel material, phloroglucinol/formaldehyde (PF), was investigated and encapsulated to meet the specifications of 500 µm diameter and 20 µm thickness, with 30 nm nanopores. Polystyrene-based low-density materials were investigated and the relationship between the crosslinker content and the nanopore structure was observed.
Monochromatic-beam-based dynamic X-ray microtomography based on OSEM-TV algorithm.
Xu, Liang; Chen, Rongchang; Yang, Yiming; Deng, Biao; Du, Guohao; Xie, Honglan; Xiao, Tiqiao
2017-01-01
Monochromatic-beam-based dynamic X-ray computed microtomography (CT) was developed to observe evolution of microstructure inside samples. However, the low flux density results in low efficiency in data collection. To increase efficiency, reducing the number of projections should be a practical solution. However, it has disadvantages of low image reconstruction quality using the traditional filtered back projection (FBP) algorithm. In this study, an iterative reconstruction method using an ordered subset expectation maximization-total variation (OSEM-TV) algorithm was employed to address and solve this problem. The simulated results demonstrated that normalized mean square error of the image slices reconstructed by the OSEM-TV algorithm was about 1/4 of that by FBP. Experimental results also demonstrated that the density resolution of OSEM-TV was high enough to resolve different materials with the number of projections less than 100. As a result, with the introduction of OSEM-TV, the monochromatic-beam-based dynamic X-ray microtomography is potentially practicable for the quantitative and non-destructive analysis to the evolution of microstructure with acceptable efficiency in data collection and reconstructed image quality.
Breeding population density and habitat use of Swainson's warblers in a Georgia floodplain forest
Wright, E.A.
2002-01-01
I examined density and habitat use of a Swainson's Warbler (Limnothlypis swainsonii) breeding population in Georgia. This songbird species is inadequately monitored, and may be declining due to anthropogenic alteration of floodplain forest breeding habitats. I used distance sampling methods to estimate density, finding 9.4 singing males/ha (CV = 0.298). Individuals were encountered too infrequently to produce a Iow-variance estimate, and distance sampling thus may be impracticable for monitoring this relatively rare species. I developed a set of multivariate habitat models using binary logistic regression techniques, based on measurement of 22 variables in 56 plots occupied by Swainson's Warblers and 110 unoccupied plots. Occupied areas were characterized by high stem density of cane (Arundinaria gigantea) and other shrub layer vegetation, and presence of abundant and accessible leaf litter. I recommend two habitat models, which correctly classified 87-89% of plots in cross-validation runs, for potential use in habitat assessment at other locations.
Lupton, Joshua R; Faridi, Kamil F; Martin, Seth S; Sharma, Sristi; Kulkarni, Krishnaji; Jones, Steven R; Michos, Erin D
2016-01-01
Cross-sectional studies have found an association between deficiencies in serum vitamin D, as measured by 25-hydroxyvitamin D (25[OH]D), and an atherogenic lipid profile. These studies have focused on a limited panel of lipid values including low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), and triglycerides (TG). Our study examines the relationship between serum 25(OH)D and an extended lipid panel (Vertical Auto Profile) while controlling for age, gender, glycemic status, and kidney function. We used the Very Large Database of Lipids, which includes US adults clinically referred for analysis of their lipid profile from 2009 to 2011. Our study focused on 20,360 subjects who had data for lipids, 25(OH)D, age, gender, hemoglobin A1c, insulin, creatinine, and blood urea nitrogen. Subjects were split into groups based on serum 25(OH)D: deficient (<20 ng/mL), intermediate (≥ 20-30 ng/mL), and optimal (≥ 30 ng/mL). The deficient group was compared to the optimal group using multivariable linear regression. In multivariable-adjusted linear regression, deficient serum 25(OH)D was associated with significantly lower serum HDL-C (-5.1%) and higher total cholesterol (+9.4%), non-HDL-C (+15.4%), directly measured LDL-C (+13.5%), intermediate-density lipoprotein cholesterol (+23.7%), very low-density lipoprotein cholesterol (+19.0%), remnant lipoprotein cholesterol (+18.4%), and TG (+26.4%) when compared with the optimal group. Deficient serum 25(OH)D is associated with significantly lower HDL-C and higher directly measured LDL-C, intermediate-density lipoprotein cholesterol, very low-density lipoproteins cholesterol, remnant lipoprotein cholesterol, and TG. Future trials examining vitamin D supplementation and cardiovascular disease risk should consider using changes in an extended lipid panel as an additional outcome measurement. Copyright © 2016 National Lipid Association. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rupšys, P.
A system of stochastic differential equations (SDE) with mixed-effects parameters and multivariate normal copula density function were used to develop tree height model for Scots pine trees in Lithuania. A two-step maximum likelihood parameter estimation method is used and computational guidelines are given. After fitting the conditional probability density functions to outside bark diameter at breast height, and total tree height, a bivariate normal copula distribution model was constructed. Predictions from the mixed-effects parameters SDE tree height model calculated during this research were compared to the regression tree height equations. The results are implemented in the symbolic computational language MAPLE.
Performance Assessment of Kernel Density Clustering for Gene Expression Profile Data
Zeng, Beiyan; Chen, Yiping P.; Smith, Oscar H.
2003-01-01
Kernel density smoothing techniques have been used in classification or supervised learning of gene expression profile (GEP) data, but their applications to clustering or unsupervised learning of those data have not been explored and assessed. Here we report a kernel density clustering method for analysing GEP data and compare its performance with the three most widely-used clustering methods: hierarchical clustering, K-means clustering, and multivariate mixture model-based clustering. Using several methods to measure agreement, between-cluster isolation, and withincluster coherence, such as the Adjusted Rand Index, the Pseudo F test, the r2 test, and the profile plot, we have assessed the effectiveness of kernel density clustering for recovering clusters, and its robustness against noise on clustering both simulated and real GEP data. Our results show that the kernel density clustering method has excellent performance in recovering clusters from simulated data and in grouping large real expression profile data sets into compact and well-isolated clusters, and that it is the most robust clustering method for analysing noisy expression profile data compared to the other three methods assessed. PMID:18629292
Fabric pilling measurement using three-dimensional image
NASA Astrophysics Data System (ADS)
Ouyang, Wenbin; Wang, Rongwu; Xu, Bugao
2013-10-01
We introduce a stereovision system and the three-dimensional (3-D) image analysis algorithms for fabric pilling measurement. Based on the depth information available in the 3-D image, the pilling detection process starts from the seed searching at local depth maxima to the region growing around the selected seeds using both depth and distance criteria. After the pilling detection, the density, height, and area of individual pills in the image can be extracted to describe the pilling appearance. According to the multivariate regression analysis on the 3-D images of 30 cotton fabrics treated by the random-tumble and home-laundering machines, the pilling grade is highly correlated with the pilling density (R=0.923) but does not consistently change with the pilling height and area. The pilling densities measured from the 3-D images also correlate well with those counted manually from the samples (R=0.985).
A MATLAB implementation of the minimum relative entropy method for linear inverse problems
NASA Astrophysics Data System (ADS)
Neupauer, Roseanna M.; Borchers, Brian
2001-08-01
The minimum relative entropy (MRE) method can be used to solve linear inverse problems of the form Gm= d, where m is a vector of unknown model parameters and d is a vector of measured data. The MRE method treats the elements of m as random variables, and obtains a multivariate probability density function for m. The probability density function is constrained by prior information about the upper and lower bounds of m, a prior expected value of m, and the measured data. The solution of the inverse problem is the expected value of m, based on the derived probability density function. We present a MATLAB implementation of the MRE method. Several numerical issues arise in the implementation of the MRE method and are discussed here. We present the source history reconstruction problem from groundwater hydrology as an example of the MRE implementation.
Kim, Sungduk; Chen, Ming-Hui; Ibrahim, Joseph G.; Shah, Arvind K.; Lin, Jianxin
2013-01-01
In this paper, we propose a class of Box-Cox transformation regression models with multidimensional random effects for analyzing multivariate responses for individual patient data (IPD) in meta-analysis. Our modeling formulation uses a multivariate normal response meta-analysis model with multivariate random effects, in which each response is allowed to have its own Box-Cox transformation. Prior distributions are specified for the Box-Cox transformation parameters as well as the regression coefficients in this complex model, and the Deviance Information Criterion (DIC) is used to select the best transformation model. Since the model is quite complex, a novel Monte Carlo Markov chain (MCMC) sampling scheme is developed to sample from the joint posterior of the parameters. This model is motivated by a very rich dataset comprising 26 clinical trials involving cholesterol lowering drugs where the goal is to jointly model the three dimensional response consisting of Low Density Lipoprotein Cholesterol (LDL-C), High Density Lipoprotein Cholesterol (HDL-C), and Triglycerides (TG) (LDL-C, HDL-C, TG). Since the joint distribution of (LDL-C, HDL-C, TG) is not multivariate normal and in fact quite skewed, a Box-Cox transformation is needed to achieve normality. In the clinical literature, these three variables are usually analyzed univariately: however, a multivariate approach would be more appropriate since these variables are correlated with each other. A detailed analysis of these data is carried out using the proposed methodology. PMID:23580436
Kim, Sungduk; Chen, Ming-Hui; Ibrahim, Joseph G; Shah, Arvind K; Lin, Jianxin
2013-10-15
In this paper, we propose a class of Box-Cox transformation regression models with multidimensional random effects for analyzing multivariate responses for individual patient data in meta-analysis. Our modeling formulation uses a multivariate normal response meta-analysis model with multivariate random effects, in which each response is allowed to have its own Box-Cox transformation. Prior distributions are specified for the Box-Cox transformation parameters as well as the regression coefficients in this complex model, and the deviance information criterion is used to select the best transformation model. Because the model is quite complex, we develop a novel Monte Carlo Markov chain sampling scheme to sample from the joint posterior of the parameters. This model is motivated by a very rich dataset comprising 26 clinical trials involving cholesterol-lowering drugs where the goal is to jointly model the three-dimensional response consisting of low density lipoprotein cholesterol (LDL-C), high density lipoprotein cholesterol (HDL-C), and triglycerides (TG) (LDL-C, HDL-C, TG). Because the joint distribution of (LDL-C, HDL-C, TG) is not multivariate normal and in fact quite skewed, a Box-Cox transformation is needed to achieve normality. In the clinical literature, these three variables are usually analyzed univariately; however, a multivariate approach would be more appropriate because these variables are correlated with each other. We carry out a detailed analysis of these data by using the proposed methodology. Copyright © 2013 John Wiley & Sons, Ltd.
Measuring firm size distribution with semi-nonparametric densities
NASA Astrophysics Data System (ADS)
Cortés, Lina M.; Mora-Valencia, Andrés; Perote, Javier
2017-11-01
In this article, we propose a new methodology based on a (log) semi-nonparametric (log-SNP) distribution that nests the lognormal and enables better fits in the upper tail of the distribution through the introduction of new parameters. We test the performance of the lognormal and log-SNP distributions capturing firm size, measured through a sample of US firms in 2004-2015. Taking different levels of aggregation by type of economic activity, our study shows that the log-SNP provides a better fit of the firm size distribution. We also formally introduce the multivariate log-SNP distribution, which encompasses the multivariate lognormal, to analyze the estimation of the joint distribution of the value of the firm's assets and sales. The results suggest that sales are a better firm size measure, as indicated by other studies in the literature.
NASA Astrophysics Data System (ADS)
Wang, Chaolin; Zhong, Shaobo; Zhang, Fushen; Huang, Quanyi
2016-11-01
Precipitation interpolation has been a hot area of research for many years. It had close relation to meteorological factors. In this paper, precipitation from 91 meteorological stations located in and around Yunnan, Guizhou and Guangxi Zhuang provinces (or autonomous region), Mainland China was taken into consideration for spatial interpolation. Multivariate Bayesian maximum entropy (BME) method with auxiliary variables, including mean relative humidity, water vapour pressure, mean temperature, mean wind speed and terrain elevation, was used to get more accurate regional distribution of annual precipitation. The means, standard deviations, skewness and kurtosis of meteorological factors were calculated. Variogram and cross- variogram were fitted between precipitation and auxiliary variables. The results showed that the multivariate BME method was precise with hard and soft data, probability density function. Annual mean precipitation was positively correlated with mean relative humidity, mean water vapour pressure, mean temperature and mean wind speed, negatively correlated with terrain elevation. The results are supposed to provide substantial reference for research of drought and waterlog in the region.
NASA Astrophysics Data System (ADS)
Wu, W.; Chen, G. Y.; Kang, R.; Xia, J. C.; Huang, Y. P.; Chen, K. J.
2017-07-01
During slaughtering and further processing, chicken carcasses are inevitably contaminated by microbial pathogen contaminants. Due to food safety concerns, many countries implement a zero-tolerance policy that forbids the placement of visibly contaminated carcasses in ice-water chiller tanks during processing. Manual detection of contaminants is labor consuming and imprecise. Here, a successive projections algorithm (SPA)-multivariable linear regression (MLR) classifier based on an optimal performance threshold was developed for automatic detection of contaminants on chicken carcasses. Hyperspectral images were obtained using a hyperspectral imaging system. A regression model of the classifier was established by MLR based on twelve characteristic wavelengths (505, 537, 561, 562, 564, 575, 604, 627, 656, 665, 670, and 689 nm) selected by SPA , and the optimal threshold T = 1 was obtained from the receiver operating characteristic (ROC) analysis. The SPA-MLR classifier provided the best detection results when compared with the SPA-partial least squares (PLS) regression classifier and the SPA-least squares supported vector machine (LS-SVM) classifier. The true positive rate (TPR) of 100% and the false positive rate (FPR) of 0.392% indicate that the SPA-MLR classifier can utilize spatial and spectral information to effectively detect contaminants on chicken carcasses.
Ventilation-Perfusion Relationships Following Experimental Pulmonary Contusion
2007-06-14
696.7 6.1 to 565.0 24.3 Hounsfield units ), as did VOL (4.3 0.5 to 33.5 3.2%). Multivariate linear regression of MGSD, VOL, VD/VT, and QS vs. PaO2...parenchyma was separated into four regions based on the Hounsfield unit (HU) ranges reported by Gattinoni et al. (23) via a segmentation process executed...determined by repeated measures ANOVA. CT, computed tomography; MGSD, mean gray-scale density of the entire lung by CT scan; HU, Hounsfield units
Sofaer, Helen R.; Skagen, Susan K.; Barsugli, Joseph J.; Rashford, Benjamin S.; Reese, Gordon C.; Hoeting, Jennifer A.; Wood, Andrew W.; Noon, Barry R.
2016-01-01
Climate change poses major challenges for conservation and management because it alters the area, quality, and spatial distribution of habitat for natural populations. To assess species’ vulnerability to climate change and target ongoing conservation investments, researchers and managers often consider the effects of projected changes in climate and land use on future habitat availability and quality and the uncertainty associated with these projections. Here, we draw on tools from hydrology and climate science to project the impact of climate change on the density of wetlands in the Prairie Pothole Region of the USA, a critical area for breeding waterfowl and other wetland-dependent species. We evaluate the potential for a trade-off in the value of conservation investments under current and future climatic conditions and consider the joint effects of climate and land use. We use an integrated set of hydrological and climatological projections that provide physically based measures of water balance under historical and projected future climatic conditions. In addition, we use historical projections derived from ten general circulation models (GCMs) as a baseline from which to assess climate change impacts, rather than historical climate data. This method isolates the impact of greenhouse gas emissions and ensures that modeling errors are incorporated into the baseline rather than attributed to climate change. Our work shows that, on average, densities of wetlands (here defined as wetland basins holding water) are projected to decline across the U.S. Prairie Pothole Region, but that GCMs differ in both the magnitude and the direction of projected impacts. However, we found little evidence for a shift in the locations expected to provide the highest wetland densities under current vs. projected climatic conditions. This result was robust to the inclusion of projected changes in land use under climate change. We suggest that targeting conservation towards wetland complexes containing both small and relatively large wetland basins, which is an ongoing conservation strategy, may also act to hedge against uncertainty in the effects of climate change.
NASA Astrophysics Data System (ADS)
Rossi, M.; Apuani, T.; Felletti, F.
2009-04-01
The aim of this paper is to compare the results of two statistical methods for landslide susceptibility analysis: 1) univariate probabilistic method based on landslide susceptibility index, 2) multivariate method (logistic regression). The study area is the Febbraro valley, located in the central Italian Alps, where different types of metamorphic rocks croup out. On the eastern part of the studied basin a quaternary cover represented by colluvial and secondarily, by glacial deposits, is dominant. In this study 110 earth flows, mainly located toward NE portion of the catchment, were analyzed. They involve only the colluvial deposits and their extension mainly ranges from 36 to 3173 m2. Both statistical methods require to establish a spatial database, in which each landslide is described by several parameters that can be assigned using a main scarp central point of landslide. The spatial database is constructed using a Geographical Information System (GIS). Each landslide is described by several parameters corresponding to the value of main scarp central point of the landslide. Based on bibliographic review a total of 15 predisposing factors were utilized. The width of the intervals, in which the maps of the predisposing factors have to be reclassified, has been defined assuming constant intervals to: elevation (100 m), slope (5 °), solar radiation (0.1 MJ/cm2/year), profile curvature (1.2 1/m), tangential curvature (2.2 1/m), drainage density (0.5), lineament density (0.00126). For the other parameters have been used the results of the probability-probability plots analysis and the statistical indexes of landslides site. In particular slope length (0 ÷ 2, 2 ÷ 5, 5 ÷ 10, 10 ÷ 20, 20 ÷ 35, 35 ÷ 260), accumulation flow (0 ÷ 1, 1 ÷ 2, 2 ÷ 5, 5 ÷ 12, 12 ÷ 60, 60 ÷27265), Topographic Wetness Index 0 ÷ 0.74, 0.74 ÷ 1.94, 1.94 ÷ 2.62, 2.62 ÷ 3.48, 3.48 ÷ 6,00, 6.00 ÷ 9.44), Stream Power Index (0 ÷ 0.64, 0.64 ÷ 1.28, 1.28 ÷ 1.81, 1.81 ÷ 4.20, 4.20 ÷ 9.40). Geological map and land use map were also used, considering geological and land use properties as categorical variables. Appling the univariate probabilistic method the Landslide Susceptibility Index (LSI) is defined as the sum of the ratio Ra/Rb calculated for each predisposing factor, where Ra is the ratio between number of pixel of class and the total number of pixel of the study area, and Rb is the ratio between number of landslides respect to the pixel number of the interval area. From the analysis of the Ra/Rb ratio the relationship between landslide occurrence and predisposing factors were defined. Then the equation of LSI was used in GIS to trace the landslide susceptibility maps. The multivariate method for landslide susceptibility analysis, based on logistic regression, was performed starting from the density maps of the predisposing factors, calculated with the intervals defined above using the equation Rb/Rbtot, where Rbtot is a sum of all Rb values. Using stepwise forward algorithms the logistic regression was performed in two successive steps: first a univariate logistic regression is used to choose the most significant predisposing factors, then the multivariate logistic regression can be performed. The univariate regression highlighted the importance of the following factors: elevation, accumulation flow, drainage density, lineament density, geology and land use. When the multivariate regression was applied the number of controlling factors was reduced neglecting the geological properties. The resulting final susceptibility equation is: P = 1 / (1 + exp-(6.46-22.34*elevation-5.33*accumulation flow-7.99* drainage density-4.47*lineament density-17.31*land use)) and using this equation the susceptibility maps were obtained. To easy compare the results of the two methodologies, the susceptibility maps were reclassified in five susceptibility intervals (very high, high, moderate, low and very low) using natural breaks. Then the maps were validated using two cumulative distribution curves, one related to the landslides (number of landslides in each susceptibility class) and one to the basin (number of pixel covering each class). Comparing the curves for each method, it results that the two approaches (univariate and multivariate) are appropriate, providing acceptable results. In both maps the distribution of high susceptibility condition is mainly localized on the left slope of the catchment in agreement with the field evidences. The comparison between the methods was obtained by subtraction of the two maps. This operation shows that about 40% of the basin is classified by the same class of susceptibility. In general the univariate probabilistic method tends to overestimate the areal extension of the high susceptibility class with respect to the maps obtained by the logistic regression method.
NASA Astrophysics Data System (ADS)
Yu, H.; Gu, H.
2017-12-01
A novel multivariate seismic formation pressure prediction methodology is presented, which incorporates high-resolution seismic velocity data from prestack AVO inversion, and petrophysical data (porosity and shale volume) derived from poststack seismic motion inversion. In contrast to traditional seismic formation prediction methods, the proposed methodology is based on a multivariate pressure prediction model and utilizes a trace-by-trace multivariate regression analysis on seismic-derived petrophysical properties to calibrate model parameters in order to make accurate predictions with higher resolution in both vertical and lateral directions. With prestack time migration velocity as initial velocity model, an AVO inversion was first applied to prestack dataset to obtain high-resolution seismic velocity with higher frequency that is to be used as the velocity input for seismic pressure prediction, and the density dataset to calculate accurate Overburden Pressure (OBP). Seismic Motion Inversion (SMI) is an inversion technique based on Markov Chain Monte Carlo simulation. Both structural variability and similarity of seismic waveform are used to incorporate well log data to characterize the variability of the property to be obtained. In this research, porosity and shale volume are first interpreted on well logs, and then combined with poststack seismic data using SMI to build porosity and shale volume datasets for seismic pressure prediction. A multivariate effective stress model is used to convert velocity, porosity and shale volume datasets to effective stress. After a thorough study of the regional stratigraphic and sedimentary characteristics, a regional normally compacted interval model is built, and then the coefficients in the multivariate prediction model are determined in a trace-by-trace multivariate regression analysis on the petrophysical data. The coefficients are used to convert velocity, porosity and shale volume datasets to effective stress and then to calculate formation pressure with OBP. Application of the proposed methodology to a research area in East China Sea has proved that the method can bridge the gap between seismic and well log pressure prediction and give predicted pressure values close to pressure meassurements from well testing.
Baker, R.; Koopman, R. J.; Saxena, S.; Diaz, V. A.; Everett, C. J.; Majeed, A.
2006-01-01
Aims/hypothesis The aim of this study was to make projections of the future diabetes burden for the adult US population based in part on the prevalence of individuals at high risk of developing diabetes. Materials and methods Models were created from data in the nationally representative National Health and Nutrition Examination Survey (NHANES) II mortality survey (1976–1992), the NHANES III (1988–1994) and the NHANES 1999–2002. Population models for adults (>20 years of age) from NHANES III data were fitted to known diabetes prevalence in the NHANES 1999–2002 before making future projections. We used a multivariable diabetes risk score to estimate the likelihood of diabetes incidence in 10 years. Estimates of future diabetes (diagnosed and undiagnosed) prevalence in 2011, 2021, and 2031 were made under several assumptions. Results Based on the multivariable diabetes risk score, the number of adults at high risk of diabetes was 38.4 million in 1991 and 49.9 million in 2001. The total diabetes burden is anticipated to be 11.5% (25.4 million) in 2011, 13.5% (32.6 million) in 2021, and 14.5% (37.7 million) in 2031. Among individuals aged 30 to 39 years old who are not currently targeted for screening according to age, the prevalence of diabetes is expected to rise from 3.7% in 2001 to 5.2% in 2031. By 2031, 20.2% of adult Hispanic individuals are expected to have diabetes. Conclusions/interpretation The prevalence of diabetes is projected to rise to substantially greater levels than previously estimated. Diabetes prevalence within the Hispanic community is projected to be potentially overwhelming. Electronic supplementary material Supplementary material is available in the online version of this article at http://dx.doi.org/10.1007/s00125-006-0528-5 and is accessible to authorized users. PMID:17119914
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Hoan, E-mail: hoan.ho@wdc.com; Department of Materials Science and Engineering, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213; Zhu, Jingxi, E-mail: jingxiz@andrew.cmu.edu
2014-11-21
We present a study on atomic ordering within individual grains in granular L1{sub 0}-FePt thin films using transmission electron microscopy techniques. The film, used as a medium for heat assisted magnetic recording, consists of a single layer of FePt grains separated by non-magnetic grain boundaries and is grown on an MgO underlayer. Using convergent-beam techniques, diffraction patterns of individual grains are obtained for a large number of crystallites. The study found that although the majority of grains are ordered in the perpendicular direction, more than 15% of them are multi-variant, or of in-plane c-axis orientation, or disordered fcc. It wasmore » also found that these multi-variant and in-plane grains have always grown across MgO grain boundaries separating two or more MgO grains of the underlayer. The in-plane ordered portion within a multi-variant L1{sub 0}-FePt grain always lacks atomic coherence with the MgO directly underneath it, whereas, the perpendicularly ordered portion is always coherent with the underlying MgO grain. Since the existence of multi-variant and in-plane ordered grains are severely detrimental to high density data storage capability, the understanding of their formation mechanism obtained here should make a significant impact on the future development of hard disk drive technology.« less
A generalized system of models forecasting Central States tree growth.
Stephen R. Shifley
1987-01-01
Describes the development and testing of a system of individual tree-based growth projection models applicable to species in Indiana, Missouri, and Ohio. Annual tree basal area growth is estimated as a function of tree size, crown ratio, stand density, and site index. Models are compatible with the STEMS and TWIGS Projection System.
NASA Astrophysics Data System (ADS)
Ghale, Purnima; Johnson, Harley T.
2018-06-01
We present an efficient sparse matrix-vector (SpMV) based method to compute the density matrix P from a given Hamiltonian in electronic structure computations. Our method is a hybrid approach based on Chebyshev-Jackson approximation theory and matrix purification methods like the second order spectral projection purification (SP2). Recent methods to compute the density matrix scale as O(N) in the number of floating point operations but are accompanied by large memory and communication overhead, and they are based on iterative use of the sparse matrix-matrix multiplication kernel (SpGEMM), which is known to be computationally irregular. In addition to irregularity in the sparse Hamiltonian H, the nonzero structure of intermediate estimates of P depends on products of H and evolves over the course of computation. On the other hand, an expansion of the density matrix P in terms of Chebyshev polynomials is straightforward and SpMV based; however, the resulting density matrix may not satisfy the required constraints exactly. In this paper, we analyze the strengths and weaknesses of the Chebyshev-Jackson polynomials and the second order spectral projection purification (SP2) method, and propose to combine them so that the accurate density matrix can be computed using the SpMV computational kernel only, and without having to store the density matrix P. Our method accomplishes these objectives by using the Chebyshev polynomial estimate as the initial guess for SP2, which is followed by using sparse matrix-vector multiplications (SpMVs) to replicate the behavior of the SP2 algorithm for purification. We demonstrate the method on a tight-binding model system of an oxide material containing more than 3 million atoms. In addition, we also present the predicted behavior of our method when applied to near-metallic Hamiltonians with a wide energy spectrum.
Considerations in cross-validation type density smoothing with a look at some data
NASA Technical Reports Server (NTRS)
Schuster, E. F.
1982-01-01
Experience gained in applying nonparametric maximum likelihood techniques of density estimation to judge the comparative quality of various estimators is reported. Two invariate data sets of one hundered samples (one Cauchy, one natural normal) are considered as well as studies in the multivariate case.
Belote, R Travis; Carroll, Carlos; Martinuzzi, Sebastián; Michalak, Julia; Williams, John W; Williamson, Matthew A; Aplet, Gregory H
2018-06-21
Addressing uncertainties in climate vulnerability remains a challenge for conservation planning. We evaluate how confidence in conservation recommendations may change with agreement among alternative climate projections and metrics of climate exposure. We assessed agreement among three multivariate estimates of climate exposure (forward velocity, backward velocity, and climate dissimilarity) using 18 alternative climate projections for the contiguous United States. For each metric, we classified maps into quartiles for each alternative climate projections, and calculated the frequency of quartiles assigned for each gridded location (high quartile frequency = more agreement among climate projections). We evaluated recommendations using a recent climate adaptation heuristic framework that recommends emphasizing various conservation strategies to land based on current conservation value and expected climate exposure. We found that areas where conservation strategies would be confidently assigned based on high agreement among climate projections varied substantially across regions. In general, there was more agreement in forward and backward velocity estimates among alternative projections than agreement in estimates of local dissimilarity. Consensus of climate predictions resulted in the same conservation recommendation assignments in a few areas, but patterns varied by climate exposure metric. This work demonstrates an approach for explicitly evaluating alternative predictions in geographic patterns of climate change.
Peikert, Tobias; Duan, Fenghai; Rajagopalan, Srinivasan; Karwoski, Ronald A; Clay, Ryan; Robb, Richard A; Qin, Ziling; Sicks, JoRean; Bartholmai, Brian J; Maldonado, Fabien
2018-01-01
Optimization of the clinical management of screen-detected lung nodules is needed to avoid unnecessary diagnostic interventions. Herein we demonstrate the potential value of a novel radiomics-based approach for the classification of screen-detected indeterminate nodules. Independent quantitative variables assessing various radiologic nodule features such as sphericity, flatness, elongation, spiculation, lobulation and curvature were developed from the NLST dataset using 726 indeterminate nodules (all ≥ 7 mm, benign, n = 318 and malignant, n = 408). Multivariate analysis was performed using least absolute shrinkage and selection operator (LASSO) method for variable selection and regularization in order to enhance the prediction accuracy and interpretability of the multivariate model. The bootstrapping method was then applied for the internal validation and the optimism-corrected AUC was reported for the final model. Eight of the originally considered 57 quantitative radiologic features were selected by LASSO multivariate modeling. These 8 features include variables capturing Location: vertical location (Offset carina centroid z), Size: volume estimate (Minimum enclosing brick), Shape: flatness, Density: texture analysis (Score Indicative of Lesion/Lung Aggression/Abnormality (SILA) texture), and surface characteristics: surface complexity (Maximum shape index and Average shape index), and estimates of surface curvature (Average positive mean curvature and Minimum mean curvature), all with P<0.01. The optimism-corrected AUC for these 8 features is 0.939. Our novel radiomic LDCT-based approach for indeterminate screen-detected nodule characterization appears extremely promising however independent external validation is needed.
Models for Experimental High Density Housing
NASA Astrophysics Data System (ADS)
Bradecki, Tomasz; Swoboda, Julia; Nowak, Katarzyna; Dziechciarz, Klaudia
2017-10-01
The article presents the effects of research on models of high density housing. The authors present urban projects for experimental high density housing estates. The design was based on research performed on 38 examples of similar housing in Poland that have been built after 2003. Some of the case studies show extreme density and that inspired the researchers to test individual virtual solutions that would answer the question: How far can we push the limits? The experimental housing projects show strengths and weaknesses of design driven only by such indexes as FAR (floor attenuation ratio - housing density) and DPH (dwellings per hectare). Although such projects are implemented, the authors believe that there are reasons for limits since high index values may be in contradiction to the optimum character of housing environment. Virtual models on virtual plots presented by the authors were oriented toward maximising the DPH index and DAI (dwellings area index) which is very often the main driver for developers. The authors also raise the question of sustainability of such solutions. The research was carried out in the URBAN model research group (Gliwice, Poland) that consists of academic researchers and architecture students. The models reflect architectural and urban regulations that are valid in Poland. Conclusions might be helpful for urban planners, urban designers, developers, architects and architecture students.
Molecular dynamics study on splitting of hydrogen-implanted silicon in Smart-Cut® technology
NASA Astrophysics Data System (ADS)
Bing, Wang; Bin, Gu; Rongying, Pan; Sijia, Zhang; Jianhua, Shen
2015-03-01
Defect evolution in a single crystal silicon which is implanted with hydrogen atoms and then annealed is investigated in the present paper by means of molecular dynamics simulation. By introducing defect density based on statistical average, this work aims to quantitatively examine defect nucleation and growth at nanoscale during annealing in Smart-Cut® technology. Research focus is put on the effects of the implantation energy, hydrogen implantation dose and annealing temperature on defect density in the statistical region. It is found that most defects nucleate and grow at the annealing stage, and that defect density increases with the increase of the annealing temperature and the decrease of the hydrogen implantation dose. In addition, the enhancement and the impediment effects of stress field on defect density in the annealing process are discussed. Project supported by the National Natural Science Foundation of China (No. 11372261), the Excellent Young Scientists Supporting Project of Science and Technology Department of Sichuan Province (No. 2013JQ0030), the Supporting Project of Department of Education of Sichuan Province (No. 2014zd3132), the Opening Project of Key Laboratory of Testing Technology for Manufacturing Process, Southwest University of Science and Technology-Ministry of Education (No. 12zxzk02), the Fund of Doctoral Research of Southwest University of Science and Technology (No. 12zx7106), and the Postgraduate Innovation Fund Project of Southwest University of Science and Technology (No. 14ycxjj0121).
NASA Astrophysics Data System (ADS)
Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.
2014-12-01
Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.
Non-Gaussian and Multivariate Noise Models for Signal Detection.
1982-09-01
follow, some of the basic results of asymptotic "theory are presented. both to make the notation clear. and to give some i ~ background for the...densities are considered within a detection framework. The discussions include specific examples and also some general methods of density generation ...densities generated by a memoryless, nonlinear transformation of a correlated, Gaussian source is discussed in some detail. A member of this class has the
Extracting galactic structure parameters from multivariated density estimation
NASA Technical Reports Server (NTRS)
Chen, B.; Creze, M.; Robin, A.; Bienayme, O.
1992-01-01
Multivariate statistical analysis, including includes cluster analysis (unsupervised classification), discriminant analysis (supervised classification) and principle component analysis (dimensionlity reduction method), and nonparameter density estimation have been successfully used to search for meaningful associations in the 5-dimensional space of observables between observed points and the sets of simulated points generated from a synthetic approach of galaxy modelling. These methodologies can be applied as the new tools to obtain information about hidden structure otherwise unrecognizable, and place important constraints on the space distribution of various stellar populations in the Milky Way. In this paper, we concentrate on illustrating how to use nonparameter density estimation to substitute for the true densities in both of the simulating sample and real sample in the five-dimensional space. In order to fit model predicted densities to reality, we derive a set of equations which include n lines (where n is the total number of observed points) and m (where m: the numbers of predefined groups) unknown parameters. A least-square estimation will allow us to determine the density law of different groups and components in the Galaxy. The output from our software, which can be used in many research fields, will also give out the systematic error between the model and the observation by a Bayes rule.
Cawthon, Peggy Mannen; Fox, Kathleen M; Gandra, Shravanthi R; Delmonico, Matthew J; Chiou, Chiun-Fang; Anthony, Mary S; Sewall, Ase; Goodpaster, Bret; Satterfield, Suzanne; Cummings, Steven R; Harris, Tamara B
2009-08-01
To examine the association between strength, function, lean mass, muscle density, and risk of hospitalization. Prospective cohort study. Two U.S. clinical centers. Adults aged 70 to 80 (N=3,011) from the Health, Aging and Body Composition Study. Measurements were of grip strength, knee extension strength, lean mass, walking speed, and chair stand pace. Thigh computed tomography scans assessed muscle area and density (a proxy for muscle fat infiltration). Hospitalizations were confirmed by local review of medical records. Negative binomial regression models estimated incident rate ratios (IRRs) of hospitalization for race- and sex-specific quartiles of each muscle and function parameter separately. Multivariate models adjusted for age, body mass index, health status, and coexisting medical conditions. During an average 4.7 years of follow-up, 1,678 (55.7%) participants experienced one or more hospitalizations. Participants in the lowest quartile of muscle density were more likely to be subsequently hospitalized (multivariate IRR=1.47, 95% confidence interval (CI)=1.24-1.73) than those in the highest quartile. Similarly, participants with the weakest grip strength were at greater risk of hospitalization (multivariate IRR=1.52, 95% CI=1.30-1.78, Q1 vs. Q4). Comparable results were seen for knee strength, walking pace, and chair stands pace. Lean mass and muscle area were not associated with risk of hospitalization. Weak strength, poor function, and low muscle density, but not muscle size or lean mass, were associated with greater risk of hospitalization. Interventions to reduce the disease burden associated with sarcopenia should focus on increasing muscle strength and improving physical function rather than simply increasing lean mass.
Multivariate Quantitative Chemical Analysis
NASA Technical Reports Server (NTRS)
Kinchen, David G.; Capezza, Mary
1995-01-01
Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.
A "Model" Multivariable Calculus Course.
ERIC Educational Resources Information Center
Beckmann, Charlene E.; Schlicker, Steven J.
1999-01-01
Describes a rich, investigative approach to multivariable calculus. Introduces a project in which students construct physical models of surfaces that represent real-life applications of their choice. The models, along with student-selected datasets, serve as vehicles to study most of the concepts of the course from both continuous and discrete…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betin, A Yu; Bobrinev, V I; Verenikina, N M
A multiplex method of recording computer-synthesised one-dimensional Fourier holograms intended for holographic memory devices is proposed. The method potentially allows increasing the recording density in the previously proposed holographic memory system based on the computer synthesis and projection recording of data page holograms. (holographic memory)
Multivariate methods to visualise colour-space and colour discrimination data.
Hastings, Gareth D; Rubin, Alan
2015-01-01
Despite most modern colour spaces treating colour as three-dimensional (3-D), colour data is usually not visualised in 3-D (and two-dimensional (2-D) projection-plane segments and multiple 2-D perspective views are used instead). The objectives of this article are firstly, to introduce a truly 3-D percept of colour space using stereo-pairs, secondly to view colour discrimination data using that platform, and thirdly to apply formal statistics and multivariate methods to analyse the data in 3-D. This is the first demonstration of the software that generated stereo-pairs of RGB colour space, as well as of a new computerised procedure that investigated colour discrimination by measuring colour just noticeable differences (JND). An initial pilot study and thorough investigation of instrument repeatability were performed. Thereafter, to demonstrate the capabilities of the software, five colour-normal and one colour-deficient subject were examined using the JND procedure and multivariate methods of data analysis. Scatter plots of responses were meaningfully examined in 3-D and were useful in evaluating multivariate normality as well as identifying outliers. The extent and direction of the difference between each JND response and the stimulus colour point was calculated and appreciated in 3-D. Ellipsoidal surfaces of constant probability density (distribution ellipsoids) were fitted to response data; the volumes of these ellipsoids appeared useful in differentiating the colour-deficient subject from the colour-normals. Hypothesis tests of variances and covariances showed many statistically significant differences between the results of the colour-deficient subject and those of the colour-normals, while far fewer differences were found when comparing within colour-normals. The 3-D visualisation of colour data using stereo-pairs, as well as the statistics and multivariate methods of analysis employed, were found to be unique and useful tools in the representation and study of colour. Many additional studies using these methods along with the JND and other procedures have been identified and will be reported in future publications. © 2014 The Authors Ophthalmic & Physiological Optics © 2014 The College of Optometrists.
Evaluation of Embedded System Component Utilized in Delivery Integrated Design Project Course
NASA Astrophysics Data System (ADS)
Junid, Syed Abdul Mutalib Al; Hussaini, Yusnira; Nazmie Osman, Fairul; Razak, Abdul Hadi Abdul; Idros, Mohd Faizul Md; Karimi Halim, Abdul
2018-03-01
This paper reports the evaluation of the embedded system component utilized in delivering the integrated electronic engineering design project course. The evaluation is conducted based on the report project submitted as to fulfil the assessment criteria for the integrated electronic engineering design project course named; engineering system design. Six projects were assessed in this evaluation. The evaluation covers the type of controller, programming language and the number of embedded component utilization as well. From the evaluation, the C-programming based language is the best solution preferred by the students which provide them flexibility in the programming. Moreover, the Analog to Digital converter is intensively used in the projects which include sensors in their proposed design. As a conclusion, in delivering the integrated design project course, the knowledge over the embedded system solution is very important since the high density of the knowledge acquired in accomplishing the project assigned.
Newell, John D; Fuld, Matthew K; Allmendinger, Thomas; Sieren, Jered P; Chan, Kung-Sik; Guo, Junfeng; Hoffman, Eric A
2015-01-01
The purpose of this study was to evaluate the impact of ultralow radiation dose single-energy computed tomographic (CT) acquisitions with Sn prefiltration and third-generation iterative reconstruction on density-based quantitative measures of growing interest in phenotyping pulmonary disease. The effects of both decreasing dose and different body habitus on the accuracy of the mean CT attenuation measurements and the level of image noise (SD) were evaluated using the COPDGene 2 test object, containing 8 different materials of interest ranging from air to acrylic and including various density foams. A third-generation dual-source multidetector CT scanner (Siemens SOMATOM FORCE; Siemens Healthcare AG, Erlangen, Germany) running advanced modeled iterative reconstruction (ADMIRE) software (Siemens Healthcare AG) was used.We used normal and very large body habitus rings at dose levels varying from 1.5 to 0.15 mGy using a spectral-shaped (0.6-mm Sn) tube output of 100 kV(p). Three CT scans were obtained at each dose level using both rings. Regions of interest for each material in the test object scans were automatically extracted. The Hounsfield unit values of each material using weighted filtered back projection (WFBP) at 1.5 mGy was used as the reference value to evaluate shifts in CT attenuation at lower dose levels using either WFBP or ADMIRE. Statistical analysis included basic statistics, Welch t tests, multivariable covariant model using the F test to assess the significance of the explanatory (independent) variables on the response (dependent) variable, and CT mean attenuation, in the multivariable covariant model including reconstruction method. Multivariable regression analysis of the mean CT attenuation values showed a significant difference with decreasing dose between ADMIRE and WFBP. The ADMIRE has reduced noise and more stable CT attenuation compared with WFBP. There was a strong effect on the mean CT attenuation values of the scanned materials for ring size (P < 0.0001) and dose level (P < 0.0001). The number of voxels in the region of interest for the particular material studied did not demonstrate a significant effect (P > 0.05). The SD was lower with ADMIRE compared with WFBP at all dose levels and ring sizes (P < 0.05). The third-generation dual-source CT scanners using third-generation iterative reconstruction methods can acquire accurate quantitative CT images with acceptable image noise at very low-dose levels (0.15 mGy). This opens up new diagnostic and research opportunities in CT phenotyping of the lung for developing new treatments and increased understanding of pulmonary disease.
Agha, Sohail
2011-11-30
Demand-side financing projects are now being implemented in many developing countries, yet evidence showing that they reach the poor is scanty. A maternal health voucher scheme provided voucher-paid services in Jhang, a predominantly rural district of Pakistan, during 2010. A pre-test/post-test quasi-experimental design was used to assess the changes in the proportion of facility-based deliveries and related maternal health services among the poor. Household interviews were conducted with randomly selected women in the intervention and control union councils, before and after the intervention.A strong outreach model was used. Voucher promoters were given basic training in identification of poor women using the Poverty Scorecard for Pakistan, in the types of problems women could face during delivery, and in the promotion of antenatal care (ANC), institutional delivery and postnatal care (PNC). Voucher booklets valued at Rs. 4,000 ($48), including three ANC visits, a PNC visit, an institutional delivery, and a postnatal family planning visit, were sold for Rs. 100 ($1.2) to low-income women targeted by project outreach workers. Women suffering from complications were referred to emergency obstetric care services.Analysis was conducted at the bivariate and the multivariate levels. At the multivariate level, logistic regression analysis was conducted to determine whether the increase in institutional delivery was greater among poor women (defined for this study as women in the fourth or fifth quintiles) relative to non-poor women (defined for this study as women in the first quintile) in the intervention union councils compared to the control union councils. Bivariate analysis showed significant increases in the institutional delivery rate among women in the fourth or fifth wealth quintiles in the intervention union councils but no significant changes in this indicator among women in the same wealth quintiles in the control union councils. Multivariate analysis showed that the increase in institutional delivery among poor women relative to non-poor women was significantly greater in the intervention compared to the control union councils. Demand-side financing projects using vouchers can be an effective way of reducing inequities in institutional delivery.
Ritota, Mena; Casciani, Lorena; Valentini, Massimiliano
2013-05-01
Analytical traceability of PGI and PDO foods (Protected Geographical Indication and Protected Denomination Origin respectively) is one of the most challenging tasks of current applied research. Here we proposed a metabolomic approach based on the combination of (1)H high-resolution magic angle spinning-nuclear magnetic resonance (HRMAS-NMR) spectroscopy with multivariate analysis, i.e. PLS-DA, as a reliable tool for the traceability of Italian PGI chicories (Cichorium intybus L.), i.e. Radicchio Rosso di Treviso and Radicchio Variegato di Castelfranco, also known as red and red-spotted, respectively. The metabolic profile was gained by means of HRMAS-NMR, and multivariate data analysis allowed us to build statistical models capable of providing clear discrimination among the two varieties and classification according to the geographical origin. Based on Variable Importance in Projection values, the molecular markers for classifying the different types of red chicories analysed were found accounting for both the cultivar and the place of origin. © 2012 Society of Chemical Industry.
Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland
2017-03-01
Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.
Brooks, Jeremy S; Waylen, Kerry A; Borgerhoff Mulder, Monique
2012-12-26
Community-based conservation (CBC) promotes the idea that conservation success requires engaging with, and providing benefits for, local communities. However, CBC projects are neither consistently successful nor free of controversy. Innovative recent studies evaluating the factors associated with success and failure typically examine only a single resource domain, have limited geographic scope, consider only one outcome, or ignore the nested nature of socioecological systems. To remedy these issues, we use a global comparative database of CBC projects identified by systematic review to evaluate success in four outcome domains (attitudes, behaviors, ecological, economic) and explore synergies and trade-offs among these outcomes. We test hypotheses about how features of the national context, project design, and local community characteristics affect these measures of success. Using bivariate analyses and multivariate proportional odds logistic regressions within a multilevel analysis and model-fitting framework, we show that project design, particularly capacity-building in local communities, is associated with success across all outcomes. In addition, some characteristics of the local community in which projects are conducted, such as tenure regimes and supportive cultural beliefs and institutions, are important for project success. Surprisingly, there is little evidence that national context systematically influences project outcomes. We also find evidence of synergies between pairs of outcomes, particularly between ecological and economic success. We suggest that well-designed and implemented projects can overcome many of the obstacles imposed by local and national conditions to succeed in multiple domains.
NASA Astrophysics Data System (ADS)
Guerra, Solange Maria; Silva, Thiago Christiano; Tabak, Benjamin Miranda; de Souza Penaloza, Rodrigo Andrés; de Castro Miranda, Rodrigo César
2016-01-01
In this paper we present systemic risk measures based on contingent claims approach and banking sector multivariate density. We also apply network measures to analyze bank common risk exposure. The proposed measures aim to capture credit risk stress and its potential to become systemic. These indicators capture not only individual bank vulnerability, but also the stress dependency structure between them. Furthermore, these measures can be quite useful for identifying systemically important banks. The empirical results show that these indicators capture with considerable fidelity the moments of increasing systemic risk in the Brazilian banking sector in recent years.
Boiret, Mathieu; de Juan, Anna; Gorretta, Nathalie; Ginot, Yves-Michel; Roger, Jean-Michel
2015-09-10
Raman chemical imaging provides chemical and spatial information about pharmaceutical drug product. By using resolution methods on acquired spectra, the objective is to calculate pure spectra and distribution maps of image compounds. With multivariate curve resolution-alternating least squares, constraints are used to improve the performance of the resolution and to decrease the ambiguity linked to the final solution. Non negativity and spatial local rank constraints have been identified as the most powerful constraints to be used. In this work, an alternative method to set local rank constraints is proposed. The method is based on orthogonal projections pretreatment. For each drug product compound, raw Raman spectra are orthogonally projected to a basis including all the variability from the formulation compounds other than the product of interest. Presence or absence of the compound of interest is obtained by observing the correlations between the orthogonal projected spectra and a pure spectrum orthogonally projected to the same basis. By selecting an appropriate threshold, maps of presence/absence of compounds can be set up for all the product compounds. This method appears as a powerful approach to identify a low dose compound within a pharmaceutical drug product. The maps of presence/absence of compounds can be used as local rank constraints in resolution methods, such as multivariate curve resolution-alternating least squares process in order to improve the resolution of the system. The method proposed is particularly suited for pharmaceutical systems, where the identity of all compounds in the formulations is known and, therefore, the space of interferences can be well defined. Copyright © 2015 Elsevier B.V. All rights reserved.
Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin
2017-03-01
The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.
Sofaer, Helen R; Skagen, Susan K; Barsugli, Joseph J; Rashford, Benjamin S; Reese, Gordon C; Hoeting, Jennifer A; Wood, Andrew W; Noon, Barry R
2016-09-01
Climate change poses major challenges for conservation and management because it alters the area, quality, and spatial distribution of habitat for natural populations. To assess species' vulnerability to climate change and target ongoing conservation investments, researchers and managers often consider the effects of projected changes in climate and land use on future habitat availability and quality and the uncertainty associated with these projections. Here, we draw on tools from hydrology and climate science to project the impact of climate change on the density of wetlands in the Prairie Pothole Region of the USA, a critical area for breeding waterfowl and other wetland-dependent species. We evaluate the potential for a trade-off in the value of conservation investments under current and future climatic conditions and consider the joint effects of climate and land use. We use an integrated set of hydrological and climatological projections that provide physically based measures of water balance under historical and projected future climatic conditions. In addition, we use historical projections derived from ten general circulation models (GCMs) as a baseline from which to assess climate change impacts, rather than historical climate data. This method isolates the impact of greenhouse gas emissions and ensures that modeling errors are incorporated into the baseline rather than attributed to climate change. Our work shows that, on average, densities of wetlands (here defined as wetland basins holding water) are projected to decline across the U.S. Prairie Pothole Region, but that GCMs differ in both the magnitude and the direction of projected impacts. However, we found little evidence for a shift in the locations expected to provide the highest wetland densities under current vs. projected climatic conditions. This result was robust to the inclusion of projected changes in land use under climate change. We suggest that targeting conservation towards wetland complexes containing both small and relatively large wetland basins, which is an ongoing conservation strategy, may also act to hedge against uncertainty in the effects of climate change. © 2016 by the Ecological Society of America.
Meeting the challenges of developing LED-based projection displays
NASA Astrophysics Data System (ADS)
Geißler, Enrico
2006-04-01
The main challenge in developing a LED-based projection system is to meet the brightness requirements of the market. Therefore a balanced combination of optical, electrical and thermal parameters must be reached to achieve these performance and cost targets. This paper describes the system design methodology for a digital micromirror display (DMD) based optical engine using LEDs as the light source, starting at the basic physical and geometrical parameters of the DMD and other optical elements through characterization of the LEDs to optimizing the system performance by determining optimal driving conditions. LEDs have a luminous flux density which is just at the threshold of acceptance in projection systems and thus only a fully optimized optical system with a matched set of LEDs can be used. This work resulted in two projection engines, one for a compact pocket projector and the other for a rear projection television, both of which are currently in commercialization.
Emilie B. Henderson; Janet L. Ohmann; Matthew J. Gregory; Heather M. Roberts; Harold S.J. Zald
2014-01-01
Landscape management and conservation planning require maps of vegetation composition and structure over large regions. Species distribution models (SDMs) are often used for individual species, but projects mapping multiple species are rarer. We compare maps of plant community composition assembled by stacking results from many SDMs with multivariate maps constructed...
TU-F-18A-06: Dual Energy CT Using One Full Scan and a Second Scan with Very Few Projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, T; Zhu, L
Purpose: The conventional dual energy CT (DECT) requires two full CT scans at different energy levels, resulting in dose increase as well as imaging errors from patient motion between the two scans. To shorten the scan time of DECT and thus overcome these drawbacks, we propose a new DECT algorithm using one full scan and a second scan with very few projections by preserving structural information. Methods: We first reconstruct a CT image on the full scan using a standard filtered-backprojection (FBP) algorithm. We then use a compressed sensing (CS) based iterative algorithm on the second scan for reconstruction frommore » very few projections. The edges extracted from the first scan are used as weights in the Objectives: function of the CS-based reconstruction to substantially improve the image quality of CT reconstruction. The basis material images are then obtained by an iterative image-domain decomposition method and an electron density map is finally calculated. The proposed method is evaluated on phantoms. Results: On the Catphan 600 phantom, the CT reconstruction mean error using the proposed method on 20 and 5 projections are 4.76% and 5.02%, respectively. Compared with conventional iterative reconstruction, the proposed edge weighting preserves object structures and achieves a better spatial resolution. With basis materials of Iodine and Teflon, our method on 20 projections obtains similar quality of decomposed material images compared with FBP on a full scan and the mean error of electron density in the selected regions of interest is 0.29%. Conclusion: We propose an effective method for reducing projections and therefore scan time in DECT. We show that a full scan plus a 20-projection scan are sufficient to provide DECT images and electron density with similar quality compared with two full scans. Our future work includes more phantom studies to validate the performance of our method.« less
Optimizing 4DCBCT projection allocation to respiratory bins.
O'Brien, Ricky T; Kipritidis, John; Shieh, Chun-Chien; Keall, Paul J
2014-10-07
4D cone beam computed tomography (4DCBCT) is an emerging image guidance strategy used in radiotherapy where projections acquired during a scan are sorted into respiratory bins based on the respiratory phase or displacement. 4DCBCT reduces the motion blur caused by respiratory motion but increases streaking artefacts due to projection under-sampling as a result of the irregular nature of patient breathing and the binning algorithms used. For displacement binning the streak artefacts are so severe that displacement binning is rarely used clinically. The purpose of this study is to investigate if sharing projections between respiratory bins and adjusting the location of respiratory bins in an optimal manner can reduce or eliminate streak artefacts in 4DCBCT images. We introduce a mathematical optimization framework and a heuristic solution method, which we will call the optimized projection allocation algorithm, to determine where to position the respiratory bins and which projections to source from neighbouring respiratory bins. Five 4DCBCT datasets from three patients were used to reconstruct 4DCBCT images. Projections were sorted into respiratory bins using equispaced, equal density and optimized projection allocation. The standard deviation of the angular separation between projections was used to assess streaking and the consistency of the segmented volume of a fiducial gold marker was used to assess motion blur. The standard deviation of the angular separation between projections using displacement binning and optimized projection allocation was 30%-50% smaller than conventional phase based binning and 59%-76% smaller than conventional displacement binning indicating more uniformly spaced projections and fewer streaking artefacts. The standard deviation in the marker volume was 20%-90% smaller when using optimized projection allocation than using conventional phase based binning suggesting more uniform marker segmentation and less motion blur. Images reconstructed using displacement binning and the optimized projection allocation algorithm were clearer, contained visibly fewer streak artefacts and produced more consistent marker segmentation than those reconstructed with either equispaced or equal-density binning. The optimized projection allocation algorithm significantly improves image quality in 4DCBCT images and provides, for the first time, a method to consistently generate high quality displacement binned 4DCBCT images in clinical applications.
Groups That Work: Student Achievement in Group Research Projects and Effects on Individual Learning
ERIC Educational Resources Information Center
Monson, Renee
2017-01-01
Group research projects frequently are used to teach undergraduate research methods. This study uses multivariate analyses to examine the characteristics of higher-achieving groups (those that earn higher grades on group research projects) and to estimate the effects of participating in higher-achieving groups on subsequent individual learning…
Gupta, Abhay; Peck, Garnet E; Miller, Ronald W; Morris, Kenneth R
2005-10-01
This study evaluates the effect of variation in the ambient moisture on the compaction behavior of microcrystalline cellulose (MCC) powder. The study was conducted by comparing the physico-mechanical properties of, and the near infrared (NIR) spectra collected on, compacts prepared by roller compaction with those collected on simulated ribbons, that is, compacts prepared under uni-axial compression. Relative density, moisture content, tensile strength (TS), and Young modulus were used as key sample attributes for comparison. Samples prepared at constant roller compactor settings and feed mass showed constant density and a decrease in TS with increasing moisture content. Compacts prepared under uni-axial compression at constant pressure and compact mass showed the opposite effect, that is, density increased while TS remained almost constant with increasing moisture content. This suggests difference in the influence of moisture on the material under roller compaction, in which the roll gap (i.e., thickness and therefore density) remains almost constant, vs. under uni-axial compression, in which the thickness is free to change in response to the applied pressure. Key sample attributes were also related to the NIR spectra using multivariate data analysis by the partial least squares projection to latent structures (PLS). Good agreement was observed between the measured and the NIR-PLS predicted values for all key attributes for both, the roller compacted samples as well as the simulated ribbons. Copyright (c) 2005 Wiley-Liss, Inc. and the American Pharmacists Association
WSN-Based Space Charge Density Measurement System
Deng, Dawei; Yuan, Haiwen; Lv, Jianxun; Ju, Yong
2017-01-01
It is generally acknowledged that high voltage direct current (HVDC) transmission line endures the drawback of large area, because of which the utilization of cable for space charge density monitoring system is of inconvenience. Compared with the traditional communication network, wireless sensor network (WSN) shows advantages in small volume, high flexibility and strong self-organization, thereby presenting great potential in solving the problem. Additionally, WSN is more suitable for the construction of distributed space charge density monitoring system as it has longer distance and higher mobility. A distributed wireless system is designed for collecting and monitoring the space charge density under HVDC transmission lines, which has been widely applied in both Chinese state grid HVDC test base and power transmission projects. Experimental results of the measuring system demonstrated its adaptability in the complex electromagnetic environment under the transmission lines and the ability in realizing accurate, flexible, and stable demands for the measurement of space charge density. PMID:28052105
WSN-Based Space Charge Density Measurement System.
Deng, Dawei; Yuan, Haiwen; Lv, Jianxun; Ju, Yong
2017-01-01
It is generally acknowledged that high voltage direct current (HVDC) transmission line endures the drawback of large area, because of which the utilization of cable for space charge density monitoring system is of inconvenience. Compared with the traditional communication network, wireless sensor network (WSN) shows advantages in small volume, high flexibility and strong self-organization, thereby presenting great potential in solving the problem. Additionally, WSN is more suitable for the construction of distributed space charge density monitoring system as it has longer distance and higher mobility. A distributed wireless system is designed for collecting and monitoring the space charge density under HVDC transmission lines, which has been widely applied in both Chinese state grid HVDC test base and power transmission projects. Experimental results of the measuring system demonstrated its adaptability in the complex electromagnetic environment under the transmission lines and the ability in realizing accurate, flexible, and stable demands for the measurement of space charge density.
Muon tomography of rock density using Micromegas-TPC telescope
NASA Astrophysics Data System (ADS)
Hivert, Fanny; Busto, José; Gaffet, Stéphane; Ernenwein, Jean-Pierre; Brunner, Jurgen; Salin, Pierre; Decitre, Jean-Baptiste; Lázaro Roche, Ignacio; Martin, Xavier
2014-05-01
The knowledge of the subsurface properties is essentially obtained by geophysical methods, e.g., seismic imaging, electric prospection or gravimetry. The current work is based on a recently developed method to investigate in situ the density of rocks using a measurement of the muon flux, whose attenuation depends on the quantity of matter the particles travel through and hence on the rock density and thickness. The present project (T2DM2) aims at performing underground muon flux measurements in order to characterize spatial and temporal rock massif density variations above the LSBB underground research facility in Rustrel (France). The muon flux will be measured with a new muon telescope device using Micromegas-Time Projection Chamber (TPC) detectors. The first step of the work presented covers the muon flux simulation based on the Gaisser model (Gaisser T., 1990), for the muon flux at the ground level, and on the MUSIC code (Kudryavtsev V. A., 2008) for the propagation of muons through the rock. The results show that the muon flux distortion caused by density variations is enough significant to be observed at 500 m depth for measurement times of about one month. This time-scale is compatible with the duration of the water transfer processes within the unsaturated Karst zone where LSBB is located. The work now focuses on the optimization of the detector layout along the LSBB galleries in order to achieve the best sensitivity.
Big data driven cycle time parallel prediction for production planning in wafer manufacturing
NASA Astrophysics Data System (ADS)
Wang, Junliang; Yang, Jungang; Zhang, Jie; Wang, Xiaoxi; Zhang, Wenjun Chris
2018-07-01
Cycle time forecasting (CTF) is one of the most crucial issues for production planning to keep high delivery reliability in semiconductor wafer fabrication systems (SWFS). This paper proposes a novel data-intensive cycle time (CT) prediction system with parallel computing to rapidly forecast the CT of wafer lots with large datasets. First, a density peak based radial basis function network (DP-RBFN) is designed to forecast the CT with the diverse and agglomerative CT data. Second, the network learning method based on a clustering technique is proposed to determine the density peak. Third, a parallel computing approach for network training is proposed in order to speed up the training process with large scaled CT data. Finally, an experiment with respect to SWFS is presented, which demonstrates that the proposed CTF system can not only speed up the training process of the model but also outperform the radial basis function network, the back-propagation-network and multivariate regression methodology based CTF methods in terms of the mean absolute deviation and standard deviation.
Text extraction method for historical Tibetan document images based on block projections
NASA Astrophysics Data System (ADS)
Duan, Li-juan; Zhang, Xi-qun; Ma, Long-long; Wu, Jian
2017-11-01
Text extraction is an important initial step in digitizing the historical documents. In this paper, we present a text extraction method for historical Tibetan document images based on block projections. The task of text extraction is considered as text area detection and location problem. The images are divided equally into blocks and the blocks are filtered by the information of the categories of connected components and corner point density. By analyzing the filtered blocks' projections, the approximate text areas can be located, and the text regions are extracted. Experiments on the dataset of historical Tibetan documents demonstrate the effectiveness of the proposed method.
Population density and mortality among individuals in motor vehicle crashes.
Gedeborg, Rolf; Thiblin, Ingemar; Byberg, Liisa; Melhus, Håkan; Lindbäck, Johan; Michaelsson, Karl
2010-10-01
To assess whether higher mortality rates among individuals in motor vehicle crashes in areas with low population density depend on injury type and severity or are related to the performance of emergency medical services (EMS). Prehospital and hospital deaths were studied in a population-based cohort of 41,243 motor vehicle crashes that occurred in Sweden between 1998 and 2004. The final multivariable analysis was restricted to 6884 individuals in motor vehicle crashes, to minimise the effects of confounding factors. Crude mortality rates following motor vehicle crashes were inversely related to regional population density. In regions with low population density, the unadjusted rate ratio for prehospital death was 2.2 (95% CI 1.9 to 2.5) and for hospital death 1.5 (95% CI 1.1 to 1.9), compared with a high-density population. However, after controlling for regional differences in age, gender and the type/severity of injuries among 6884 individuals in motor vehicle crashes, low population density was no longer associated with increased mortality. At 25 years of age, predicted prehospital mortality was 9% lower (95% CI 5% to 12%) in regions with low population density compared with high population density. This difference decreased with increasing age, but was still 3% lower (95% CI 0.5% to 5%) at 65 years of age. The inverse relationship between population density and mortality among individuals in motor vehicle crashes is related to pre-crash factors that influence the type and severity of injuries and not to differences in EMS.
Ihmsen, Markus; Cornelis, Jens; Solenthaler, Barbara; Horvath, Christopher; Teschner, Matthias
2013-07-25
We propose a novel formulation of the projection method for Smoothed Particle Hydrodynamics (SPH). We combine a symmetric SPH pressure force and an SPH discretization of the continuity equation to obtain a discretized form of the pressure Poisson equation (PPE). In contrast to previous projection schemes, our system does consider the actual computation of the pressure force. This incorporation improves the convergence rate of the solver. Furthermore, we propose to compute the density deviation based on velocities instead of positions as this formulation improves the robustness of the time-integration scheme. We show that our novel formulation outperforms previous projection schemes and state-of-the-art SPH methods. Large time steps and small density deviations of down to 0.01% can be handled in typical scenarios. The practical relevance of the approach is illustrated by scenarios with up to 40 million SPH particles.
Ihmsen, Markus; Cornelis, Jens; Solenthaler, Barbara; Horvath, Christopher; Teschner, Matthias
2014-03-01
We propose a novel formulation of the projection method for Smoothed Particle Hydrodynamics (SPH). We combine a symmetric SPH pressure force and an SPH discretization of the continuity equation to obtain a discretized form of the pressure Poisson equation (PPE). In contrast to previous projection schemes, our system does consider the actual computation of the pressure force. This incorporation improves the convergence rate of the solver. Furthermore, we propose to compute the density deviation based on velocities instead of positions as this formulation improves the robustness of the time-integration scheme. We show that our novel formulation outperforms previous projection schemes and state-of-the-art SPH methods. Large time steps and small density deviations of down to 0.01 percent can be handled in typical scenarios. The practical relevance of the approach is illustrated by scenarios with up to 40 million SPH particles.
Travison, T G; Chiu, G R; McKinlay, J B; Araujo, A B
2011-10-01
The relative importance of various contributors to racial/ethnic variation in BMC/BMD is not established. Using population-based data, we determined that body composition differences (specifically skeletal muscle and fat mass) are among the strongest contributors to these variations. Racial/ethnic variation in fracture risk is well documented, but the mechanisms by which such heterogeneity arises are poorly understood. We analyzed data from black, Hispanic, and white men enrolled in the Boston Area Community Health/Bone (BACH/Bone) Survey to determine the contributions of risk factors to racial/ethnic differences in bone mineral content (BMC) and density (BMD). In a population-based study, BMC, BMD, and body composition were ascertained by DXA. Socioeconomic status, health history, and dietary intake were obtained via interview. Hormones and markers of bone turnover were obtained from non-fasting blood samples. Multivariate analyses measured percentage reductions in estimated racial/ethnic differences in BMC/BMD, accompanying the successive removal of covariates from linear regression models. Black men demonstrated greater BMC than their Hispanic and white counterparts. At the femoral neck, adjustment for covariables was sufficient to reduce these differences by 46% and 35%, respectively. While absolute differences in BMC were smaller at the distal radius than femoral neck, the proportionate reductions in racial/ethnic differences after covariable adjustment were comparable or greater. Multivariate models provided evidence that lean and fat mass, serum 25(OH)D, osteocalcin, estradiol, and aspects of socioeconomic status influence the magnitude of racial/ethnic differences in BMC, with lean and fat mass providing the strongest effects. Results for BMD were similar, but typically of lesser magnitude and statistical significance. These cross-sectional analyses demonstrate that much of the racial/ethnic heterogeneity in measures of bone mass and density can be accounted for through variation in body composition, diet, and socio-demographic factors.
An assessment of long term ecosystem research activities across European socio-ecological gradients.
Metzger, M J; Bunce, R G H; van Eupen, M; Mirtl, M
2010-06-01
Integration of European long term ecosystem research (LTER) would provide important support for the management of the pan-European environment and ecosystems, as well as international policy commitments. This does require appropriate coverage of Europe and standardised frameworks and research methods between countries. Emerging interest in socio-ecological systems prompted the present assessment of the distribution of LTER activities across European socio-ecological gradients. This paper presents a European stratification with a 1 km(2) resolution, delineating 48 broad socio-ecological regions. The dataset is based on an existing biogeophysical stratification constructed using multivariate clustering of mainly climatic variables and a newly developed socio-economic stratification based on an economic density indicator. The coverage of European LTER facilities across the socio-ecological gradients is tested using this dataset. The analysis shows two strong biases in the present LTER effort. Firstly, urban and disturbed regions are consistently under-represented, illustrating a bias for traditional ecological research away from human activity. Secondly, the Mediterranean, for which some of the most extreme global change impacts are projected, is receiving comparatively little attention. Both findings can help guide future investment in the European LTER network - and especially in a Long Term Socio-Ecological Research (LTSER) component- to provide a more balanced coverage. This will provide better scientific understanding of pan-European environmental concerns and support the management of natural resources and international policy commitments in the European Union. (c) 2010 Elsevier Ltd. All rights reserved.
A Multivariate Analysis of Galaxy Cluster Properties
NASA Astrophysics Data System (ADS)
Ogle, P. M.; Djorgovski, S.
1993-05-01
We have assembled from the literature a data base on on 394 clusters of galaxies, with up to 16 parameters per cluster. They include optical and x-ray luminosities, x-ray temperatures, galaxy velocity dispersions, central galaxy and particle densities, optical and x-ray core radii and ellipticities, etc. In addition, derived quantities, such as the mass-to-light ratios and x-ray gas masses are included. Doubtful measurements have been identified, and deleted from the data base. Our goal is to explore the correlations between these parameters, and interpret them in the framework of our understanding of evolution of clusters and large-scale structure, such as the Gott-Rees scaling hierarchy. Among the simple, monovariate correlations we found, the most significant include those between the optical and x-ray luminosities, x-ray temperatures, cluster velocity dispersions, and central galaxy densities, in various mutual combinations. While some of these correlations have been discussed previously in the literature, generally smaller samples of objects have been used. We will also present the results of a multivariate statistical analysis of the data, including a principal component analysis (PCA). Such an approach has not been used previously for studies of cluster properties, even though it is much more powerful and complete than the simple monovariate techniques which are commonly employed. The observed correlations may lead to powerful constraints for theoretical models of formation and evolution of galaxy clusters. P.M.O. was supported by a Caltech graduate fellowship. S.D. acknowledges a partial support from the NASA contract NAS5-31348 and the NSF PYI award AST-9157412.
Some Supporting Evidence for Accurate Multivariate Perceptions with Chernoff Faces, Project 547.
ERIC Educational Resources Information Center
Wainer, Howard
A scheme, using features in a cartoon-like human face to represent variables, is tested as to its ability to graphically depict multivariate data. A factor analysis of Harman's "24 Psychological Tests" was performed and yielded four orthogonal factors. Nose width represented the loading on Factor 1; eye size on Factor 2; curve of mouth…
All-Possible-Subsets for MANOVA and Factorial MANOVAs: Less than a Weekend Project
ERIC Educational Resources Information Center
Nimon, Kim; Zientek, Linda Reichwein; Kraha, Amanda
2016-01-01
Multivariate techniques are increasingly popular as researchers attempt to accurately model a complex world. MANOVA is a multivariate technique used to investigate the dimensions along which groups differ, and how these dimensions may be used to predict group membership. A concern in a MANOVA analysis is to determine if a smaller subset of…
Remote Multivariable Control Design Using a Competition Game
ERIC Educational Resources Information Center
Atanasijevic-Kunc, M.; Logar, V.; Karba, R.; Papic, M.; Kos, A.
2011-01-01
In this paper, some approaches to teaching multivariable control design are discussed, with special attention being devoted to a step-by-step transition to e-learning. The approach put into practice and presented here is developed through design projects, from which one is chosen as a competition game and is realized using the E-CHO system,…
Materials Data on BaSe (SG:225) by Materials Project
Kristin Persson
2014-11-02
Computed materials data using density functional theory calculations. These calculations determine the electronic structure of bulk materials by solving approximations to the Schrodinger equation. For more information, see https://materialsproject.org/docs/calculations
Materials Data on BaSe (SG:221) by Materials Project
Kristin Persson
2014-11-02
Computed materials data using density functional theory calculations. These calculations determine the electronic structure of bulk materials by solving approximations to the Schrodinger equation. For more information, see https://materialsproject.org/docs/calculations
GNSS-ISR data fusion: General framework with application to the high-latitude ionosphere
NASA Astrophysics Data System (ADS)
Semeter, Joshua; Hirsch, Michael; Lind, Frank; Coster, Anthea; Erickson, Philip; Pankratius, Victor
2016-03-01
A mathematical framework is presented for the fusion of electron density measured by incoherent scatter radar (ISR) and total electron content (TEC) measured using global navigation satellite systems (GNSS). Both measurements are treated as projections of an unknown density field (for GNSS-TEC the projection is tomographic; for ISR the projection is a weighted average over a local spatial region) and discrete inverse theory is applied to obtain a higher fidelity representation of the field than could be obtained from either modality individually. The specific implementation explored herein uses the interpolated ISR density field as initial guess to the combined inverse problem, which is subsequently solved using maximum entropy regularization. Simulations involving a dense meridional network of GNSS receivers near the Poker Flat ISR demonstrate the potential of this approach to resolve sub-beam structure in ISR measurements. Several future directions are outlined, including (1) data fusion using lower level (lag product) ISR data, (2) consideration of the different temporal sampling rates, (3) application of physics-based regularization, (4) consideration of nonoptimal observing geometries, and (5) use of an ISR simulation framework for optimal experiment design.
Climate Change Impacts to North Pacific Pelagic Habitat Are Projected to Lower Carrying Capacity
NASA Astrophysics Data System (ADS)
Woodworth-Jefcoats, P. A.; Polovina, J. J.; Drazen, J.
2016-02-01
We use output from a suite of CMIP5 earth system models to explore the impacts of climate change on marine fisheries over the 21st century. Ocean temperatures from both the historical and RCP 8.5 projections are integrated over the upper 200 m of the water column to characterize thermal habitat in the epipelagic realm. We find that across all models the projected temperature increases lead to a redistribution of thermal habitat: temperatures that currently represent the majority of North Pacific pelagic habitat are replaced by temperatures several degrees warmer. Additionally, all models project the emergence of new thermal habitat that exceeds present-day maximum temperatures. Spatially, present-day thermal habitat retreats northward and contracts eastward as warmer habitat in the southern and western North Pacific expands. In addition to these changes in thermal habitat, zooplankton densities are projected to decline across much of the North Pacific. Taken together, warming temperatures and declining zooplankton densities create the potential for mismatches in metabolic demand and supply through the 21st century. We find that carrying capacity for tropical tunas and other commercially valuable pelagic fish may be especially vulnerable to the impacts of climate change. The waters projected to see the greatest redistribution of thermal habitat and greatest declines in zooplankton densities are primarily those targeted by the Hawaii-based and international longline fleets. Fishery managers around the North Pacific will need to incorporate these impacts of climate change into future management strategies.
Anastasiadis, Anastasios; Onal, Bulent; Modi, Pranjal; Turna, Burak; Duvdevani, Mordechai; Timoney, Anthony; Wolf, J Stuart; De La Rosette, Jean
2013-12-01
This study aimed to explore the relationship between stone density and outcomes of percutaneous nephrolithotomy (PCNL) using the Clinical Research Office of the Endourological Society (CROES) PCNL Global Study database. Patients undergoing PCNL treatment were assigned to a low stone density [LSD, ≤ 1000 Hounsfield units (HU)] or high stone density (HSD, > 1000 HU) group based on the radiological density of the primary renal stone. Preoperative characteristics and outcomes were compared in the two groups. Retreatment for residual stones was more frequent in the LSD group. The overall stone-free rate achieved was higher in the HSD group (79.3% vs 74.8%, p = 0.113). By univariate regression analysis, the probability of achieving a stone-free outcome peaked at approximately 1250 HU. Below or above this density resulted in lower treatment success, particularly at very low HU values. With increasing radiological stone density, operating time decreased to a minimum at approximately 1000 HU, then increased with further increase in stone density. Multivariate non-linear regression analysis showed a similar relationship between the probability of a stone-free outcome and stone density. Higher treatment success rates were found with low stone burden, pelvic stone location and use of pneumatic lithotripsy. Very low and high stone densities are associated with lower rates of treatment success and longer operating time in PCNL. Preoperative assessment of stone density may help in the selection of treatment modality for patients with renal stones.
The local interstellar helium density - Corrected
NASA Technical Reports Server (NTRS)
Freeman, J.; Paresce, F.; Bowyer, S.
1979-01-01
An upper bound for the number density of neutral helium in the local interstellar medium of 0.004 + or - 0.0022 per cu cm was previously reported, based on extreme-ultraviolet telescope observations at 584 A made during the 1975 Apollo-Soyuz Test Project. A variety of evidence is found which indicates that the 584-A sensitivity of the instrument declined by a factor of 2 between the last laboratory calibration and the time of the measurements. The upper bound on the helium density is therefore revised to 0.0089 + or - 0.005 per cu cm.
Patterns of brain structural connectivity differentiate normal weight from overweight subjects
Gupta, Arpana; Mayer, Emeran A.; Sanmiguel, Claudia P.; Van Horn, John D.; Woodworth, Davis; Ellingson, Benjamin M.; Fling, Connor; Love, Aubrey; Tillisch, Kirsten; Labus, Jennifer S.
2015-01-01
Background Alterations in the hedonic component of ingestive behaviors have been implicated as a possible risk factor in the pathophysiology of overweight and obese individuals. Neuroimaging evidence from individuals with increasing body mass index suggests structural, functional, and neurochemical alterations in the extended reward network and associated networks. Aim To apply a multivariate pattern analysis to distinguish normal weight and overweight subjects based on gray and white-matter measurements. Methods Structural images (N = 120, overweight N = 63) and diffusion tensor images (DTI) (N = 60, overweight N = 30) were obtained from healthy control subjects. For the total sample the mean age for the overweight group (females = 32, males = 31) was 28.77 years (SD = 9.76) and for the normal weight group (females = 32, males = 25) was 27.13 years (SD = 9.62). Regional segmentation and parcellation of the brain images was performed using Freesurfer. Deterministic tractography was performed to measure the normalized fiber density between regions. A multivariate pattern analysis approach was used to examine whether brain measures can distinguish overweight from normal weight individuals. Results 1. White-matter classification: The classification algorithm, based on 2 signatures with 17 regional connections, achieved 97% accuracy in discriminating overweight individuals from normal weight individuals. For both brain signatures, greater connectivity as indexed by increased fiber density was observed in overweight compared to normal weight between the reward network regions and regions of the executive control, emotional arousal, and somatosensory networks. In contrast, the opposite pattern (decreased fiber density) was found between ventromedial prefrontal cortex and the anterior insula, and between thalamus and executive control network regions. 2. Gray-matter classification: The classification algorithm, based on 2 signatures with 42 morphological features, achieved 69% accuracy in discriminating overweight from normal weight. In both brain signatures regions of the reward, salience, executive control and emotional arousal networks were associated with lower morphological values in overweight individuals compared to normal weight individuals, while the opposite pattern was seen for regions of the somatosensory network. Conclusions 1. An increased BMI (i.e., overweight subjects) is associated with distinct changes in gray-matter and fiber density of the brain. 2. Classification algorithms based on white-matter connectivity involving regions of the reward and associated networks can identify specific targets for mechanistic studies and future drug development aimed at abnormal ingestive behavior and in overweight/obesity. PMID:25737959
Prakash Nepal; Peter J. Ince; Kenneth E. Skog; Sun J. Chang
2012-01-01
This paper describes a set of empirical net forest growth models based on forest growing-stock density relationships for three U.S. regions (North, South, and West) and two species groups (softwoods and hardwoods) at the regional aggregate level. The growth models accurately predict historical U.S. timber inventory trends when we incorporate historical timber harvests...
Spatial fuel data products of the LANDFIRE Project
Reeves, M.C.; Ryan, K.C.; Rollins, M.G.; Thompson, T.G.
2009-01-01
The Landscape Fire and Resource Management Planning Tools (LANDFIRE) Project is mapping wildland fuels, vegetation, and fire regime characteristics across the United States. The LANDFIRE project is unique because of its national scope, creating an integrated product suite at 30-m spatial resolution and complete spatial coverage of all lands within the 50 states. Here we describe development of the LANDFIRE wildland fuels data layers for the conterminous 48 states: surface fire behavior fuel models, canopy bulk density, canopy base height, canopy cover, and canopy height. Surface fire behavior fuel models are mapped by developing crosswalks to vegetation structure and composition created by LANDFIRE. Canopy fuels are mapped using regression trees relating field-referenced estimates of canopy base height and canopy bulk density to satellite imagery, biophysical gradients and vegetation structure and composition data. Here we focus on the methods and data used to create the fuel data products, discuss problems encountered with the data, provide an accuracy assessment, demonstrate recent use of the data during the 2007 fire season, and discuss ideas for updating, maintaining and improving LANDFIRE fuel data products.
Overview of computational control research at UT Austin
NASA Technical Reports Server (NTRS)
Bong, Wie
1989-01-01
An overview of current research activities at UT Austin is presented to discuss certain technical issues in the following areas: (1) Computer-Aided Nonlinear Control Design: In this project, the describing function method is employed for the nonlinear control analysis and design of a flexible spacecraft equipped with pulse modulated reaction jets. INCA program has been enhanced to allow the numerical calculation of describing functions as well as the nonlinear limit cycle analysis capability in the frequency domain; (2) Robust Linear Quadratic Gaussian (LQG) Compensator Synthesis: Robust control design techniques and software tools are developed for flexible space structures with parameter uncertainty. In particular, an interactive, robust multivariable control design capability is being developed for INCA program; and (3) LQR-Based Autonomous Control System for the Space Station: In this project, real time implementation of LQR-based autonomous control system is investigated for the space station with time-varying inertias and with significant multibody dynamic interactions.
Analysis of recoverable current from one component of magnetic flux density in MREIT and MRCDI.
Park, Chunjae; Lee, Byung Il; Kwon, Oh In
2007-06-07
Magnetic resonance current density imaging (MRCDI) provides a current density image by measuring the induced magnetic flux density within the subject with a magnetic resonance imaging (MRI) scanner. Magnetic resonance electrical impedance tomography (MREIT) has been focused on extracting some useful information of the current density and conductivity distribution in the subject Omega using measured B(z), one component of the magnetic flux density B. In this paper, we analyze the map Tau from current density vector field J to one component of magnetic flux density B(z) without any assumption on the conductivity. The map Tau provides an orthogonal decomposition J = J(P) + J(N) of the current J where J(N) belongs to the null space of the map Tau. We explicitly describe the projected current density J(P) from measured B(z). Based on the decomposition, we prove that B(z) data due to one injection current guarantee a unique determination of the isotropic conductivity under assumptions that the current is two-dimensional and the conductivity value on the surface is known. For a two-dimensional dominating current case, the projected current density J(P) provides a good approximation of the true current J without accumulating noise effects. Numerical simulations show that J(P) from measured B(z) is quite similar to the target J. Biological tissue phantom experiments compare J(P) with the reconstructed J via the reconstructed isotropic conductivity using the harmonic B(z) algorithm.
NASA Astrophysics Data System (ADS)
Chan, C. H.; Brown, G.; Rikvold, P. A.
2017-05-01
A generalized approach to Wang-Landau simulations, macroscopically constrained Wang-Landau, is proposed to simulate the density of states of a system with multiple macroscopic order parameters. The method breaks a multidimensional random-walk process in phase space into many separate, one-dimensional random-walk processes in well-defined subspaces. Each of these random walks is constrained to a different set of values of the macroscopic order parameters. When the multivariable density of states is obtained for one set of values of fieldlike model parameters, the density of states for any other values of these parameters can be obtained by a simple transformation of the total system energy. All thermodynamic quantities of the system can then be rapidly calculated at any point in the phase diagram. We demonstrate how to use the multivariable density of states to draw the phase diagram, as well as order-parameter probability distributions at specific phase points, for a model spin-crossover material: an antiferromagnetic Ising model with ferromagnetic long-range interactions. The fieldlike parameters in this model are an effective magnetic field and the strength of the long-range interaction.
Linking the Weather Generator with Regional Climate Model
NASA Astrophysics Data System (ADS)
Dubrovsky, Martin; Farda, Ales; Skalak, Petr; Huth, Radan
2013-04-01
One of the downscaling approaches, which transform the raw outputs from the climate models (GCMs or RCMs) into data with more realistic structure, is based on linking the stochastic weather generator with the climate model output. The present contribution, in which the parametric daily surface weather generator (WG) M&Rfi is linked to the RCM output, follows two aims: (1) Validation of the new simulations of the present climate (1961-1990) made by the ALADIN-Climate Regional Climate Model at 25 km resolution. The WG parameters are derived from the RCM-simulated surface weather series and compared to those derived from weather series observed in 125 Czech meteorological stations. The set of WG parameters will include statistics of the surface temperature and precipitation series (including probability of wet day occurrence). (2) Presenting a methodology for linking the WG with RCM output. This methodology, which is based on merging information from observations and RCM, may be interpreted as a downscaling procedure, whose product is a gridded WG capable of producing realistic synthetic multivariate weather series for weather-ungauged locations. In this procedure, WG is calibrated with RCM-simulated multi-variate weather series in the first step, and the grid specific WG parameters are then de-biased by spatially interpolated correction factors based on comparison of WG parameters calibrated with gridded RCM weather series and spatially scarcer observations. The quality of the weather series produced by the resultant gridded WG will be assessed in terms of selected climatic characteristics (focusing on characteristics related to variability and extremes of surface temperature and precipitation). Acknowledgements: The present experiment is made within the frame of projects ALARO-Climate (project P209/11/2405 sponsored by the Czech Science Foundation), WG4VALUE (project LD12029 sponsored by the Ministry of Education, Youth and Sports of CR) and VALUE (COST ES 1102 action).
Mass density images from the diffraction enhanced imaging technique.
Hasnah, M O; Parham, C; Pisano, E D; Zhong, Z; Oltulu, O; Chapman, D
2005-02-01
Conventional x-ray radiography measures the projected x-ray attenuation of an object. It requires attenuation differences to obtain contrast of embedded features. In general, the best absorption contrast is obtained at x-ray energies where the absorption is high, meaning a high absorbed dose. Diffraction-enhanced imaging (DEI) derives contrast from absorption, refraction, and extinction. The refraction angle image of DEI visualizes the spatial gradient of the projected electron density of the object. The projected electron density often correlates well with the projected mass density and projected absorption in soft-tissue imaging, yet the mass density is not an "energy"-dependent property of the object, as is the case of absorption. This simple difference can lead to imaging with less x-ray exposure or dose. In addition, the mass density image can be directly compared (i.e., a signal-to-noise comparison) with conventional radiography. We present the method of obtaining the mass density image, the results of experiments in which comparisons are made with radiography, and an application of the method to breast cancer imaging.
Analysis of interstellar fragmentation structure based on IRAS images
NASA Technical Reports Server (NTRS)
Scalo, John M.
1989-01-01
The goal of this project was to develop new tools for the analysis of the structure of densely sampled maps of interstellar star-forming regions. A particular emphasis was on the recognition and characterization of nested hierarchical structure and fractal irregularity, and their relation to the level of star formation activity. The panoramic IRAS images provided data with the required range in spatial scale, greater than a factor of 100, and in column density, greater than a factor of 50. In order to construct a densely sampled column density map of a cloud complex which is both self-gravitating and not (yet?) stirred up much by star formation, a column density image of the Taurus region has been constructed from IRAS data. The primary drawback to using the IRAS data for this purpose is that it contains no velocity information, and the possible importance of projection effects must be kept in mind.
Modelling Truck Camper Production
ERIC Educational Resources Information Center
Kramlich, G. R., II; Kobylski, G.; Ahner, D.
2008-01-01
This note describes an interdisciplinary project designed to enhance students' knowledge of the basic techniques taught in a multivariable calculus course. The note discusses the four main requirements of the project and then the solutions for each requirement. Concepts covered include differentials, gradients, Lagrange multipliers, constrained…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
HEATS Project: The University of Utah is developing a compact hot-and-cold thermal battery using advanced metal hydrides that could offer efficient climate control system for EVs. The team’s innovative designs of heating and cooling systems for EVs with high energy density, low-cost thermal batteries could significantly reduce the weight and eliminate the space constraint in automobiles. The thermal battery can be charged by plugging it into an electrical outlet while charging the electric battery and it produces heat and cold through a heat exchanger when discharging. The ultimate goal of the project is a climate-controlling thermal battery that can lastmore » up to 5,000 charge and discharge cycles while substantially increasing the driving range of EVs, thus reducing the drain on electric batteries.« less
Identifying environmental features for land management decisions
NASA Technical Reports Server (NTRS)
1984-01-01
Multivariate statistical analysis and imaging processing techniques are being applied to the study of arid/semiarid environments, with emphasis on desertification. Field level indicators of land-soil biota degradation are being sifted out with staging up to the low aircraft reconnaissance level, to LANDSAT TM & MSS, and even to the AVHRR level. Three completed projects are reviewed: riparian habitat on the Humboldt River floodplain, Salt Lake County Urban expansion detection, and salinization/desertification detection in the delta area. Beginning projects summarized include: comparative condition of rangeland in Rush Valley; modeling a GIS/remote sensing data base for Cache County; universal soil loss equation applied to Pinyon-Juniper; relating MSS to ground radiometry near Battle Mountain; and riparian habitat mapping on Mary's River, Nevada.
Salas, Desirée; Le Gall, Antoine; Fiche, Jean-Bernard; Valeri, Alessandro; Ke, Yonggang; Bron, Patrick; Bellot, Gaetan
2017-01-01
Superresolution light microscopy allows the imaging of labeled supramolecular assemblies at a resolution surpassing the classical diffraction limit. A serious limitation of the superresolution approach is sample heterogeneity and the stochastic character of the labeling procedure. To increase the reproducibility and the resolution of the superresolution results, we apply multivariate statistical analysis methods and 3D reconstruction approaches originally developed for cryogenic electron microscopy of single particles. These methods allow for the reference-free 3D reconstruction of nanomolecular structures from two-dimensional superresolution projection images. Since these 2D projection images all show the structure in high-resolution directions of the optical microscope, the resulting 3D reconstructions have the best possible isotropic resolution in all directions. PMID:28811371
Clustering analysis for muon tomography data elaboration in the Muon Portal project
NASA Astrophysics Data System (ADS)
Bandieramonte, M.; Antonuccio-Delogu, V.; Becciani, U.; Costa, A.; La Rocca, P.; Massimino, P.; Petta, C.; Pistagna, C.; Riggi, F.; Riggi, S.; Sciacca, E.; Vitello, F.
2015-05-01
Clustering analysis is one of multivariate data analysis techniques which allows to gather statistical data units into groups, in order to minimize the logical distance within each group and to maximize the one between different groups. In these proceedings, the authors present a novel approach to the muontomography data analysis based on clustering algorithms. As a case study we present the Muon Portal project that aims to build and operate a dedicated particle detector for the inspection of harbor containers to hinder the smuggling of nuclear materials. Clustering techniques, working directly on scattering points, help to detect the presence of suspicious items inside the container, acting, as it will be shown, as a filter for a preliminary analysis of the data.
Feasibility study of new energy projects on three-level indicator system
NASA Astrophysics Data System (ADS)
Zhan, Zhigang
2018-06-01
With the rapid development of new energy industry, many new energy development projects are being carried out all over the world. To analyze the feasibility of the project. we build feasibility of new energy projects assessment model, based on the gathered abundant data about progress in new energy projects.12 indicators are selected by principal component analysis(PCA). Then we construct a new three-level indicator system, where the first level has 1 indicator, the second level has 5 indicators and the third level has 12 indicators to evaluate. Moreover, we use the entropy weight method (EWM) to get weight vector of the indicators in the third level and the multivariate statistical analysis(MVA)to get the weight vector of indicators in the second-class. We use this evaluation model to evaluate the feasibility of the new energy project and make a reference for the subsequent new energy investment. This could be a contribution to the world's low-carbon and green development by investing in sustainable new energy projects. We will introduce new variables and improve the weight model in the future. We also conduct a sensitivity analysis of the model and illustrate the strengths and weaknesses.
Chen, Pengxiang; Han, Lihui; Wang, Cong; Jia, Yibin; Song, Qingxu; Wang, Jianbo; Guan, Shanghui; Tan, Bingxu; Liu, Bowen; Jia, Wenqiao; Cui, Jianfeng; Zhou, Wei; Cheng, Yufeng
2017-06-20
This study was to evaluate the prognostic significance of serum lipids in esophageal squamous cell carcinoma patients who underwent esophagectomy. Preoperative serum lipids were collected from 214 patients who were diagnosed with esophageal squamous cell carcinoma. All of the patients received esophagectomy in Qilu Hospital of Shandong University from January 2007 to December 2008. The records and data were analyzed retrospectively. We found that low total cholesterol (for T stage, p = 0.006; for TNM stage, p = 0.039) and low-density lipoprotein cholesterol (for T stage, p = 0.031; for TNM stage, p = 0.035) were associated with advanced T stage and TNM stage. Kaplan-Meier survival analysis indicated that low total cholesterol and low-density lipoprotein cholesterol were associated with shorter disease-free survival(for total cholesterol, p = 0.045; for low-density lipoprotein cholesterol, p < 0.001) and overall survival (for total cholesterol, p = 0.043; for low-density lipoprotein cholesterol, p < 0.001). Lower low-density lipoprotein cholesterol/high-density lipoprotein cholesterol ratio (LHR) indicated poorer disease-free survival and overall survival (both p < 0.001). In the multivariate analysis, low-density lipoprotein cholesterol and LHR were independent prognostic factors for disease-free survival and overall survival. In conclusion, our study indicated that preoperative serum total cholesterol and low-density lipoprotein cholesterol are prognostic factors for esophageal squamous cell carcinoma patients who underwent esophagectomy. LHR can serve as a promising serum lipids-based prognostic indicator.
Brekke, L.D.; Dettinger, M.D.; Maurer, E.P.; Anderson, M.
2008-01-01
Ensembles of historical climate simulations and climate projections from the World Climate Research Programme's (WCRP's) Coupled Model Intercomparison Project phase 3 (CMIP3) multi-model dataset were investigated to determine how model credibility affects apparent relative scenario likelihoods in regional risk assessments. Methods were developed and applied in a Northern California case study. An ensemble of 59 twentieth century climate simulations from 17 WCRP CMIP3 models was analyzed to evaluate relative model credibility associated with a 75-member projection ensemble from the same 17 models. Credibility was assessed based on how models realistically reproduced selected statistics of historical climate relevant to California climatology. Metrics of this credibility were used to derive relative model weights leading to weight-threshold culling of models contributing to the projection ensemble. Density functions were then estimated for two projected quantities (temperature and precipitation), with and without considering credibility-based ensemble reductions. An analysis for Northern California showed that, while some models seem more capable at recreating limited aspects twentieth century climate, the overall tendency is for comparable model performance when several credibility measures are combined. Use of these metrics to decide which models to include in density function development led to local adjustments to function shapes, but led to limited affect on breadth and central tendency, which were found to be more influenced by 'completeness' of the original ensemble in terms of models and emissions pathways. ?? 2007 Springer Science+Business Media B.V.
Multispectral x-ray CT: multivariate statistical analysis for efficient reconstruction
NASA Astrophysics Data System (ADS)
Kheirabadi, Mina; Mustafa, Wail; Lyksborg, Mark; Lund Olsen, Ulrik; Bjorholm Dahl, Anders
2017-10-01
Recent developments in multispectral X-ray detectors allow for an efficient identification of materials based on their chemical composition. This has a range of applications including security inspection, which is our motivation. In this paper, we analyze data from a tomographic setup employing the MultiX detector, that records projection data in 128 energy bins covering the range from 20 to 160 keV. Obtaining all information from this data requires reconstructing 128 tomograms, which is computationally expensive. Instead, we propose to reduce the dimensionality of projection data prior to reconstruction and reconstruct from the reduced data. We analyze three linear methods for dimensionality reduction using a dataset with 37 equally-spaced projection angles. Four bottles with different materials are recorded for which we are able to obtain similar discrimination of their content using a very reduced subset of tomograms compared to the 128 tomograms that would otherwise be needed without dimensionality reduction.
Goldenberg, Shira M; Deering, Kathleen; Amram, Ofer; Guillemi, Silvia; Nguyen, Paul; Montaner, Julio; Shannon, Kate
2017-09-01
Despite the high HIV burden faced by sex workers, data on access and retention in antiretroviral therapy (ART) are limited. Using an innovative spatial epidemiological approach, we explored how the social geography of sex work criminalization and violence impacts HIV treatment interruptions among sex workers living with HIV in Vancouver over a 3.5-year period. Drawing upon data from a community-based cohort (AESHA, 2010-2013) and linked external administrative data on ART dispensation, GIS mapping and multivariable logistic regression with generalized estimating equations to prospectively examine the effects of spatial criminalization and violence near women's places of residence on 2-day ART interruptions. Analyses were restricted to 66 ART-exposed women who contributed 208 observations and 83 ART interruption events. In adjusted multivariable models, heightened density of displacement due to policing independently correlated with HIV treatment interruptions (AOR: 1.02, 95%CI: 1.00-1.04); density of legal restrictions (AOR: 1.30, 95%CI: 0.97-1.76) and a combined measure of criminalization/violence (AOR: 1.00, 95%CI: 1.00-1.01) were marginally correlated. The social geography of sex work criminalization may undermine access to essential medicines, including HIV treatment. Interventions to promote 'enabling environments' (e.g. peer-led models, safer living/working spaces) should be explored, alongside policy reforms to ensure uninterrupted treatment access.
NASA Astrophysics Data System (ADS)
Hoell, Simon; Omenzetter, Piotr
2018-02-01
To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.
ERIC Educational Resources Information Center
Steinley, Douglas; Brusco, Michael J.; Henson, Robert
2012-01-01
A measure of "clusterability" serves as the basis of a new methodology designed to preserve cluster structure in a reduced dimensional space. Similar to principal component analysis, which finds the direction of maximal variance in multivariate space, principal cluster axes find the direction of maximum clusterability in multivariate space.…
Xie, Weixing; Jin, Daxiang; Ma, Hui; Ding, Jinyong; Xu, Jixi; Zhang, Shuncong; Liang, De
2016-05-01
The risk factors for cement leakage were retrospectively reviewed in 192 patients who underwent percutaneous vertebral augmentation (PVA). To discuss the factors related to the cement leakage in PVA procedure for the treatment of osteoporotic vertebral compression fractures. PVA is widely applied for the treatment of osteoporotic vertebral fractures. Cement leakage is a major complication of this procedure. The risk factors for cement leakage were controversial. A retrospective review of 192 patients who underwent PVA was conducted. The following data were recorded: age, sex, bone density, number of fractured vertebrae before surgery, number of treated vertebrae, severity of the treated vertebrae, operative approach, volume of injected bone cement, preoperative vertebral compression ratio, preoperative local kyphosis angle, intraosseous clefts, preoperative vertebral cortical bone defect, and ratio and type of cement leakage. To study the correlation between each factor and cement leakage ratio, bivariate regression analysis was employed to perform univariate analysis, whereas multivariate linear regression analysis was employed to perform multivariate analysis. The study included 192 patients (282 treated vertebrae), and cement leakage occurred in 100 vertebrae (35.46%). The vertebrae with preoperative cortical bone defects generally exhibited higher cement leakage ratio, and the leakage is typically type C. Vertebrae with intact cortical bones before the procedure tend to experience type S leakage. Univariate analysis showed that patient age, bone density, number of fractured vertebrae before surgery, and vertebral cortical bone were associated with cement leakage ratio (P<0.05). Multivariate analysis showed that the main factors influencing bone cement leakage are bone density and vertebral cortical bone defect, with standardized partial regression coefficients of -0.085 and 0.144, respectively. High bone density and vertebral cortical bone defect are independent risk factors associated with bone cement leakage.
Park, Sung Hee; Lee, Ji Young; Kim, Sangsoo
2011-01-01
Current Genome-Wide Association Studies (GWAS) are performed in a single trait framework without considering genetic correlations between important disease traits. Hence, the GWAS have limitations in discovering genetic risk factors affecting pleiotropic effects. This work reports a novel data mining approach to discover patterns of multiple phenotypic associations over 52 anthropometric and biochemical traits in KARE and a new analytical scheme for GWAS of multivariate phenotypes defined by the discovered patterns. This methodology applied to the GWAS for multivariate phenotype highLDLhighTG derived from the predicted patterns of the phenotypic associations. The patterns of the phenotypic associations were informative to draw relations between plasma lipid levels with bone mineral density and a cluster of common traits (Obesity, hypertension, insulin resistance) related to Metabolic Syndrome (MS). A total of 15 SNPs in six genes (PAK7, C20orf103, NRIP1, BCL2, TRPM3, and NAV1) were identified for significant associations with highLDLhighTG. Noteworthy findings were that the significant associations included a mis-sense mutation (PAK7:R335P), a frame shift mutation (C20orf103) and SNPs in splicing sites (TRPM3). The six genes corresponded to rat and mouse quantitative trait loci (QTLs) that had shown associations with the common traits such as the well characterized MS and even tumor susceptibility. Our findings suggest that the six genes may play important roles in the pleiotropic effects on lipid metabolism and the MS, which increase the risk of Type 2 Diabetes and cardiovascular disease. The use of the multivariate phenotypes can be advantageous in identifying genetic risk factors, accounting for the pleiotropic effects when the multivariate phenotypes have a common etiological pathway.
NASA Astrophysics Data System (ADS)
Othman, Khairulnizam; Ahmad, Afandi
2016-11-01
In this research we explore the application of normalize denoted new techniques in advance fast c-mean in to the problem of finding the segment of different breast tissue regions in mammograms. The goal of the segmentation algorithm is to see if new denotes fuzzy c- mean algorithm could separate different densities for the different breast patterns. The new density segmentation is applied with multi-selection of seeds label to provide the hard constraint, whereas the seeds labels are selected based on user defined. New denotes fuzzy c- mean have been explored on images of various imaging modalities but not on huge format digital mammograms just yet. Therefore, this project is mainly focused on using normalize denoted new techniques employed in fuzzy c-mean to perform segmentation to increase visibility of different breast densities in mammography images. Segmentation of the mammogram into different mammographic densities is useful for risk assessment and quantitative evaluation of density changes. Our proposed methodology for the segmentation of mammograms on the basis of their region into different densities based categories has been tested on MIAS database and Trueta Database.
A k-omega multivariate beta PDF for supersonic turbulent combustion
NASA Technical Reports Server (NTRS)
Alexopoulos, G. A.; Baurle, R. A.; Hassan, H. A.
1993-01-01
In a recent attempt by the authors at predicting measurements in coaxial supersonic turbulent reacting mixing layers involving H2 and air, a number of discrepancies involving the concentrations and their variances were noted. The turbulence model employed was a one-equation model based on the turbulent kinetic energy. This required the specification of a length scale. In an attempt at detecting the cause of the discrepancy, a coupled k-omega joint probability density function (PDF) is employed in conjunction with a Navier-Stokes solver. The results show that improvements resulting from a k-omega model are quite modest.
Winterer, G; Androsova, G; Bender, O; Boraschi, D; Borchers, F; Dschietzig, T B; Feinkohl, I; Fletcher, P; Gallinat, J; Hadzidiakos, D; Haynes, J D; Heppner, F; Hetzer, S; Hendrikse, J; Ittermann, B; Kant, I M J; Kraft, A; Krannich, A; Krause, R; Kühn, S; Lachmann, G; van Montfort, S J T; Müller, A; Nürnberg, P; Ofosu, K; Pietsch, M; Pischon, T; Preller, J; Renzulli, E; Scheurer, K; Schneider, R; Slooter, A J C; Spies, C; Stamatakis, E; Volk, H D; Weber, S; Wolf, A; Yürek, F; Zacharias, N
2018-04-01
Postoperative cognitive impairment is among the most common medical complications associated with surgical interventions - particularly in elderly patients. In our aging society, it is an urgent medical need to determine preoperative individual risk prediction to allow more accurate cost-benefit decisions prior to elective surgeries. So far, risk prediction is mainly based on clinical parameters. However, these parameters only give a rough estimate of the individual risk. At present, there are no molecular or neuroimaging biomarkers available to improve risk prediction and little is known about the etiology and pathophysiology of this clinical condition. In this short review, we summarize the current state of knowledge and briefly present the recently started BioCog project (Biomarker Development for Postoperative Cognitive Impairment in the Elderly), which is funded by the European Union. It is the goal of this research and development (R&D) project, which involves academic and industry partners throughout Europe, to deliver a multivariate algorithm based on clinical assessments as well as molecular and neuroimaging biomarkers to overcome the currently unsatisfying situation. Copyright © 2017. Published by Elsevier Masson SAS.
Aneurysm permeability following coil embolization: packing density and coil distribution
Chueh, Ju-Yu; Vedantham, Srinivasan; Wakhloo, Ajay K; Carniato, Sarena L; Puri, Ajit S; Bzura, Conrad; Coffin, Spencer; Bogdanov, Alexei A; Gounis, Matthew J
2015-01-01
Background Rates of durable aneurysm occlusion following coil embolization vary widely, and a better understanding of coil mass mechanics is desired. The goal of this study is to evaluate the impact of packing density and coil uniformity on aneurysm permeability. Methods Aneurysm models were coiled using either Guglielmi detachable coils or Target coils. The permeability was assessed by taking the ratio of microspheres passing through the coil mass to those in the working fluid. Aneurysms containing coil masses were sectioned for image analysis to determine surface area fraction and coil uniformity. Results All aneurysms were coiled to a packing density of at least 27%. Packing density, surface area fraction of the dome and neck, and uniformity of the dome were significantly correlated (p<0.05). Hence, multivariate principal components-based partial least squares regression models were used to predict permeability. Similar loading vectors were obtained for packing and uniformity measures. Coil mass permeability was modeled better with the inclusion of packing and uniformity measures of the dome (r2=0.73) than with packing density alone (r2=0.45). The analysis indicates the importance of including a uniformity measure for coil distribution in the dome along with packing measures. Conclusions A densely packed aneurysm with a high degree of coil mass uniformity will reduce permeability. PMID:25031179
Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; ...
2014-12-02
Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signaturemore » and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanemoto, S.; Andoh, Y.; Sandoz, S.A.
1984-10-01
A method for evaluating reactor stability in boiling water reactors has been developed. The method is based on multivariate autoregressive (M-AR) modeling of steady-state neutron and process noise signals. In this method, two kinds of power spectral densities (PSDs) for the measured neutron signal and the corresponding noise source signal are separately identified by the M-AR modeling. The closed- and open-loop stability parameters are evaluated from these PSDs. The method is applied to actual plant noise data that were measured together with artificial perturbation test data. Stability parameters identified from noise data are compared to those from perturbation test data,more » and it is shown that both results are in good agreement. In addition to these stability estimations, driving noise sources for the neutron signal are evaluated by the M-AR modeling. Contributions from void, core flow, and pressure noise sources are quantitatively evaluated, and the void noise source is shown to be the most dominant.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belianinov, Alex, E-mail: belianinova@ornl.gov; Ganesh, Panchapakesan; Lin, Wenzhi
2014-12-01
Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe{sub 0.55}Se{sub 0.45} (T{sub c} = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe{sub 1−x}Se{sub x} structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified bymore » their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less
An assessment of twilight airglow inversion procedures using atmosphere explorer observations
NASA Technical Reports Server (NTRS)
Mcdade, I. C.; Sharp, W. E.
1993-01-01
The aim of this research project was to test and truth some recently developed methods for recovering thermospheric oxygen atom densities and thermospheric temperatures from ground-based observations of the 7320 A O(+)((sup 2)D - (sup 2)P) twilight air glow emission. The research plan was to use twilight observations made by the Visible Airglow Experiment (VAE) on the Atmosphere Explorer 'E' satellite as proxy ground based twilight observations. These observations were to be processed using the twilight inversion procedures, and the recovered oxygen atom densities and thermospheric temperatures were then to be examined to see how they compared with the densities and temperatures that were measured by the Open Source Mass Spectrometer and the Neutral Atmosphere Temperature Experiment on the satellite.
Outcome-based and Participation-based Wellness Incentives
Barleen, Nathan A.; Marzec, Mary L.; Boerger, Nicholas L.; Moloney, Daniel P.; Zimmerman, Eric M.; Dobro, Jeff
2017-01-01
Objective: This study examined whether worksite wellness program participation or achievement of health improvement targets differed according to four incentive types (participation-based, hybrid, outcome-based, and no incentive). Methods: The study included individuals who completed biometric health screenings in both 2013 and 2014 and had elevated metrics in 2013 (baseline year). Multivariate logistic regression modeling tested for differences in odds of participation and achievement of health improvement targets between incentive groups; controlling for demographics, employer characteristics, incentive amounts, and other factors. Results: No statistically significant differences between incentive groups occurred for odds of participation or achievement of health improvement target related to body mass index, blood pressure, or nonhigh-density lipoprotein cholesterol. Conclusions: Given the null findings of this study, employers cannot assume that outcome-based incentives will result in either increased program participation or greater achievement of health improvement targets than participation-based incentives. PMID:28146041
NASA Astrophysics Data System (ADS)
Milner-Bolotin, Marina
2001-07-01
Motivating nonscience majors in science and mathematics studies became one of the most interesting and important challenges in contemporary science and mathematics education. Therefore, designing and studying a learning environment, which enhances students' motivation, is an important task. This experimental study sought to explore the implications of student autonomy in topic choice in a project-based Physical Science Course for nonscience majors' on students' motivational orientation. It also suggested and tested a model explaining motivational outcomes of project-based learning environment through increased student ownership of science projects. A project, How Things Work, was designed and implemented in this study. The focus of the project was application of physical science concepts learned in the classroom to everyday life situations. Participants of the study (N = 59) were students enrolled in three selected sections of a Physical Science Course, designed to fulfill science requirements for nonscience majors. These sections were taught by the same instructor over a period of an entire 16-week semester at a large public research university. The study focused on four main variables: student autonomy in choosing a project topic, their motivational orientation, student ownership of the project, and the interest in the project topic. Achievement Goal Orientation theory became the theoretical framework for the study. Student motivational orientation, defined as mastery or performance goal orientation, was measured by an Achievement Goal Orientation Questionnaire. Student ownership was measured using an original instrument, Ownership Measurement Questionnaire, designed and tested by the researchers. Repeated measures yoked design, ANOVA, ANCOVA, and multivariate regression analysis were implemented in the study. Qualitative analysis was used to complement and verify quantitative results. It has been found that student autonomy in the project choice did not make a significant impact on their motivational orientation, while their initial interest in the project topic did. The latter was found to be related to students' ownership of the project, which was found to lead to improved mastery goal orientation. These findings indicate that incorporating project-based learning in science teaching may lead to increased student mastery goal orientation, and may result in improved science learning.
Helzner, E P.; Scarmeas, N; Cosentino, S; Tang, M X.; Schupf, N; Stern, Y
2008-01-01
Objective: To describe factors associated with survival in Alzheimer disease (AD) in a multiethnic, population-based longitudinal study. Methods: AD cases were identified in the Washington Heights Inwood Columbia Aging Project, a longitudinal, community-based study of cognitive aging in Northern Manhattan. The sample comprised 323 participants who were initially dementia-free but developed AD during study follow-up (incident cases). Participants were followed for an average of 4.1 (up to 12.6) years. Possible factors associated with shorter lifespan were assessed using Cox proportional hazards models with attained age as the time to event (time from birth to death or last follow-up). In subanalyses, median postdiagnosis survival durations were estimated using postdiagnosis study follow-up as the timescale. Results: The mortality rate was 10.7 per 100 person-years. Mortality rates were higher among those diagnosed at older ages, and among Hispanics compared to non-Hispanic whites. The median lifespan of the entire sample was 92.2 years (95% CI: 90.3, 94.1). In a multivariable-adjusted Cox model, history of diabetes and history of hypertension were independently associated with a shorter lifespan. No differences in lifespan were seen by race/ethnicity after multivariable adjustment. The median postdiagnosis survival duration was 3.7 years among non-Hispanic whites, 4.8 years among African Americans, and 7.6 years among Hispanics. Conclusion: Factors influencing survival in Alzheimer disease include race/ethnicity and comorbid diabetes and hypertension. GLOSSARY AD = Alzheimer disease; NDI = National Death Index; WHICAP = Washington Heights Inwood Columbia Aging Project. PMID:18981370
ERIC Educational Resources Information Center
de Oliveira, Rodrigo R.; das Neves, Luiz S.; de Lima, Kassio M. G.
2012-01-01
A chemometrics course is offered to students in their fifth semester of the chemistry undergraduate program that includes an in-depth project. Students carry out the project over five weeks (three 8-h sessions per week) and conduct it in parallel to other courses or other practical work. The students conduct a literature search, carry out…
DOT National Transportation Integrated Search
2016-11-28
Intelligent Compaction (IC) is considered to be an innovative technology intended to address some of the problems associated with conventional compaction methods of earthwork (e.g. stiffnessbased measurements instead of density-based measurements). I...
NASA Astrophysics Data System (ADS)
Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo
2006-08-01
Among the many GIS based multivariate statistical methods for landslide susceptibility zonation, the so called “Conditional Analysis method” holds a special place for its conceptual simplicity. In fact, in this method landslide susceptibility is simply expressed as landslide density in correspondence with different combinations of instability-factor classes. To overcome the operational complexity connected to the long, tedious and error prone sequence of commands required by the procedure, a shell script mainly based on the GRASS GIS was created. The script, starting from a landslide inventory map and a number of factor maps, automatically carries out the whole procedure resulting in the construction of a map with five landslide susceptibility classes. A validation procedure allows to assess the reliability of the resulting model, while the simple mean deviation of the density values in the factor class combinations, helps to evaluate the goodness of landslide density distribution. The procedure was applied to a relatively small basin (167 km2) in the Italian Northern Apennines considering three landslide types, namely rotational slides, flows and complex landslides, for a total of 1,137 landslides, and five factors, namely lithology, slope angle and aspect, elevation and slope/bedding relations. The analysis of the resulting 31 different models obtained combining the five factors, confirms the role of lithology, slope angle and slope/bedding relations in influencing slope stability.
Step styles of pedestrians at different densities
NASA Astrophysics Data System (ADS)
Wang, Jiayue; Weng, Wenguo; Boltes, Maik; Zhang, Jun; Tordeux, Antoine; Ziemer, Verena
2018-02-01
Stepping locomotion is the basis of human movement. The investigation of stepping locomotion and its affecting factors is necessary for a more realistic knowledge of human movement, which is usually referred to as walking with equal step lengths for the right and left leg. To study pedestrians’ stepping locomotion, a set of single-file movement experiments involving 39 participants of the same age walking on a highly curved oval course is conducted. The microscopic characteristics of the pedestrians including 1D Voronoi density, speed, and step length are calculated based on a projected coordinate. The influence of the projection lines with different radii on the measurement of these quantities is investigated. The step lengths from the straight and curved parts are compared using the Kolmogorov-Smirnov test. During the experiments, six different step styles are observed and the proportions of different step styles change with the density. At low density, the main step style is the stable-large step style and the step lengths of one pedestrian are almost constant. At high density, some pedestrians adjust and decrease their step lengths. Some pedestrians take relatively smaller and larger steps alternately to adapt to limited space.
NASA Astrophysics Data System (ADS)
Nagayama, Yoshio; Yamaguchi, Soichiro; Tsuchiya, Hayato; Kuwahara, Daisuke; LHD Experimental Team
2016-10-01
Visualization of local electron density fluctuations will be very useful to study the physics of confinement and instabilities in fusion plasma. In the Large Helical Device (LHD), the O-mode microwave imaging reflectometry (O-MIR) has been intensively developed in order to visualize the electron density fluctuations. The frequency is 26 - 34 GHz. This corresponds to the electron density of 0.8 - 1.5 × 1019 m-3. The plasma is illuminated by the Gaussian beam with four frequencies. The imaging optics make a plasma image onto the newly developed 2D (8 × 8) Horn-antenna Millimeter-wave Imaging Device (HMID). In HMID, the signal wave that is accumulated by the horn antenna is transduced to the micro-strip line by using the finline transducer. The signal wave is mixed by the double balanced mixer with the local wave that is delivered by cables. By using O-MIR, electron density fluctuations are measured at the H-mode edge and the ITB layer in LHD. This work is supported by NIFS/NINS under the project of Formation of International Scientific Base and Network, by the NIFS LHD project, by KAKENHI, and by IMS.
Intelligence and EEG current density using low-resolution electromagnetic tomography (LORETA).
Thatcher, R W; North, D; Biver, C
2007-02-01
The purpose of this study was to compare EEG current source densities in high IQ subjects vs. low IQ subjects. Resting eyes closed EEG was recorded from 19 scalp locations with a linked ears reference from 442 subjects ages 5 to 52 years. The Wechsler Intelligence Test was administered and subjects were divided into low IQ (< or =90), middle IQ (>90 to <120) and high IQ (> or =120) groups. Low-resolution electromagnetic tomographic current densities (LORETA) from 2,394 cortical gray matter voxels were computed from 1-30 Hz based on each subject's EEG. Differences in current densities using t tests, multivariate analyses of covariance, and regression analyses were used to evaluate the relationships between IQ and current density in Brodmann area groupings of cortical gray matter voxels. Frontal, temporal, parietal, and occipital regions of interest (ROIs) consistently exhibited a direct relationship between LORETA current density and IQ. Maximal t test differences were present at 4 Hz, 9 Hz, 13 Hz, 18 Hz, and 30 Hz with different anatomical regions showing different maxima. Linear regression fits from low to high IQ groups were statistically significant (P < 0.0001). Intelligence is directly related to a general level of arousal and to the synchrony of neural populations driven by thalamo-cortical resonances. A traveling frame model of sequential microstates is hypothesized to explain the results.
Zhang, Yongsheng; Wei, Heng; Zheng, Kangning
2017-01-01
Considering that metro network expansion brings us with more alternative routes, it is attractive to integrate the impacts of routes set and the interdependency among alternative routes on route choice probability into route choice modeling. Therefore, the formulation, estimation and application of a constrained multinomial probit (CMNP) route choice model in the metro network are carried out in this paper. The utility function is formulated as three components: the compensatory component is a function of influencing factors; the non-compensatory component measures the impacts of routes set on utility; following a multivariate normal distribution, the covariance of error component is structured into three parts, representing the correlation among routes, the transfer variance of route, and the unobserved variance respectively. Considering multidimensional integrals of the multivariate normal probability density function, the CMNP model is rewritten as Hierarchical Bayes formula and M-H sampling algorithm based Monte Carlo Markov Chain approach is constructed to estimate all parameters. Based on Guangzhou Metro data, reliable estimation results are gained. Furthermore, the proposed CMNP model also shows a good forecasting performance for the route choice probabilities calculation and a good application performance for transfer flow volume prediction. PMID:28591188
Matos, Larissa A.; Bandyopadhyay, Dipankar; Castro, Luis M.; Lachos, Victor H.
2015-01-01
In biomedical studies on HIV RNA dynamics, viral loads generate repeated measures that are often subjected to upper and lower detection limits, and hence these responses are either left- or right-censored. Linear and non-linear mixed-effects censored (LMEC/NLMEC) models are routinely used to analyse these longitudinal data, with normality assumptions for the random effects and residual errors. However, the derived inference may not be robust when these underlying normality assumptions are questionable, especially the presence of outliers and thick-tails. Motivated by this, Matos et al. (2013b) recently proposed an exact EM-type algorithm for LMEC/NLMEC models using a multivariate Student’s-t distribution, with closed-form expressions at the E-step. In this paper, we develop influence diagnostics for LMEC/NLMEC models using the multivariate Student’s-t density, based on the conditional expectation of the complete data log-likelihood. This partially eliminates the complexity associated with the approach of Cook (1977, 1986) for censored mixed-effects models. The new methodology is illustrated via an application to a longitudinal HIV dataset. In addition, a simulation study explores the accuracy of the proposed measures in detecting possible influential observations for heavy-tailed censored data under different perturbation and censoring schemes. PMID:26190871
Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan
2017-12-01
Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.
Bakhsh, Hanadi; Dei, Metella; Bucciantini, Sandra; Balzi, Daniela; Bruni, Vincenzina
2015-01-01
To evaluate biological differences among young subjects with premature ovarian insufficiency (POI) commencing at different stages of life. Retrospective observational study. Careggi University Hospital Participants: One hundred sixty-two females aged between 15 and 29 years with premature ovarian insufficiency. Data were collected as a retrospective chart review of baseline evaluation at diagnosis of premature ovarian insufficiency (POI). About 162 participants were divided into four groups based on gynecological age. Two primary outcome variables (uterine development and bone mineral density (BMD)) were analyzed in terms of differences among groups and in a multivariate logistic regression analysis. Uterine development was clearly jeopardized when estrogen insufficiency started at a very young age. Total body BMD showed significant differences among the four groups studied, clearly corresponding to the duration of ovarian function. Data were discussed in relation to the choice of hormone replacement therapy regimens.
Diet and the role of lipoproteins, lipases, and thyroid hormones in coronary lesion growth
NASA Technical Reports Server (NTRS)
Barth, Jacques D.; Jansen, Hans; Reiber, Johan H. C.; Birkenhager, Jan C.; Kromhout, Daan
1987-01-01
The relationships between the coronary lesion growth and the blood contents of lipoprotein fractions, thyroic hormones, and the lipoprotein lipase activity were investigated in male patients with severe coronary atherosclerosis, who participated in a lipid-lowering dietary intervention program. A quantitative computer-assisted image-processing technique was used to assess the severity of coronary obstructions at the beginning of the program and at its termination two years later. Based on absolute coronary scores, patients were divided into a no-lesion growth group (14 patients) and a progression group (21 paients). At the end of the trial, the very-low-density lipoprotein cholesterol and triglycerides were found to be significantly higher, while the high-density lipoprotein cholesterol and hepatic lipase (HL) were lower in the progression group. Multivariate regression analysis showed HL to be the most important determinant of changes in coronary atherosclerotic lesions.
ICLUS v1.3 Population Projections
Climate and land-use change are major components of global environmental change with feedbacks between these components. The consequences of these interactions show that land use may exacerbate or alleviate climate change effects. Based on these findings it is important to use land-use scenarios that are consistent with the specific assumptions underlying climate-change scenarios. The Integrated Climate and Land-Use Scenarios (ICLUS) project developed land-use outputs that are based on a downscaled version of the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines. ICLUS outputs are derived from a pair of models. A demographic model generates county-level population estimates that are distributed by a spatial allocation model (SERGoM v3) as housing density across the landscape. Land-use outputs were developed for the four main SRES storylines and a baseline (base case). The model is run for the conterminous USA and output is available for each scenario by decade to 2100. In addition to housing density at a 1 hectare spatial resolution, this project also generated estimates of impervious surface at a resolution of 1 square kilometer. This shapefile holds population data for all counties of the conterminous USA for all decades (2010-2100) and SRES population growth scenarios (A1, A2, B1, B2), as well as a 'base case' (BC) scenario, for use in the Integrated Climate and Land Use
Ider, Yusuf Ziya; Birgul, Ozlem; Oran, Omer Faruk; Arikan, Orhan; Hamamura, Mark J; Muftuler, L Tugan
2010-06-07
Fourier transform (FT)-based algorithms for magnetic resonance current density imaging (MRCDI) from one component of magnetic flux density have been developed for 2D and 3D problems. For 2D problems, where current is confined to the xy-plane and z-component of the magnetic flux density is measured also on the xy-plane inside the object, an iterative FT-MRCDI algorithm is developed by which both the current distribution inside the object and the z-component of the magnetic flux density on the xy-plane outside the object are reconstructed. The method is applied to simulated as well as actual data from phantoms. The effect of measurement error on the spatial resolution of the current density reconstruction is also investigated. For 3D objects an iterative FT-based algorithm is developed whereby the projected current is reconstructed on any slice using as data the Laplacian of the z-component of magnetic flux density measured for that slice. In an injected current MRCDI scenario, the current is not divergence free on the boundary of the object. The method developed in this study also handles this situation.
Three-Way Analysis of Spectrospatial Electromyography Data: Classification and Interpretation
Kauppi, Jukka-Pekka; Hahne, Janne; Müller, Klaus-Robert; Hyvärinen, Aapo
2015-01-01
Classifying multivariate electromyography (EMG) data is an important problem in prosthesis control as well as in neurophysiological studies and diagnosis. With modern high-density EMG sensor technology, it is possible to capture the rich spectrospatial structure of the myoelectric activity. We hypothesize that multi-way machine learning methods can efficiently utilize this structure in classification as well as reveal interesting patterns in it. To this end, we investigate the suitability of existing three-way classification methods to EMG-based hand movement classification in spectrospatial domain, as well as extend these methods by sparsification and regularization. We propose to use Fourier-domain independent component analysis as preprocessing to improve classification and interpretability of the results. In high-density EMG experiments on hand movements across 10 subjects, three-way classification yielded higher average performance compared with state-of-the art classification based on temporal features, suggesting that the three-way analysis approach can efficiently utilize detailed spectrospatial information of high-density EMG. Phase and amplitude patterns of features selected by the classifier in finger-movement data were found to be consistent with known physiology. Thus, our approach can accurately resolve hand and finger movements on the basis of detailed spectrospatial information, and at the same time allows for physiological interpretation of the results. PMID:26039100
LIMEPY: Lowered Isothermal Model Explorer in PYthon
NASA Astrophysics Data System (ADS)
Gieles, Mark; Zocchi, Alice
2017-10-01
LIMEPY solves distribution function (DF) based lowered isothermal models. It solves Poisson's equation used on input parameters and offers fast solutions for isotropic/anisotropic, single/multi-mass models, normalized DF values, density and velocity moments, projected properties, and generates discrete samples.
Semi-autonomous parking for enhanced safety and efficiency.
DOT National Transportation Integrated Search
2017-06-01
This project focuses on the use of tools from a combination of computer vision and localization based navigation schemes to aid the process of efficient and safe parking of vehicles in high density parking spaces. The principles of collision avoidanc...
NASA Astrophysics Data System (ADS)
Zakariyah, N.; Pathy, N. B.; Taib, N. A. M.; Rahmat, K.; Judy, C. W.; Fadzil, F.; Lau, S.; Ng, K. H.
2016-03-01
It has been shown that breast density and obesity are related to breast cancer risk. The aim of this study is to investigate the relationships of breast volume, breast dense volume and volumetric breast density (VBD) with body mass index (BMI) and body fat mass (BFM) for the three ethnic groups (Chinese, Malay and Indian) in Malaysia. We collected raw digital mammograms from 2450 women acquired on three digital mammography systems. The mammograms were analysed using Volpara software to obtain breast volume, breast dense volume and VBD. Body weight, BMI and BFM of the women were measured using a body composition analyser. Multivariable logistic regression was used to determine the independent predictors of increased overall breast volume, breast dense volume and VBD. Indians have highest breast volume and breast dense volume followed by Malays and Chinese. While Chinese are highest in VBD, followed by Malay and Indian. Multivariable analysis showed that increasing BMI and BFM were independent predictors of increased overall breast volume and dense volume. Moreover, BMI and BFM were independently and inversely related to VBD.
Molodecky, Natalie A; Blake, Isobel M; O'Reilly, Kathleen M; Wadood, Mufti Zubair; Safdar, Rana M; Wesolowski, Amy; Buckee, Caroline O; Bandyopadhyay, Ananda S; Okayasu, Hiromasa; Grassly, Nicholas C
2017-06-01
Pakistan currently provides a substantial challenge to global polio eradication, having contributed to 73% of reported poliomyelitis in 2015 and 54% in 2016. A better understanding of the risk factors and movement patterns that contribute to poliovirus transmission across Pakistan would support evidence-based planning for mass vaccination campaigns. We fit mixed-effects logistic regression models to routine surveillance data recording the presence of poliomyelitis associated with wild-type 1 poliovirus in districts of Pakistan over 6-month intervals between 2010 to 2016. To accurately capture the force of infection (FOI) between districts, we compared 6 models of population movement (adjacency, gravity, radiation, radiation based on population density, radiation based on travel times, and mobile-phone based). We used the best-fitting model (based on the Akaike Information Criterion [AIC]) to produce 6-month forecasts of poliomyelitis incidence. The odds of observing poliomyelitis decreased with improved routine or supplementary (campaign) immunisation coverage (multivariable odds ratio [OR] = 0.75, 95% confidence interval [CI] 0.67-0.84; and OR = 0.75, 95% CI 0.66-0.85, respectively, for each 10% increase in coverage) and increased with a higher rate of reporting non-polio acute flaccid paralysis (AFP) (OR = 1.13, 95% CI 1.02-1.26 for a 1-unit increase in non-polio AFP per 100,000 persons aged <15 years). Estimated movement of poliovirus-infected individuals was associated with the incidence of poliomyelitis, with the radiation model of movement providing the best fit to the data. Six-month forecasts of poliomyelitis incidence by district for 2013-2016 showed good predictive ability (area under the curve range: 0.76-0.98). However, although the best-fitting movement model (radiation) was a significant determinant of poliomyelitis incidence, it did not improve the predictive ability of the multivariable model. Overall, in Pakistan the risk of polio cases was predicted to reduce between July-December 2016 and January-June 2017. The accuracy of the model may be limited by the small number of AFP cases in some districts. Spatiotemporal variation in immunization performance and population movement patterns are important determinants of historical poliomyelitis incidence in Pakistan; however, movement dynamics were less influential in predicting future cases, at a time when the polio map is shrinking. Results from the regression models we present are being used to help plan vaccination campaigns and transit vaccination strategies in Pakistan.
Molodecky, Natalie A.; Buckee, Caroline O.; Okayasu, Hiromasa; Grassly, Nicholas C.
2017-01-01
Background Pakistan currently provides a substantial challenge to global polio eradication, having contributed to 73% of reported poliomyelitis in 2015 and 54% in 2016. A better understanding of the risk factors and movement patterns that contribute to poliovirus transmission across Pakistan would support evidence-based planning for mass vaccination campaigns. Methods and findings We fit mixed-effects logistic regression models to routine surveillance data recording the presence of poliomyelitis associated with wild-type 1 poliovirus in districts of Pakistan over 6-month intervals between 2010 to 2016. To accurately capture the force of infection (FOI) between districts, we compared 6 models of population movement (adjacency, gravity, radiation, radiation based on population density, radiation based on travel times, and mobile-phone based). We used the best-fitting model (based on the Akaike Information Criterion [AIC]) to produce 6-month forecasts of poliomyelitis incidence. The odds of observing poliomyelitis decreased with improved routine or supplementary (campaign) immunisation coverage (multivariable odds ratio [OR] = 0.75, 95% confidence interval [CI] 0.67–0.84; and OR = 0.75, 95% CI 0.66–0.85, respectively, for each 10% increase in coverage) and increased with a higher rate of reporting non-polio acute flaccid paralysis (AFP) (OR = 1.13, 95% CI 1.02–1.26 for a 1-unit increase in non-polio AFP per 100,000 persons aged <15 years). Estimated movement of poliovirus-infected individuals was associated with the incidence of poliomyelitis, with the radiation model of movement providing the best fit to the data. Six-month forecasts of poliomyelitis incidence by district for 2013–2016 showed good predictive ability (area under the curve range: 0.76–0.98). However, although the best-fitting movement model (radiation) was a significant determinant of poliomyelitis incidence, it did not improve the predictive ability of the multivariable model. Overall, in Pakistan the risk of polio cases was predicted to reduce between July–December 2016 and January–June 2017. The accuracy of the model may be limited by the small number of AFP cases in some districts. Conclusions Spatiotemporal variation in immunization performance and population movement patterns are important determinants of historical poliomyelitis incidence in Pakistan; however, movement dynamics were less influential in predicting future cases, at a time when the polio map is shrinking. Results from the regression models we present are being used to help plan vaccination campaigns and transit vaccination strategies in Pakistan. PMID:28604777
LFSPMC: Linear feature selection program using the probability of misclassification
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Marion, B. P.
1975-01-01
The computational procedure and associated computer program for a linear feature selection technique are presented. The technique assumes that: a finite number, m, of classes exists; each class is described by an n-dimensional multivariate normal density function of its measurement vectors; the mean vector and covariance matrix for each density function are known (or can be estimated); and the a priori probability for each class is known. The technique produces a single linear combination of the original measurements which minimizes the one-dimensional probability of misclassification defined by the transformed densities.
2012-03-28
Scintillation 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Comberiate, Joseph M. 5e. TASK NUMBER 5f. WORK...bubble climatology. A tomographic reconstruction technique was modified and applied to SSUSI data to reconstruct three-dimensional cubes of ionospheric... modified and applied to SSUSI data to reconstruct three-dimensional cubes of ionospheric electron density. These data cubes allowed for 3-D imaging of
"Take the Volume Pledge" may result in disparity in access to care.
Blanco, Barbara A; Kothari, Anai N; Blackwell, Robert H; Brownlee, Sarah A; Yau, Ryan M; Attisha, John P; Ezure, Yoshiki; Pappas, Sam; Kuo, Paul C; Abood, Gerard J
2017-03-01
"Take the Volume Pledge" proposes restricting pancreatectomies to hospitals that perform ≥20 per year. Our purpose was to identify those factors that characterize patients at risk for loss of access to pancreatic cancer care with enforcement of volume standards. Using the Healthcare Cost and Utilization Project State Inpatient Database from Florida, we identified patients who underwent pancreatectomy for pancreatic malignancy from 2007-2011. American Hospital Association and United States Census Bureau data were linked to patient-level data. High-volume hospitals were defined as performing ≥20 pancreatic resections per year. Univariable and multivariable statistics compared patient characteristics and utilization of high-volume hospitals. Classification and Regression Tree modeling was used to predict patients at risk for losing access to care. Our study included 1,663 patients. Five high-volume hospitals were identified, and they treated 1,056 (63.5%) patients. Patients residing far from high-volume hospitals, in areas with the highest population density, non-Caucasian ethnicity, and greater income had decreased odds of obtaining care at high-volume hospitals. Using these factors, we developed a Classification and Regression Tree-based predictive tool to identify these patients. Implementation of "Take the Volume Pledge" is an important step toward improving pancreatectomy outcomes; however, policymakers must consider the potential impact on limiting access and possible health disparities that may arise. Copyright © 2016 Elsevier Inc. All rights reserved.
Shelton, Rachel C; Dunston, Sheba King; Leoce, Nicole; Jandorf, Lina; Thompson, Hayley S; Crookes, Danielle M; Erwin, Deborah O
2016-03-22
Lay health advisor (LHA) programs are increasingly being implemented in the USA and globally in the context of health promotion and disease prevention. LHAs are effective in addressing health disparities when used to reach medically underserved populations, with strong evidence among African American and Hispanic women. Despite their success and the evidence supporting implementation of LHA programs in community settings, there are tremendous barriers to sustaining LHA programs and little is understood about their implementation and sustainability in "real-world" settings. The purpose of this study was to (1) propose a conceptual framework to investigate factors at individual, social, and organizational levels that impact LHA activity and retention; and (2) use prospective data to investigate the individual, social, and organizational factors that predict activity level and retention among a community-based sample of African American LHAs participating in an effective, evidence-based LHA program (National Witness Project; NWP). Seventy-six LHAs were recruited from eight NWP sites across the USA. Baseline predictor data was collected from LHAs during a telephone questionnaire administered between 2010 and 2011. Outcome data on LHA participation and program activity levels were collected in the fall of 2012 from NWP program directors. Chi-square and ANOVA tests were used to identify differences between retained and completely inactive LHAs, and LHAs with high/moderate vs. low/no activity levels. Multivariable logistic regression models were conducted to identify variables that predicted LHA retention and activity levels. In multivariable models, LHAs based at sites with academic partnerships had increased odds of retention and high/moderate activity levels, even after adjusting for baseline LHA activity level. Higher religiosity among LHAs was associated with decreased odds of being highly/moderately active. LHA role clarity and self-efficacy were associated with retention and high/moderate activity in multivariable models unadjusted for baseline LHA activity level. Organizational and role-related factors are critical in influencing the retention and activity levels of LHAs. Developing and fostering partnerships with academic institutions will be important strategies to promote successful implementation and sustainability of LHA programs. Clarifying role expectations and building self-efficacy during LHA recruitment and training should be further explored to promote LHA retention and participation.
Nonparametric analysis of Minnesota spruce and aspen tree data and LANDSAT data
NASA Technical Reports Server (NTRS)
Scott, D. W.; Jee, R.
1984-01-01
The application of nonparametric methods in data-intensive problems faced by NASA is described. The theoretical development of efficient multivariate density estimators and the novel use of color graphics workstations are reviewed. The use of nonparametric density estimates for data representation and for Bayesian classification are described and illustrated. Progress in building a data analysis system in a workstation environment is reviewed and preliminary runs presented.
Multivariate spline methods in surface fitting
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr. (Principal Investigator); Schumaker, L. L.
1984-01-01
The use of spline functions in the development of classification algorithms is examined. In particular, a method is formulated for producing spline approximations to bivariate density functions where the density function is decribed by a histogram of measurements. The resulting approximations are then incorporated into a Bayesiaan classification procedure for which the Bayes decision regions and the probability of misclassification is readily computed. Some preliminary numerical results are presented to illustrate the method.
Vorontsov, Mikhail; Weyrauch, Thomas; Lachinova, Svetlana; Gatz, Micah; Carhart, Gary
2012-07-15
Maximization of a projected laser beam's power density at a remotely located extended object (speckle target) can be achieved by using an adaptive optics (AO) technique based on sensing and optimization of the target-return speckle field's statistical characteristics, referred to here as speckle metrics (SM). SM AO was demonstrated in a target-in-the-loop coherent beam combining experiment using a bistatic laser beam projection system composed of a coherent fiber-array transmitter and a power-in-the-bucket receiver. SM sensing utilized a 50 MHz rate dithering of the projected beam that provided a stair-mode approximation of the outgoing combined beam's wavefront tip and tilt with subaperture piston phases. Fiber-integrated phase shifters were used for both the dithering and SM optimization with stochastic parallel gradient descent control.
de Almeida, Valber Elias; de Araújo Gomes, Adriano; de Sousa Fernandes, David Douglas; Goicoechea, Héctor Casimiro; Galvão, Roberto Kawakami Harrop; Araújo, Mario Cesar Ugulino
2018-05-01
This paper proposes a new variable selection method for nonlinear multivariate calibration, combining the Successive Projections Algorithm for interval selection (iSPA) with the Kernel Partial Least Squares (Kernel-PLS) modelling technique. The proposed iSPA-Kernel-PLS algorithm is employed in a case study involving a Vis-NIR spectrometric dataset with complex nonlinear features. The analytical problem consists of determining Brix and sucrose content in samples from a sugar production system, on the basis of transflectance spectra. As compared to full-spectrum Kernel-PLS, the iSPA-Kernel-PLS models involve a smaller number of variables and display statistically significant superiority in terms of accuracy and/or bias in the predictions. Published by Elsevier B.V.
Intercohort density dependence drives brown trout habitat selection
NASA Astrophysics Data System (ADS)
Ayllón, Daniel; Nicola, Graciela G.; Parra, Irene; Elvira, Benigno; Almodóvar, Ana
2013-01-01
Habitat selection can be viewed as an emergent property of the quality and availability of habitat but also of the number of individuals and the way they compete for its use. Consequently, habitat selection can change across years due to fluctuating resources or to changes in population numbers. However, habitat selection predictive models often do not account for ecological dynamics, especially density dependent processes. In stage-structured population, the strength of density dependent interactions between individuals of different age classes can exert a profound influence on population trajectories and evolutionary processes. In this study, we aimed to assess the effects of fluctuating densities of both older and younger competing life stages on the habitat selection patterns (described as univariate and multivariate resource selection functions) of young-of-the-year, juvenile and adult brown trout Salmo trutta. We observed all age classes were selective in habitat choice but changed their selection patterns across years consistently with variations in the densities of older but not of younger age classes. Trout of an age increased selectivity for positions highly selected by older individuals when their density decreased, but this pattern did not hold when the density of younger age classes varied. It suggests that younger individuals are dominated by older ones but can expand their range of selected habitats when density of competitors decreases, while older trout do not seem to consider the density of younger individuals when distributing themselves even though they can negatively affect their final performance. Since these results may entail critical implications for conservation and management practices based on habitat selection models, further research should involve a wider range of river typologies and/or longer time frames to fully understand the patterns of and the mechanisms underlying the operation of density dependence on brown trout habitat selection.
Serum osteoprotegerin levels and mammographic density among high-risk women.
Moran, Olivia; Zaman, Tasnim; Eisen, Andrea; Demsky, Rochelle; Blackmore, Kristina; Knight, Julia A; Elser, Christine; Ginsburg, Ophira; Zbuk, Kevin; Yaffe, Martin; Narod, Steven A; Salmena, Leonardo; Kotsopoulos, Joanne
2018-06-01
Mammographic density is a risk factor for breast cancer but the mechanism behind this association is unclear. The receptor activator of nuclear factor κB (RANK)/RANK ligand (RANKL) pathway has been implicated in the development of breast cancer. Given the role of RANK signaling in mammary epithelial cell proliferation, we hypothesized this pathway may also be associated with mammographic density. Osteoprotegerin (OPG), a decoy receptor for RANKL, is known to inhibit RANK signaling. Thus, it is of interest to evaluate whether OPG levels modify breast cancer risk through mammographic density. We quantified serum OPG levels in 57 premenopausal and 43 postmenopausal women using an enzyme-linked immunosorbent assay (ELISA). Cumulus was used to measure percent density, dense area, and non-dense area for each mammographic image. Subjects were classified into high versus low OPG levels based on the median serum OPG level in the entire cohort (115.1 pg/mL). Multivariate models were used to assess the relationship between serum OPG levels and the measures of mammographic density. Serum OPG levels were not associated with mammographic density among premenopausal women (P ≥ 0.42). Among postmenopausal women, those with low serum OPG levels had higher mean percent mammographic density (20.9% vs. 13.7%; P = 0.04) and mean dense area (23.4 cm 2 vs. 15.2 cm 2 ; P = 0.02) compared to those with high serum OPG levels after covariate adjustment. These findings suggest that low OPG levels may be associated with high mammographic density, particularly in postmenopausal women. Targeting RANK signaling may represent a plausible, non-surgical prevention option for high-risk women with high mammographic density, especially those with low circulating OPG levels.
Kuselman, Ilya; Pennecchi, Francesca R; da Silva, Ricardo J N B; Hibbert, D Brynn
2017-11-01
The probability of a false decision on conformity of a multicomponent material due to measurement uncertainty is discussed when test results are correlated. Specification limits of the components' content of such a material generate a multivariate specification interval/domain. When true values of components' content and corresponding test results are modelled by multivariate distributions (e.g. by multivariate normal distributions), a total global risk of a false decision on the material conformity can be evaluated based on calculation of integrals of their joint probability density function. No transformation of the raw data is required for that. A total specific risk can be evaluated as the joint posterior cumulative function of true values of a specific batch or lot lying outside the multivariate specification domain, when the vector of test results, obtained for the lot, is inside this domain. It was shown, using a case study of four components under control in a drug, that the correlation influence on the risk value is not easily predictable. To assess this influence, the evaluated total risk values were compared with those calculated for independent test results and also with those assuming much stronger correlation than that observed. While the observed statistically significant correlation did not lead to a visible difference in the total risk values in comparison to the independent test results, the stronger correlation among the variables caused either the total risk decreasing or its increasing, depending on the actual values of the test results. Copyright © 2017 Elsevier B.V. All rights reserved.
Wang, Yingfeng; Man, Hongxue; Gao, Jian; Liu, Xinfeng; Ren, Xiaolei; Chen, Jianxin; Zhang, Jiayu; Gao, Kuo; Li, Zhongfeng; Zhao, Baosheng
2016-09-01
Lang-du (LD) has been traditionally used to treat human diseases in China. Plasma metabolic profiling was applied in this study based on LC-MS to elucidate the toxicity in rats induced by injected ethanol extract of LD. LD injection was given by intraperitoneal injection at doses of 0.1, 0.05, 0.025 and 0 g kg(-1) body weight per day to rats. The blood biochemical levels of alanine aminotransferase, direct bilirubin, creatinine, serum β2-microglobulin and low-density lipoprotein increased in LD-injected rats, and the levels of total protein and albumin decreased in these groups. The metabolic profiles of the samples were analyzed by multivariate statistics analysis, including principal component analysis, partial least squares discriminant analysis and orthogonal projection to latent structures discriminate analysis (OPLS-DA). The metabolic characters in rats injected with LD were perturbed in a dose-dependent manner. By OPLS-DA, 18 metabolites were served as the potential toxicity biomarkers. Moreover, LD treatment resulted in an increase in the p-cresol, p-cresol sulfate, lysophosphatidylethanolamine (LPE) (18:0), LPE (16:0), lysophosphatidylcholine (16:0) and 12-HETE concentrations, and a decrease in hippuric acid, cholic acid and N-acetyl-l-phenylalanine. These results suggested that chronic exposure to LD could cause a disturbance in lipids metabolism and amino acids metabolism, etc. Therefore, an analysis of the metabolic profiles can contribute to a better understanding of the adverse effects of LD. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Westerholm, R; Egebäck, K E
1994-01-01
This paper presents results from the characterization of vehicle exhaust that were obtained primarily within the Swedish Urban Air Project, "Tätortsprojektet." Exhaust emissions from both gasoline- and diesel-fueled vehicles have been investigated with respect to regulated pollutants (carbon monoxide [CO], hydrocarbon [HC], nitrogen oxides [NOx], and particulate), unregulated pollutants, and in bioassay tests (Ames test, TCDD receptor affinity tests). Unregulated pollutants present in both the particle- and the semi-volatile phases were characterized. Special interest was focused on the impact of fuel composition on heavy-duty diesel vehicle emissions. It was confirmed that there exists a quantifiable relationship between diesel-fuel variables of the fuel blends, the chemical composition of the emissions, and their biological effects. According to the results from the multivariate analysis, the most important fuel parameters are: polycyclic aromatic hydrocarbons (PAH) content, 90% distillation point, final boiling point, specific heat, aromatic content, density, and sulfur content. PMID:7529699
Rice, Megan S; Tworoger, Shelley S; Bertrand, Kimberly A; Hankinson, Susan E; Rosner, Bernard A; Feeney, Yvonne B; Clevenger, Charles V; Tamimi, Rulla M
2015-01-01
Higher circulating prolactin levels have been associated with higher percent mammographic density among postmenopausal women in some, but not all studies. However, few studies have examined associations with dense area and non-dense breast area breast or considered associations with prolactin Nb2 lymphoma cell bioassay levels. We conducted a cross-sectional study among 1,124 premenopausal and 890 postmenopausal women who were controls in breast cancer case-control studies nested in the Nurses' Health Study (NHS) and NHSII. Participants provided blood samples in 1989-1990 (NHS) or 1996-1999 (NHSII) and mammograms were obtained from around the time of blood draw. Multivariable linear models were used to assess the associations between prolactin levels (measured by immunoassay or bioassay) with percent density, dense area, and non-dense area. Among 1,124 premenopausal women, percent density, dense area, and non-dense area were not associated with prolactin immunoassay levels in multivariable models (p trends = 0.10, 0.18, and 0.69, respectively). Among 890 postmenopausal women, those with prolactin immunoassay levels in the highest versus lowest quartile had modestly, though significantly, higher percent density (difference = 3.01 percentage points, 95 % CI 0.22, 5.80) as well as lower non-dense area (p trend = 0.02). Among women with both immunoassay and bioassay levels, there were no consistent differences in the associations with percent density between bioassay and immunoassay levels. Postmenopausal women with prolactin immunoassay levels in the highest quartile had significantly higher percent density as well as lower non-dense area compared to those in the lowest quartile. Future studies should examine the underlying biologic mechanisms, particularly for non-dense area.
Assessment of benthic changes during 20 years of monitoring the Mexican Salina Cruz Bay.
González-Macías, C; Schifter, I; Lluch-Cota, D B; Méndez-Rodríguez, L; Hernández-Vázquez, S
2009-02-01
In this work a non-parametric multivariate analysis was used to assess the impact of metals and organic compounds in the macro infaunal component of the mollusks benthic community using surface sediment data from several monitoring programs collected over 20 years in Salina Cruz Bay, Mexico. The data for benthic mollusks community characteristics (richness, abundance and diversity) were linked to multivariate environmental patterns, using the Alternating Conditional Expectations method to correlate the biological measurements of the mollusk community with the physicochemical properties of water and sediments. Mollusks community variation is related to environmental characteristics as well as lead content. Surface deposit feeders are increasing their relative density, while subsurface deposit feeders are decreasing with respect to time, these last are expected to be more related with sediment and more affected then by its quality. However gastropods with predatory carnivore as well as chemosymbiotic deposit feeder bivalves have maintained their relative densities along time.
ERIC Educational Resources Information Center
Polanin, Joshua R.; Wilson, Sandra Jo
2014-01-01
The purpose of this project is to demonstrate the practical methods developed to utilize a dataset consisting of both multivariate and multilevel effect size data. The context for this project is a large-scale meta-analytic review of the predictors of academic achievement. This project is guided by three primary research questions: (1) How do we…
Geometrical Representations in the Learning of Two-Variable Functions
ERIC Educational Resources Information Center
Trigueros, Maria; Martinez-Planell, Rafael
2010-01-01
This study is part of a project concerned with the analysis of how students work with two-variable functions. This is of fundamental importance given the role of multivariable functions in mathematics and its applications. The portion of the project we report here concentrates on investigating the relationship between students' notion of subsets…
Enhancements to the Tonge-Ramesh Ceramic Failure Model for Use in Eulerian Simulations
2016-09-14
ability to project an arbitrary trial stress (σtr) onto the quasi -static yield surface (providing the value for σqs). Once the projection onto the quasi ...Model Evaluation Methods 4.1 Geometry from Prior Experiments There are experimental data from 2 research groups on penetration of confined boron carbide...by high-density, long-rod projectiles.21,22 Based on these prior ex- periments, the following 3 experimental geometries were identified to test the
Psychocentricity and participant profiles: implications for lexical processing among multilinguals
Libben, Gary; Curtiss, Kaitlin; Weber, Silke
2014-01-01
Lexical processing among bilinguals is often affected by complex patterns of individual experience. In this paper we discuss the psychocentric perspective on language representation and processing, which highlights the centrality of individual experience in psycholinguistic experimentation. We discuss applications to the investigation of lexical processing among multilinguals and explore the advantages of using high-density experiments with multilinguals. High density experiments are designed to co-index measures of lexical perception and production, as well as participant profiles. We discuss the challenges associated with the characterization of participant profiles and present a new data visualization technique, that we term Facial Profiles. This technique is based on Chernoff faces developed over 40 years ago. The Facial Profile technique seeks to overcome some of the challenges associated with the use of Chernoff faces, while maintaining the core insight that recoding multivariate data as facial features can engage the human face recognition system and thus enhance our ability to detect and interpret patterns within multivariate datasets. We demonstrate that Facial Profiles can code participant characteristics in lexical processing studies by recoding variables such as reading ability, speaking ability, and listening ability into iconically-related relative sizes of eye, mouth, and ear, respectively. The balance of ability in bilinguals can be captured by creating composite facial profiles or Janus Facial Profiles. We demonstrate the use of Facial Profiles and Janus Facial Profiles in the characterization of participant effects in the study of lexical perception and production. PMID:25071614
Michalareas, George; Schoffelen, Jan-Mathijs; Paterson, Gavin; Gross, Joachim
2013-01-01
Abstract In this work, we investigate the feasibility to estimating causal interactions between brain regions based on multivariate autoregressive models (MAR models) fitted to magnetoencephalographic (MEG) sensor measurements. We first demonstrate the theoretical feasibility of estimating source level causal interactions after projection of the sensor-level model coefficients onto the locations of the neural sources. Next, we show with simulated MEG data that causality, as measured by partial directed coherence (PDC), can be correctly reconstructed if the locations of the interacting brain areas are known. We further demonstrate, if a very large number of brain voxels is considered as potential activation sources, that PDC as a measure to reconstruct causal interactions is less accurate. In such case the MAR model coefficients alone contain meaningful causality information. The proposed method overcomes the problems of model nonrobustness and large computation times encountered during causality analysis by existing methods. These methods first project MEG sensor time-series onto a large number of brain locations after which the MAR model is built on this large number of source-level time-series. Instead, through this work, we demonstrate that by building the MAR model on the sensor-level and then projecting only the MAR coefficients in source space, the true casual pathways are recovered even when a very large number of locations are considered as sources. The main contribution of this work is that by this methodology entire brain causality maps can be efficiently derived without any a priori selection of regions of interest. Hum Brain Mapp, 2013. © 2012 Wiley Periodicals, Inc. PMID:22328419
NASA Technical Reports Server (NTRS)
Bar-Cohen, Y.; Bhattacharya, K.
2003-01-01
The objective of the project was to develop a versatile electroactuator based on a specific class of EAP, conductive polymer, that is capable of developing high forces and displacements in both bending and linear contraction/expansion movements.
NASA Technical Reports Server (NTRS)
Nisbet, John S.; Barnard, Theresa A.; Forbes, Gregory S.; Krider, E. Philip; Lhermitte, Roger
1990-01-01
The data obtained at the time of the Thunderstorm Research International Project storm at the Kennedy Space Center on July 11, 1978 are analyzed in a model-independent manner. The data base included data from three Doppler radars, a lightning detection and ranging system and a network of 25 electric field mills, and rain gages. Electric field measurements were used to analyze the charge moments transferred by lightning flashes, and the data were fitted to Weibull distributions; these were used to estimate statistical parameters of the lightning for both intracloud and cloud-to-ground flashes and to estimate the fraction of the flashes which were below the observation threshold. The displacement and the conduction current densities were calculated from electric field measurements between flashes. These values were used to derive the magnitudes and the locations of dipole and monopole generators by least squares fitting the measured Maxwell current densities to the displacement-dominated equations.
Effective model hierarchies for dynamic and static classical density functional theories
NASA Astrophysics Data System (ADS)
Majaniemi, S.; Provatas, N.; Nonomura, M.
2010-09-01
The origin and methodology of deriving effective model hierarchies are presented with applications to solidification of crystalline solids. In particular, it is discussed how the form of the equations of motion and the effective parameters on larger scales can be obtained from the more microscopic models. It will be shown that tying together the dynamic structure of the projection operator formalism with static classical density functional theories can lead to incomplete (mass) transport properties even though the linearized hydrodynamics on large scales is correctly reproduced. To facilitate a more natural way of binding together the dynamics of the macrovariables and classical density functional theory, a dynamic generalization of density functional theory based on the nonequilibrium generating functional is suggested.
Energy Navigation: Simulation Evaluation and Benefit Analysis
NASA Technical Reports Server (NTRS)
Williams, David H.; Oseguera-Lohr, Rosa M.; Lewis, Elliot T.
2011-01-01
This paper presents results from two simulation studies investigating the use of advanced flight-deck-based energy navigation (ENAV) and conventional transport-category vertical navigation (VNAV) for conducting a descent through a busy terminal area, using Continuous Descent Arrival (CDA) procedures. This research was part of the Low Noise Flight Procedures (LNFP) element within the Quiet Aircraft Technology (QAT) Project, and the subsequent Airspace Super Density Operations (ASDO) research focus area of the Airspace Project. A piloted simulation study addressed development of flight guidance, and supporting pilot and Air Traffic Control (ATC) procedures for high density terminal operations. The procedures and charts were designed to be easy to understand, and to make it easy for the crew to make changes via the Flight Management Computer Control-Display Unit (FMC-CDU) to accommodate changes from ATC.
Understanding global climate change scenarios through bioclimate stratification
NASA Astrophysics Data System (ADS)
Soteriades, A. D.; Murray-Rust, D.; Trabucco, A.; Metzger, M. J.
2017-08-01
Despite progress in impact modelling, communicating and understanding the implications of climatic change projections is challenging due to inherent complexity and a cascade of uncertainty. In this letter, we present an alternative representation of global climate change projections based on shifts in 125 multivariate strata characterized by relatively homogeneous climate. These strata form climate analogues that help in the interpretation of climate change impacts. A Random Forests classifier was calculated and applied to 63 Coupled Model Intercomparison Project Phase 5 climate scenarios at 5 arcmin resolution. Results demonstrate how shifting bioclimate strata can summarize future environmental changes and form a middle ground, conveniently integrating current knowledge of climate change impact with the interpretation advantages of categorical data but with a level of detail that resembles a continuous surface at global and regional scales. Both the agreement in major change and differences between climate change projections are visually combined, facilitating the interpretation of complex uncertainty. By making the data and the classifier available we provide a climate service that helps facilitate communication and provide new insight into the consequences of climate change.
Charles Essien; Brian K. Via; Qingzheng Cheng; Thomas Gallagher; Timothy McDonald; Xiping Wang; Lori G. Eckhardt
2017-01-01
The polymeric angle and concentration within the S2 layer of the softwood fiber cell wall are very critical for molecular and microscopic properties that influence strength, stiffness and acoustic velocity of wood at the macroscopic level. The main objective of this study was to elucidate the effect of cellulose, hemicellulose, lignin, microfibril angle and density on...
Lee, Joseph G L; Sun, Dennis L; Schleicher, Nina M; Ribisl, Kurt M; Luke, Douglas A; Henriksen, Lisa
2017-05-01
Evidence of racial/ethnic inequalities in tobacco outlet density is limited by: (1) reliance on studies from single counties or states, (2) limited attention to spatial dependence, and (3) an unclear theory-based relationship between neighbourhood composition and tobacco outlet density. In 97 counties from the contiguous USA, we calculated the 2012 density of likely tobacco outlets (N=90 407), defined as tobacco outlets per 1000 population in census tracts (n=17 667). We used 2 spatial regression techniques, (1) a spatial errors approach in GeoDa software and (2) fitting a covariance function to the errors using a distance matrix of all tract centroids. We examined density as a function of race, ethnicity, income and 2 indicators identified from city planning literature to indicate neighbourhood stability (vacant housing, renter-occupied housing). The average density was 1.3 tobacco outlets per 1000 persons. Both spatial regression approaches yielded similar results. In unadjusted models, tobacco outlet density was positively associated with the proportion of black residents and negatively associated with the proportion of Asian residents, white residents and median household income. There was no association with the proportion of Hispanic residents. Indicators of neighbourhood stability explained the disproportionate density associated with black residential composition, but inequalities by income persisted in multivariable models. Data from a large sample of US counties and results from 2 techniques to address spatial dependence strengthen evidence of inequalities in tobacco outlet density by race and income. Further research is needed to understand the underlying mechanisms in order to strengthen interventions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Bone mineral density in subjects using central nervous system-active medications.
Kinjo, Mitsuyo; Setoguchi, Soko; Schneeweiss, Sebastian; Solomon, Daniel H
2005-12-01
Decreased bone mineral density defines osteoporosis according to the World Health Organization and is an important predictor of future fractures. The use of several types of central nervous system-active drugs, including benzodiazepines, anticonvulsants, antidepressants, and opioids, have all been associated with increased risk of fracture. However, it is unclear whether such an increase in risk is related to an effect of bone mineral density or to other factors, such as increased risk of falls. We sought to examine the relationship between bone mineral density and the use of benzodiazepines, anticonvulsants, antidepressants, and opioids in a representative US population-based sample. We analyzed data on adults aged 17 years and older from the Third National Health and Nutrition Examination Survey (NHANES III, 1988-1994). Total femoral bone mineral density of 7114 male and 7532 female participants was measured by dual-energy x-ray absorptiometry. Multivariable linear regression models were used to quantify the relation between central nervous system medication exposure and total femoral bone mineral density. Models controlled for relevant covariates, including age, sex, and body mass index. In linear regression models, significantly reduced bone mineral density was found in subjects taking anticonvulsants (0.92 g/cm2; 95% confidence interval [CI]: 0.89 to 0.94) and opioids (0.92 g/cm2; 95% CI: 0.88 to 0.95) compared with nonusers (0.95 g/cm2; 95% CI: 0.95 to 0.95) after adjusting for several potential confounders. The other central nervous system-active drugs--benzodiazepines or antidepressants--were not associated with significantly reduced bone mineral density. In cross-sectional analysis of NHANES III, anticonvulsants and opioids (but not benzodiazepines or antidepressants) were associated with significantly reduced bone mineral density. These findings have implications for fracture-prevention strategies.
Insausti, Matías; Gomes, Adriano A; Cruz, Fernanda V; Pistonesi, Marcelo F; Araujo, Mario C U; Galvão, Roberto K H; Pereira, Claudete F; Band, Beatriz S F
2012-08-15
This paper investigates the use of UV-vis, near infrared (NIR) and synchronous fluorescence (SF) spectrometries coupled with multivariate classification methods to discriminate biodiesel samples with respect to the base oil employed in their production. More specifically, the present work extends previous studies by investigating the discrimination of corn-based biodiesel from two other biodiesel types (sunflower and soybean). Two classification methods are compared, namely full-spectrum SIMCA (soft independent modelling of class analogies) and SPA-LDA (linear discriminant analysis with variables selected by the successive projections algorithm). Regardless of the spectrometric technique employed, full-spectrum SIMCA did not provide an appropriate discrimination of the three biodiesel types. In contrast, all samples were correctly classified on the basis of a reduced number of wavelengths selected by SPA-LDA. It can be concluded that UV-vis, NIR and SF spectrometries can be successfully employed to discriminate corn-based biodiesel from the two other biodiesel types, but wavelength selection by SPA-LDA is key to the proper separation of the classes. Copyright © 2012 Elsevier B.V. All rights reserved.
Riess, Helene; Clowes, Petra; Kroidl, Inge; Kowuor, Dickens O.; Nsojo, Anthony; Mangu, Chacha; Schüle, Steffen A.; Mansmann, Ulrich; Geldmacher, Christof; Mhina, Seif; Maboko, Leonard; Hoelscher, Michael; Saathoff, Elmar
2013-01-01
Background Hookworm disease is one of the most common infections and cause of a high disease burden in the tropics and subtropics. Remotely sensed ecological data and model-based geostatistics have been used recently to identify areas in need for hookworm control. Methodology Cross-sectional interview data and stool samples from 6,375 participants from nine different sites in Mbeya region, south-western Tanzania, were collected as part of a cohort study. Hookworm infection was assessed by microscopy of duplicate Kato-Katz thick smears from one stool sample from each participant. A geographic information system was used to obtain remotely sensed environmental data such as land surface temperature (LST), vegetation cover, rainfall, and elevation, and combine them with hookworm infection data and with socio-demographic and behavioral data. Uni- and multivariable logistic regression was performed on sites separately and on the pooled dataset. Principal Findings Univariable analyses yielded significant associations for all ecological variables. Five ecological variables stayed significant in the final multivariable model: population density (odds ratio (OR) = 0.68; 95% confidence interval (CI) = 0.63–0.73), mean annual vegetation density (OR = 0.11; 95% CI = 0.06–0.18), mean annual LST during the day (OR = 0.81; 95% CI = 0.75–0.88), mean annual LST during the night (OR = 1.54; 95% CI = 1.44–1.64), and latrine coverage in household surroundings (OR = 1.02; 95% CI = 1.01–1.04). Interaction terms revealed substantial differences in associations of hookworm infection with population density, mean annual enhanced vegetation index, and latrine coverage between the two sites with the highest prevalence of infection. Conclusion/Significance This study supports previous findings that remotely sensed data such as vegetation indices, LST, and elevation are strongly associated with hookworm prevalence. However, the results indicate that the influence of environmental conditions can differ substantially within a relatively small geographic area. The use of large-scale associations as a predictive tool on smaller scales is therefore problematic and should be handled with care. PMID:24040430
Shen, P; Zhao, J; Sun, G; Chen, N; Zhang, X; Gui, H; Yang, Y; Liu, J; Shu, K; Wang, Z; Zeng, H
2017-05-01
The aim of this study was to develop nomograms for predicting prostate cancer and its zonal location using prostate-specific antigen density, prostate volume, and their zone-adjusted derivatives. A total of 928 consecutive patients with prostate-specific antigen (PSA) less than 20.0 ng/mL, who underwent transrectal ultrasound-guided transperineal 12-core prostate biopsy at West China Hospital between 2011 and 2014, were retrospectively enrolled. The patients were randomly split into training cohort (70%, n = 650) and validation cohort (30%, n = 278). Predicting models and the associated nomograms were built using the training cohort, while the validations of the models were conducted using the validation cohort. Univariate and multivariate logistic regression was performed. Then, new nomograms were generated based on multivariate regression coefficients. The discrimination power and calibration of these nomograms were validated using the area under the ROC curve (AUC) and the calibration curve. The potential clinical effects of these models were also tested using decision curve analysis. In total, 285 (30.7%) patients were diagnosed with prostate cancer. Among them, 131 (14.1%) and 269 (29.0%) had transition zone prostate cancer and peripheral zone prostate cancer. Each of zone-adjusted derivatives-based nomogram had an AUC more than 0.75. All nomograms had higher calibration and much better net benefit than the scenarios in predicting patients with or without different zones prostate cancer. Prostate-specific antigen density, prostate volume, and their zone-adjusted derivatives have important roles in detecting prostate cancer and its zonal location for patients with PSA 2.5-20.0 ng/mL. To the best of our knowledge, this is the first nomogram using these parameters to predict outcomes of 12-core prostate biopsy. These instruments can help clinicians to increase the accuracy of prostate cancer screening and to avoid unnecessary prostate biopsy. © 2017 American Society of Andrology and European Academy of Andrology.
2014-09-01
approaches. Ecological Modelling Volume 200, Issues 1–2, 10, pp 1–19. Buhlmann, Kurt A ., Thomas S.B. Akre , John B. Iverson, Deno Karapatakis, Russell A ...statistical multivariate analysis to define the current and projected future range probability for species of interest to Army land managers. A software...15 Figure 4. RCW omission rate and predicted area as a function of the cumulative threshold
Sousa, Eunice; Quintino, Victor; Palhas, Jael; Rodrigues, Ana Maria; Teixeira, José
2016-01-01
Ponds provide vital ecological services. They are biodiversity hotspots and important breading sites for rare and endangered species, including amphibians and dragonflies. Nevertheless, their number is decreasing due to habitat degradation caused by human activities. The “Ponds with Life” environmental education project was developed to raise public awareness and engagement in the study of ponds by promoting the direct contact between the public and nature, researchers and pedagogical hands-on exploration activities. A pre-post- project survey was set-up to assess the effects of the project on the environmental consciousness, knowledge and attitude changes towards ponds and the associated biodiversity of school students aged 15 to 18. The survey questions were based on Likert scales and their pre-post project comparisons used an innovative multivariate hypothesis testing approach. The results showed that the project improved the students’ knowledge and attitudes towards ponds and associated biodiversity, especially the amphibians. Ponds can be found or constructed in urban areas and despite small sized, they proved to be interesting model habitats and living laboratories to foster environmental education, by encompassing a high number of species and a fast ecological succession. PMID:27148879
Sousa, Eunice; Quintino, Victor; Palhas, Jael; Rodrigues, Ana Maria; Teixeira, José
2016-01-01
Ponds provide vital ecological services. They are biodiversity hotspots and important breading sites for rare and endangered species, including amphibians and dragonflies. Nevertheless, their number is decreasing due to habitat degradation caused by human activities. The "Ponds with Life" environmental education project was developed to raise public awareness and engagement in the study of ponds by promoting the direct contact between the public and nature, researchers and pedagogical hands-on exploration activities. A pre-post- project survey was set-up to assess the effects of the project on the environmental consciousness, knowledge and attitude changes towards ponds and the associated biodiversity of school students aged 15 to 18. The survey questions were based on Likert scales and their pre-post project comparisons used an innovative multivariate hypothesis testing approach. The results showed that the project improved the students' knowledge and attitudes towards ponds and associated biodiversity, especially the amphibians. Ponds can be found or constructed in urban areas and despite small sized, they proved to be interesting model habitats and living laboratories to foster environmental education, by encompassing a high number of species and a fast ecological succession.
ARIANNA: A research environment for neuroimaging studies in autism spectrum disorders.
Retico, Alessandra; Arezzini, Silvia; Bosco, Paolo; Calderoni, Sara; Ciampa, Alberto; Coscetti, Simone; Cuomo, Stefano; De Santis, Luca; Fabiani, Dario; Fantacci, Maria Evelina; Giuliano, Alessia; Mazzoni, Enrico; Mercatali, Pietro; Miscali, Giovanni; Pardini, Massimiliano; Prosperi, Margherita; Romano, Francesco; Tamburini, Elena; Tosetti, Michela; Muratori, Filippo
2017-08-01
The complexity and heterogeneity of Autism Spectrum Disorders (ASD) require the implementation of dedicated analysis techniques to obtain the maximum from the interrelationship among many variables that describe affected individuals, spanning from clinical phenotypic characterization and genetic profile to structural and functional brain images. The ARIANNA project has developed a collaborative interdisciplinary research environment that is easily accessible to the community of researchers working on ASD (https://arianna.pi.infn.it). The main goals of the project are: to analyze neuroimaging data acquired in multiple sites with multivariate approaches based on machine learning; to detect structural and functional brain characteristics that allow the distinguishing of individuals with ASD from control subjects; to identify neuroimaging-based criteria to stratify the population with ASD to support the future development of personalized treatments. Secure data handling and storage are guaranteed within the project, as well as the access to fast grid/cloud-based computational resources. This paper outlines the web-based architecture, the computing infrastructure and the collaborative analysis workflows at the basis of the ARIANNA interdisciplinary working environment. It also demonstrates the full functionality of the research platform. The availability of this innovative working environment for analyzing clinical and neuroimaging information of individuals with ASD is expected to support researchers in disentangling complex data thus facilitating their interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sociodemographic Factors, Population Density, and Bicycling for Transportation in the United States.
Nehme, Eileen K; Pérez, Adriana; Ranjit, Nalini; Amick, Benjamin C; Kohl, Harold W
2016-01-01
Transportation bicycling is a behavior with demonstrated health benefits. Population-representative studies of transportation bicycling in United States are lacking. This study examined associations between sociodemographic factors, population density, and transportation bicycling and described transportation bicyclists by trip purposes, using a US-representative sample. This cross-sectional study used 2009 National Household Travel Survey datasets. Associations among study variables were assessed using weighted multivariable logistic regression. On a typical day in 2009, 1% of Americans older than 5 years of age reported a transportation bicycling trip. Transportation cycling was inversely associated with age and directly with being male, with being white, and with population density (≥ 10,000 vs < 500 people/square mile: odd ratio, 2.78, 95% confidence interval, 1.54-5.05). Those whose highest level of education was a high school diploma or some college were least likely to bicycle for transportation. Twenty-one percent of transportation bicyclists reported trips to work, whereas 67% reported trips to social or other activities. Transportation bicycling in the United States is associated with sociodemographic characteristics and population density. Bicycles are used for a variety of trip purposes, which has implications for transportation bicycling research based on commuter data and for developing interventions to promote this behavior.
. Another project used multivariate statistics to develop a novel device to non-invasively measure hydrogen Cellulosic Ethanol Production due to Experimental Measurement Uncertainty," Biotechnology for Biofuels
The use of recycled concrete aggregate in a dense graded aggregate base course.
DOT National Transportation Integrated Search
2008-03-01
The research project was broken up into 2 different parts. The first part involved evaluating the potential use of the Time : Domain Reflectometry, TDR (ASTM D6780), as a non-nuclear means for determining the dry density and moisture content of : gra...
Intharathirat, Rotchana; Abdul Salam, P; Kumar, S; Untong, Akarapong
2015-05-01
In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developing countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435-44,994 tonnes per day in 2013 to 55,177-56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.
MicroRNA let-7, T cells, and patient survival in colorectal cancer
Dou, Ruoxu; Nishihara, Reiko; Cao, Yin; Hamada, Tsuyoshi; Mima, Kosuke; Masuda, Atsuhiro; Masugi, Yohei; Shi, Yan; Gu, Mancang; Li, Wanwan; da Silva, Annacarolina; Nosho, Katsuhiko; Zhang, Xuehong; Meyerhardt, Jeffrey A.; Giovannucci, Edward L.; Chan, Andrew T.; Fuchs, Charles S.; Qian, Zhi Rong; Ogino, Shuji
2016-01-01
Experimental evidence suggests that the let-7 family of noncoding RNAs suppresses adaptive immune responses, contributing to immune evasion by the tumor. We hypothesized that the amount of let-7a and let-7b expression in colorectal carcinoma might be associated with limited T-lymphocyte infiltrates in the tumor microenvironment and worse clinical outcome. Utilizing the molecular pathological epidemiology resources of 795 rectal and colon cancers in two U.S.-nationwide prospective cohort studies, we measured tumor-associated let-7a and let-7b expression levels by quantitative reverse-transcription PCR, and CD3+, CD8+, CD45RO (PTPRC)+, and FOXP3+ cell densities by tumor tissue microarray immunohistochemistry and computer-assisted image analysis. Logistic regression analysis and Cox proportional hazards regression were used to assess associations of let-7a (and let-7b) expression (quartile predictor variables) with T-cell densities (binary outcome variables) and mortality, respectively, controlling for tumor molecular features, including microsatellite instability, CpG island methylator phenotype, LINE-1 methylation, and KRAS, BRAF, and PIK3CA mutations. Compared with cases in the lowest quartile of let-7a expression, those in the highest quartile were associated with lower densities of CD3+ [multivariate odds ratio (OR), 0.40; 95% confidence interval (CI), 0.23 to 0.67; Ptrend = 0.003] and CD45RO+ cells (multivariate OR, 0.31; 95% CI, 0.17 to 0.58; Ptrend = 0.0004), and higher colorectal cancer-specific mortality (multivariate hazard ratio, 1.82; 95% CI, 1.42 to 3.13; Ptrend = 0.001). In contrast, let-7b expression was not significantly associated with T-cell density or colorectal cancer prognosis. Our data support the role of let-7a in suppressing antitumor immunity in colorectal cancer, and suggest let-7a as a potential target of immunotherapy. PMID:27737877
Spatial assessment of air quality patterns in Malaysia using multivariate analysis
NASA Astrophysics Data System (ADS)
Dominick, Doreena; Juahir, Hafizan; Latif, Mohd Talib; Zain, Sharifuddin M.; Aris, Ahmad Zaharin
2012-12-01
This study aims to investigate possible sources of air pollutants and the spatial patterns within the eight selected Malaysian air monitoring stations based on a two-year database (2008-2009). The multivariate analysis was applied on the dataset. It incorporated Hierarchical Agglomerative Cluster Analysis (HACA) to access the spatial patterns, Principal Component Analysis (PCA) to determine the major sources of the air pollution and Multiple Linear Regression (MLR) to assess the percentage contribution of each air pollutant. The HACA results grouped the eight monitoring stations into three different clusters, based on the characteristics of the air pollutants and meteorological parameters. The PCA analysis showed that the major sources of air pollution were emissions from motor vehicles, aircraft, industries and areas of high population density. The MLR analysis demonstrated that the main pollutant contributing to variability in the Air Pollutant Index (API) at all stations was particulate matter with a diameter of less than 10 μm (PM10). Further MLR analysis showed that the main air pollutant influencing the high concentration of PM10 was carbon monoxide (CO). This was due to combustion processes, particularly originating from motor vehicles. Meteorological factors such as ambient temperature, wind speed and humidity were also noted to influence the concentration of PM10.
NASA Astrophysics Data System (ADS)
Khajehei, S.; Madadgar, S.; Moradkhani, H.
2014-12-01
The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).
Buried landmine detection using multivariate normal clustering
NASA Astrophysics Data System (ADS)
Duston, Brian M.
2001-10-01
A Bayesian classification algorithm is presented for discriminating buried land mines from buried and surface clutter in Ground Penetrating Radar (GPR) signals. This algorithm is based on multivariate normal (MVN) clustering, where feature vectors are used to identify populations (clusters) of mines and clutter objects. The features are extracted from two-dimensional images created from ground penetrating radar scans. MVN clustering is used to determine the number of clusters in the data and to create probability density models for target and clutter populations, producing the MVN clustering classifier (MVNCC). The Bayesian Information Criteria (BIC) is used to evaluate each model to determine the number of clusters in the data. An extension of the MVNCC allows the model to adapt to local clutter distributions by treating each of the MVN cluster components as a Poisson process and adaptively estimating the intensity parameters. The algorithm is developed using data collected by the Mine Hunter/Killer Close-In Detector (MH/K CID) at prepared mine lanes. The Mine Hunter/Killer is a prototype mine detecting and neutralizing vehicle developed for the U.S. Army to clear roads of anti-tank mines.
Alter, Andrea; Huong, Nguyen Thu; Singh, Meenakshi; Orlova, Marianna; Van Thuc, Nguyen; Katoch, Kiran; Gao, Xiaojiang; Thai, Vu Hong; Ba, Nguyen Ngoc; Carrington, Mary; Abel, Laurent; Mehra, Narinder; Alcaïs, Alexandre; Schurr, Erwin
2011-05-01
Experimental evidence suggested the existence of unidentified leprosy susceptibility loci in the human leukocyte antigen (HLA) complex. To identify such genetic risk factors, a high-density association scan of a 1.9-mega-base (Mb) region in the HLA complex was performed. Among 682 single-nucleotide polymorphisms (SNPs), 59 were associated with leprosy (P <.01) in 198 Vietnamese single-case leprosy families. Genotyping of these SNPs in an independent sample of 292 Vietnamese single-case leprosy families replicated the association of 12 SNPs (P <.01). Multivariate analysis of these 12 SNPs showed that the association information could be captured by 2 intergenic HLA class I region SNPs (P = 9.4 × 10⁻⁹)-rs2394885 and rs2922997 (marginal multivariate P = 2.1 × 10⁻⁷ and P = .0016, respectively). SNP rs2394885 tagged the HLA-C*15:05 allele in the Vietnamese population. The identical associations were validated in a third sample of 364 patients with leprosy and 371 control subjects from North India. These results implicated class I alleles in leprosy pathogenesis.
Alter, Andrea; Huong, Nguyen Thu; Singh, Meenakshi; Orlova, Marianna; Van Thuc, Nguyen; Katoch, Kiran; Gao, Xiaojiang; Thai, Vu Hong; Ba, Nguyen Ngoc; Carrington, Mary; Abel, Laurent; Mehra, Narinder; Alcaïs, Alexandre
2011-01-01
Experimental evidence suggested the existence of unidentified leprosy susceptibility loci in the human leukocyte antigen (HLA) complex. To identify such genetic risk factors, a high-density association scan of a 1.9-mega-base (Mb) region in the HLA complex was performed. Among 682 single-nucleotide polymorphisms (SNPs), 59 were associated with leprosy (P <.01) in 198 Vietnamese single-case leprosy families. Genotyping of these SNPs in an independent sample of 292 Vietnamese single-case leprosy families replicated the association of 12 SNPs (P <.01). Multivariate analysis of these 12 SNPs showed that the association information could be captured by 2 intergenic HLA class I region SNPs (P = 9.4 × 10−9)—rs2394885 and rs2922997 (marginal multivariate P = 2.1 × 10−7 and P = .0016, respectively). SNP rs2394885 tagged the HLA-C*15:05 allele in the Vietnamese population. The identical associations were validated in a third sample of 364 patients with leprosy and 371 control subjects from North India. These results implicated class I alleles in leprosy pathogenesis. PMID:21459816
Development of a Multivariable Parametric Cost Analysis for Space-Based Telescopes
NASA Technical Reports Server (NTRS)
Dollinger, Courtnay
2011-01-01
Over the past 400 years, the telescope has proven to be a valuable tool in helping humankind understand the Universe around us. The images and data produced by telescopes have revolutionized planetary, solar, stellar, and galactic astronomy and have inspired a wide range of people, from the child who dreams about the images seen on NASA websites to the most highly trained scientist. Like all scientific endeavors, astronomical research must operate within the constraints imposed by budget limitations. Hence the importance of understanding cost: to find the balance between the dreams of scientists and the restrictions of the available budget. By logically analyzing the data we have collected for over thirty different telescopes from more than 200 different sources, statistical methods, such as plotting regressions and residuals, can be used to determine what drives the cost of telescopes to build and use a cost model for space-based telescopes. Previous cost models have focused their attention on ground-based telescopes due to limited data for space telescopes and the larger number and longer history of ground-based astronomy. Due to the increased availability of cost data from recent space-telescope construction, we have been able to produce and begin testing a comprehensive cost model for space telescopes, with guidance from the cost models for ground-based telescopes. By separating the variables that effect cost such as diameter, mass, wavelength, density, data rate, and number of instruments, we advance the goal to better understand the cost drivers of space telescopes.. The use of sophisticated mathematical techniques to improve the accuracy of cost models has the potential to help society make informed decisions about proposed scientific projects. An improved knowledge of cost will allow scientists to get the maximum value returned for the money given and create a harmony between the visions of scientists and the reality of a budget.
Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong
2015-09-01
This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.
Planetary atmosphere models: A research and instructional web-based resource
NASA Astrophysics Data System (ADS)
Gray, Samuel Augustine
The effects of altitude change on the temperature, pressure, density, and speed of sound were investigated. These effects have been documented in Global Reference Atmospheric Models (GRAMs) to be used in calculating the conditions in various parts of the atmosphere for several planets. Besides GRAMs, there are several websites that provide online calculators for the 1976 US Standard Atmosphere. This thesis presents the creation of an online calculator of the atmospheres of Earth, Mars, Venus, Titan, and Neptune. The websites consist of input forms for altitude and temperature adjustment followed by a results table for the calculated data. The first phase involved creating a spreadsheet reference based on the 1976 US Standard Atmosphere and other planetary GRAMs available. Microsoft Excel was used to input the equations and make a graphical representation of the temperature, pressure, density, and speed of sound change as altitude changed using equations obtained from the GRAMs. These spreadsheets were used later as a reference for the JavaScript code in both the design and comparison of the data output of the calculators. The websites were created using HTML, CSS, and JavaScript coding languages. The calculators could accurately display the temperature, pressure, density, and speed of sound of these planets from surface values to various stages within the atmosphere. These websites provide a resource for students involved in projects and classes that require knowledge of these changes in these atmospheres. This project also created a chance for new project topics to arise for future students involved in aeronautics and astronautics.
Galagan, Sean R; Paul, Proma; Menezes, Lysander; LaMontagne, D Scott
2013-06-26
This study investigates the effect of communication strategies on human papillomavirus (HPV) vaccine uptake in HPV vaccine demonstration projects in Uganda and Vietnam. Secondary analysis was conducted on data from surveys of a representative sample of parents and guardians of girls eligible for HPV vaccine, measuring three-dose coverage achieved in demonstration projects in 2008-2010. Univariate and multivariate logistic regression analysis calculated the unadjusted and adjusted odds of receiving at least one dose of HPV vaccine depending on exposure to community influencers; information, education, and communication (IEC) channels; and demographic factors. This study found that exposure to community influencers was associated with HPV vaccine uptake in a multivariate model controlling for other factors. Exposure to non-interactive IEC channels was only marginally associated with HPV vaccine uptake. These results underscore the need of HPV vaccine programs in low- and middle-income countries to involve and utilize key community influencers and stakeholders to maximize HPV vaccine uptake. Copyright © 2013 Elsevier Ltd. All rights reserved.
A new subgrid-scale representation of hydrometeor fields using a multivariate PDF
Griffin, Brian M.; Larson, Vincent E.
2016-06-03
The subgrid-scale representation of hydrometeor fields is important for calculating microphysical process rates. In order to represent subgrid-scale variability, the Cloud Layers Unified By Binormals (CLUBB) parameterization uses a multivariate probability density function (PDF). In addition to vertical velocity, temperature, and moisture fields, the PDF includes hydrometeor fields. Previously, hydrometeor fields were assumed to follow a multivariate single lognormal distribution. Now, in order to better represent the distribution of hydrometeors, two new multivariate PDFs are formulated and introduced.The new PDFs represent hydrometeors using either a delta-lognormal or a delta-double-lognormal shape. The two new PDF distributions, plus the previous single lognormalmore » shape, are compared to histograms of data taken from large-eddy simulations (LESs) of a precipitating cumulus case, a drizzling stratocumulus case, and a deep convective case. In conclusion, the warm microphysical process rates produced by the different hydrometeor PDFs are compared to the same process rates produced by the LES.« less
Gaussianization for fast and accurate inference from cosmological data
NASA Astrophysics Data System (ADS)
Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.
2016-06-01
We present a method to transform multivariate unimodal non-Gaussian posterior probability densities into approximately Gaussian ones via non-linear mappings, such as Box-Cox transformations and generalizations thereof. This permits an analytical reconstruction of the posterior from a point sample, like a Markov chain, and simplifies the subsequent joint analysis with other experiments. This way, a multivariate posterior density can be reported efficiently, by compressing the information contained in Markov Chain Monte Carlo samples. Further, the model evidence integral (I.e. the marginal likelihood) can be computed analytically. This method is analogous to the search for normal parameters in the cosmic microwave background, but is more general. The search for the optimally Gaussianizing transformation is performed computationally through a maximum-likelihood formalism; its quality can be judged by how well the credible regions of the posterior are reproduced. We demonstrate that our method outperforms kernel density estimates in this objective. Further, we select marginal posterior samples from Planck data with several distinct strongly non-Gaussian features, and verify the reproduction of the marginal contours. To demonstrate evidence computation, we Gaussianize the joint distribution of data from weak lensing and baryon acoustic oscillations, for different cosmological models, and find a preference for flat Λcold dark matter. Comparing to values computed with the Savage-Dickey density ratio, and Population Monte Carlo, we find good agreement of our method within the spread of the other two.
Tile-based parallel coordinates and its application in financial visualization
NASA Astrophysics Data System (ADS)
Alsakran, Jamal; Zhao, Ye; Zhao, Xinlei
2010-01-01
Parallel coordinates technique has been widely used in information visualization applications and it has achieved great success in visualizing multivariate data and perceiving their trends. Nevertheless, visual clutter usually weakens or even diminishes its ability when the data size increases. In this paper, we first propose a tile-based parallel coordinates, where the plotting area is divided into rectangular tiles. Each tile stores an intersection density that counts the total number of polylines intersecting with that tile. Consequently, the intersection density is mapped to optical attributes, such as color and opacity, by interactive transfer functions. The method visualizes the polylines efficiently and informatively in accordance with the density distribution, and thus, reduces visual cluttering and promotes knowledge discovery. The interactivity of our method allows the user to instantaneously manipulate the tiles distribution and the transfer functions. Specifically, the classic parallel coordinates rendering is a special case of our method when each tile represents only one pixel. A case study on a real world data set, U.S. stock mutual fund data of year 2006, is presented to show the capability of our method in visually analyzing financial data. The presented visual analysis is conducted by an expert in the domain of finance. Our method gains the support from professionals in the finance field, they embrace it as a potential investment analysis tool for mutual fund managers, financial planners, and investors.
Preisser, J. S.; Hammett-Stabler, C. A.; Renner, J. B.; Rubin, J.
2011-01-01
Summary The association between follicle-stimulating hormone (FSH) and bone density was tested in 111 postmenopausal women aged 50–64 years. In the multivariable analysis, weight and race were important determinants of bone mineral density. FSH, bioavailable estradiol, and other hormonal variables did not show statistically significant associations with bone density at any site. Introduction FSH has been associated with bone density loss in animal models and longitudinal studies of women. Most of these analyses have not considered the effect of weight or race. Methods We tested the association between FSH and bone density in younger postmenopausal women, adjusting for patient-related factors. In 111 postmenopausal women aged 50–64 years, areal bone mineral density (BMD) was measured at the lumbar spine, femoral neck, total hip, and distal radius using dual-energy X-ray absorptiometry, and volumetric BMD was measured at the distal radius using peripheral quantitative computed tomography (pQCT). Height, weight, osteoporosis risk factors, and serum hormonal factors were assessed. Results FSH inversely correlated with weight, bioavailable estradiol, areal BMD at the lumbar spine and hip, and volumetric BMD at the ultradistal radius. In the multivariable analysis, no hormonal variable showed a statistically significant association with areal BMD at any site. Weight was independently associated with BMD at all central sites (p<0.001), but not with BMD or pQCT measures at the distal radius. Race was independently associated with areal BMD at all sites (p≤0.008) and with cortical area at the 33% distal radius (p=0.004). Conclusions Correlations between FSH and bioavailable estradiol and BMD did not persist after adjustment for weight and race in younger postmenopausal women. Weight and race were more important determinants of bone density and should be included in analyses of hormonal influences on bone. PMID:21125395
Dependence of Some Properties of Groups on Group Local Number Density
NASA Astrophysics Data System (ADS)
Deng, Xin-Fa; Wu, Ping
2014-09-01
In this study we investigate the dependence of projected size Sizesky, and rms deviation σR of projected distance in the sky from the group center, rms velocities σV , and virial radius RVir of groups on group local number density. In the volume-limited group samples, it is found that groups in high density regions preferentially have larger Sizesky, σR , σV , and RVir than ones in low density regions.
VizieR Online Data Catalog: Structure of young stellar clusters. II. (Kuhn+, 2015)
NASA Astrophysics Data System (ADS)
Kuhn, M. A.; Getman, K. V.; Feigelson, E. D.
2015-07-01
We investigate the intrinsic stellar populations (estimated total numbers of OB and pre-main-sequence stars down to 0.1Mȯ) that are present in 17 massive star-forming regions (MSFRs) surveyed by the MYStIX project. The study is based on the catalog of >31000 MYStIX Probable Complex Members with both disk-bearing and disk-free populations, compensating for extinction, nebulosity, and crowding effects. Correction for observational sensitivities is made using the X-ray luminosity function and the near-infrared initial mass function --a correction that is often not made by infrared surveys of young stars. The resulting maps of the projected structure of the young stellar populations, in units of intrinsic stellar surface density, allow direct comparison between different regions. Several regions have multiple dense clumps, similar in size and density to the Orion Nebula Cluster. The highest projected density of ~34000 stars/pc2 is found in the core of the RCW 38 cluster. Histograms of surface density show different ranges of values in different regions, supporting the conclusion of Bressert et al. (B10; 2010MNRAS.409L..54B) that no universal surface-density threshold can distinguish between clustered and distributed star formation. However, a large component of the young stellar population of MSFRs resides in dense environments of 200-10000 stars/pc2 (including within the nearby Orion molecular clouds), and we find that there is no evidence for the B10 conclusion that such dense regions form an extreme "tail" of the distribution. Tables of intrinsic populations for these regions are used in our companion study of young cluster properties and evolution. (3 data files).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan, A.S.; Cauble, R.; Da Silva, L.B.
1996-02-01
This report summarizes the major accomplishments of this three-year Laboratory Directed Research and Development (LDRD) Exploratory Research Project (ERP) entitled ``X-ray Laser Propagation and Coherence: Diagnosing Fast-evolving, High-density Laser Plasmas Using X-ray Lasers,`` tracking code 93-ERP-075. The most significant accomplishment of this project is the demonstration of a new laser plasma diagnostic: a soft x-ray Mach-Zehnder interferometer using a neonlike yttrium x-ray laser at 155 {angstrom} as the probe source. Detailed comparisons of absolute two-dimensional electron density profiles obtained from soft x-ray laser interferograms and profiles obtained from radiation hydrodynamics codes, such as LASNEX, will allow us to validate andmore » benchmark complex numerical models used to study the physics of laser-plasma interactions. Thus the development of soft x-ray interferometry technique provides a mechanism to probe the deficiencies of the numerical models and is an important tool for, the high-energy density physics and science-based stockpile stewardship programs. The authors have used the soft x-ray interferometer to study a number of high-density, fast evolving, laser-produced plasmas, such as the dynamics of exploding foils and colliding plasmas. They are pursuing the application of the soft x-ray interferometer to study ICF-relevant plasmas, such as capsules and hohlraums, on the Nova 10-beam facility. They have also studied the development of enhanced-coherence, shorter-pulse-duration, and high-brightness x-ray lasers. The utilization of improved x-ray laser sources can ultimately enable them to obtain three-dimensional holographic images of laser-produced plasmas.« less
Cluster-based exposure variation analysis
2013-01-01
Background Static posture, repetitive movements and lack of physical variation are known risk factors for work-related musculoskeletal disorders, and thus needs to be properly assessed in occupational studies. The aims of this study were (i) to investigate the effectiveness of a conventional exposure variation analysis (EVA) in discriminating exposure time lines and (ii) to compare it with a new cluster-based method for analysis of exposure variation. Methods For this purpose, we simulated a repeated cyclic exposure varying within each cycle between “low” and “high” exposure levels in a “near” or “far” range, and with “low” or “high” velocities (exposure change rates). The duration of each cycle was also manipulated by selecting a “small” or “large” standard deviation of the cycle time. Theses parameters reflected three dimensions of exposure variation, i.e. range, frequency and temporal similarity. Each simulation trace included two realizations of 100 concatenated cycles with either low (ρ = 0.1), medium (ρ = 0.5) or high (ρ = 0.9) correlation between the realizations. These traces were analyzed by conventional EVA, and a novel cluster-based EVA (C-EVA). Principal component analysis (PCA) was applied on the marginal distributions of 1) the EVA of each of the realizations (univariate approach), 2) a combination of the EVA of both realizations (multivariate approach) and 3) C-EVA. The least number of principal components describing more than 90% of variability in each case was selected and the projection of marginal distributions along the selected principal component was calculated. A linear classifier was then applied to these projections to discriminate between the simulated exposure patterns, and the accuracy of classified realizations was determined. Results C-EVA classified exposures more correctly than univariate and multivariate EVA approaches; classification accuracy was 49%, 47% and 52% for EVA (univariate and multivariate), and C-EVA, respectively (p < 0.001). All three methods performed poorly in discriminating exposure patterns differing with respect to the variability in cycle time duration. Conclusion While C-EVA had a higher accuracy than conventional EVA, both failed to detect differences in temporal similarity. The data-driven optimality of data reduction and the capability of handling multiple exposure time lines in a single analysis are the advantages of the C-EVA. PMID:23557439
Morgan, Elise F.; Mason, Zachary D.; Chien, Karen B.; Pfeiffer, Anthony J.; Barnes, George L.; Einhorn, Thomas A.; Gerstenfeld, Louis C.
2009-01-01
Non-invasive characterization of fracture callus structure and composition may facilitate development of surrogate measures of the regain of mechanical function. As such, quantitative computed tomography- (CT-) based analyses of fracture calluses could enable more reliable clinical assessments of bone healing. Although previous studies have used CT to quantify and predict fracture healing, it is unclear which of the many CT-derived metrics of callus structure and composition are the most predictive of callus mechanical properties. The goal of this study was to identify the changes in fracture callus structure and composition that occur over time and that are most closely related to the regain of mechanical function. Micro-computed tomography (μCT) imaging and torsion testing were performed on murine fracture calluses (n=188) at multiple post-fracture timepoints and under different experimental conditions that alter fracture healing. Total callus volume (TV), mineralized callus volume (BV), callus mineralized volume fraction (BV/TV), bone mineral content (BMC), tissue mineral density (TMD), standard deviation of mineral density (σTMD), effective polar moment of inertia (Jeff), torsional strength, and torsional rigidity were quantified. Multivariate statistical analyses, including multivariate analysis of variance, principal components analysis, and stepwise regression were used to identify differences in callus structure and composition among experimental groups and to determine which of the μCT outcome measures were the strongest predictors of mechanical properties. Although calluses varied greatly in the absolute and relative amounts of mineralized tissue (BV, BMC, and BV/TV), differences among timepoints were most strongly associated with changes in tissue mineral density. Torsional strength and rigidity were dependent on mineral density as well as the amount of mineralized tissue: TMD, BV, and σTMD explained 62% of the variation in torsional strength (p<0.001); and TMD, BMC, BV/TV, and σTMD explained 70% of the variation in torsional rigidity (p<0.001). These results indicate that fracture callus mechanical properties can be predicted by several μCT-derived measures of callus structure and composition. These findings form the basis for developing non-invasive assessments of fracture healing and for identifying biological and biomechanical mechanisms that lead to impaired or enhanced healing. PMID:19013264
MSeq-CNV: accurate detection of Copy Number Variation from Sequencing of Multiple samples.
Malekpour, Seyed Amir; Pezeshk, Hamid; Sadeghi, Mehdi
2018-03-05
Currently a few tools are capable of detecting genome-wide Copy Number Variations (CNVs) based on sequencing of multiple samples. Although aberrations in mate pair insertion sizes provide additional hints for the CNV detection based on multiple samples, the majority of the current tools rely only on the depth of coverage. Here, we propose a new algorithm (MSeq-CNV) which allows detecting common CNVs across multiple samples. MSeq-CNV applies a mixture density for modeling aberrations in depth of coverage and abnormalities in the mate pair insertion sizes. Each component in this mixture density applies a Binomial distribution for modeling the number of mate pairs with aberration in the insertion size and also a Poisson distribution for emitting the read counts, in each genomic position. MSeq-CNV is applied on simulated data and also on real data of six HapMap individuals with high-coverage sequencing, in 1000 Genomes Project. These individuals include a CEU trio of European ancestry and a YRI trio of Nigerian ethnicity. Ancestry of these individuals is studied by clustering the identified CNVs. MSeq-CNV is also applied for detecting CNVs in two samples with low-coverage sequencing in 1000 Genomes Project and six samples form the Simons Genome Diversity Project.
Interpretation of hip fracture patterns using areal bone mineral density in the proximal femur.
Hey, Hwee Weng Dennis; Sng, Weizhong Jonathan; Lim, Joel Louis Zongwei; Tan, Chuen Seng; Gan, Alfred Tau Liang; Ng, Jun Han Charles; Kagda, Fareed H Y
2015-12-01
Bone mineral density scans are currently interpreted based on an average score of the entire proximal femur. Improvements in technology now allow us to measure bone density in specific regions of the proximal femur. The study attempts to explain the pathophysiology of neck of femur (NOF) and intertrochanteric/basi-cervical (IT) fractures by correlating areal BMD (aBMD) scores with fracture patterns, and explore possible predictors for these fracture patterns. This is a single institution retrospective study on all patients who underwent hip surgeries from June 2010 to August 2012. A total of 106 patients (44 IT/basi-cervical, 62 NOF fractures) were studied. The data retrieved include patient characteristics and aBMD scores measured at different regions of the contralateral hip within 1 month of the injury. Demographic and clinical characteristic differences between IT and NOF fractures were analyzed using Fisher's Exact test and two-sample t test. Relationship between aBMD scores and fracture patterns was assessed using multivariable regression modeling. After adjusted multivariable analysis, T-Troc and T-inter scores were significantly lower in intertrochanteric/basi-cervical fractures compared to neck of femur fractures (P = 0.022 and P = 0.026, respectively). Both intertrochanteric/basi-cervical fractures (mean T.Tot -1.99) and neck of femur fractures (mean T.Tot -1.64) were not found to be associated with a mean T.tot less than -2.5. However, the mean aBMD scores were consistently less than -2.5 for both intertrochanteric/basi-cervical fractures and neck of femur fractures. Gender and calcium intake at the time of injury were associated with specific hip fracture patterns (P = 0.002 and P = 0.011, respectively). Hip fracture patterns following low energy trauma may be influenced by the pattern of reduced bone density in different areas of the hip. Intertrochanteric/basi-cervical fractures were associated with significantly lower T-Troc and T-Inter scores compared to neck of femur fractures, suggesting that the fracture traversed through the areas with the lowest bone density in the proximal femur. In the absence of reduced T.Troc and T.Inter, neck of femur fractures occurred more commonly. T-Total scores may underestimate the severity of osteoporosis/osteopenia and measuring T-score at the neck of femur may better reflect the severity of osteoporosis and likelihood of a fragility fracture.
Provost, Alden M.; Payne, Dorothy F.; Voss, Clifford I.
2006-01-01
A digital model was developed to simulate ground-water flow and solute transport for the Upper Floridan aquifer in the Savannah, Georgia-Hilton Head Island, South Carolina, area. The model was used to (1) simulate trends of saltwater intrusion from predevelopment to the present day (1885-2004), (2) project these trends from the present day into the future, and (3) evaluate the relative influence of different assumptions regarding initial and boundary conditions and physical properties. The model is based on a regional, single-density ground-water flow model of coastal Georgia and adjacent parts of South Carolina and Florida. Variable-density ground-water flow and solute transport were simulated using the U.S. Geological Survey finite-element, variable-density solute-transport simulator SUTRA, 1885-2004. The model comprises seven layers: the surficial aquifer system, the Brunswick aquifer system, the Upper Floridan aquifer, the Lower Floridan aquifer, and the intervening confining units. The model was calibrated to September 1998 water levels, for single-density freshwater conditions, then refined using variable density and chloride concentration to give a reasonable match to the trend in the chloride distribution in the Upper Floridan aquifer inferred from field measurements of specific conductance made during 2000, 2002, 2003, and 2004. The model was modified to simulate solute transport by allowing saltwater to enter the system through localized areas near the northern end of Hilton Head Island, at Pinckney Island, and near the Colleton River, and was calibrated to match chloride concentrations inferred from field measurements of specific conductance. This simulation is called the 'Base Case.'
Brugger, Katharina; Rubel, Franz
2013-01-01
Bluetongue is an arboviral disease of ruminants causing significant economic losses. Our risk assessment is based on the epidemiological key parameter, the basic reproduction number. It is defined as the number of secondary cases caused by one primary case in a fully susceptible host population, in which values greater than one indicate the possibility, i.e., the risk, for a major disease outbreak. In the course of the Bluetongue virus serotype 8 (BTV-8) outbreak in Europe in 2006 we developed such a risk assessment for the University of Veterinary Medicine Vienna, Austria. Basic reproduction numbers were calculated using a well-known formula for vector-borne diseases considering the population densities of hosts (cattle and small ruminants) and vectors (biting midges of the Culicoides obsoletus spp.) as well as temperature dependent rates. The latter comprise the biting and mortality rate of midges as well as the reciprocal of the extrinsic incubation period. Most important, but generally unknown, is the spatio-temporal distribution of the vector density. Therefore, we established a continuously operating daily monitoring to quantify the seasonal cycle of the vector population by a statistical model. We used cross-correlation maps and Poisson regression to describe vector densities by environmental temperature and precipitation. Our results comprise time series of observed and simulated Culicoides obsoletus spp. counts as well as basic reproduction numbers for the period 2009–2011. For a spatio-temporal risk assessment we projected our results from the location of Vienna to the entire region of Austria. We compiled both daily maps of vector densities and the basic reproduction numbers, respectively. Basic reproduction numbers above one were generally found between June and August except in the mountainous regions of the Alps. The highest values coincide with the locations of confirmed BTV cases. PMID:23560090
The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test
Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain; ...
2016-12-20
Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less
The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain
Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less
THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT. II. ISOLATED DISK TEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain
Using an isolated Milky Way-mass galaxy simulation, we compare results from nine state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt–Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly formed stellar clump mass functions show more significant variation (difference by up to a factor of ∼3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low-density region, and between more diffusive and less diffusive schemes in the high-density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less
Forecasting of municipal solid waste quantity in a developing country using multivariate grey models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Intharathirat, Rotchana, E-mail: rotchana.in@gmail.com; Abdul Salam, P., E-mail: salam@ait.ac.th; Kumar, S., E-mail: kumar@ait.ac.th
Highlights: • Grey model can be used to forecast MSW quantity accurately with the limited data. • Prediction interval overcomes the uncertainty of MSW forecast effectively. • A multivariate model gives accuracy associated with factors affecting MSW quantity. • Population, urbanization, employment and household size play role for MSW quantity. - Abstract: In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developingmore » countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435–44,994 tonnes per day in 2013 to 55,177–56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period.« less
Weichenthal, Scott; Ryswyk, Keith Van; Goldstein, Alon; Bagg, Scott; Shekkarizfard, Maryam; Hatzopoulou, Marianne
2016-04-01
Existing evidence suggests that ambient ultrafine particles (UFPs) (<0.1µm) may contribute to acute cardiorespiratory morbidity. However, few studies have examined the long-term health effects of these pollutants owing in part to a need for exposure surfaces that can be applied in large population-based studies. To address this need, we developed a land use regression model for UFPs in Montreal, Canada using mobile monitoring data collected from 414 road segments during the summer and winter months between 2011 and 2012. Two different approaches were examined for model development including standard multivariable linear regression and a machine learning approach (kernel-based regularized least squares (KRLS)) that learns the functional form of covariate impacts on ambient UFP concentrations from the data. The final models included parameters for population density, ambient temperature and wind speed, land use parameters (park space and open space), length of local roads and rail, and estimated annual average NOx emissions from traffic. The final multivariable linear regression model explained 62% of the spatial variation in ambient UFP concentrations whereas the KRLS model explained 79% of the variance. The KRLS model performed slightly better than the linear regression model when evaluated using an external dataset (R(2)=0.58 vs. 0.55) or a cross-validation procedure (R(2)=0.67 vs. 0.60). In general, our findings suggest that the KRLS approach may offer modest improvements in predictive performance compared to standard multivariable linear regression models used to estimate spatial variations in ambient UFPs. However, differences in predictive performance were not statistically significant when evaluated using the cross-validation procedure. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Reuter, Bryan; Oliver, Todd; Lee, M. K.; Moser, Robert
2017-11-01
We present an algorithm for a Direct Numerical Simulation of the variable-density Navier-Stokes equations based on the velocity-vorticity approach introduced by Kim, Moin, and Moser (1987). In the current work, a Helmholtz decomposition of the momentum is performed. Evolution equations for the curl and the Laplacian of the divergence-free portion are formulated by manipulation of the momentum equations and the curl-free portion is reconstructed by enforcing continuity. The solution is expanded in Fourier bases in the homogeneous directions and B-Spline bases in the inhomogeneous directions. Discrete equations are obtained through a mixed Fourier-Galerkin and collocation weighted residual method. The scheme is designed such that the numerical solution conserves mass locally and globally by ensuring the discrete divergence projection is exact through the use of higher order splines in the inhomogeneous directions. The formulation is tested on multiple variable-density flow problems.
Non-Gaussian probabilistic MEG source localisation based on kernel density estimation☆
Mohseni, Hamid R.; Kringelbach, Morten L.; Woolrich, Mark W.; Baker, Adam; Aziz, Tipu Z.; Probert-Smith, Penny
2014-01-01
There is strong evidence to suggest that data recorded from magnetoencephalography (MEG) follows a non-Gaussian distribution. However, existing standard methods for source localisation model the data using only second order statistics, and therefore use the inherent assumption of a Gaussian distribution. In this paper, we present a new general method for non-Gaussian source estimation of stationary signals for localising brain activity from MEG data. By providing a Bayesian formulation for MEG source localisation, we show that the source probability density function (pdf), which is not necessarily Gaussian, can be estimated using multivariate kernel density estimators. In the case of Gaussian data, the solution of the method is equivalent to that of widely used linearly constrained minimum variance (LCMV) beamformer. The method is also extended to handle data with highly correlated sources using the marginal distribution of the estimated joint distribution, which, in the case of Gaussian measurements, corresponds to the null-beamformer. The proposed non-Gaussian source localisation approach is shown to give better spatial estimates than the LCMV beamformer, both in simulations incorporating non-Gaussian signals, and in real MEG measurements of auditory and visual evoked responses, where the highly correlated sources are known to be difficult to estimate. PMID:24055702
Bromuri, Stefano; Zufferey, Damien; Hennebert, Jean; Schumacher, Michael
2014-10-01
This research is motivated by the issue of classifying illnesses of chronically ill patients for decision support in clinical settings. Our main objective is to propose multi-label classification of multivariate time series contained in medical records of chronically ill patients, by means of quantization methods, such as bag of words (BoW), and multi-label classification algorithms. Our second objective is to compare supervised dimensionality reduction techniques to state-of-the-art multi-label classification algorithms. The hypothesis is that kernel methods and locality preserving projections make such algorithms good candidates to study multi-label medical time series. We combine BoW and supervised dimensionality reduction algorithms to perform multi-label classification on health records of chronically ill patients. The considered algorithms are compared with state-of-the-art multi-label classifiers in two real world datasets. Portavita dataset contains 525 diabetes type 2 (DT2) patients, with co-morbidities of DT2 such as hypertension, dyslipidemia, and microvascular or macrovascular issues. MIMIC II dataset contains 2635 patients affected by thyroid disease, diabetes mellitus, lipoid metabolism disease, fluid electrolyte disease, hypertensive disease, thrombosis, hypotension, chronic obstructive pulmonary disease (COPD), liver disease and kidney disease. The algorithms are evaluated using multi-label evaluation metrics such as hamming loss, one error, coverage, ranking loss, and average precision. Non-linear dimensionality reduction approaches behave well on medical time series quantized using the BoW algorithm, with results comparable to state-of-the-art multi-label classification algorithms. Chaining the projected features has a positive impact on the performance of the algorithm with respect to pure binary relevance approaches. The evaluation highlights the feasibility of representing medical health records using the BoW for multi-label classification tasks. The study also highlights that dimensionality reduction algorithms based on kernel methods, locality preserving projections or both are good candidates to deal with multi-label classification tasks in medical time series with many missing values and high label density. Copyright © 2014 Elsevier Inc. All rights reserved.
Dietary energy density and body weight changes after 3 years in the PREDIMED study.
Razquin, Cristina; Sanchez-Tainta, Ana; Salas-Salvadó, Jordi; Buil-Cosiales, Pilar; Corella, Dolores; Fito, Montserrat; Ros, Emilio; Estruch, Ramón; Arós, Fernando; Gómez-Gracia, Enrique; Fiol, Miquel; Lapetra, José; Serra-Majem, Luis; Pinto, Xavier; Schröder, Helmut; Tur, Josep; Sorlí, José V; Lamuela-Raventós, Rosa M; Bulló, Mónica; Bes-Rastrollo, Maira; Martinez-Gonzalez, Miguel A
2017-11-01
The association of dietary energy density (ED) and overweight is not clear in the literature. Our aim was to study in 4259 of the PREDIMED trial whether an increase in dietary ED based on a higher adherence to a Mediterranean dietary pattern was associated with 3-year weight gain. A validated 137-item food-frequency questionnaire was administered. Multivariable-adjusted models were used to analyze the association between 3-year ED change and the subsequent 3-year body weight change. The most important weight reduction after 3-year follow-up was observed in the two lowest quintiles and the highest quintile of ED change. The highest ED increase was characterized by an increased intake of extra virgin olive oil (EVOO) and nuts and a decreased intake of other oils, vegetable and fruit consumption (p < .001). In conclusion, increased 3-year ED in the PREDIMED study, associated with a higher EVOO and nuts consumption, was not associated with weight gain.
Magheli, Ahmed; Hinz, Stefan; Hege, Claudia; Stephan, Carsten; Jung, Klaus; Miller, Kurt; Lein, Michael
2010-01-01
We investigated the value of pretreatment prostate specific antigen density to predict Gleason score upgrading in light of significant changes in grading routine in the last 2 decades. Of 1,061 consecutive men who underwent radical prostatectomy between 1999 and 2004, 843 were eligible for study. Prostate specific antigen density was calculated and a cutoff for highest accuracy to predict Gleason upgrading was determined using ROC curve analysis. The predictive accuracy of prostate specific antigen and prostate specific antigen density to predict Gleason upgrading was evaluated using ROC curve analysis based on predicted probabilities from logistic regression models. Prostate specific antigen and prostate specific antigen density predicted Gleason upgrading on univariate analysis (as continuous variables OR 1.07 and 7.21, each p <0.001) and on multivariate analysis (as continuous variables with prostate specific antigen density adjusted for prostate specific antigen OR 1.07, p <0.001 and OR 4.89, p = 0.037, respectively). When prostate specific antigen density was added to the model including prostate specific antigen and other Gleason upgrading predictors, prostate specific antigen lost its predictive value (OR 1.02, p = 0.423), while prostate specific antigen density remained an independent predictor (OR 4.89, p = 0.037). Prostate specific antigen density was more accurate than prostate specific antigen to predict Gleason upgrading (AUC 0.61 vs 0.57, p = 0.030). Prostate specific antigen density is a significant independent predictor of Gleason upgrading even when accounting for prostate specific antigen. This could be especially important in patients with low risk prostate cancer who seek less invasive therapy such as active surveillance since potentially life threatening disease may be underestimated. Further studies are warranted to help evaluate the role of prostate specific antigen density in Gleason upgrading and its significance for biochemical outcome.
Mapping Tree Density at the Global Scale
NASA Astrophysics Data System (ADS)
Covey, K. R.; Crowther, T. W.; Glick, H.; Bettigole, C.; Bradford, M.
2015-12-01
The global extent and distribution of forest trees is central to our understanding of the terrestrial biosphere. We provide the first spatially continuous map of forest tree density at a global-scale. This map reveals that the global number of trees is approximately 3.04 trillion, an order of magnitude higher than the previous estimate. Of these trees, approximately 1.39 trillion exist in tropical and subtropical regions, with 0.74, and 0.61 trillion in boreal and temperate regions, respectively. Biome-level trends in tree density demonstrate the importance of climate and topography in controlling local tree densities at finer scales, as well as the overwhelming impact of humans across most of the world. Based on our projected tree densities, we estimate that deforestation is currently responsible for removing over 15 billion trees each year, and the global number of trees has fallen by approximately 46% since the start of human civilization.
Mapping tree density at a global scale
NASA Astrophysics Data System (ADS)
Crowther, T. W.; Glick, H. B.; Covey, K. R.; Bettigole, C.; Maynard, D. S.; Thomas, S. M.; Smith, J. R.; Hintler, G.; Duguid, M. C.; Amatulli, G.; Tuanmu, M.-N.; Jetz, W.; Salas, C.; Stam, C.; Piotto, D.; Tavani, R.; Green, S.; Bruce, G.; Williams, S. J.; Wiser, S. K.; Huber, M. O.; Hengeveld, G. M.; Nabuurs, G.-J.; Tikhonova, E.; Borchardt, P.; Li, C.-F.; Powrie, L. W.; Fischer, M.; Hemp, A.; Homeier, J.; Cho, P.; Vibrans, A. C.; Umunay, P. M.; Piao, S. L.; Rowe, C. W.; Ashton, M. S.; Crane, P. R.; Bradford, M. A.
2015-09-01
The global extent and distribution of forest trees is central to our understanding of the terrestrial biosphere. We provide the first spatially continuous map of forest tree density at a global scale. This map reveals that the global number of trees is approximately 3.04 trillion, an order of magnitude higher than the previous estimate. Of these trees, approximately 1.39 trillion exist in tropical and subtropical forests, with 0.74 trillion in boreal regions and 0.61 trillion in temperate regions. Biome-level trends in tree density demonstrate the importance of climate and topography in controlling local tree densities at finer scales, as well as the overwhelming effect of humans across most of the world. Based on our projected tree densities, we estimate that over 15 billion trees are cut down each year, and the global number of trees has fallen by approximately 46% since the start of human civilization.
No need for external orthogonality in subsystem density-functional theory.
Unsleber, Jan P; Neugebauer, Johannes; Jacob, Christoph R
2016-08-03
Recent reports on the necessity of using externally orthogonal orbitals in subsystem density-functional theory (SDFT) [Annu. Rep. Comput. Chem., 8, 2012, 53; J. Phys. Chem. A, 118, 2014, 9182] are re-investigated. We show that in the basis-set limit, supermolecular Kohn-Sham-DFT (KS-DFT) densities can exactly be represented as a sum of subsystem densities, even if the subsystem orbitals are not externally orthogonal. This is illustrated using both an analytical example and in basis-set free numerical calculations for an atomic test case. We further show that even with finite basis sets, SDFT calculations using accurate reconstructed potentials can closely approach the supermolecular KS-DFT density, and that the deviations between SDFT and KS-DFT decrease as the basis-set limit is approached. Our results demonstrate that formally, there is no need to enforce external orthogonality in SDFT, even though this might be a useful strategy when developing projection-based DFT embedding schemes.
Mapping tree density at a global scale.
Crowther, T W; Glick, H B; Covey, K R; Bettigole, C; Maynard, D S; Thomas, S M; Smith, J R; Hintler, G; Duguid, M C; Amatulli, G; Tuanmu, M-N; Jetz, W; Salas, C; Stam, C; Piotto, D; Tavani, R; Green, S; Bruce, G; Williams, S J; Wiser, S K; Huber, M O; Hengeveld, G M; Nabuurs, G-J; Tikhonova, E; Borchardt, P; Li, C-F; Powrie, L W; Fischer, M; Hemp, A; Homeier, J; Cho, P; Vibrans, A C; Umunay, P M; Piao, S L; Rowe, C W; Ashton, M S; Crane, P R; Bradford, M A
2015-09-10
The global extent and distribution of forest trees is central to our understanding of the terrestrial biosphere. We provide the first spatially continuous map of forest tree density at a global scale. This map reveals that the global number of trees is approximately 3.04 trillion, an order of magnitude higher than the previous estimate. Of these trees, approximately 1.39 trillion exist in tropical and subtropical forests, with 0.74 trillion in boreal regions and 0.61 trillion in temperate regions. Biome-level trends in tree density demonstrate the importance of climate and topography in controlling local tree densities at finer scales, as well as the overwhelming effect of humans across most of the world. Based on our projected tree densities, we estimate that over 15 billion trees are cut down each year, and the global number of trees has fallen by approximately 46% since the start of human civilization.
QRS complex detection based on continuous density hidden Markov models using univariate observations
NASA Astrophysics Data System (ADS)
Sotelo, S.; Arenas, W.; Altuve, M.
2018-04-01
In the electrocardiogram (ECG), the detection of QRS complexes is a fundamental step in the ECG signal processing chain since it allows the determination of other characteristics waves of the ECG and provides information about heart rate variability. In this work, an automatic QRS complex detector based on continuous density hidden Markov models (HMM) is proposed. HMM were trained using univariate observation sequences taken either from QRS complexes or their derivatives. The detection approach is based on the log-likelihood comparison of the observation sequence with a fixed threshold. A sliding window was used to obtain the observation sequence to be evaluated by the model. The threshold was optimized by receiver operating characteristic curves. Sensitivity (Sen), specificity (Spc) and F1 score were used to evaluate the detection performance. The approach was validated using ECG recordings from the MIT-BIH Arrhythmia database. A 6-fold cross-validation shows that the best detection performance was achieved with 2 states HMM trained with QRS complexes sequences (Sen = 0.668, Spc = 0.360 and F1 = 0.309). We concluded that these univariate sequences provide enough information to characterize the QRS complex dynamics from HMM. Future works are directed to the use of multivariate observations to increase the detection performance.
The Ecology and Acoustic Behavior of Minke Whales in the Hawaiian and other Pacific Islands
2012-09-30
the SECR density estimation methods (developed by project partners, Len Thomas, from St. Andrews, and Steve Martin from SPAWAR Systems San Diego...PROJECTS Related projects were conducted by Len Thomas, Vincent Janik, and Steve Martin. These projects are using density estimates derived from...Martin, D.K. Mellinger, S. Jarvis , R.P. Morrissey, C. Ciminello, and N.DiMarzio, 2010. Spatially explicit capture recapture methods to estimate minke
Airfoil Design in Multivariable Calculus: Tying It All Together
ERIC Educational Resources Information Center
Laverty, Rich; Povich, Timothy; Williams, Tasha
2005-01-01
Near the conclusion of their final term in the calculus sequence at The United States Military Academy, cadets are given a week long group project. At the end of the week, the project is briefed to their instructors, classmates, and superior officers. From a teaching perspective, the goal is to encapsulate as much of the course as possible in one…
NASA Astrophysics Data System (ADS)
Hu, Mengli; Yang, Zhixiong; Zhou, Wenzhe; Li, Aolin; Pan, Jiangling; Ouyang, Fangping
2018-04-01
By using density functional theory (DFT) and nonequilibrium Green's function (NEGF), field effect transistor (FET) based on zigzag shaped phosphorene nanoribbons (ZPNR) are investigated. The FETs are constructed with bare-edged ZPNRs as electrodes and H, Cl or OH adsorbed ZPNRs as channel. It is found FETs with the three kinds of channel show similar transport properties. The FET is p-type with a maximum current on/off ratio of 104 and a minimum off-current of 1 nA. The working mode of FETs is dependent on the parity of channel length. It can be either enhancement mode or depletion mode and the off-state current shows an even-odd oscillation. The current oscillations are interpreted with density of states (DOS) analysis and methods of evolution operator and tight-binding Hamiltonian. Operating mechanism of the designed FETs is also presented with projected local density of states and band diagrams.
Ground penetrating radar evaluation of new pavement density.
DOT National Transportation Integrated Search
2015-02-01
The objective of this project was to map pavement surface density variations using dielectric : measurements from ground penetrating radar (GPR). The work was carried out as part of an : Asphalt Intelligent Compaction demonstration project on SR 539 ...
NASA Technical Reports Server (NTRS)
Otugen, M. Volkan; Popovic, Svetozar
1996-01-01
Ongoing research in Rayleigh scattering diagnostics for variable density low speed flow applications and for supersonic flow measurements are described. During the past several years, the focus has been on the development and use of a Nd:YAG-based Rayleigh scattering system with improved signal-to-noise characteristics and with applicability to complex, confined flows. This activity serves other research projects in the Aerodynamics Laboratory which require the non-contact, accurate, time-frozen measurement of gas density, pressure, and temperature (each separately), in a fairly wide dynamic range of each parameter. Recently, with the acquisition of a new seed-injected Nd:YAG laser, effort also has been directed to the development of a high-speed velocity probe based on a spectrally resolved Rayleigh scattering technique.
A climate-based multivariate extreme emulator of met-ocean-hydrological events for coastal flooding
NASA Astrophysics Data System (ADS)
Camus, Paula; Rueda, Ana; Mendez, Fernando J.; Tomas, Antonio; Del Jesus, Manuel; Losada, Iñigo J.
2015-04-01
Atmosphere-ocean general circulation models (AOGCMs) are useful to analyze large-scale climate variability (long-term historical periods, future climate projections). However, applications such as coastal flood modeling require climate information at finer scale. Besides, flooding events depend on multiple climate conditions: waves, surge levels from the open-ocean and river discharge caused by precipitation. Therefore, a multivariate statistical downscaling approach is adopted to reproduce relationships between variables and due to its low computational cost. The proposed method can be considered as a hybrid approach which combines a probabilistic weather type downscaling model with a stochastic weather generator component. Predictand distributions are reproduced modeling the relationship with AOGCM predictors based on a physical division in weather types (Camus et al., 2012). The multivariate dependence structure of the predictand (extreme events) is introduced linking the independent marginal distributions of the variables by a probabilistic copula regression (Ben Ayala et al., 2014). This hybrid approach is applied for the downscaling of AOGCM data to daily precipitation and maximum significant wave height and storm-surge in different locations along the Spanish coast. Reanalysis data is used to assess the proposed method. A commonly predictor for the three variables involved is classified using a regression-guided clustering algorithm. The most appropriate statistical model (general extreme value distribution, pareto distribution) for daily conditions is fitted. Stochastic simulation of the present climate is performed obtaining the set of hydraulic boundary conditions needed for high resolution coastal flood modeling. References: Camus, P., Menéndez, M., Méndez, F.J., Izaguirre, C., Espejo, A., Cánovas, V., Pérez, J., Rueda, A., Losada, I.J., Medina, R. (2014b). A weather-type statistical downscaling framework for ocean wave climate. Journal of Geophysical Research, doi: 10.1002/2014JC010141. Ben Ayala, M.A., Chebana, F., Ouarda, T.B.M.J. (2014). Probabilistic Gaussian Copula Regression Model for Multisite and Multivariable Downscaling, Journal of Climate, 27, 3331-3347.
Van Zeeland, M A; Boivin, R L; Brower, D L; Carlstrom, T N; Chavez, J A; Ding, W X; Feder, R; Johnson, D; Lin, L; O'Neill, R C; Watts, C
2013-04-01
One of the systems planned for the measurement of electron density in ITER is a multi-channel tangentially viewing combined interferometer-polarimeter (TIP). This work discusses the current status of the design, including a preliminary optical table layout, calibration options, error sources, and performance projections based on a CO2/CO laser system. In the current design, two-color interferometry is carried out at 10.59 μm and 5.42 μm and a separate polarimetry measurement of the plasma induced Faraday effect, utilizing the rotating wave technique, is made at 10.59 μm. The inclusion of polarimetry provides an independent measure of the electron density and can also be used to correct the conventional two-color interferometer for fringe skips at all densities, up to and beyond the Greenwald limit. The system features five chords with independent first mirrors to reduce risks associated with deposition, erosion, etc., and a common first wall hole to minimize penetration sizes. Simulations of performance for a projected ITER baseline discharge show the diagnostic will function as well as, or better than, comparable existing systems for feedback density control. Calculations also show that finite temperature effects will be significant in ITER even for moderate temperature plasmas and can lead to a significant underestimate of electron density. A secondary role TIP will fulfill is that of a density fluctuation diagnostic; using a toroidal Alfvén eigenmode as an example, simulations show TIP will be extremely robust in this capacity and potentially able to resolve coherent mode fluctuations with perturbed densities as low as δn∕n ≈ 10(-5).
2016-09-23
Lauren Menke3 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER H0HJ (53290813) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS...as prior work has demonstrated that friendship can facilitate performance in decision-making and motor tasks (e.g., Shah & Jehn, 1993). However, a...Relationship between Team Performance and Joint Attention with Longitudinal Multivariate Mixed Models 5a. CONTRACT NUMBER FA8650-14-D-6501-0009 5b
Marine species in ambient low-oxygen regions subject to double jeopardy impacts of climate change.
Stortini, Christine H; Chabot, Denis; Shackell, Nancy L
2017-06-01
We have learned much about the impacts of warming on the productivity and distribution of marine organisms, but less about the impact of warming combined with other environmental stressors, including oxygen depletion. Also, the combined impact of multiple environmental stressors requires evaluation at the scales most relevant to resource managers. We use the Gulf of St. Lawrence, Canada, characterized by a large permanently hypoxic zone, as a case study. Species distribution models were used to predict the impact of multiple scenarios of warming and oxygen depletion on the local density of three commercially and ecologically important species. Substantial changes are projected within 20-40 years. A eurythermal depleted species already limited to shallow, oxygen-rich refuge habitat (Atlantic cod) may be relatively uninfluenced by oxygen depletion but increase in density within refuge areas with warming. A more stenothermal, deep-dwelling species (Greenland halibut) is projected to lose ~55% of its high-density areas under the combined impacts of warming and oxygen depletion. Another deep-dwelling, more eurythermal species (Northern shrimp) would lose ~4% of its high-density areas due to oxygen depletion alone, but these impacts may be buffered by warming, which may increase density by 8% in less hypoxic areas, but decrease density by ~20% in the warmest parts of the region. Due to local climate variability and extreme events, and that our models cannot project changes in species sensitivity to hypoxia with warming, our results should be considered conservative. We present an approach to effectively evaluate the individual and cumulative impacts of multiple environmental stressors on a species-by-species basis at the scales most relevant to managers. Our study may provide a basis for work in other low-oxygen regions and should contribute to a growing literature base in climate science, which will continue to be of support for resource managers as climate change accelerates. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Jeyaram, A.; Kesari, S.; Bajpai, A.; Bhunia, G. S.; Krishna Murthy, Y. V. N.
2012-07-01
Visceral Leishmaniasis (VL) commonly known as Kala-azar is one of the most neglected tropical disease affecting approximately 200 million poorest populations 'at risk in 109 districts of three endemic countries namely Bangladesh, India and Nepal at different levels. This tropical disease is caused by the protozoan parasite Leishmania donovani and transmitted by female Phlebotomus argentipes sand flies. The analysis of disease dynamics indicate the periodicity at seasonal and inter-annual temporal scale which forms the basis for development of advanced early warning system. Study area of highly endemic Vaishali district, Bihar, India has been taken for model development. A Systematic study of geo-environmental parameters derived from satellite data in conjunction with ground intelligence enabled modelling of infectious disease and risk villages. High resolution Indian satellites data of IRS LISS IV (multi-spectral) and Cartosat-1 (Pan) have been used for studying environmentally risk parameters viz. peri-domestic vegetation, dwelling condition, wetland ecosystem, cropping pattern, Normalised Difference Vegetation Index (NDVI), detailed land use etc towards risk assessment. Univariate analysis of the relationship between vector density and various land cover categories and climatic variables suggested that all the variables are significantly correlated. Using the significantly correlated variables with vector density, a seasonal multivariate regression model has been carried out incorporating geo-environmental parameters, climate variables and seasonal time series disease parameters. Linear and non-linear models have been applied for periodicity and interannual temporal scale to predict Man-hour-density (MHD) and 'out-of-fit' data set used for validating the model with reasonable accuracy. To improve the MHD predictive approach, fuzzy model has also been incorporated in GIS environment combining spatial geo-environmental and climatic variables using fuzzy membership logic. Based on the perceived importance of the geoenvironmental parameters assigned by epidemiology expert, combined fuzzy membership has been calculated. The combined fuzzy membership indicate the predictive measure of vector density in each village. A γ factor has been introduced to have increasing effect in the higher side and decreasing effect in the lower side which facilitated for prioritisation of the villages. This approach is not only to predict vector density but also to prioritise the villages for effective control measures. A software package for modelling the risk villages integrating multivariate regression and fuzzy membership analysis models have been developed to estimate MHD (vector density) as part of the early warning system.
Analysis of interstellar cloud structure based on IRAS images
NASA Technical Reports Server (NTRS)
Scalo, John M.
1992-01-01
The goal of this project was to develop new tools for the analysis of the structure of densely sampled maps of interstellar star-forming regions. A particular emphasis was on the recognition and characterization of nested hierarchical structure and fractal irregularity, and their relation to the level of star formation activity. The panoramic IRAS images provided data with the required range in spatial scale, greater than a factor of 100, and in column density, greater than a factor of 50. In order to construct densely sampled column density maps of star-forming clouds, column density images of four nearby cloud complexes were constructed from IRAS data. The regions have various degrees of star formation activity, and most of them have probably not been affected much by the disruptive effects of young massive stars. The largest region, the Scorpius-Ophiuchus cloud complex, covers about 1000 square degrees (it was subdivided into a few smaller regions for analysis). Much of the work during the early part of the project focused on an 80 square degree region in the core of the Taurus complex, a well-studied region of low-mass star formation.
The prognostic value of tumor-infiltrating neutrophils in gastric adenocarcinoma after resection.
Zhao, Jing-jing; Pan, Ke; Wang, Wei; Chen, Ju-gao; Wu, Yan-heng; Lv, Lin; Li, Jian-jun; Chen, Yi-bing; Wang, Dan-dan; Pan, Qiu-zhong; Li, Xiao-dong; Xia, Jian-chuan
2012-01-01
Several pieces of evidence indicate that tumor-infiltrating neutrophils (TINs) are correlated to tumor progression. In the current study, we explore the relationship between TINs and clinicopathological features of gastric adenocarcinoma patients. Furthermore, we investigated the prognostic value of TINs. The study was comprised of two groups, training group (115 patients) and test group (97 patients). Biomarkers (intratumoral CD15+ neutrophils) were assessed by immunohistochemistry. The relationship between clinicopathological features and patient outcome were evaluated using Cox regression and Kaplan-Meier analysis. Immunohistochemical detection showed that the tumor-infiltrating neutrophils (TINs) in the training group ranged from 0.00-115.70 cells/high-power microscopic field (HPF) and the median number was 21.60 cells/HPF. Based on the median number, the patients were divided into high and low TINs groups. Chi-square test analysis revealed that the density of CD15+ TINs was positively associated with lymph node metastasis (p = 0.024), distance metastasis (p = 0.004) and UICC (International Union Against Cancer) staging (p = 0.028). Kaplan-Meier analysis showed that patients with a lower density of TINs had a better prognosis than patients with a higher density of TINs (p = 0.002). Multivariate Cox's analysis showed that the density of CD15+ TINs was an independent prognostic factor for overall survival of gastric adenocarcinoma patients. Using another 97 patients as a test group and basing on the median number of TINs (21.60 cells/HPF) coming from the training group, Kaplan-Meier analysis also showed that patients with a lower density of TINs had a better prognosis than patients with a higher density of TINs (p = 0.032). The results verify that the number of CD15+ TINs can predict the survival of gastric adenocarcinoma surgical patients. The presence of CD15+ TINs is an independent and unfavorable factor in the prognosis of gastric adenocarcinoma patients. Targeting CD15+ TINs may be a potential intervenient therapy in the future.
Saito, Norio; Cordier, Stéphane; Lemoine, Pierric; Ohsawa, Takeo; Wada, Yoshiki; Grasset, Fabien; Cross, Jeffrey S; Ohashi, Naoki
2017-06-05
The electronic and crystal structures of Cs 2 [Mo 6 X 14 ] (X = Cl, Br, I) cluster-based compounds were investigated by density functional theory (DFT) simulations and experimental methods such as powder X-ray diffraction, ultraviolet-visible spectroscopy, and X-ray photoemission spectroscopy (XPS). The experimentally determined lattice parameters were in good agreement with theoretically optimized ones, indicating the usefulness of DFT calculations for the structural investigation of these clusters. The calculated band gaps of these compounds reproduced those experimentally determined by UV-vis reflectance within an error of a few tenths of an eV. Core-level XPS and effective charge analyses indicated bonding states of the halogens changed according to their sites. The XPS valence spectra were fairly well reproduced by simulations based on the projected electron density of states weighted with cross sections of Al K α , suggesting that DFT calculations can predict the electronic properties of metal-cluster-based crystals with good accuracy.
2012-01-01
Background Chronic low back pain (CLBP) experienced in middle-age may have important implications for vertebral bone health, although this issue has not been investigated as a primary aim previously. This study investigated the associations between CLBP and dual energy X-ray absorptiometry (DXA)-derived vertebral bone mineral measures acquired from postero-anterior and lateral-projections, among community-dwelling, middle-aged adults. Methods Twenty-nine adults with CLBP (11 male, 18 female) and 42 adults with no history of LBP in the preceding year (17 male, 25 female) were evaluated. Self-reported demographic and clinical data were collected via questionnaires. Areal bone mineral density (aBMD) was measured in the lumbar spine by DXA. Apparent volumetric (ap.v) BMD in the lumbar spine was also calculated. Multiple linear regression models were used to examine associations between study group (CLBP and control) and vertebral DXA variables by gender, adjusting for height, mass and age. Results There was no difference between groups by gender in anthropometrics or clinical characteristics. In the CLBP group, the mean (SD) duration of CLBP was 13.3 (10.4) years in males and 11.6 (9.9) years in females, with Oswestry Disability Index scores of 16.2 (8.7)% and 15.4 (9.1)%, respectively. Males with CLBP had significantly lower adjusted lateral-projection aBMD and lateral-projection ap.vBMD than controls at L3 with mean differences (standard error) of 0.09 (0.04) g/cm2 (p = 0.03) and 0.02 (0.01) g/cm3 (p = 0.04). These multivariate models accounted for 55% and 53% of the variance in lateral-projection L3 aBMD and lateral-projection L3 ap.vBMD. Conclusions CLBP in males is associated with some lumbar vertebral BMD measures, raising important questions about the mechanism and potential clinical impact of this association. PMID:22458361
Flinn, Paul W; Hagstrum, David W; Reed, Carl; Phillips, Tom W
2003-01-01
The USDA Agricultural Research Service (ARS) funded a demonstration project (1998-2002) for areawide IPM for stored wheat in Kansas and Oklahoma. This project was a collaboration of researchers at the ARS Grain Marketing and Production Research Center in Manhattan, Kansas, Kansas State University, and Oklahoma State University. The project utilized two elevator networks, one in each state, for a total of 28 grain elevators. These elevators stored approximately 31 million bushels of wheat, which is approximately 1.2% of the annual national production. Stored wheat was followed as it moved from farm to the country elevator and finally to the terminal elevator. During this study, thousands of grain samples were taken in concrete elevator silos. Wheat stored at elevators was frequently infested by several insect species, which sometimes reached high numbers and damaged the grain. Fumigation using aluminum phosphide pellets was the main method for managing these insect pests in elevators in the USA. Fumigation decisions tended to be based on past experience with controlling stored-grain insects, or were calendar based. Integrated pest management (IPM) requires sampling and risk benefit analysis. We found that the best sampling method for estimating insect density, without turning the grain from one bin to another, was the vacuum probe sampler. Decision support software, Stored Grain Advisor Pro (SGA Pro) was developed that interprets insect sampling data, and provides grain managers with a risk analysis report detailing which bins are at low, moderate or high risk for insect-caused economic losses. Insect density was predicted up to three months in the future based on current insect density, grain temperature and moisture. Because sampling costs money, there is a trade-off between frequency of sampling and the cost of fumigation. The insect growth model in SGA Pro reduces the need to sample as often, thereby making the program more cost-effective. SGA Pro was validated during the final year of the areawide program. Based on data from 533 bins, SGA Pro accurately predicted which bins were at low, moderate or high risk. Only in two out of 533 bins did SGA Pro incorrectly predict bins as being low risk and, in both cases, insect density was only high (> two insects kg(-1)) at the surface, which suggested recent immigration. SGA Pro is superior to calendar-based management because it ensures that grain is only treated when insect densities exceed economic thresholds (two insects kg(-1)). This approach will reduce the frequency of fumigation while maintaining high grain quality. Minimizing the use of fumigant improves worker safety and reduces both control costs and harm to the environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Beran; John Christenson; Dragos Nica
2002-12-15
The goal of the project is to enable plant operators to detect with high sensitivity and reliability the onset of decalibration drifts in all of the instrumentation used as input to the reactor heat balance calculations. To achieve this objective, the collaborators developed and implemented at DBNPS an extension of the Multivariate State Estimation Technique (MSET) pattern recognition methodology pioneered by ANAL. The extension was implemented during the second phase of the project and fully achieved the project goal.
Woven Thermal Protection System Based Heat-shield for Extreme Entry Environments Technology (HEEET)
NASA Technical Reports Server (NTRS)
Ellerby, Donald; Venkatapathy, Ethiraj; Stackpoole, Margaret; Chinnapongse, Ronald; Munk, Michelle; Dillman, Robert; Feldman, Jay; Prabhu, Dinesh; Beerman, Adam
2013-01-01
NASA's future robotic missions utilizing an entry system into Venus and the outer planets, namely, Saturn, Uranus, Neptune, result in extremely high entry conditions that exceed the capabilities of state of the art low to mid density ablators such as PICA or Avcoat. Therefore mission planners typically assume the use of a fully dense carbon phenolic heat shield similar to what was flown on Pioneer Venus and Galileo. Carbon phenolic is a robust TPS material however its high density and relatively high thermal conductivity constrain mission planners to steep entries, with high heat fluxes and pressures and short entry durations, in order for CP to be feasible from a mass perspective. The high entry conditions pose challenges for certification in existing ground based test facilities and the longer-term sustainability of CP will continue to pose challenges. In 2012 the Game Changing Development Program (GCDP) in NASA's Space Technology Mission Directorate funded NASA ARC to investigate the feasibility of a Woven Thermal Protection System (WTPS) to meet the needs of NASA's most challenging entry missions. This project was highly successful demonstrating that a Woven TPS solution compares favorably to CP in performance in simulated reentry environments and provides the opportunity to manufacture graded materials that should result in overall reduced mass solutions and enable a much broader set of missions than does CP. Building off the success of the WTPS project GCDP has funded a follow on project to further mature and scale up the WTPS concept for insertion into future NASA robotic missions. The matured WTPS will address the CP concerns associated with ground based test limitations and sustainability. This presentation will briefly discuss results from the WTPS Project and the plans for WTPS maturation into a heat-shield for extreme entry environment.
Woven Thermal Protection System Based Heat-shield for Extreme Entry Environments Technology (HEEET)
NASA Technical Reports Server (NTRS)
Chinnapongse, Ronald; Ellerbe, Donald; Stackpoole, Maragaret; Venkatapathy, Ethiraj; Beerman, Adam; Feldman, Jay; Peterson Keith; Prabhu, Dinesh; Dillman, Robert; Munk, Michelle
2013-01-01
NASA's future robotic missions utilizing an entry system into Venus and the outer planets, namely, Saturn, Uranus, Neptune, result in extremely severe entry conditions that exceed the capabilities of state of the art low to mid density ablators such as PICA or Avcoat. Therefore mission planners typically assume the use of a fully dense carbon phenolic heat shield similar to what was flown on Pioneer Venus and Galileo. Carbon phenolic (CP) is a robust TPS material however its high density and relatively high thermal conductivity constrain mission planners to steep entries, with high heat fluxes and pressures and short entry durations, in order for CP to be feasible from a mass perspective. The high entry conditions pose challenges for certification in existing ground based test facilities and the longer--term sustainability of CP will continue to pose challenges. In 2012 the Game Changing Development Program (GCDP) in NASA's Space Technology Mission Directorate funded NASA ARC to investigate the feasibility of a Woven Thermal Protection System (WTPS) to meet the needs of NASA's most challenging entry missions. This project was highly successful demonstrating that a Woven TPS solution compares favorably to CP in performance in simulated reentry environments and provides the opportunity to manufacture graded materials that should result in overall reduced mass solutions and enable a much broader set of missions than does CP. Building off the success of the WTPS project GCDP has funded a follow on project to further mature and scale up the WTPS concept for insertion into future NASA robotic missions. The matured WTPS will address the CP concerns associated with ground based test limitations and sustainability. This presentation will briefly discuss results from the WTPS Project and the plans for WTPS maturation into a heat--shield for extreme entry environment.
Aneurysm permeability following coil embolization: packing density and coil distribution.
Chueh, Ju-Yu; Vedantham, Srinivasan; Wakhloo, Ajay K; Carniato, Sarena L; Puri, Ajit S; Bzura, Conrad; Coffin, Spencer; Bogdanov, Alexei A; Gounis, Matthew J
2015-09-01
Rates of durable aneurysm occlusion following coil embolization vary widely, and a better understanding of coil mass mechanics is desired. The goal of this study is to evaluate the impact of packing density and coil uniformity on aneurysm permeability. Aneurysm models were coiled using either Guglielmi detachable coils or Target coils. The permeability was assessed by taking the ratio of microspheres passing through the coil mass to those in the working fluid. Aneurysms containing coil masses were sectioned for image analysis to determine surface area fraction and coil uniformity. All aneurysms were coiled to a packing density of at least 27%. Packing density, surface area fraction of the dome and neck, and uniformity of the dome were significantly correlated (p<0.05). Hence, multivariate principal components-based partial least squares regression models were used to predict permeability. Similar loading vectors were obtained for packing and uniformity measures. Coil mass permeability was modeled better with the inclusion of packing and uniformity measures of the dome (r(2)=0.73) than with packing density alone (r(2)=0.45). The analysis indicates the importance of including a uniformity measure for coil distribution in the dome along with packing measures. A densely packed aneurysm with a high degree of coil mass uniformity will reduce permeability. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Bone mineral density across a range of physical activity volumes: NHANES 2007-2010.
Whitfield, Geoffrey P; Kohrt, Wendy M; Pettee Gabriel, Kelley K; Rahbar, Mohammad H; Kohl, Harold W
2015-02-01
The association between aerobic physical activity volume and bone mineral density (BMD) is not completely understood. The purpose of this study was to clarify the association between BMD and aerobic activity across a broad range of activity volumes, particularly volumes between those recommended in the 2008 Physical Activity Guidelines for Americans and those of trained endurance athletes. Data from the 2007-2010 National Health and Nutrition Examination Survey were used to quantify the association between reported physical activity and BMD at the lumbar spine and proximal femur across the entire range of activity volumes reported by US adults. Participants were categorized into multiples of the minimum guideline-recommended volume based on reported moderate- and vigorous-intensity leisure activity. Lumbar and proximal femur BMD were assessed with dual-energy x-ray absorptiometry. Among women, multivariable-adjusted linear regression analyses revealed no significant differences in lumbar BMD across activity categories, whereas proximal femur BMD was significantly higher among those who exceeded the guidelines by 2-4 times than those who reported no activity. Among men, multivariable-adjusted BMD at both sites neared its highest values among those who exceeded the guidelines by at least 4 times and was not progressively higher with additional activity. Logistic regression estimating the odds of low BMD generally echoed the linear regression results. The association between physical activity volume and BMD is complex. Among women, exceeding guidelines by 2-4 times may be important for maximizing BMD at the proximal femur, whereas among men, exceeding guidelines by ≥4 times may be beneficial for lumbar and proximal femur BMD.
A PDF projection method: A pressure algorithm for stand-alone transported PDFs
NASA Astrophysics Data System (ADS)
Ghorbani, Asghar; Steinhilber, Gerd; Markus, Detlev; Maas, Ulrich
2015-03-01
In this paper, a new formulation of the projection approach is introduced for stand-alone probability density function (PDF) methods. The method is suitable for applications in low-Mach number transient turbulent reacting flows. The method is based on a fractional step method in which first the advection-diffusion-reaction equations are modelled and solved within a particle-based PDF method to predict an intermediate velocity field. Then the mean velocity field is projected onto a space where the continuity for the mean velocity is satisfied. In this approach, a Poisson equation is solved on the Eulerian grid to obtain the mean pressure field. Then the mean pressure is interpolated at the location of each stochastic Lagrangian particle. The formulation of the Poisson equation avoids the time derivatives of the density (due to convection) as well as second-order spatial derivatives. This in turn eliminates the major sources of instability in the presence of stochastic noise that are inherent in particle-based PDF methods. The convergence of the algorithm (in the non-turbulent case) is investigated first by the method of manufactured solutions. Then the algorithm is applied to a one-dimensional turbulent premixed flame in order to assess the accuracy and convergence of the method in the case of turbulent combustion. As a part of this work, we also apply the algorithm to a more realistic flow, namely a transient turbulent reacting jet, in order to assess the performance of the method.
360-degrees profilometry using strip-light projection coupled to Fourier phase-demodulation.
Servin, Manuel; Padilla, Moises; Garnica, Guillermo
2016-01-11
360 degrees (360°) digitalization of three dimensional (3D) solids using a projected light-strip is a well-established technique in academic and commercial profilometers. These profilometers project a light-strip over the digitizing solid while the solid is rotated a full revolution or 360-degrees. Then, a computer program typically extracts the centroid of this light-strip, and by triangulation one obtains the shape of the solid. Here instead of using intensity-based light-strip centroid estimation, we propose to use Fourier phase-demodulation for 360° solid digitalization. The advantage of Fourier demodulation over strip-centroid estimation is that the accuracy of phase-demodulation linearly-increases with the fringe density, while in strip-light the centroid-estimation errors are independent. Here we proposed first to construct a carrier-frequency fringe-pattern by closely adding the individual light-strip images recorded while the solid is being rotated. Next, this high-density fringe-pattern is phase-demodulated using the standard Fourier technique. To test the feasibility of this Fourier demodulation approach, we have digitized two solids with increasing topographic complexity: a Rubik's cube and a plastic model of a human-skull. According to our results, phase demodulation based on the Fourier technique is less noisy than triangulation based on centroid light-strip estimation. Moreover, Fourier demodulation also provides the amplitude of the analytic signal which is a valuable information for the visualization of surface details.
NASA Astrophysics Data System (ADS)
KIM, Y.; Lim, Y. J.; Kim, Y. H.; Kim, B. J.
2015-12-01
The impacts of climate change on wind speed, wind energy density (WED), and potential electronic production (PEP) over the Korean peninsula have been investigated by using five regional climate models (HadGEM3-RA, RegCM, WRF, GRIMs and MM5) ensemble projection data. HadGEM2-AO based two RCP scenarios (RCP4.5/8.5) data have been used for initial and boundary condition to all RCMs. Wind energy density and its annual and seasonal variability have been estimated based on monthly near-surface wind speeds, and the potential electronic production and its change have been also analyzed. As a result of comparison ensemble models based annual mean wind speed for 25-yr historical period (1981-2005) to the ERA-interim, it is shown that all RCMs overestimate near-surface wind speed compared to the reanalysis data but the results of HadGEM3-RA are most comparable. The changes annual and seasonal mean of WED and PEP for the historical period and comparison results to future projection (2021-2050) will be presented in this poster session. We also scrutinize the changes in mean sea level pressure and mean sea level pressure gradient in driving GCM/RCM as a factor inducing the variations. Our results can be used as a background data for devising a plan to develop and operate wind farm over the Korean Peninsula.
NASA Technical Reports Server (NTRS)
Bose, Deepak; Wright, Henry
2015-01-01
Aerothermal & TPS: a) Determine Forebody Aerothermal Heating. b) Determine In-depth TPS Temperature. c) Determine Backshell Aerothermal Environment. Aerodynamics and Atmosphere: a) Reconstruct Atmospheric Density, Winds, and Wind-Relative Attitude. b) Determine Hypersonic & Supersonic Aerodynamics Forces. c) Base Pressure Contribution to Drag.
ERIC Educational Resources Information Center
Black, Sally; Washington, Ericka
2008-01-01
The Olweus Bullying Prevention Program (BPP) is an internationally recognized school-based bullying prevention program. This project sought to evaluate pilot implementation of the program in one urban district using fidelity of implementation, bullying incident density (BID), student surveys, and serious incident reports as process and outcome…
Education toward a More Economic Life Style.
ERIC Educational Resources Information Center
Paynton, Naomi
1979-01-01
A community-based project was carried out in two low-income, high density areas in Tel Aviv, Israel, to help mothers of large families gain more from the existing family budget. Areas covered included clothing, nutrition, household maintenance, and savings and insurance. Behavioral changes were greatest in the area of clothing. (Author/JH)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agustsson, Ronald
In this project, RadiaBeam Technologies was tasked with developing a novel solution for a cost effective quench protection based on fast expansion of the normal zone. By inductively coupling a strong electromagnetic pulse via a resonant LC circuit, we attempted to demonstrate accelerated normal zone propagation. The AC field induces currents in the superconducting layer with the current density exceeding that of the critical current density, J c. This creates a large normal zone, uniformly distributing the dissipation through the magnet body. The method does not rely on thermal heating of the conductor, thus enabling nearly instantaneous protection. Through themore » course of the Phase II project, RadiaBeam Technologies continued extensive numerical modeling of the inductive quench system, re-designed and built several iterations of the POC system for testing and observed evidence of a transient partial quench being induced. However the final device was not fabricated. This was a consequence of the fundamentally complex nature of the energy extraction process and the challenges associated even with demonstrating the proof of concept in a bench top device.« less
Pompa-García, Marín; Venegas-González, Alejandro
2016-01-01
Forest ecosystems play an important role in the global carbon cycle. Therefore, understanding the dynamics of carbon uptake in forest ecosystems is much needed. Pinus cooperi is a widely distributed species in the Sierra Madre Occidental in northern Mexico and future climatic variations could impact these ecosystems. Here, we analyze the variations of trunk carbon in two populations of P. cooperi situated at different elevational gradients, combining dendrochronological techniques and allometry. Carbon sequestration (50% biomass) was estimated from a specific allometric equation for this species based on: (i) variation of intra-annual wood density and (ii) diameter reconstruction. The results show that the population at a higher elevation had greater wood density, basal area, and hence, carbon accumulation. This finding can be explained by an ecological response of trees to adverse weather conditions, which would cause a change in the cellular structure affecting the within-ring wood density profile. The influence of variations in climate on the maximum density of chronologies showed a positive correlation with precipitation and the Multivariate El Niño Southern Oscillation Index during the winter season, and a negative correlation with maximum temperature during the spring season. Monitoring previous conditions to growth is crucial due to the increased vulnerability to extreme climatic variations on higher elevational sites. We concluded that temporal variability of wood density contributes to a better understanding of environmental historical changes and forest carbon dynamics in Northern Mexico, representing a significant improvement over previous studies on carbon sequestration. Assuming a uniform density according to tree age is incorrect, so this method can be used for environmental mitigation strategies, such as for managing P. cooperi, a dominant species of great ecological amplitude and widely used in forest industries. PMID:27272519
Pompa-García, Marín; Venegas-González, Alejandro
2016-01-01
Forest ecosystems play an important role in the global carbon cycle. Therefore, understanding the dynamics of carbon uptake in forest ecosystems is much needed. Pinus cooperi is a widely distributed species in the Sierra Madre Occidental in northern Mexico and future climatic variations could impact these ecosystems. Here, we analyze the variations of trunk carbon in two populations of P. cooperi situated at different elevational gradients, combining dendrochronological techniques and allometry. Carbon sequestration (50% biomass) was estimated from a specific allometric equation for this species based on: (i) variation of intra-annual wood density and (ii) diameter reconstruction. The results show that the population at a higher elevation had greater wood density, basal area, and hence, carbon accumulation. This finding can be explained by an ecological response of trees to adverse weather conditions, which would cause a change in the cellular structure affecting the within-ring wood density profile. The influence of variations in climate on the maximum density of chronologies showed a positive correlation with precipitation and the Multivariate El Niño Southern Oscillation Index during the winter season, and a negative correlation with maximum temperature during the spring season. Monitoring previous conditions to growth is crucial due to the increased vulnerability to extreme climatic variations on higher elevational sites. We concluded that temporal variability of wood density contributes to a better understanding of environmental historical changes and forest carbon dynamics in Northern Mexico, representing a significant improvement over previous studies on carbon sequestration. Assuming a uniform density according to tree age is incorrect, so this method can be used for environmental mitigation strategies, such as for managing P. cooperi, a dominant species of great ecological amplitude and widely used in forest industries.
Jang, Nuri; Kwon, Hee Jung; Park, Min Hui; Kang, Su Hwan; Bae, Young Kyung
2018-04-01
This study investigated the prognostic value of tumor-infiltrating lymphocyte (TIL) density as determined by molecular subtype and receipt of adjuvant chemotherapy in invasive breast cancer (IBC). Stromal TIL densities were evaluated in 1489 IBC samples using recommendations proposed by the International TILs Working Group. Cases were allocated to high- and low-TIL density groups using a cutoff of 10%. Of the 1489 IBC patients, 427 (28.7%) were assigned to the high-TIL group and 1062 (71.3%) to the low-TIL group. High TIL density was found to be significantly associated with large tumor size (p = 0.001), high histologic grade (p < 0.001), and high Ki-67 labeling index (p < 0.001). Triple-negative and human epidermal growth factor receptor 2 (HER2)-positive subtypes had significantly higher TIL densities than luminal A or B (HER2-negative) subtypes (p < 0.001). High TIL density was significantly associated with prolonged disease-free survival (DFS) by univariate (p < 0.001) and multivariate (p < 0.001) analyses. In the low-TIL-density group, the patients who did not receive adjuvant chemotherapy showed better DFS (p < 0.001), but no such survival difference was observed in the high-TIL group (p = 0.222). For the patients who received adjuvant anthracycline, high-TIL density was found to be an independent prognostic factor of favorable DFS in the luminal B (HER2-negative; p = 0.003), HER2-positive (p = 0.019), and triple-negative (p = 0.017) subtypes. Measurements of TIL density in routine clinical practice could give useful prognostic information for the triple-negative, HER2-positive, and luminal B (HER2-negative) IBC subtypes, especially for patients administered adjuvant anthracycline.
NASA Astrophysics Data System (ADS)
Gastounioti, Aimilia; Hsieh, Meng-Kang; Pantalone, Lauren; Conant, Emily F.; Kontos, Despina
2018-03-01
Mammographic density is an established risk factor for breast cancer. However, area-based density (ABD) measured in 2D mammograms consider the projection, rather than the actual volume of dense tissue which may be an important limitation. With the increasing utilization of digital breast tomosynthesis (DBT) in screening, there's an opportunity to routinely estimate volumetric breast density (VBD). In this study, we investigate associations between DBT-VBD and ABD extracted from standard-dose mammography (DM) and synthetic 2D digital mammography (sDM) increasingly replacing DM. We retrospectively analyzed bilateral imaging data from a random sample of 1000 women, acquired over a transitional period at our institution when all women had DBT, sDM and DM acquired as part of their routine breast screening. For each exam, ABD was measured in DM and sDM images with the publicly available "LIBRA" software, while DBT-VBD was measured using a previously validated, fully-automated computer algorithm. Spearman correlation (r) was used to compare VBD to ABD measurements. For each density measure, we also estimated the within woman intraclass correlation (ICC) and finally, to compare to clinical assessments, we performed analysis of variance (ANOVA) to evaluate the variation to the assigned clinical BI-RADS breast density category for each woman. DBT-VBD was moderately correlated to ABD from DM (r=0.70) and sDM (r=0.66). All density measures had strong bilateral symmetry (ICC = [0.85, 0.95]), but were significantly different across BI-RADS density categories (ANOVA, p<0.001). Our results contribute to further elaborating the clinical implications of breast density measures estimated with DBT which may better capture the volumetric amount of dense tissue within the breast than area-based measures and visual assessment.
Automated Processing of ISIS Topside Ionograms into Electron Density Profiles
NASA Technical Reports Server (NTRS)
Reinisch, bodo W.; Huang, Xueqin; Bilitza, Dieter; Hills, H. Kent
2004-01-01
Modeling of the topside ionosphere has for the most part relied on just a few years of data from topside sounder satellites. The widely used Bent et al. (1972) model, for example, is based on only 50,000 Alouette 1 profiles. The International Reference Ionosphere (IRI) (Bilitza, 1990, 2001) uses an analytical description of the graphs and tables provided by Bent et al. (1972). The Alouette 1, 2 and ISIS 1, 2 topside sounder satellites of the sixties and seventies were ahead of their times in terms of the sheer volume of data obtained and in terms of the computer and software requirements for data analysis. As a result, only a small percentage of the collected topside ionograms was converted into electron density profiles. Recently, a NASA-funded data restoration project has undertaken and is continuing the process of digitizing the Alouette/ISIS ionograms from the analog 7-track tapes. Our project involves the automated processing of these digital ionograms into electron density profiles. The project accomplished a set of important goals that will have a major impact on understanding and modeling of the topside ionosphere: (1) The TOPside Ionogram Scaling and True height inversion (TOPIST) software was developed for the automated scaling and inversion of topside ionograms. (2) The TOPIST software was applied to the over 300,000 ISIS-2 topside ionograms that had been digitized in the fkamework of a separate AISRP project (PI: R.F. Benson). (3) The new TOPIST-produced database of global electron density profiles for the topside ionosphere were made publicly available through NASA s National Space Science Data Center (NSSDC) ftp archive at
Low-enriched uranium high-density target project. Compendium report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vandegrift, George; Brown, M. Alex; Jerden, James L.
2016-09-01
At present, most 99Mo is produced in research, test, or isotope production reactors by irradiation of highly enriched uranium targets. To achieve the denser form of uranium needed for switching from high to low enriched uranium (LEU), targets in the form of a metal foil (~125-150 µm thick) are being developed. The LEU High Density Target Project successfully demonstrated several iterations of an LEU-fission-based Mo-99 technology that has the potential to provide the world’s supply of Mo-99, should major producers choose to utilize the technology. Over 50 annular high density targets have been successfully tested, and the assembly and disassemblymore » of targets have been improved and optimized. Two target front-end processes (acidic and electrochemical) have been scaled up and demonstrated to allow for the high-density target technology to mate up to the existing producer technology for target processing. In the event that a new target processing line is started, the chemical processing of the targets is greatly simplified. Extensive modeling and safety analysis has been conducted, and the target has been qualified to be inserted into the High Flux Isotope Reactor, which is considered above and beyond the requirements for the typical use of this target due to high fluence and irradiation duration.« less
The Grand Tour via Geodesic Interpolation of 2-frames
NASA Technical Reports Server (NTRS)
Asimov, Daniel; Buja, Andreas
1994-01-01
Grand tours are a class of methods for visualizing multivariate data, or any finite set of points in n-space. The idea is to create an animation of data projections by moving a 2-dimensional projection plane through n-space. The path of planes used in the animation is chosen so that it becomes dense, that is, it comes arbitrarily close to any plane. One of the original inspirations for the grand tour was the experience of trying to comprehend an abstract sculpture in a museum. One tends to walk around the sculpture, viewing it from many different angles. A useful class of grand tours is based on the idea of continuously interpolating an infinite sequence of randomly chosen planes. Visiting randomly (more precisely: uniformly) distributed planes guarantees denseness of the interpolating path. In computer implementations, 2-dimensional orthogonal projections are specified by two 1-dimensional projections which map to the horizontal and vertical screen dimensions, respectively. Hence, a grand tour is specified by a path of pairs of orthonormal projection vectors. This paper describes an interpolation scheme for smoothly connecting two pairs of orthonormal vectors, and thus for constructing interpolating grand tours. The scheme is optimal in the sense that connecting paths are geodesics in a natural Riemannian geometry.
A projection operator method for the analysis of magnetic neutron form factors
NASA Astrophysics Data System (ADS)
Kaprzyk, S.; Van Laar, B.; Maniawski, F.
1981-03-01
A set of projection operators in matrix form has been derived on the basis of decomposition of the spin density into a series of fully symmetrized cubic harmonics. This set of projection operators allows a formulation of the Fourier analysis of magnetic form factors in a convenient way. The presented method is capable of checking the validity of various theoretical models used for spin density analysis up to now. The general formalism is worked out in explicit form for the fcc and bcc structures and deals with that part of spin density which is contained within the sphere inscribed in the Wigner-Seitz cell. This projection operator method has been tested on the magnetic form factors of nickel and iron.
NASA Astrophysics Data System (ADS)
Masud, M. B.; Khaliq, M. N.; Wheater, H. S.
2017-04-01
This study assesses projected changes to drought characteristics in Alberta, Saskatchewan and Manitoba, the prairie provinces of Canada, using a multi-regional climate model (RCM) ensemble available through the North American Regional Climate Change Assessment Program. Simulations considered include those performed with six RCMs driven by National Center for Environmental Prediction reanalysis II for the 1981-2003 period and those driven by four Atmosphere-Ocean General Circulation Models for the 1970-1999 and 2041-2070 periods (i.e. eleven current and the same number of corresponding future period simulations). Drought characteristics are extracted using two drought indices, namely the Standardized Precipitation Index (SPI) and the Standardized Precipitation Evapotranspiration Index (SPEI). Regional frequency analysis is used to project changes to selected 20- and 50-year regional return levels of drought characteristics for fifteen homogeneous regions, covering the study area. In addition, multivariate analyses of drought characteristics, derived on the basis of 6-month SPI and SPEI values, are developed using the copula approach for each region. Analysis of multi-RCM ensemble-averaged projected changes to mean and selected return levels of drought characteristics show increases over the southern and south-western parts of the study area. Based on bi- and trivariate joint occurrence probabilities of drought characteristics, the southern regions along with the central regions are found highly drought vulnerable, followed by the southwestern and southeastern regions. Compared to the SPI-based analysis, the results based on SPEI suggest drier conditions over many regions in the future, indicating potential effects of rising temperatures on drought risks. These projections will be useful in the development of appropriate adaptation strategies for the water and agricultural sectors, which play an important role in the economy of the study area.
Density-functional theory based on the electron distribution on the energy coordinate
NASA Astrophysics Data System (ADS)
Takahashi, Hideaki
2018-03-01
We developed an electronic density functional theory utilizing a novel electron distribution n(ɛ) as a basic variable to compute ground state energy of a system. n(ɛ) is obtained by projecting the electron density n({\\boldsymbol{r}}) defined on the space coordinate {\\boldsymbol{r}} onto the energy coordinate ɛ specified with the external potential {\\upsilon }ext}({\\boldsymbol{r}}) of interest. It was demonstrated that the Kohn-Sham equation can also be formulated with the exchange-correlation functional E xc[n(ɛ)] that employs the density n(ɛ) as an argument. It turned out an exchange functional proposed in our preliminary development suffices to describe properly the potential energies of several types of chemical bonds with comparable accuracies to the corresponding functional based on local density approximation. As a remarkable feature of the distribution n(ɛ) it inherently involves the spatially non-local information of the exchange hole at the bond dissociation limit in contrast to conventional approximate functionals. By taking advantage of this property we also developed a prototype of the static correlation functional E sc including no empirical parameters, which showed marked improvements in describing the dissociations of covalent bonds in {{{H}}}2,{{{C}}}2{{{H}}}4 and {CH}}4 molecules.
Correction of scatter in megavoltage cone-beam CT
NASA Astrophysics Data System (ADS)
Spies, L.; Ebert, M.; Groh, B. A.; Hesse, B. M.; Bortfeld, T.
2001-03-01
The role of scatter in a cone-beam computed tomography system using the therapeutic beam of a medical linear accelerator and a commercial electronic portal imaging device (EPID) is investigated. A scatter correction method is presented which is based on a superposition of Monte Carlo generated scatter kernels. The kernels are adapted to both the spectral response of the EPID and the dimensions of the phantom being scanned. The method is part of a calibration procedure which converts the measured transmission data acquired for each projection angle into water-equivalent thicknesses. Tomographic reconstruction of the projections then yields an estimate of the electron density distribution of the phantom. It is found that scatter produces cupping artefacts in the reconstructed tomograms. Furthermore, reconstructed electron densities deviate greatly (by about 30%) from their expected values. The scatter correction method removes the cupping artefacts and decreases the deviations from 30% down to about 8%.
The Exomet Project: EU/ESA Research on High-Performance Light-Metal Alloys and Nanocomposites
NASA Astrophysics Data System (ADS)
Sillekens, W. H.
The performance of structural materials is commonly associated with such design parameters as strength and stiffness relative to their density; a recognized means to further enhance the weight-saving potential of low-density materials is thus to improve on their mechanical attributes. The European Community research project ExoMet that started in mid-2012 targets such high-performance aluminum- and magnesium-based materials by exploring novel grain-refining and nanoparticle additions in conjunction with melt treatment by means of external fields (electromagnetic, ultrasonic, mechanical). These external fields are to provide for an effective and efficient dispersion of the additions in the melt and their uniform distribution in the as-cast material. The consortium of 27 companies, universities and research organizations from eleven countries integrates various scientific and technological disciplines as well as application areas — including automotive and (aero)-space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hickner, Michael A.; Matranga, Christopher S.
This project will use bipolar membranes to produce efficient vapor-phase electrolysis cells for splitting CO 2 to CO and oxygen. CO is a valuable chemical feedstock that can be combined catalytically with hydrogen in the Fischer-Tropsch process to make liquid fuels. CO is arguably the best target for CO 2 reduction since, as a gaseous product, it is easily collected and is relatively immune to membrane crossover losses. The keys to success in this project are to design and synthesize hydrophilic, low resistance bipolar membranes and to create optimized electrode/catalyst/ electrolyte architectures based on these new membranes and advanced catalystsmore » in order to achieve high current density at low overpotentials for CO 2 conversion. High current density is key to achieving industrially-relevant throughput for the process and low overpotentials maintain high overall efficiency for the process.« less
Effects of long term supplementation of anabolic androgen steroids on human skeletal muscle.
Yu, Ji-Guo; Bonnerud, Patrik; Eriksson, Anders; Stål, Per S; Tegner, Yelverton; Malm, Christer
2014-01-01
The effects of long-term (over several years) anabolic androgen steroids (AAS) administration on human skeletal muscle are still unclear. In this study, seventeen strength training athletes were recruited and individually interviewed regarding self-administration of banned substances. Ten subjects admitted having taken AAS or AAS derivatives for the past 5 to 15 years (Doped) and the dosage and type of banned substances were recorded. The remaining seven subjects testified to having never used any banned substances (Clean). For all subjects, maximal muscle strength and body composition were tested, and biopsies from the vastus lateralis muscle were obtained. Using histochemistry and immunohistochemistry (IHC), muscle biopsies were evaluated for morphology including fiber type composition, fiber size, capillary variables and myonuclei. Compared with the Clean athletes, the Doped athletes had significantly higher lean leg mass, capillary per fibre and myonuclei per fiber. In contrast, the Doped athletes had significantly lower absolute value in maximal squat force and relative values in maximal squat force (relative to lean body mass, to lean leg mass and to muscle fiber area). Using multivariate statistics, an orthogonal projection of latent structure discriminant analysis (OPLS-DA) model was established, in which the maximal squat force relative to muscle mass and the maximal squat force relative to fiber area, together with capillary density and nuclei density were the most important variables for separating Doped from the Clean athletes (regression = 0.93 and prediction = 0.92, p<0.0001). In Doped athletes, AAS dose-dependent increases were observed in lean body mass, muscle fiber area, capillary density and myonuclei density. In conclusion, long term AAS supplementation led to increases in lean leg mass, muscle fiber size and a parallel improvement in muscle strength, and all were dose-dependent. Administration of AAS may induce sustained morphological changes in human skeletal muscle, leading to physical performance enhancement.
Effects of Long Term Supplementation of Anabolic Androgen Steroids on Human Skeletal Muscle
Yu, Ji-Guo; Bonnerud, Patrik; Eriksson, Anders; Stål, Per S.; Tegner, Yelverton; Malm, Christer
2014-01-01
The effects of long-term (over several years) anabolic androgen steroids (AAS) administration on human skeletal muscle are still unclear. In this study, seventeen strength training athletes were recruited and individually interviewed regarding self-administration of banned substances. Ten subjects admitted having taken AAS or AAS derivatives for the past 5 to 15 years (Doped) and the dosage and type of banned substances were recorded. The remaining seven subjects testified to having never used any banned substances (Clean). For all subjects, maximal muscle strength and body composition were tested, and biopsies from the vastus lateralis muscle were obtained. Using histochemistry and immunohistochemistry (IHC), muscle biopsies were evaluated for morphology including fiber type composition, fiber size, capillary variables and myonuclei. Compared with the Clean athletes, the Doped athletes had significantly higher lean leg mass, capillary per fibre and myonuclei per fiber. In contrast, the Doped athletes had significantly lower absolute value in maximal squat force and relative values in maximal squat force (relative to lean body mass, to lean leg mass and to muscle fiber area). Using multivariate statistics, an orthogonal projection of latent structure discriminant analysis (OPLS-DA) model was established, in which the maximal squat force relative to muscle mass and the maximal squat force relative to fiber area, together with capillary density and nuclei density were the most important variables for separating Doped from the Clean athletes (regression = 0.93 and prediction = 0.92, p<0.0001). In Doped athletes, AAS dose-dependent increases were observed in lean body mass, muscle fiber area, capillary density and myonuclei density. In conclusion, long term AAS supplementation led to increases in lean leg mass, muscle fiber size and a parallel improvement in muscle strength, and all were dose-dependent. Administration of AAS may induce sustained morphological changes in human skeletal muscle, leading to physical performance enhancement. PMID:25207812
Larkin, J D; Publicover, N G; Sutko, J L
2011-01-01
In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.
Meyer, Ana-Claire L.; Dua, Tarun; Boscardin, John; Escarce, José J.; Saxena, Shekhar; Birbeck, Gretchen L.
2013-01-01
Purpose Epilepsy is one of the most common serious neurological disorders worldwide. Our objective was to determine which economic, healthcare, neurology and epilepsy specific resources were associated with untreated epilepsy in resource-constrained settings. Methods A systematic review of the literature identified community-based studies in resource-constrained settings that calculated the epilepsy treatment gap, the proportion with untreated epilepsy, from prevalent active epilepsy cases. Economic, healthcare, neurology and epilepsy specific resources were taken from existing datasets. Poisson regression models with jackknifed standard errors were used to create bivariate and multivariate models comparing the association between treatment status and economic and health resource indicators. Relative risks were reported. Key Findings Forty-seven studies of 8285 individuals from 24 countries met inclusion criteria. Bivariate analysis demonstrated that individuals residing in rural locations had significantly higher risks of untreated epilepsy [Relative Risk(RR)=1.63; 95% confidence interval(CI):1.26,2.11]. Significantly lower risks of untreated epilepsy were observed for higher physician density [RR=0.65, 95% CI:0.55,0.78], presence of a lay [RR=0.74, 95%CI:0.60,0.91] or professional association for epilepsy [RR=0.73, 95%CI:0.59,0.91], or post-graduate neurology training program [RR=0.67, 95%CI:0.55, 0.82]. In multivariate models, higher physician density maintained significant effects [RR=0.67; 95%CI:0.52,0.88]. Significance Even among resource-limited regions, people with epilepsy in countries with fewer economic, healthcare, neurology and epilepsy specific resources are more likely to have untreated epilepsy. Community-based epilepsy care programs have improved access to treatment but in order to decrease the epilepsy treatment gap, poverty and inequalities of healthcare, neurological and epilepsy resources must be dealt with at the local, national, and global levels. PMID:23106784
Han, M H; Ryu, J I; Kim, C H; Kim, J M; Cheong, J H; Bak, K H; Chun, H J
2017-06-01
Osteopenia and osteoporosis were independent predictive factors for higher atlantoaxial subluxation occurrence in patients with lower body mass index. Our findings suggest that patients with rheumatoid arthritis with osteopenia or osteoporosis, particularly those with lower body mass index (BMI), should be screened regularly to determine the status of their cervical spines. Cervical spine involvement in rheumatoid arthritis (RA) patients may cause serious adverse effects on quality of life and overall health. This study aimed to evaluate the association between atlantodental interval (ADI), atlantoaxial subluxation (AAS), and systemic bone mineral density (BMD) based on BMI variations among established patients with RA. The ADI was transformed to the natural log scale to normalize distributions for all analyses. Multivariable linear regression analyses were used to identify independent predictive factors for ADI based on each BMD classification. Multivariate Cox regression analyses were also performed to identify independent predictive factors for the risk of AAS, which were classified by tertile groups of BMI. A total of 1220 patients with RA who had undergone at least one or more cervical radiography and BMD assessments were identified and enrolled. We found that the association between BMD and ADI (β, -0.029; 95% CI, -0.059 to 0.002; p = 0.070) fell short of achieving statistical significance. However, the ADI showed a 3.6% decrease per 1 BMI increase in the osteoporosis group (β, -0.036; 95% CI, -0.061 to -0.011; p = 0.004). The osteopenia and osteoporosis groups showed about a 1.5-fold and a 1.8-fold increased risk of AAS occurrence among the first tertile of the BMI group. Our study showed a possible association between lower BMD and AAS occurrence in patients with RA with lower BMI. Further studies are needed to confirm our findings.
Models and analysis for multivariate failure time data
NASA Astrophysics Data System (ADS)
Shih, Joanna Huang
The goal of this research is to develop and investigate models and analytic methods for multivariate failure time data. We compare models in terms of direct modeling of the margins, flexibility of dependency structure, local vs. global measures of association, and ease of implementation. In particular, we study copula models, and models produced by right neutral cumulative hazard functions and right neutral hazard functions. We examine the changes of association over time for families of bivariate distributions induced from these models by displaying their density contour plots, conditional density plots, correlation curves of Doksum et al, and local cross ratios of Oakes. We know that bivariate distributions with same margins might exhibit quite different dependency structures. In addition to modeling, we study estimation procedures. For copula models, we investigate three estimation procedures. the first procedure is full maximum likelihood. The second procedure is two-stage maximum likelihood. At stage 1, we estimate the parameters in the margins by maximizing the marginal likelihood. At stage 2, we estimate the dependency structure by fixing the margins at the estimated ones. The third procedure is two-stage partially parametric maximum likelihood. It is similar to the second procedure, but we estimate the margins by the Kaplan-Meier estimate. We derive asymptotic properties for these three estimation procedures and compare their efficiency by Monte-Carlo simulations and direct computations. For models produced by right neutral cumulative hazards and right neutral hazards, we derive the likelihood and investigate the properties of the maximum likelihood estimates. Finally, we develop goodness of fit tests for the dependency structure in the copula models. We derive a test statistic and its asymptotic properties based on the test of homogeneity of Zelterman and Chen (1988), and a graphical diagnostic procedure based on the empirical Bayes approach. We study the performance of these two methods using actual and computer generated data.
Problems in achieving density in asphaltic concrete : final report.
DOT National Transportation Integrated Search
1979-01-01
This investigation was undertaken in an attempt to identify the causes of low densities in asphaltic concrete placed on 9 projects. The most prevalent cause, which accounted for the difficulties on 5 of the 9 projects, appeared to be improper mix des...
Low Density Parity Check Codes Based on Finite Geometries: A Rediscovery and More
NASA Technical Reports Server (NTRS)
Kou, Yu; Lin, Shu; Fossorier, Marc
1999-01-01
Low density parity check (LDPC) codes with iterative decoding based on belief propagation achieve astonishing error performance close to Shannon limit. No algebraic or geometric method for constructing these codes has been reported and they are largely generated by computer search. As a result, encoding of long LDPC codes is in general very complex. This paper presents two classes of high rate LDPC codes whose constructions are based on finite Euclidean and projective geometries, respectively. These classes of codes a.re cyclic and have good constraint parameters and minimum distances. Cyclic structure adows the use of linear feedback shift registers for encoding. These finite geometry LDPC codes achieve very good error performance with either soft-decision iterative decoding based on belief propagation or Gallager's hard-decision bit flipping algorithm. These codes can be punctured or extended to obtain other good LDPC codes. A generalization of these codes is also presented.
IMPACT: Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking
NASA Astrophysics Data System (ADS)
Koller, J.; Brennan, S.; Godinez, H. C.; Higdon, D. M.; Klimenko, A.; Larsen, B.; Lawrence, E.; Linares, R.; McLaughlin, C. A.; Mehta, P. M.; Palmer, D.; Ridley, A. J.; Shoemaker, M.; Sutton, E.; Thompson, D.; Walker, A.; Wohlberg, B.
2013-12-01
Low-Earth orbiting satellites suffer from atmospheric drag due to thermospheric density which changes on the order of several magnitudes especially during space weather events. Solar flares, precipitating particles and ionospheric currents cause the upper atmosphere to heat up, redistribute, and cool again. These processes are intrinsically included in empirical models, e.g. MSIS and Jacchia-Bowman type models. However, sensitivity analysis has shown that atmospheric drag has the highest influence on satellite conjunction analysis and empirical model still do not adequately represent a desired accuracy. Space debris and collision avoidance have become an increasingly operational reality. It is paramount to accurately predict satellite orbits and include drag effect driven by space weather. The IMPACT project (Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking), funded with over $5 Million by the Los Alamos Laboratory Directed Research and Development office, has the goal to develop an integrated system of atmospheric drag modeling, orbit propagation, and conjunction analysis with detailed uncertainty quantification to address the space debris and collision avoidance problem. Now with over two years into the project, we have developed an integrated solution combining physics-based density modeling of the upper atmosphere between 120-700 km altitude, satellite drag forecasting for quiet and disturbed geomagnetic conditions, and conjunction analysis with non-Gaussian uncertainty quantification. We are employing several novel approaches including a unique observational sensor developed at Los Alamos; machine learning with a support-vector machine approach of the coupling between solar drivers of the upper atmosphere and satellite drag; rigorous data assimilative modeling using a physics-based approach instead of empirical modeling of the thermosphere; and a computed-tomography method for extracting temporal maps of thermospheric densities using ground based observations. The developed IMPACT framework is an open research framework enabling the exchange and testing of a variety of atmospheric density models, orbital propagators, drag coefficient models, ground based observations, etc. and study their effect on conjunctions and uncertainty predictions. The framework is based on a modern service-oriented architecture controlled by a web interface and providing 3D visualizations. The goal of this project is to revolutionize the ability to monitor and track space objects during highly disturbed space weather conditions, provide suitable forecasts for satellite drag conditions and conjunction analysis, and enable the exchange of models, codes, and data in an open research environment. We will present capabilities and results of the IMPACT framework including a demo of the control interface and visualizations.
Holmes, John; Green, Mark; Strong, Mark; Pearson, Tim; Meier, Petra
2016-01-01
Background Availability of alcohol is a major policy issue for governments, and one of the availability factors is the density of alcohol outlets within geographic areas. Objective The aim of this study is to investigate the association between alcohol outlet density and hospital admissions for alcohol-related conditions in a national (English) small area level ecological study. Methods This project will employ ecological correlation and cross-sectional time series study designs to examine spatial and temporal relationships between alcohol outlet density and hospital admissions. Census units to be used in the analysis will include all Lower and Middle Super-Output Areas (LSOAs and MSOAs) in England (53 million total population; 32,482 LSOAs and 6781 MSOAs). LSOAs (approximately 1500 people per LSOA) will support investigation at a fine spatial resolution. Spatio-temporal associations will be investigated using MSOAs (approximately 7500 people per MSOA). The project will use comprehensive coverage data on alcohol outlets in England (from 2003, 2007, 2010, and 2013) from a commercial source, which has estimated that the database includes 98% of all alcohol outlets in England. Alcohol outlets may be classified into two broad groups: on-trade outlets, comprising outlets from which alcohol can be purchased and consumed on the premises (eg, pubs); and off-trade outlets, in which alcohol can be purchased but not consumed on the premises (eg, off-licenses). In the 2010 dataset, there are 132,989 on-trade and 51,975 off-trade outlets. The longitudinal data series will allow us to examine associations between changes in outlet density and changes in hospital admission rates. The project will use anonymized data on alcohol-related hospital admissions in England from 2003 to 2013 and investigate associations with acute (eg, admissions for injuries) and chronic (eg, admissions for alcoholic liver disease) harms. The investigation will include the examination of conditions that are wholly and partially attributable to alcohol, using internationally standardized alcohol-attributable fractions. Results The project is currently in progress. Results are expected in 2017. Conclusions The results of this study will provide a national evidence base to inform policy decisions regarding the licensing of alcohol sales outlets. PMID:27986646
Role of Blood Lipids in the Development of Ischemic Stroke and its Subtypes
Engström, Gunnar; Larsson, Susanna C.; Traylor, Matthew; Markus, Hugh S.; Melander, Olle; Orho-Melander, Marju
2018-01-01
Background and Purpose— Statin therapy is associated with a lower risk of ischemic stroke supporting a causal role of low-density lipoprotein (LDL) cholesterol. However, more evidence is needed to answer the question whether LDL cholesterol plays a causal role in ischemic stroke subtypes. In addition, it is unknown whether high-density lipoprotein cholesterol and triglycerides have a causal relationship to ischemic stroke and its subtypes. Our aim was to investigate the causal role of LDL cholesterol, high-density lipoprotein cholesterol, and triglycerides in ischemic stroke and its subtypes through Mendelian randomization (MR). Methods— Summary data on 185 genome-wide lipids-associated single nucleotide polymorphisms were obtained from the Global Lipids Genetics Consortium and the Stroke Genetics Network for their association with ischemic stroke (n=16 851 cases and 32 473 controls) and its subtypes, including large artery atherosclerosis (n=2410), small artery occlusion (n=3186), and cardioembolic (n=3427) stroke. Inverse-variance–weighted MR was used to obtain the causal estimates. Inverse-variance–weighted multivariable MR, MR-Egger, and sensitivity exclusion of pleiotropic single nucleotide polymorphisms after Steiger filtering and MR-Pleiotropy Residual Sum and Outlier test were used to adjust for pleiotropic bias. Results— A 1-SD genetically elevated LDL cholesterol was associated with an increased risk of ischemic stroke (odds ratio: 1.12; 95% confidence interval: 1.04–1.20) and large artery atherosclerosis stroke (odds ratio: 1.28; 95% confidence interval: 1.10–1.49) but not with small artery occlusion or cardioembolic stroke in multivariable MR. A 1-SD genetically elevated high-density lipoprotein cholesterol was associated with a decreased risk of small artery occlusion stroke (odds ratio: 0.79; 95% confidence interval: 0.67–0.90) in multivariable MR. MR-Egger indicated no pleiotropic bias, and results did not markedly change after sensitivity exclusion of pleiotropic single nucleotide polymorphisms. Genetically elevated triglycerides did not associate with ischemic stroke or its subtypes. Conclusions— LDL cholesterol lowering is likely to prevent large artery atherosclerosis but may not prevent small artery occlusion nor cardioembolic strokes. High-density lipoprotein cholesterol elevation may lead to benefits in small artery disease prevention. Finally, triglyceride lowering may not yield benefits in ischemic stroke and its subtypes. PMID:29535274
2012-02-17
tool should be combined with a user-friendly Windows-based software interface that utilizes the best practices for process planning developed by us and...best practices developed through this project, resulting in the commercial availability of machines for the Navy and others. These machines will...research 2011 Outstanding Paper Award, VRAP 2011, for paper "Some Studies on Dislocation Density based Finite Element Modeling of Ultrasonic
BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods
NASA Astrophysics Data System (ADS)
Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.
2017-12-01
Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.
Whist, A C; Liland, K H; Jonsson, M E; Sæbø, S; Sviland, S; Østerås, O; Norström, M; Hopp, P
2014-11-01
Surveillance programs for animal diseases are critical to early disease detection and risk estimation and to documenting a population's disease status at a given time. The aim of this study was to describe a risk-based surveillance program for detecting Mycobacterium avium ssp. paratuberculosis (MAP) infection in Norwegian dairy cattle. The included risk factors for detecting MAP were purchase of cattle, combined cattle and goat farming, and location of the cattle farm in counties containing goats with MAP. The risk indicators included production data [culling of animals >3 yr of age, carcass conformation of animals >3 yr of age, milk production decrease in older lactating cows (lactations 3, 4, and 5)], and clinical data (diarrhea, enteritis, or both, in animals >3 yr of age). Except for combined cattle and goat farming and cattle farm location, all data were collected at the cow level and summarized at the herd level. Predefined risk factors and risk indicators were extracted from different national databases and combined in a multivariate statistical process control to obtain a risk assessment for each herd. The ordinary Hotelling's T(2) statistic was applied as a multivariate, standardized measure of difference between the current observed state and the average state of the risk factors for a given herd. To make the analysis more robust and adapt it to the slowly developing nature of MAP, monthly risk calculations were based on data accumulated during a 24-mo period. Monitoring of these variables was performed to identify outliers that may indicate deviance in one or more of the underlying processes. The highest-ranked herds were scattered all over Norway and clustered in high-density dairy cattle farm areas. The resulting rankings of herds are being used in the national surveillance program for MAP in 2014 to increase the sensitivity of the ongoing surveillance program in which 5 fecal samples for bacteriological examination are collected from 25 dairy herds. The use of multivariate statistical process control for selection of herds will be beneficial when a diagnostic test suitable for mass screening is available and validated on the Norwegian cattle population, thus making it possible to increase the number of sampled herds. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Measures of dependence for multivariate Lévy distributions
NASA Astrophysics Data System (ADS)
Boland, J.; Hurd, T. R.; Pivato, M.; Seco, L.
2001-02-01
Recent statistical analysis of a number of financial databases is summarized. Increasing agreement is found that logarithmic equity returns show a certain type of asymptotic behavior of the largest events, namely that the probability density functions have power law tails with an exponent α≈3.0. This behavior does not vary much over different stock exchanges or over time, despite large variations in trading environments. The present paper proposes a class of multivariate distributions which generalizes the observed qualities of univariate time series. A new consequence of the proposed class is the "spectral measure" which completely characterizes the multivariate dependences of the extreme tails of the distribution. This measure on the unit sphere in M-dimensions, in principle completely general, can be determined empirically by looking at extreme events. If it can be observed and determined, it will prove to be of importance for scenario generation in portfolio risk management.
Multidisciplinary optimization of a controlled space structure using 150 design variables
NASA Technical Reports Server (NTRS)
James, Benjamin B.
1992-01-01
A general optimization-based method for the design of large space platforms through integration of the disciplines of structural dynamics and control is presented. The method uses the global sensitivity equations approach and is especially appropriate for preliminary design problems in which the structural and control analyses are tightly coupled. The method is capable of coordinating general purpose structural analysis, multivariable control, and optimization codes, and thus, can be adapted to a variety of controls-structures integrated design projects. The method is used to minimize the total weight of a space platform while maintaining a specified vibration decay rate after slewing maneuvers.
Relationship between alcohol intake, body fat, and physical activity – a population-based study
Liangpunsakul, Suthat; Crabb, David W.; Qi, Rong
2010-01-01
Objectives Aside from fat, ethanol is the macronutrient with the highest energy density. Whether the energy derived from ethanol affects the body composition and fat mass is debatable. We investigated the relationship between alcohol intake, body composition, and physical activity in the US population using the third National Health and Nutrition Examination Survey (NHANES III). Methods Ten thousand five hundred and fifty subjects met eligible criteria and constituted our study cohort. Estimated percent body fat and resting metabolic rate were calculated based on the sum of the skinfolds. Multivariate regression analyses were performed accounting for the study sampling weight. Results In both genders, moderate and hazardous alcohol drinkers were younger (p<0.05), had significantly lower BMI (P<0.01) and body weight (p<0.01) than controls, non drinkers. Those with hazardous alcohol consumption had significantly less physical activity compared to those with no alcohol use and moderate drinkers in both genders. Female had significantly higher percent body fat than males. In the multivariate linear regression analyses, the levels of alcohol consumption were found to be an independent predictor associated with lower percent body fat only in male subjects. Conclusions Our results showed that alcoholics are habitually less active and that alcohol drinking is an independent predictor of lower percent body fat especially in male alcoholics. PMID:20696406
Brooks, R.A.; Bell, S.S.
2005-01-01
A descriptive study of the architecture of the red mangrove, Rhizophora mangle L., habitat of Tampa Bay, FL, was conducted to assess if plant architecture could be used to discriminate overwash from fringing forest type. Seven above-water (e.g., tree height, diameter at breast height, and leaf area) and 10 below-water (e.g., root density, root complexity, and maximum root order) architectural features were measured in eight mangrove stands. A multivariate technique (discriminant analysis) was used to test the ability of different models comprising above-water, below-water, or whole tree architecture to classify forest type. Root architectural features appear to be better than classical forestry measurements at discriminating between fringing and overwash forests but, regardless of the features loaded into the model, misclassification rates were high as forest type was only correctly classified in 66% of the cases. Based upon habitat architecture, the results of this study do not support a sharp distinction between overwash and fringing red mangrove forests in Tampa Bay but rather indicate that the two are architecturally undistinguishable. Therefore, within this northern portion of the geographic range of red mangroves, a more appropriate classification system based upon architecture may be one in which overwash and fringing forest types are combined into a single, "tide dominated" category. ?? 2005 Elsevier Ltd. All rights reserved.
Craddock, Travis J. A.; Fletcher, Mary Ann; Klimas, Nancy G.
2015-01-01
There is a growing appreciation for the network biology that regulates the coordinated expression of molecular and cellular markers however questions persist regarding the identifiability of these networks. Here we explore some of the issues relevant to recovering directed regulatory networks from time course data collected under experimental constraints typical of in vivo studies. NetSim simulations of sparsely connected biological networks were used to evaluate two simple feature selection techniques used in the construction of linear Ordinary Differential Equation (ODE) models, namely truncation of terms versus latent vector projection. Performance was compared with ODE-based Time Series Network Identification (TSNI) integral, and the information-theoretic Time-Delay ARACNE (TD-ARACNE). Projection-based techniques and TSNI integral outperformed truncation-based selection and TD-ARACNE on aggregate networks with edge densities of 10-30%, i.e. transcription factor, protein-protein cliques and immune signaling networks. All were more robust to noise than truncation-based feature selection. Performance was comparable on the in silico 10-node DREAM 3 network, a 5-node Yeast synthetic network designed for In vivo Reverse-engineering and Modeling Assessment (IRMA) and a 9-node human HeLa cell cycle network of similar size and edge density. Performance was more sensitive to the number of time courses than to sample frequency and extrapolated better to larger networks by grouping experiments. In all cases performance declined rapidly in larger networks with lower edge density. Limited recovery and high false positive rates obtained overall bring into question our ability to generate informative time course data rather than the design of any particular reverse engineering algorithm. PMID:25984725
Phlebotomus argentipes seasonal patterns in India and Nepal.
Picado, Albert; Das, Murari Lal; Kumar, Vijay; Dinesh, Diwakar S; Rijal, Suman; Singh, Shri P; Das, Pradeep; Coosemans, Marc; Boelaert, Marleen; Davies, Clive
2010-03-01
The current control of Phebotomus argentipes (Annandale and Brunetti), the vector of Leishmania donovani (Laveran and Mesnil), on the Indian subcontinent is base on indoor residual spraying. The efficacy of this method depends, among other factors, on the timing and number of spraying rounds, which depend on the P. argentipes seasonality. To describe P. argentipes' seasonal patterns, six visceral leishmaniasis (VL) endemic villages, three in Muzaffarpur and three in Sunsari districts in India and Nepal, respectively, were selected based on accessibility and VL incidence. Ten houses per cluster with the highest P. argentipes density were monitored monthly for 15-16 mo using Center for Disease Control and Prevention light traps. Minimum and maximum temperature and rainfall data for the months January 2006 through December 2007 were collected from the nearest available weather stations. Backwards stepwise regression was used to generate the minimal adequate model for explaining the monthly variation in P. argentipes populations. The seasonality of P. argentipes is similar in India and Nepal, with two annual density peaks around May and October. Monthly P. argentipes density is positively associated with temperature and negatively associated with rainfall in both study sites. The multivariate climate model explained 57% of the monthly vectorial abundance. Vector control programs against P. argentipes (i.e., indoor residual spraying) should take into account the seasonal described here when implementing and monitoring interventions. Monitoring simple meteorological variables (i.e., temperature, rainfall) may allow prediction of VL epidemics on the Indian subcontinent.
León-Latre, Montserrat; Moreno-Franco, Belén; Andrés-Esteban, Eva M; Ledesma, Marta; Laclaustra, Martín; Alcalde, Víctor; Peñalvo, José L; Ordovás, José M; Casasnovas, José A
2014-06-01
To analyze the association between sitting time and biomarkers of insulin resistance and inflammation in a sample of healthy male workers. Cross-sectional study carried out in a sample of 929 volunteers belonging to the Aragon Workers' Health Study cohort. Sociodemographic, anthropometric, pharmacological and laboratory data were collected: lipids-total cholesterol, high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, triglycerides, apolipoproteins A-1 and B-100, lipoprotein (a)-, insulin resistance-glucose, glycated hemoglobin, homeostasis model assessment of insulin resistance, insulin, and triglyceride/high-density lipoprotein cholesterol ratio-, and inflammatory profile-C-reactive protein and leukocytes. Information on sitting time and physical activity was assessed using a questionnaire. Sedentary behavior was analyzed in terms of prevalences and medians, according to tertiles, using a multivariate model (crude and adjusted linear regression) with biomarkers of inflammation and insulin resistance. The most sedentary individuals had higher body mass index, greater waist circumference, and higher systolic blood pressure, with a significant upward trend in each tertile. Likewise, they had a worse lipid profile with a higher C-reactive protein level, homeostasis model assessment of insulin resistance index, triglyceride/high-density lipoprotein cholesterol ratio, and insulin concentration. In the multivariate analysis, we observed a significant association between the latter parameters and sitting time in hours (log C-reactive protein [β = 0.07], log homeostasis model assessment of insulin resistance index [β = 0.05], triglyceride/high-density lipoprotein cholesterol ratio [β = 0.23], and insulin [β = 0.44]), which remained after adjustment for metabolic equivalents-h/week. Workers who spend more time sitting show a worse inflammatory and insulin resistance profile independently of the physical activity performed. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.
Krause, Peter J.; Niccolai, Linda; Steeves, Tanner; O’Keefe, Corrine Folsom; Diuk-Wasser, Maria A.
2014-01-01
Peridomestic exposure to Borrelia burgdorferi-infected Ixodes scapularis nymphs is considered the dominant means of infection with black-legged tick-borne pathogens in the eastern United States. Population level studies have detected a positive association between the density of infected nymphs and Lyme disease incidence. At a finer spatial scale within endemic communities, studies have focused on individual level risk behaviors, without accounting for differences in peridomestic nymphal density. This study simultaneously assessed the influence of peridomestic tick exposure risk and human behavior risk factors for Lyme disease infection on Block Island, Rhode Island. Tick exposure risk on Block Island properties was estimated using remotely sensed landscape metrics that strongly correlated with tick density at the individual property level. Behavioral risk factors and Lyme disease serology were assessed using a longitudinal serosurvey study. Significant factors associated with Lyme disease positive serology included one or more self-reported previous Lyme disease episodes, wearing protective clothing during outdoor activities, the average number of hours spent daily in tick habitat, the subject’s age and the density of shrub edges on the subject’s property. The best fit multivariate model included previous Lyme diagnoses and age. The strength of this association with previous Lyme disease suggests that the same sector of the population tends to be repeatedly infected. The second best multivariate model included a combination of environmental and behavioral factors, namely hours spent in vegetation, subject’s age, shrub edge density (increase risk) and wearing protective clothing (decrease risk). Our findings highlight the importance of concurrent evaluation of both environmental and behavioral factors to design interventions to reduce the risk of tick-borne infections. PMID:24416278
Finch, Casey; Al-Damluji, Mohammed Salim; Krause, Peter J; Niccolai, Linda; Steeves, Tanner; O'Keefe, Corrine Folsom; Diuk-Wasser, Maria A
2014-01-01
Peridomestic exposure to Borrelia burgdorferi-infected Ixodes scapularis nymphs is considered the dominant means of infection with black-legged tick-borne pathogens in the eastern United States. Population level studies have detected a positive association between the density of infected nymphs and Lyme disease incidence. At a finer spatial scale within endemic communities, studies have focused on individual level risk behaviors, without accounting for differences in peridomestic nymphal density. This study simultaneously assessed the influence of peridomestic tick exposure risk and human behavior risk factors for Lyme disease infection on Block Island, Rhode Island. Tick exposure risk on Block Island properties was estimated using remotely sensed landscape metrics that strongly correlated with tick density at the individual property level. Behavioral risk factors and Lyme disease serology were assessed using a longitudinal serosurvey study. Significant factors associated with Lyme disease positive serology included one or more self-reported previous Lyme disease episodes, wearing protective clothing during outdoor activities, the average number of hours spent daily in tick habitat, the subject's age and the density of shrub edges on the subject's property. The best fit multivariate model included previous Lyme diagnoses and age. The strength of this association with previous Lyme disease suggests that the same sector of the population tends to be repeatedly infected. The second best multivariate model included a combination of environmental and behavioral factors, namely hours spent in vegetation, subject's age, shrub edge density (increase risk) and wearing protective clothing (decrease risk). Our findings highlight the importance of concurrent evaluation of both environmental and behavioral factors to design interventions to reduce the risk of tick-borne infections.
Can texture analysis of tooth microwear detect within guild niche partitioning in extinct species?
NASA Astrophysics Data System (ADS)
Purnell, Mark; Nedza, Christopher; Rychlik, Leszek
2017-04-01
Recent work shows that tooth microwear analysis can be applied further back in time and deeper into the phylogenetic history of vertebrate clades than previously thought (e.g. niche partitioning in early Jurassic insectivorous mammals; Gill et al., 2014, Nature). Furthermore, quantitative approaches to analysis based on parameterization of surface roughness are increasing the robustness and repeatability of this widely used dietary proxy. Discriminating between taxa within dietary guilds has the potential to significantly increase our ability to determine resource use and partitioning in fossil vertebrates, but how sensitive is the technique? To address this question we analysed tooth microwear texture in sympatric populations of shrew species (Neomys fodiens, Neomys anomalus, Sorex araneus, Sorex minutus) from BiaŁ owieza Forest, Poland. These populations are known to exhibit varying degrees of niche partitioning (Churchfield & Rychlik, 2006, J. Zool.) with greatest overlap between the Neomys species. Sorex araneus also exhibits some niche overlap with N. anomalus, while S. minutus is the most specialised. Multivariate analysis based only on tooth microwear textures recovers the same pattern of niche partitioning. Our results also suggest that tooth textures track seasonal differences in diet. Projecting data from fossils into the multivariate dietary space defined using microwear from extant taxa demonstrates that the technique is capable of subtle dietary discrimination in extinct insectivores.
Yang, Yan-Qin; Yin, Hong-Xu; Yuan, Hai-Bo; Jiang, Yong-Wen; Dong, Chun-Wang; Deng, Yu-Liang
2018-01-01
In the present work, a novel infrared-assisted extraction coupled to headspace solid-phase microextraction (IRAE-HS-SPME) followed by gas chromatography-mass spectrometry (GC-MS) was developed for rapid determination of the volatile components in green tea. The extraction parameters such as fiber type, sample amount, infrared power, extraction time, and infrared lamp distance were optimized by orthogonal experimental design. Under optimum conditions, a total of 82 volatile compounds in 21 green tea samples from different geographical origins were identified. Compared with classical water-bath heating, the proposed technique has remarkable advantages of considerably reducing the analytical time and high efficiency. In addition, an effective classification of green teas based on their volatile profiles was achieved by partial least square-discriminant analysis (PLS-DA) and hierarchical clustering analysis (HCA). Furthermore, the application of a dual criterion based on the variable importance in the projection (VIP) values of the PLS-DA models and on the category from one-way univariate analysis (ANOVA) allowed the identification of 12 potential volatile markers, which were considered to make the most important contribution to the discrimination of the samples. The results suggest that IRAE-HS-SPME/GC-MS technique combined with multivariate analysis offers a valuable tool to assess geographical traceability of different tea varieties.
Yin, Hong-Xu; Yuan, Hai-Bo; Jiang, Yong-Wen; Dong, Chun-Wang; Deng, Yu-Liang
2018-01-01
In the present work, a novel infrared-assisted extraction coupled to headspace solid-phase microextraction (IRAE-HS-SPME) followed by gas chromatography-mass spectrometry (GC-MS) was developed for rapid determination of the volatile components in green tea. The extraction parameters such as fiber type, sample amount, infrared power, extraction time, and infrared lamp distance were optimized by orthogonal experimental design. Under optimum conditions, a total of 82 volatile compounds in 21 green tea samples from different geographical origins were identified. Compared with classical water-bath heating, the proposed technique has remarkable advantages of considerably reducing the analytical time and high efficiency. In addition, an effective classification of green teas based on their volatile profiles was achieved by partial least square-discriminant analysis (PLS-DA) and hierarchical clustering analysis (HCA). Furthermore, the application of a dual criterion based on the variable importance in the projection (VIP) values of the PLS-DA models and on the category from one-way univariate analysis (ANOVA) allowed the identification of 12 potential volatile markers, which were considered to make the most important contribution to the discrimination of the samples. The results suggest that IRAE-HS-SPME/GC-MS technique combined with multivariate analysis offers a valuable tool to assess geographical traceability of different tea varieties. PMID:29494626
Happel, Max F K; Jeschke, Marcus; Ohl, Frank W
2010-08-18
Primary sensory cortex integrates sensory information from afferent feedforward thalamocortical projection systems and convergent intracortical microcircuits. Both input systems have been demonstrated to provide different aspects of sensory information. Here we have used high-density recordings of laminar current source density (CSD) distributions in primary auditory cortex of Mongolian gerbils in combination with pharmacological silencing of cortical activity and analysis of the residual CSD, to dissociate the feedforward thalamocortical contribution and the intracortical contribution to spectral integration. We found a temporally highly precise integration of both types of inputs when the stimulation frequency was in close spectral neighborhood of the best frequency of the measurement site, in which the overlap between both inputs is maximal. Local intracortical connections provide both directly feedforward excitatory and modulatory input from adjacent cortical sites, which determine how concurrent afferent inputs are integrated. Through separate excitatory horizontal projections, terminating in cortical layers II/III, information about stimulus energy in greater spectral distance is provided even over long cortical distances. These projections effectively broaden spectral tuning width. Based on these data, we suggest a mechanism of spectral integration in primary auditory cortex that is based on temporally precise interactions of afferent thalamocortical inputs and different short- and long-range intracortical networks. The proposed conceptual framework allows integration of different and partly controversial anatomical and physiological models of spectral integration in the literature.
Ning-Bo, Huang; Peng, Huang; Zong-Ti, Shao; Xi-Guang, Feng; Yi, Dong; Guang-Huai, Yang; Jin-Song, Li; Yan-Hong, Zhang; Shao-Yun, Chen; Shou-Ju, Nie; Wen, Li
2016-03-11
To evaluate the effect of hydraulic schistosomiasis control project with ditches managed on Oncomelania hupensis snail control. From 2009 to 2011, the snail investigations and schistosomiasis surveillance were carried out in Dali City and Yongsheng County, two sites of national schistosomiasis surveillance. The history data of schistosomiasis control were collected and analyzed. At the harden sections of the water conservancy project with ditches managed in Shajing Village of Dali City, only one snail was found in 2010 with the density of living snails of 0.004 snails/0.1 m 2 , while the densities of living snails were respectively 0.080, 0.002 snails/0.1 m 2 and 0.007 snails/0.1 m2 in unhardened sections of the project from 2009 to 2011. No snails were found in the harden sections of the water conservancy project with ditches managed in Gaojiacun Village of Yongsheng County, while the densities of living snails were respectively 0.040, 0.030 snails/0.1 m2 and 0.040 snails/0.1 m2 in unhardened sections of the project from 2009 to 2011. After the ditches were hardened, no infected snails were found from 2009 to 2011, and the appearance rate of frames with snails and density of living snails were both decreased, while they were both higher in unhardened ditches. The hydraulic schistosomiasis control project has obvious effect on control snails, but the maintain work should be strengthened after the project is completed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maitra, Neepa
2016-07-14
This project investigates the accuracy of currently-used functionals in time-dependent density functional theory, which is today routinely used to predict and design materials and computationally model processes in solar energy conversion. The rigorously-based electron-ion dynamics method developed here sheds light on traditional methods and overcomes challenges those methods have. The fundamental research undertaken here is important for building reliable and practical methods for materials discovery. The ultimate goal is to use these tools for the computational design of new materials for solar cell devices of high efficiency.
A log-linear model approach to estimation of population size using the line-transect sampling method
Anderson, D.R.; Burnham, K.P.; Crain, B.R.
1978-01-01
The technique of estimating wildlife population size and density using the belt or line-transect sampling method has been used in many past projects, such as the estimation of density of waterfowl nestling sites in marshes, and is being used currently in such areas as the assessment of Pacific porpoise stocks in regions of tuna fishing activity. A mathematical framework for line-transect methodology has only emerged in the last 5 yr. In the present article, we extend this mathematical framework to a line-transect estimator based upon a log-linear model approach.
Serenity: A subsystem quantum chemistry program.
Unsleber, Jan P; Dresselhaus, Thomas; Klahr, Kevin; Schnieders, David; Böckers, Michael; Barton, Dennis; Neugebauer, Johannes
2018-05-15
We present the new quantum chemistry program Serenity. It implements a wide variety of functionalities with a focus on subsystem methodology. The modular code structure in combination with publicly available external tools and particular design concepts ensures extensibility and robustness with a focus on the needs of a subsystem program. Several important features of the program are exemplified with sample calculations with subsystem density-functional theory, potential reconstruction techniques, a projection-based embedding approach and combinations thereof with geometry optimization, semi-numerical frequency calculations and linear-response time-dependent density-functional theory. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Bhattacharya, Abhishek; Dunson, David B.
2012-01-01
This article considers a broad class of kernel mixture density models on compact metric spaces and manifolds. Following a Bayesian approach with a nonparametric prior on the location mixing distribution, sufficient conditions are obtained on the kernel, prior and the underlying space for strong posterior consistency at any continuous density. The prior is also allowed to depend on the sample size n and sufficient conditions are obtained for weak and strong consistency. These conditions are verified on compact Euclidean spaces using multivariate Gaussian kernels, on the hypersphere using a von Mises-Fisher kernel and on the planar shape space using complex Watson kernels. PMID:22984295
Drunk driving detection based on classification of multivariate time series.
Li, Zhenlong; Jin, Xue; Zhao, Xiaohua
2015-09-01
This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... acres of pine stands using a variety of methods to treat MPB infested stands, reduce the overall density... actions proposed are in direct response to management direction provided by the Black Hills National Forest Land and Resource Management Plan (Forest Plan). The site specific actions are designed, based on...
21st Century Locomotive Technology: Quarterly Technical Status Report 28
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lembit Salasoo; Ramu Chandra
2010-02-19
Thermal testing of a subscale locomotive sodium battery module was initiated.to validate thermal models. The hybrid trip optimizer problem was formulated. As outcomes of this project, GE has proceeded to commercialize trip optimizer technology, and has initiated work on a state-of-the-art battery manufacturing plant for high energy density, sodium-based batteries.
Panazzolo, Diogo G; Sicuro, Fernando L; Clapauch, Ruth; Maranhão, Priscila A; Bouskela, Eliete; Kraemer-Aguiar, Luiz G
2012-11-13
We aimed to evaluate the multivariate association between functional microvascular variables and clinical-laboratorial-anthropometrical measurements. Data from 189 female subjects (34.0 ± 15.5 years, 30.5 ± 7.1 kg/m2), who were non-smokers, non-regular drug users, without a history of diabetes and/or hypertension, were analyzed by principal component analysis (PCA). PCA is a classical multivariate exploratory tool because it highlights common variation between variables allowing inferences about possible biological meaning of associations between them, without pre-establishing cause-effect relationships. In total, 15 variables were used for PCA: body mass index (BMI), waist circumference, systolic and diastolic blood pressure (BP), fasting plasma glucose, levels of total cholesterol, high-density lipoprotein cholesterol (HDL-c), low-density lipoprotein cholesterol (LDL-c), triglycerides (TG), insulin, C-reactive protein (CRP), and functional microvascular variables measured by nailfold videocapillaroscopy. Nailfold videocapillaroscopy was used for direct visualization of nutritive capillaries, assessing functional capillary density, red blood cell velocity (RBCV) at rest and peak after 1 min of arterial occlusion (RBCV(max)), and the time taken to reach RBCV(max) (TRBCV(max)). A total of 35% of subjects had metabolic syndrome, 77% were overweight/obese, and 9.5% had impaired fasting glucose. PCA was able to recognize that functional microvascular variables and clinical-laboratorial-anthropometrical measurements had a similar variation. The first five principal components explained most of the intrinsic variation of the data. For example, principal component 1 was associated with BMI, waist circumference, systolic BP, diastolic BP, insulin, TG, CRP, and TRBCV(max) varying in the same way. Principal component 1 also showed a strong association among HDL-c, RBCV, and RBCV(max), but in the opposite way. Principal component 3 was associated only with microvascular variables in the same way (functional capillary density, RBCV and RBCV(max)). Fasting plasma glucose appeared to be related to principal component 4 and did not show any association with microvascular reactivity. In non-diabetic female subjects, a multivariate scenario of associations between classic clinical variables strictly related to obesity and metabolic syndrome suggests a significant relationship between these diseases and microvascular reactivity.
NASA Astrophysics Data System (ADS)
Chen, B.; Harp, D. R.; Lin, Y.; Keating, E. H.; Pawar, R.
2017-12-01
Monitoring is a crucial aspect of geologic carbon sequestration (GCS) risk management. It has gained importance as a means to ensure CO2 is safely and permanently stored underground throughout the lifecycle of a GCS project. Three issues are often involved in a monitoring project: (i) where is the optimal location to place the monitoring well(s), (ii) what type of data (pressure, rate and/or CO2 concentration) should be measured, and (iii) What is the optimal frequency to collect the data. In order to address these important issues, a filtering-based data assimilation procedure is developed to perform the monitoring optimization. The optimal monitoring strategy is selected based on the uncertainty reduction of the objective of interest (e.g., cumulative CO2 leak) for all potential monitoring strategies. To reduce the computational cost of the filtering-based data assimilation process, two machine-learning algorithms: Support Vector Regression (SVR) and Multivariate Adaptive Regression Splines (MARS) are used to develop the computationally efficient reduced-order-models (ROMs) from full numerical simulations of CO2 and brine flow. The proposed framework for GCS monitoring optimization is demonstrated with two examples: a simple 3D synthetic case and a real field case named Rock Spring Uplift carbon storage site in Southwestern Wyoming.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yakovleva, Marina
2012-12-31
FMC Lithium Division has successfully completed the project “Establishing Sustainable US PHEV/EV Manufacturing Base: Stabilized Lithium Metal Powder, Enabling Material and Revolutionary Technology for High Energy Li-ion Batteries”. The project included design, acquisition and process development for the production scale units to 1) produce stabilized lithium dispersions in oil medium, 2) to produce dry stabilized lithium metal powders, 3) to evaluate, design and acquire pilot-scale unit for alternative production technology to further decrease the cost, and 4) to demonstrate concepts for integrating SLMP technology into the Li- ion batteries to increase energy density. It is very difficult to satisfy safety,more » cost and performance requirements for the PHEV and EV applications. As the initial step in SLMP Technology introduction, industry can use commercially available LiMn2O4 or LiFePO4, for example, that are the only proven safer and cheaper lithium providing cathodes available on the market. Unfortunately, these cathodes alone are inferior to the energy density of the conventional LiCoO2 cathode and, even when paired with the advanced anode materials, such as silicon composite material, the resulting cell will still not meet the energy density requirements. We have demonstrated, however, if SLMP Technology is used to compensate for the irreversible capacity in the anode, the efficiency of the cathode utilization will be improved and the cost of the cell, based on the materials, will decrease.« less
Snow multivariable data assimilation for hydrological predictions in Alpine sites
NASA Astrophysics Data System (ADS)
Piazzi, Gaia; Thirel, Guillaume; Campo, Lorenzo; Gabellani, Simone; Stevenin, Hervè
2017-04-01
Snowpack dynamics (snow accumulation and ablation) strongly impacts on hydrological processes in Alpine areas. During the winter season the presence of snow cover (snow accumulation) reduces the drainage in the basin with a resulting lower watershed time of concentration in case of possible rainfall events. Moreover, the release of the significant water volume stored in winter (snowmelt) considerably contributes to the total discharge during the melting period. Therefore when modeling hydrological processes in snow-dominated catchments the quality of predictions deeply depends on how the model succeeds in catching snowpack dynamics. The integration of a hydrological model with a snow module allows improving predictions of river discharges. Besides the well-known modeling limitations (uncertainty in parameterizations; possible errors affecting both meteorological forcing data and initial conditions; approximations in boundary conditions), there are physical factors that make an exhaustive reconstruction of snow dynamics complicated: snow intermittence in space and time, stratification and slow phenomena like metamorphism processes, uncertainty in snowfall evaluation, wind transportation, etc. Data Assimilation (DA) techniques provide an objective methodology to combine several independent snow-related data sources (model simulations, ground-based measurements and remote sensed observations) in order to obtain the most likely estimate of snowpack state. This study presents SMASH (Snow Multidata Assimilation System for Hydrology), a multi-layer snow dynamic model strengthened by a multivariable DA framework for hydrological purposes. The model is physically based on mass and energy balances and can be used to reproduce the main physical processes occurring within the snowpack: accumulation, density dynamics, melting, sublimation, radiative balance, heat and mass exchanges. The model is driven by observed forcing meteorological data (air temperature, wind velocity, relative air humidity, precipitation and incident solar radiation) to provide a complete estimate of snowpack state. The implementation of a DA scheme enables to assimilate simultaneously ground-based observations of different snow-related variables (snow depth, snow density, surface temperature and albedo). SMASH performances are evaluated by using observed data supplied by meteorological stations located in three experimental Alpine sites: Col de Porte (1325 m, France); Torgnon (2160 m, Italy); Weissfluhjoch (2540 m, Switzerland). A comparison analysis between the resulting performaces of Particle Filter and Ensemble Kalman Filter schemes is shown.
Projected quasiparticle theory for molecular electronic structure
NASA Astrophysics Data System (ADS)
Scuseria, Gustavo E.; Jiménez-Hoyos, Carlos A.; Henderson, Thomas M.; Samanta, Kousik; Ellis, Jason K.
2011-09-01
We derive and implement symmetry-projected Hartree-Fock-Bogoliubov (HFB) equations and apply them to the molecular electronic structure problem. All symmetries (particle number, spin, spatial, and complex conjugation) are deliberately broken and restored in a self-consistent variation-after-projection approach. We show that the resulting method yields a comprehensive black-box treatment of static correlations with effective one-electron (mean-field) computational cost. The ensuing wave function is of multireference character and permeates the entire Hilbert space of the problem. The energy expression is different from regular HFB theory but remains a functional of an independent quasiparticle density matrix. All reduced density matrices are expressible as an integration of transition density matrices over a gauge grid. We present several proof-of-principle examples demonstrating the compelling power of projected quasiparticle theory for quantum chemistry.
NASA Technical Reports Server (NTRS)
Hargrove, A.
1982-01-01
Optimal digital control of nonlinear multivariable constrained systems was studied. The optimal controller in the form of an algorithm was improved and refined by reducing running time and storage requirements. A particularly difficult system of nine nonlinear state variable equations was chosen as a test problem for analyzing and improving the controller. Lengthy analysis, modeling, computing and optimization were accomplished. A remote interactive teletype terminal was installed. Analysis requiring computer usage of short duration was accomplished using Tuskegee's VAX 11/750 system.
Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid
2015-12-01
This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.
Integrated control-system design via generalized LQG (GLQG) theory
NASA Technical Reports Server (NTRS)
Bernstein, Dennis S.; Hyland, David C.; Richter, Stephen; Haddad, Wassim M.
1989-01-01
Thirty years of control systems research has produced an enormous body of theoretical results in feedback synthesis. Yet such results see relatively little practical application, and there remains an unsettling gap between classical single-loop techniques (Nyquist, Bode, root locus, pole placement) and modern multivariable approaches (LQG and H infinity theory). Large scale, complex systems, such as high performance aircraft and flexible space structures, now demand efficient, reliable design of multivariable feedback controllers which optimally tradeoff performance against modeling accuracy, bandwidth, sensor noise, actuator power, and control law complexity. A methodology is described which encompasses numerous practical design constraints within a single unified formulation. The approach, which is based upon coupled systems or modified Riccati and Lyapunov equations, encompasses time-domain linear-quadratic-Gaussian theory and frequency-domain H theory, as well as classical objectives such as gain and phase margin via the Nyquist circle criterion. In addition, this approach encompasses the optimal projection approach to reduced-order controller design. The current status of the overall theory will be reviewed including both continuous-time and discrete-time (sampled-data) formulations.
Marini, Federico; de Beer, Dalene; Walters, Nico A; de Villiers, André; Joubert, Elizabeth; Walczak, Beata
2017-03-17
An ultimate goal of investigations of rooibos plant material subjected to different stages of fermentation is to identify the chemical changes taking place in the phenolic composition, using an untargeted approach and chromatographic fingerprints. Realization of this goal requires, among others, identification of the main components of the plant material involved in chemical reactions during the fermentation process. Quantitative chromatographic data for the compounds for extracts of green, semi-fermented and fermented rooibos form the basis of preliminary study following a targeted approach. The aim is to estimate whether treatment has a significant effect based on all quantified compounds and to identify the compounds, which contribute significantly to it. Analysis of variance is performed using modern multivariate methods such as ANOVA-Simultaneous Component Analysis, ANOVA - Target Projection and regularized MANOVA. This study is the first one in which all three approaches are compared and evaluated. For the data studied, all tree methods reveal the same significance of the fermentation effect on the extract compositions, but they lead to its different interpretation. Copyright © 2017 Elsevier B.V. All rights reserved.
Market interdependence among commodity prices based on information transmission on the Internet
NASA Astrophysics Data System (ADS)
Ji, Qiang; Guo, Jian-Feng
2015-05-01
Human behaviour on the Internet has become a synchro-projection of real society. In this paper, we introduce the public concern derived from query volumes on the Web to empirically analyse the influence of information on commodity markets (e.g., crude oil, heating oil, corn and gold) using multivariate GARCH models based on dynamic conditional correlations. The analysis found that the changes of public concern on the Internet can well depict the changes of market prices, as the former has significant Granger causality effects on market prices. The findings indicate that the information of external shocks to commodity markets could be transmitted quickly, and commodity markets easily absorb the public concern of the information-sensitive traders. Finally, the conditional correlation among commodity prices varies dramatically over time.
Multivariate analysis of light scattering spectra of liquid dairy products
NASA Astrophysics Data System (ADS)
Khodasevich, M. A.
2010-05-01
Visible light scattering spectra from the surface layer of samples of commercial liquid dairy products are recorded with a colorimeter. The principal component method is used to analyze these spectra. Vectors representing the samples of dairy products in a multidimensional space of spectral counts are projected onto a three-dimensional subspace of principal components. The magnitudes of these projections are found to depend on the type of dairy product.
Discordance between net analyte signal theory and practical multivariate calibration.
Brown, Christopher D
2004-08-01
Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.
Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery.
Liu, Han; Wang, Lie; Zhao, Tuo
2015-08-01
We propose a calibrated multivariate regression method named CMR for fitting high dimensional multivariate regression models. Compared with existing methods, CMR calibrates regularization for each regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence O (1/ ϵ ), where ϵ is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional multivariate regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package camel implementing the proposed method is available on the Comprehensive R Archive Network http://cran.r-project.org/web/packages/camel/.
Magnetic exchange couplings from noncollinear perturbation theory: dinuclear CuII complexes.
Phillips, Jordan J; Peralta, Juan E
2014-08-07
To benchmark the performance of a new method based on noncollinear coupled-perturbed density functional theory [J. Chem. Phys. 138, 174115 (2013)], we calculate the magnetic exchange couplings in a series of triply bridged ferromagnetic dinuclear Cu(II) complexes that have been recently synthesized [Phys. Chem. Chem. Phys. 15, 1966 (2013)]. We find that for any basis-set the couplings from our noncollinear coupled-perturbed methodology are practically identical to those of spin-projected energy-differences when a hybrid density functional approximation is employed. This demonstrates that our methodology properly recovers a Heisenberg description for these systems, and is robust in its predictive power of magnetic couplings. Furthermore, this indicates that the failure of density functional theory to capture the subtle variation of the exchange couplings in these complexes is not simply an artifact of broken-symmetry methods, but rather a fundamental weakness of current approximate density functionals for the description of magnetic couplings.
A stereological analysis of ductile fracture by microvoid coalescence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steele, J.H., Jr.
A stereological analysis for ductile fracture by microvoid coalescence is presented based upon the model of Widgery and Knott which postulates that microvoids link with a propagating crack if they lie within a certain interaction distance of its plane. A 3- dimensional analytical expression for dimple density and shape is developed from this model using projected image relationships for a thin slab. Void nucleation and growth are incorporated into the analysis using numerical integration of the Rice-Tracey growth equation over the appropriate strain range. An evaluation of the stereological approach is given using tensile data from a spheroidized 1045 steelmore » to predict the effect of hydrostatic pressure upon the dimple density. The analysis, which is consistent with observed correlations between dimple density and second phase particle density, is shown to provide an estimate of dimple size and microroughness parameter used in local stain models for microvoid coalescence. 24 refs., 10 figs.« less
Hydrogen Research for Spaceport and Space-Based Applications: Fuel Cell Projects
NASA Technical Reports Server (NTRS)
Anderson, Tim; Balaban, Canan
2008-01-01
The activities presented are a broad based approach to advancing key hydrogen related technologies in areas such as fuel cells, hydrogen production, and distributed sensors for hydrogen-leak detection, laser instrumentation for hydrogen-leak detection, and cryogenic transport and storage. Presented are the results from research projects, education and outreach activities, system and trade studies. The work will aid in advancing the state-of-the-art for several critical technologies related to the implementation of a hydrogen infrastructure. Activities conducted are relevant to a number of propulsion and power systems for terrestrial, aeronautics and aerospace applications. Fuel cell research focused on proton exchange membranes (PEM), solid oxide fuel cells (SOFC). Specific technologies included aircraft fuel cell reformers, new and improved electrodes, electrolytes, interconnect, and seals, modeling of fuel cells including CFD coupled with impedance spectroscopy. Research was conducted on new materials and designs for fuel cells, along with using embedded sensors with power management electronics to improve the power density delivered by fuel cells. Fuel cell applications considered were in-space operations, aviation, and ground-based fuel cells such as; powering auxiliary power units (APUs) in aircraft; high power density, long duration power supplies for interplanetary missions (space science probes and planetary rovers); regenerative capabilities for high altitude aircraft; and power supplies for reusable launch vehicles.
NASA Astrophysics Data System (ADS)
Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.
2016-08-01
Drought is among the costliest natural hazards worldwide and extreme drought events in recent years have caused huge losses to various sectors. Drought prediction is therefore critically important for providing early warning information to aid decision making to cope with drought. Due to the complicated nature of drought, it has been recognized that the univariate drought indicator may not be sufficient for drought characterization and hence multivariate drought indices have been developed for drought monitoring. Alongside the substantial effort in drought monitoring with multivariate drought indices, it is of equal importance to develop a drought prediction method with multivariate drought indices to integrate drought information from various sources. This study proposes a general framework for multivariate multi-index drought prediction that is capable of integrating complementary prediction skills from multiple drought indices. The Multivariate Ensemble Streamflow Prediction (MESP) is employed to sample from historical records for obtaining statistical prediction of multiple variables, which is then used as inputs to achieve multivariate prediction. The framework is illustrated with a linearly combined drought index (LDI), which is a commonly used multivariate drought index, based on climate division data in California and New York in the United States with different seasonality of precipitation. The predictive skill of LDI (represented with persistence) is assessed by comparison with the univariate drought index and results show that the LDI prediction skill is less affected by seasonality than the meteorological drought prediction based on SPI. Prediction results from the case study show that the proposed multivariate drought prediction outperforms the persistence prediction, implying a satisfactory performance of multivariate drought prediction. The proposed method would be useful for drought prediction to integrate drought information from various sources for early drought warning.
Acceleration of Monte Carlo SPECT simulation using convolution-based forced detection
NASA Astrophysics Data System (ADS)
de Jong, H. W. A. M.; Slijpen, E. T. P.; Beekman, F. J.
2001-02-01
Monte Carlo (MC) simulation is an established tool to calculate photon transport through tissue in Emission Computed Tomography (ECT). Since the first appearance of MC a large variety of variance reduction techniques (VRT) have been introduced to speed up these notoriously slow simulations. One example of a very effective and established VRT is known as forced detection (FD). In standard FD the path from the photon's scatter position to the camera is chosen stochastically from the appropriate probability density function (PDF), modeling the distance-dependent detector response. In order to speed up MC the authors propose a convolution-based FD (CFD) which involves replacing the sampling of the PDF by a convolution with a kernel which depends on the position of the scatter event. The authors validated CFD for parallel-hole Single Photon Emission Computed Tomography (SPECT) using a digital thorax phantom. Comparison of projections estimated with CFD and standard FD shows that both estimates converge to practically identical projections (maximum bias 0.9% of peak projection value), despite the slightly different photon paths used in CFD and standard FD. Projections generated with CFD converge, however, to a noise-free projection up to one or two orders of magnitude faster, which is extremely useful in many applications such as model-based image reconstruction.
High Energy Density Li-ion Cells for EV’s Based on Novel, High Voltage Cathode Material Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kepler, Keith D.; Slater, Michael
This Li-ion cell technology development project had three objectives: to develop advanced electrode materials and cell components to enable stable high-voltage operation; to design and demonstrate a Li-ion cell using these materials that meets the PHEV40 performance targets; and to design and demonstrate a Li-ion cell using these materials that meets the EV performance targets. The major challenge to creating stable high energy cells with long cycle life is system integration. Although materials that can give high energy cells are known, stabilizing them towards long-term cycling in the presence of other novel cell components is a major challenge. The majormore » technical barriers addressed by this work include low cathode specific energy, poor electrolyte stability during high voltage operation, and insufficient capacity retention during deep discharge for Si-containing anodes. Through the course of this project, Farasis was able to improve capacity retention of NCM materials for 4.4+ V operation, through both surface treatment and bulk-doping approaches. Other material advances include increased rate capability and of HE-NCM materials through novel synthesis approach, doubling the relative capacity at 1C over materials synthesized using standard methods. Silicon active materials proved challenging throughout the project and ultimately were the limiting factor in the energy density vs. cycle life trade off. By avoiding silicon anodes for the lower energy PHEV design, we manufactured cells with intermediate energy density and long cycle life under high voltage operation for PHEV applications. Cells with high energy density for EV applications were manufactured targeting a 300 Wh/kg design and were able to achieve > 200 cycles.« less
Optimizing care in osteoporosis: The Canadian quality circle project
Ioannidis, George; Thabane, Lehana; Gafni, Amiram; Hodsman, Anthony; Kvern, Brent; Johnstone, Dan; Plumley, Nathalie; Salach, Lena; Jiwa, Famida; Adachi, Jonathan D; Papaioannou, Alexandra
2008-01-01
Background While the Osteoporosis Canada 2002 Canadian guidelines provided evidence based strategies in preventing, diagnosing, and managing this condition, publication and distribution of guidelines have not, in and of themselves, been shown to alter physicians clinical approaches. We hypothesize that primary care physicians enrolled in the Quality Circle project would change their patient management of osteoporosis in terms of awareness of osteoporosis risk factors and bone mineral density testing in accordance with the guidelines. Methods The project consisted of five Quality Circle phases that included: 1) Training & Baseline Data Collection, 2) First Educational Intervention & First Follow-Up Data Collection 3) First Strategy Implementation Session, 4) Final Educational Intervention & Final Follow-up Data Collection, and 5) Final Strategy Implementation Session. A total of 340 circle members formed 34 quality circles and participated in the study. The generalized estimating equations approach was used to model physician awareness of risk factors for osteoporosis and appropriate utilization of bone mineral density testing pre and post educational intervention (first year of the study). Odds ratios (OR) and 95% confidence intervals (95% CI) were calculated. Results After the 1st year of the study, physicians' certainty of their patients' risk factor status increased. Certainty varied from an OR of 1.4 (95% CI: 1.1, 1.8) for prior vertebral fracture status to 6.3 (95% CI: 2.3, 17.9) for prior hip fracture status. Furthermore, bone mineral density testing increased in high risk as compared with low risk patients (OR: 1.4; 95% CI: 1.2, 1.7). Conclusion Quality Circle methodology was successful in increasing both physicians' awareness of osteoporosis risk factors and appropriate bone mineral density testing in accordance with the 2002 Canadian guidelines. PMID:18828906
Harff, Jan; Bohling, Geoffrey C.; Endler, R.; Davis, J.C.; Olea, R.A.
1999-01-01
The Holocene sediment sequence of a core taken within the centre of the Eastern Gotland Basin was subdivided into 12 lithostratigraphic units based on MSCL-data (sound velocity, wet bulk density, magnetic susceptibility) using a multivariate classification method. The lower 6 units embrace the sediments until the Litorina transgression, and the upper 6 units subdivide the brackish-marine Litorina- and post-Litorina sediments. The upper lithostratigraphic units reflect a change of anoxic (laminated) and oxic (non-laminated) sediments. By application of a numerical stratigraphic correlation method the zonation was extended laterally onto contiguous sediment cores within the central basin. Consequently the change of anoxic and oxic sediments can be used for a general lithostratigraphic subdivision of sediments of the Gotland Basin. A quantitative criterion based on the sediment-physical lithofacies is added to existing subdivisions of the Holocene in the Baltic Sea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loveday, D.L.; Craggs, C.
Box-Jenkins-based multivariate stochastic modeling is carried out using data recorded from a domestic heating system. The system comprises an air-source heat pump sited in the roof space of a house, solar assistance being provided by the conventional tile roof acting as a radiation absorber. Multivariate models are presented which illustrate the time-dependent relationships between three air temperatures - at external ambient, at entry to, and at exit from, the heat pump evaporator. Using a deterministic modeling approach, physical interpretations are placed on the results of the multivariate technique. It is concluded that the multivariate Box-Jenkins approach is a suitable techniquemore » for building thermal analysis. Application to multivariate Box-Jenkins approach is a suitable technique for building thermal analysis. Application to multivariate model-based control is discussed, with particular reference to building energy management systems. It is further concluded that stochastic modeling of data drawn from a short monitoring period offers a means of retrofitting an advanced model-based control system in existing buildings, which could be used to optimize energy savings. An approach to system simulation is suggested.« less
Multivariate Statistical Modelling of Drought and Heat Wave Events
NASA Astrophysics Data System (ADS)
Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele
2016-04-01
Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A copula is a multivariate distribution function which allows one to model the dependence structure of given variables separately from the marginal behaviour. We firstly look at the structure of soil moisture drought over the entire of France using the SAFRAN dataset between 1959 and 2009. Soil moisture is represented using the Standardised Precipitation Evapotranspiration Index (SPEI). Drought characteristics are computed at grid point scale where drought conditions are identified as those with an SPEI value below -1.0. We model the multivariate dependence structure of drought events defined by certain characteristics and compute return levels of these events. We initially find that drought characteristics such as duration, mean SPEI and the maximum contiguous area to a grid point all have positive correlations, though the degree to which they are correlated can vary considerably spatially. A spatial representation of return levels then may provide insight into the areas most prone to drought conditions. As a next step, we analyse the dependence structure between soil moisture conditions preceding the onset of a heat wave and the heat wave itself.
Quantitative Analysis of Cotton Canopy Size in Field Conditions Using a Consumer-Grade RGB-D Camera.
Jiang, Yu; Li, Changying; Paterson, Andrew H; Sun, Shangpeng; Xu, Rui; Robertson, Jon
2017-01-01
Plant canopy structure can strongly affect crop functions such as yield and stress tolerance, and canopy size is an important aspect of canopy structure. Manual assessment of canopy size is laborious and imprecise, and cannot measure multi-dimensional traits such as projected leaf area and canopy volume. Field-based high throughput phenotyping systems with imaging capabilities can rapidly acquire data about plants in field conditions, making it possible to quantify and monitor plant canopy development. The goal of this study was to develop a 3D imaging approach to quantitatively analyze cotton canopy development in field conditions. A cotton field was planted with 128 plots, including four genotypes of 32 plots each. The field was scanned by GPhenoVision (a customized field-based high throughput phenotyping system) to acquire color and depth images with GPS information in 2016 covering two growth stages: canopy development, and flowering and boll development. A data processing pipeline was developed, consisting of three steps: plot point cloud reconstruction, plant canopy segmentation, and trait extraction. Plot point clouds were reconstructed using color and depth images with GPS information. In colorized point clouds, vegetation was segmented from the background using an excess-green (ExG) color filter, and cotton canopies were further separated from weeds based on height, size, and position information. Static morphological traits were extracted on each day, including univariate traits (maximum and mean canopy height and width, projected canopy area, and concave and convex volumes) and a multivariate trait (cumulative height profile). Growth rates were calculated for univariate static traits, quantifying canopy growth and development. Linear regressions were performed between the traits and fiber yield to identify the best traits and measurement time for yield prediction. The results showed that fiber yield was correlated with static traits after the canopy development stage ( R 2 = 0.35-0.71) and growth rates in early canopy development stages ( R 2 = 0.29-0.52). Multi-dimensional traits (e.g., projected canopy area and volume) outperformed one-dimensional traits, and the multivariate trait (cumulative height profile) outperformed univariate traits. The proposed approach would be useful for identification of quantitative trait loci (QTLs) controlling canopy size in genetics/genomics studies or for fiber yield prediction in breeding programs and production environments.
Cinner, Joshua E; Bodin, Orjan
2010-08-11
Diverse livelihood portfolios are frequently viewed as a critical component of household economies in developing countries. Within the context of natural resources governance in particular, the capacity of individual households to engage in multiple occupations has been shown to influence important issues such as whether fishers would exit a declining fishery, how people react to policy, the types of resource management systems that may be applicable, and other decisions about natural resource use. This paper uses network analysis to provide a novel methodological framework for detailed systemic analysis of household livelihood portfolios. Paying particular attention to the role of natural resource-based occupations such as fisheries, we use network analyses to map occupations and their interrelationships- what we refer to as 'livelihood landscapes'. This network approach allows for the visualization of complex information about dependence on natural resources that can be aggregated at different scales. We then examine how the role of natural resource-based occupations changes along spectra of socioeconomic development and population density in 27 communities in 5 western Indian Ocean countries. Network statistics, including in- and out-degree centrality, the density of the network, and the level of network centralization are compared along a multivariate index of community-level socioeconomic development and a gradient of human population density. The combination of network analyses suggests an increase in household-level specialization with development for most occupational sectors, including fishing and farming, but that at the community-level, economies remained diversified. The novel modeling approach introduced here provides for various types of livelihood portfolio analyses at different scales of social aggregation. Our livelihood landscapes approach provides insights into communities' dependencies and usages of natural resources, and shows how patterns of occupational interrelationships relate to socioeconomic development and population density. A key question for future analysis is how the reduction of household occupational diversity, but maintenance of community diversity we see with increasing socioeconomic development influences key aspects of societies' vulnerability to environmental change or disasters.
Quantification of micro stickies
Mahendra Doshi; Jeffrey Dyer; Salman Aziz; Kristine Jackson; Said M. Abubakr
1997-01-01
The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...
Aledo, Rosa; Padró, Teresa; Mata, Pedro; Alonso, Rodrigo; Badimon, Lina
2015-04-01
Recent genome-wide association studies have identified a locus on chromosome 12q13.3 associated with plasma levels of triglyceride and high-density lipoprotein cholesterol, with rs11613352 being the lead single nucleotide polymorphism in this genome-wide association study locus. The aim of the study is to investigate the involvement of rs11613352 in a population with high cardiovascular risk due to familial hypercholesterolemia. The single nucleotide polymorphism was genotyped by Taqman(®) assay in a cohort of 601 unrelated familial hypercholesterolemia patients and its association with plasma triglyceride and high-density lipoprotein cholesterol levels was analyzed by multivariate methods based on linear regression. Minimal allele frequency was 0.17 and genotype frequencies were 0.69, 0.27, and 0.04 for CC, CT, and TT genotypes, respectively. The polymorphism is associated in a recessive manner (TT genotype) with a decrease in triglyceride levels (P=.002) and with an increase in high-density lipoprotein cholesterol levels (P=.021) after adjusting by age and sex. The polymorphism rs11613352 may contribute to modulate the cardiovascular risk by modifying plasma lipid levels in familial hypercholesterolemia patients. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
Reconstruction of internal density distributions in porous bodies from laser ultrasonic data
NASA Technical Reports Server (NTRS)
Lu, Yichi; Goldman, Jeffrey A.; Wadley, Haydn N. G.
1992-01-01
It is presently shown that, for density-reconstruction problems in which information about the inhomogeneity is known a priori, the nonlinear least-squares algorithm yields satisfactory results on the basis of limited projection data. The back-projection algorithm, which obviates assumptions about the objective function to be reconstructed, does not recover the boundary of the inhomogeneity when the number of projections is limited and ray-bending is ignored.
Model-based Optimization and Feedback Control of the Current Density Profile Evolution in NSTX-U
NASA Astrophysics Data System (ADS)
Ilhan, Zeki Okan
Nuclear fusion research is a highly challenging, multidisciplinary field seeking contributions from both plasma physics and multiple engineering areas. As an application of plasma control engineering, this dissertation mainly explores methods to control the current density profile evolution within the National Spherical Torus eXperiment-Upgrade (NSTX-U), which is a substantial upgrade based on the NSTX device, which is located in Princeton Plasma Physics Laboratory (PPPL), Princeton, NJ. Active control of the toroidal current density profile is among those plasma control milestones that the NSTX-U program must achieve to realize its next-step operational goals, which are characterized by high-performance, long-pulse, MHD-stable plasma operation with neutral beam heating. Therefore, the aim of this work is to develop model-based, feedforward and feedback controllers that can enable time regulation of the current density profile in NSTX-U by actuating the total plasma current, electron density, and the powers of the individual neutral beam injectors. Motivated by the coupled, nonlinear, multivariable, distributed-parameter plasma dynamics, the first step towards control design is the development of a physics-based, control-oriented model for the current profile evolution in NSTX-U in response to non-inductive current drives and heating systems. Numerical simulations of the proposed control-oriented model show qualitative agreement with the high-fidelity physics code TRANSP. The next step is to utilize the proposed control-oriented model to design an open-loop actuator trajectory optimizer. Given a desired operating state, the optimizer produces the actuator trajectories that can steer the plasma to such state. The objective of the feedforward control design is to provide a more systematic approach to advanced scenario planning in NSTX-U since the development of such scenarios is conventionally carried out experimentally by modifying the tokamak's actuator trajectories and analyzing the resulting plasma evolution. Finally, the proposed control-oriented model is embedded in feedback control schemes based on optimal control and Model Predictive Control (MPC) approaches. Integrators are added to the standard Linear Quadratic Gaussian (LQG) and MPC formulations to provide robustness against various modeling uncertainties and external disturbances. The effectiveness of the proposed feedback controllers in regulating the current density profile in NSTX-U is demonstrated in closed-loop nonlinear simulations. Moreover, the optimal feedback control algorithm has been implemented successfully in closed-loop control simulations within TRANSP through the recently developed Expert routine. (Abstract shortened by ProQuest.).
Jiang, Hua; Peng, Jin; Zhou, Zhi-yuan; Duan, Yu; Chen, Wei; Cai, Bin; Yang, Hao; Zhang, Wei
2010-09-01
Spinal cord injury (SCI) is a complex trauma that consists of multiple pathological mechanisms involving cytotoxic, oxidation stress and immune-endocrine. This study aimed to establish plasma metabonomics fingerprinting atlas for SCI using (1)H nuclear magnetic resonance (NMR) based metabonomics methodology and principal component analysis techniques. Nine Sprague-Dawley (SD) male rats were randomly divided into SCI, normal and sham-operation control groups. Plasma samples were collected for (1)H NMR spectroscopy 3 days after operation. The NMR data were analyzed using principal component analysis technique with Matlab software. Metabonomics analysis was able to distinguish the three groups (SCI, normal control, sham-operation). The fingerprinting atlas indicated that, compared with those without SCI, the SCI group demonstrated the following characteristics with regard to second principal component: it is made up of fatty acids, myc-inositol, arginine, very low-density lipoprotein (VLDL), low-density lipoprotein (LDL), triglyceride (TG), glucose, and 3-methyl-histamine. The data indicated that SCI results in several significant changes in plasma metabolism early on and that a metabonomics approach based on (1)H NMR spectroscopy can provide a metabolic profile comprising several metabolite classes and allow for relative quantification of such changes. The results also provided support for further development and application of metabonomics technologies for studying SCI and for the utilization of multivariate models for classifying the extent of trauma within an individual.
Yap, Natalie; Wong, Phillip; McGinn, Stella; Nery, Maria-Liza; Doyle, Jean; Wells, Lynda; Clifton-Bligh, Phillip; Clifton-Bligh, Roderick J
2017-01-01
Low bone mineral density (BMD) is a known independent predictor of mortality in the general elderly population. However, studies in patients with end-stage renal disease (ESRD) are limited. The present study evaluated mortality during long-term follow-up in a population of patients having dialysis for ESRD, in whom BMD was also measured. Fifty-eight patients with ESRD were recruited consecutively from a dialysis clinic and followed prospectively for 6 years. Baseline BMD of the lumbar spine and femoral neck (FN) were measured by X-ray absorptiometry and by peripheral quantitative CT at the radius and tibia. Serum calcium, phosphate, parathyroid hormone (PTH), and albumin were measured at baseline. During follow-up, 25 patients died. Univariate analysis showed that mortality was significantly associated with FN-BMD: hazards ratio (HR) per 0.1 g/cm2 decrease 1.50 (95% CI 1.07-2.10), p = 0.019; FN-T score: HR per 1-SD decrease 1.84 (95% CI 1.16-2.92), p = 0.009; and tibial cortical density: HR per 10 mg/cm3 decrease 1.08 (95% CI 1.02-1.14), p = 0.010. In multivariate analysis with stepwise adjustment for age, sex, transplant status, albumin, PTH, phosphate, dialysis duration, diabetes, and smoking, FN-T score remained significantly associated with mortality: HR per 1-SD decrease 1.82 (95% CI 1.02-3.24), p = 0.044, whereas the HR for FN-BMD and tibial cortical density were no longer significant. When 4 patients who had peritoneal dialysis were excluded, the HR relating FN-BMD, FN-T score, and tibial cortical density to mortality remained significant but became insignificant when albumin was included in the multivariate analysis. Reduced FN-BMD, FN-T score, and tibial cortical density were significantly associated with an increased risk of death in patients with ESRD. © 2017 S. Karger AG, Basel.
Brand, Judith S; Humphreys, Keith; Thompson, Deborah J; Li, Jingmei; Eriksson, Mikael; Hall, Per; Czene, Kamila
2014-12-01
Mammographic density is a strong heritable trait, but data on its genetic component are limited to area-based and qualitative measures. We studied the heritability of volumetric mammographic density ascertained by a fully-automated method and the association with breast cancer susceptibility loci. Heritability of volumetric mammographic density was estimated with a variance component model in a sib-pair sample (N pairs = 955) of a Swedish screening based cohort. Associations with 82 established breast cancer loci were assessed in an independent sample of the same cohort (N = 4025 unrelated women) using linear models, adjusting for age, body mass index, and menopausal status. All tests were two-sided, except for heritability analyses where one-sided tests were used. After multivariable adjustment, heritability estimates (standard error) for percent dense volume, absolute dense volume, and absolute nondense volume were 0.63 (0.06) and 0.43 (0.06) and 0.61 (0.06), respectively (all P < .001). Percent and absolute dense volume were associated with rs10995190 (ZNF365; P = 9.0 × 10(-6) and 8.9 × 10(-7), respectively) and rs9485372 (TAB2; P = 1.8 × 10(-5) and 1.8 × 10(-3), respectively). We also observed associations of rs9383938 (ESR1) and rs2046210 (ESR1) with the absolute dense volume (P = 2.6 × 10(-4) and 4.6 × 10(-4), respectively), and rs6001930 (MLK1) and rs17356907 (NTN4) with the absolute nondense volume (P = 6.7 × 10(-6) and 8.4 × 10(-5), respectively). Our results support the high heritability of mammographic density, though estimates are weaker for absolute than percent dense volume. We also demonstrate that the shared genetic component with breast cancer is not restricted to dense tissues only. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
An Improved Method of Heterogeneity Compensation for the Convolution / Superposition Algorithm
NASA Astrophysics Data System (ADS)
Jacques, Robert; McNutt, Todd
2014-03-01
Purpose: To improve the accuracy of convolution/superposition (C/S) in heterogeneous material by developing a new algorithm: heterogeneity compensated superposition (HCS). Methods: C/S has proven to be a good estimator of the dose deposited in a homogeneous volume. However, near heterogeneities electron disequilibrium occurs, leading to the faster fall-off and re-buildup of dose. We propose to filter the actual patient density in a position and direction sensitive manner, allowing the dose deposited near interfaces to be increased or decreased relative to C/S. We implemented the effective density function as a multivariate first-order recursive filter and incorporated it into GPU-accelerated, multi-energetic C/S implementation. We compared HCS against C/S using the ICCR 2000 Monte-Carlo accuracy benchmark, 23 similar accuracy benchmarks and 5 patient cases. Results: Multi-energetic HCS increased the dosimetric accuracy for the vast majority of voxels; in many cases near Monte-Carlo results were achieved. We defined the per-voxel error, %|mm, as the minimum of the distance to agreement in mm and the dosimetric percentage error relative to the maximum MC dose. HCS improved the average mean error by 0.79 %|mm for the patient volumes; reducing the average mean error from 1.93 %|mm to 1.14 %|mm. Very low densities (i.e. < 0.1 g / cm3) remained problematic, but may be solvable with a better filter function. Conclusions: HCS improved upon C/S's density scaled heterogeneity correction with a position and direction sensitive density filter. This method significantly improved the accuracy of the GPU based algorithm reaching the accuracy levels of Monte Carlo based methods with performance in a few tenths of seconds per beam. Acknowledgement: Funding for this research was provided by the NSF Cooperative Agreement EEC9731748, Elekta / IMPAC Medical Systems, Inc. and the Johns Hopkins University. James Satterthwaite provided the Monte Carlo benchmark simulations.
Update of the DTM thermosphere model in the framework of the H2020 project `SWAMI'
NASA Astrophysics Data System (ADS)
Bruinsma, S.; Jackson, D.; Stolle, C.; Negrin, S.
2017-12-01
In the framework of the H2020 project SWAMI (Space Weather Atmosphere Model and Indices), which is expected to start in January 2018, the CIRA thermosphere specification model DTM2013 will be improved through the combination of assimilating more density data to drive down remaining biases and a new high cadence kp geomagnetic index in order to improve storm-time performance. Five more years of GRACE high-resolution densities from 2012-2016, densities from the last year of the GOCE mission, Swarm mean densities, and mean densities from 2010-2017 inferred from the geodetic satellites at about 800 km are available now. The DTM2013 model will be compared with the new density data in order to detect possible systematic errors or other kinds of deficiencies and a first analysis will be presented. Also, a more detailed analysis of model performance under storm conditions will be provided, which will then be the benchmark to quantify model improvement expected with the higher cadence kp indices. In the SWAMI project, the DTM model will be coupled in the 120-160 km altitude region to the Met Office Unified Model in order to create a whole atmosphere model. It can be used for launch operations, re-entry computations, orbit prediction, and aeronomy and space weather studies. The project objectives and time line will be given.
Understanding lithospheric stresses in Arctic: constraints and models
NASA Astrophysics Data System (ADS)
Medvedev, Sergei; Minakov, Alexander; Lebedeva-Ivanova, Nina; Gaina, Carmen
2016-04-01
This pilot project aims to model stress patterns and analyze factors controlling lithospheric stresses in Arctic. The project aims to understand the modern stresses in Arctic as well as to define the ways to test recent hypotheses about Cenozoic evolution of the region. The regions around Lomonosov Ridge and Barents Sea are of particular interest driven by recent acquisition of high-resolution potential field and seismic data. Naturally, the major contributor to the lithospheric stress distribution is the gravitational potential energy (GPE). The study tries to incorporate available geological and geophysical data to build reliable GPE. In particular, we use the recently developed integrated gravity inversion for crustal thickness which incorporates up-to-date compilations of gravity anomalies, bathymetry, and sedimentary thickness. The modelled lithosphere thermal structure assumes a pure shear extension and the ocean age model constrained by global plate kinematics for the last ca. 120 Ma. The results of this approach are juxtaposed with estimates of the density variation inferred from the upper mantle S-wave velocity models based on previous surface wave tomography studies. Although new data and interpretations of the Arctic lithosphere structure become available now, there are areas of low accuracy or even lack of data. To compensate for this, we compare two approaches to constrain GPE: (1) one that directly integrates density of modelled lithosphere and (2) one that uses geoid anomalies which are filtered to account for density variations down to the base of the lithosphere only. The two versions of GPE compared to each other and the stresses calculated numerically are compared with observations. That allows us to optimize GPE and understand density structure, stress pattern, and factors controlling the stresses in Arctic.
Doan, Nhat Trung; Engvig, Andreas; Zaske, Krystal; Persson, Karin; Lund, Martina Jonette; Kaufmann, Tobias; Cordova-Palomera, Aldo; Alnæs, Dag; Moberget, Torgeir; Brækhus, Anne; Barca, Maria Lage; Nordvik, Jan Egil; Engedal, Knut; Agartz, Ingrid; Selbæk, Geir; Andreassen, Ole A; Westlye, Lars T
2017-09-01
Alzheimer's disease (AD) is a debilitating age-related neurodegenerative disorder. Accurate identification of individuals at risk is complicated as AD shares cognitive and brain features with aging. We applied linked independent component analysis (LICA) on three complementary measures of gray matter structure: cortical thickness, area and gray matter density of 137 AD, 78 mild (MCI) and 38 subjective cognitive impairment patients, and 355 healthy adults aged 18-78 years to identify dissociable multivariate morphological patterns sensitive to age and diagnosis. Using the lasso classifier, we performed group classification and prediction of cognition and age at different age ranges to assess the sensitivity and diagnostic accuracy of the LICA patterns in relation to AD, as well as early and late healthy aging. Three components showed high sensitivity to the diagnosis and cognitive status of AD, with different relationships with age: one reflected an anterior-posterior gradient in thickness and gray matter density and was uniquely related to diagnosis, whereas the other two, reflecting widespread cortical thickness and medial temporal lobe volume, respectively, also correlated significantly with age. Repeating the LICA decomposition and between-subject analysis on ADNI data, including 186 AD, 395 MCI and 220 age-matched healthy controls, revealed largely consistent brain patterns and clinical associations across samples. Classification results showed that multivariate LICA-derived brain characteristics could be used to predict AD and age with high accuracy (area under ROC curve up to 0.93 for classification of AD from controls). Comparison between classifiers based on feature ranking and feature selection suggests both common and unique feature sets implicated in AD and aging, and provides evidence of distinct age-related differences in early compared to late aging. Copyright © 2017 Elsevier Inc. All rights reserved.
Chronic hepatitis C infection is associated with insulin resistance and lipid profiles.
Dai, Chia-Yen; Yeh, Ming-Lun; Huang, Chung-Feng; Hou, Chen-Hsiu; Hsieh, Ming-Yen; Huang, Jee-Fu; Lin, I-Ling; Lin, Zu-Yau; Chen, Shinn-Chern; Wang, Liang-Yen; Chuang, Wan-Long; Yu, Ming-Lung; Tung, Hung-Da
2015-05-01
Chronic hepatitis C virus (HCV) infection has been suggested to be associated with non-insulin-dependent diabetes mellitus and lipid profiles. This study aimed to investigate the possible relationships of insulin resistance (IR) and lipid profiles with chronic hepatitis C (CHC) patients in Taiwan. We enrolled 160 hospital-based CHC patients with liver biopsy and the 480 controlled individuals without CHC and chronic hepatitis B from communities without known history of non-insulin-dependent diabetes mellitus. Fasting plasma glucose, total cholesterol, high-density lipoprotein cholesterol (HDL-C), low-density lipoprotein cholesterol (LDL-C), triglycerides (TGs), alanine aminotransferase, and serum insulin levels, and homeostasis model assessment (HOMA-IR) were tested. When comparing factors between CHC patients, and sex- and age-matched controls who had no HCV infection, patients with HCV infection had a significantly higher alanine aminotransferase level, fasting plasma glucose level, insulin level, and HOMA-IR (P < 0.001, P = 0.023, P = 0.017, and P = 0.011, respectively), and significantly lower TG level (P = 0.023), total cholesterol, and HDL-C and LDL-C levels (all P < 0.001) than 480 controls. In multivariate logistic regression analyses, a low total cholesterol, a low TGs, and a high HOMA-IR are independent factors significantly associated with chronic HCV infection. In the 160 CHC patients (41 patients with high HOMA-IR [> 2.5]), a high body mass index, TGs, and HCV RNA level are independent factors significantly associated with high HOMA-IR in multivariate logistic analyses. Chronic HCV infection was associated with metabolic characteristics including IR and lipid profile. IR was also associated with virological characteristics. © 2013 Journal of Gastroenterology and Hepatology Foundation and Wiley Publishing Asia Pty Ltd.
Eating patterns and energy and nutrient intakes of US women.
Haines, P S; Hungerford, D W; Popkin, B M; Guilkey, D K
1992-06-01
A longitudinal multivariate analysis was used to determine whether differences in energy and nutrient intakes were present for women classified into different eating patterns. Ten multidimensional eating patterns were created based on the proportion of energy consumed at home and at seven away-from-home locations. Data were from 1,120 women aged 19 through 50 years who were surveyed up to six times over a 1-year period as part of the 1985 Continuing Survey of Food Intake by Individuals, US Department of Agriculture. Data from 5,993 days were analyzed. To examine differences in energy and nutrient intakes, longitudinal multivariate analyses were used to control for eating pattern and factors such as demographics, season, and day of week. Younger women in the Fast Food eating pattern consumed the greatest intakes of energy, total fat, saturated fat, cholesterol, and sodium. Well-educated, higher-income women in the Restaurant pattern consumed diets with the highest overall fat density. Nutrient densities for dietary fiber, calcium, vitamin C, and folacin were particularly low in away-from-home eating patterns. In contrast, moderately educated, middle-aged and middle-income women in the Home Mixed eating pattern (70% at home, 30% away from home) consumed the most healthful diets. We conclude that knowledge of demographics such as income and education is not enough to target dietary interventions. Rather, educational efforts must consider both demographics and the location of away-from-home eating. This will allow development of behavioral change strategies that consider food choices dictated by the eating environment as well as personal knowledge and attitude factors related to adoption of healthful food choices.
Li, Jia; Zhang, Haibo; Chen, Yongshan; Luo, Yongming; Zhang, Hua
2016-07-01
To quantify the extent of antibiotic contamination and to identity the dominant pollutant sources in the Tiaoxi River Watershed, surface water samples were collected at eight locations and analyzed for four tetracyclines and three sulfonamides using ultra-performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS). The observed maximum concentrations of tetracycline (623 ng L(-1)), oxytetracycline (19,810 ng L(-1)), and sulfamethoxazole (112 ng L(-1)) exceeded their corresponding Predicted No Effect Concentration (PNEC) values. In particular, high concentrations of antibiotics were observed in wet summer with heavy rainfall. The maximum concentrations of antibiotics appeared in the vicinity of intensive aquaculture areas. High-resolution land use data were used for identifying diffuse source of antibiotic pollution in the watershed. Significant correlations between tetracycline and developed (r = 0.93), tetracycline and barren (r = 0.87), oxytetracycline and barren (r = 0.82), and sulfadiazine and agricultural facilities (r = 0.71) were observed. In addition, the density of aquaculture significantly correlated with doxycycline (r = 0.74) and oxytetracycline (r = 0.76), while the density of livestock significantly correlated with sulfadiazine (r = 0.71). Principle Component Analysis (PCA) indicated that doxycycline, tetracycline, oxytetracycline, and sulfamethoxazole were from aquaculture and domestic sources, whereas sulfadiazine and sulfamethazine were from livestock wastewater. Flood or drainage from aquaculture ponds was identified as a major source of antibiotics in the Tiaoxi watershed. A hot-spot map was created based on results of land use analysis and multi-variable statistics, which provided an effective management tool of sources identification in watersheds with multiple diffuse sources of antibiotic pollution.
Ya, Gao; Qiu, Zhang; Tianrong, Pan
2018-06-01
Atherosclerotic cardiovascular disease is the leading cause of mortality of patients with type 2 diabetes mellitus, and both coronary artery disease (CAD) and diabetes mellitus are associated with inflammation. Emerging evidence suggests a relationship of the monocyte to high-density lipoprotein cholesterol ratio (MHR) with the incidence and severity of CAD. The aim of the present study was to examine the association of MHR with CAD in patients with type 2 diabetes mellitus. A total of 458 consecutive individuals were enrolled, comprising 178 type 2 diabetic patients, 124 type 2 diabetes with CAD, and 156 healthy volunteers as the controls. A multivariable logistic regression model was used to evaluate the relationship between the MHR and CAD in type 2 diabetes, and the receiver operating characteristic (ROC) curve of MHR was used for predicting the presence of CAD in type 2 diabetic patients. Values of MHR were significantly higher in type 2 diabetic patients with CAD compared with those without CAD and the control group. Moreover, multivariate logistic regression analysis showed that MHR was an independent predictor of the presence of CAD in type 2 diabetic patients (OR = 1.361, 95% CI 1.245 - 1.487, p < 0.0001). Based on the receiver operating characteristic (ROC) curve, the cutoff value of MHR (> 8.2) in predicting the presence of CAD in type 2 diabetic patients yields a sensitivity and specificity of 83.74% and 62.15%, respectively, with an area under the curve of 0.795 (95% CI: 0.745 - 0.840). The MHR is strongly associated with CAD in type 2 diabetes and might be a potential biomarker to predict the presence of CAD in type 2 diabetic patients.
Bone Mineral Density across a Range of Physical Activity Volumes: NHANES 2007–2010
Whitfield, Geoffrey P.; Kohrt, Wendy M.; Pettee Gabriel, Kelley K.; Rahbar, Mohammad H.; Kohl, Harold W.
2014-01-01
Introduction The association between aerobic physical activity volume and bone mineral density (BMD) is not completely understood. The purpose of this study was to clarify the association between BMD and aerobic activity across a broad range of activity volumes, in particular volumes between those recommended in the 2008 Physical Activity Guidelines for Americans and those of trained endurance athletes. Methods Data from the 2007–2010 National Health and Nutrition Examination Survey were used to quantify the association between reported physical activity and BMD at the lumbar spine and proximal femur across the entire range of activity volumes reported by US adults. Participants were categorized into multiples of the minimum guideline-recommended volume based on reported moderate and vigorous intensity leisure activity. Lumbar and proximal femur BMD was assessed with dual-energy x-ray absorptiometry. Results Among women, multivariable-adjusted linear regression analyses revealed no significant differences in lumbar BMD across activity categories, while proximal femur BMD was significantly higher among those who exceeded guidelines by 2–4 times than those who reported no activity. Among men, multivariable-adjusted BMD at both sites neared its highest values among those who exceeded guidelines by at least 4 times and was not progressively higher with additional activity. Logistic regression estimating the odds of low BMD generally echoed the linear regression results. Conclusion The association between physical activity volume and BMD is complex. Among women, exceeding guidelines by 2–4 times may be important for maximizing BMD at the proximal femur, while among men, exceeding guidelines by 4+ times may be beneficial for lumbar and proximal femur BMD. PMID:24870584
Modeling Compound Flood Hazards in Coastal Embayments
NASA Astrophysics Data System (ADS)
Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.
2017-12-01
Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the strengths/weaknesses of each approach and helps modelers choose the appropriate scenario that best fit to the needs of their project. The proposed risk assessment approach can help flood hazard modeling practitioners achieve a more reliable estimate of risk, by cautiously reducing the dimensionality of the hazard analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N; White, Devin A; Urban, Marie L
2013-01-01
The Population Density Tables (PDT) project at the Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort whichmore » considers over 250 countries, spans 40 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.« less
Human Activity Recognition in AAL Environments Using Random Projections.
Damaševičius, Robertas; Vasiljevas, Mindaugas; Šalkevičius, Justas; Woźniak, Marcin
2016-01-01
Automatic human activity recognition systems aim to capture the state of the user and its environment by exploiting heterogeneous sensors attached to the subject's body and permit continuous monitoring of numerous physiological signals reflecting the state of human actions. Successful identification of human activities can be immensely useful in healthcare applications for Ambient Assisted Living (AAL), for automatic and intelligent activity monitoring systems developed for elderly and disabled people. In this paper, we propose the method for activity recognition and subject identification based on random projections from high-dimensional feature space to low-dimensional projection space, where the classes are separated using the Jaccard distance between probability density functions of projected data. Two HAR domain tasks are considered: activity identification and subject identification. The experimental results using the proposed method with Human Activity Dataset (HAD) data are presented.
Human Activity Recognition in AAL Environments Using Random Projections
Damaševičius, Robertas; Vasiljevas, Mindaugas; Šalkevičius, Justas; Woźniak, Marcin
2016-01-01
Automatic human activity recognition systems aim to capture the state of the user and its environment by exploiting heterogeneous sensors attached to the subject's body and permit continuous monitoring of numerous physiological signals reflecting the state of human actions. Successful identification of human activities can be immensely useful in healthcare applications for Ambient Assisted Living (AAL), for automatic and intelligent activity monitoring systems developed for elderly and disabled people. In this paper, we propose the method for activity recognition and subject identification based on random projections from high-dimensional feature space to low-dimensional projection space, where the classes are separated using the Jaccard distance between probability density functions of projected data. Two HAR domain tasks are considered: activity identification and subject identification. The experimental results using the proposed method with Human Activity Dataset (HAD) data are presented. PMID:27413392
Hafeez, Sidra; Bujanda, Zoila Lopez; Chatterton, Robert T.; Jacobs, Lisa K.; Khouri, Nagi F.; Ivancic, David; Kenney, Kara; Shehata, Christina; Jeter, Stacie C.; Wolfman, Judith A.; Zalles, Carola M.; Huang, Peng
2016-01-01
Methods to determine individualized breast cancer risk lack sufficient sensitivity to select women most likely to benefit from preventive strategies. Alterations in DNA methylation occur early in breast cancer. We hypothesized that cancer-specific methylation markers could enhance breast cancer risk assessment. We evaluated 380 women without a history of breast cancer. We determined their menopausal status or menstrual cycle phase, risk of developing breast cancer (Gail model), and breast density, and obtained random fine needle aspiration (rFNA) samples for assessment of cytopathology and cumulative methylation index (CMI). Eight methylated gene markers were identified through whole genome methylation analysis and included novel and previously established breast cancer detection genes. We performed correlative and multivariate linear regression analyses to evaluate DNA methylation of a gene panel as a function of clinical factors associated with breast cancer risk. CMI and individual gene methylation were independent of age, menopausal status or menstrual phase, lifetime Gail risk score, and breast density. CMI and individual gene methylation for the eight genes increased significantly (p<0.001) with increasing cytological atypia. The findings were verified with multivariate analyses correcting for age, log (Gail), log (percent density), rFNA cell number and BMI. Our results demonstrate a significant association between cytological atypia and high CMI, which does not vary with menstrual phase or menopause and is independent of Gail risk and mammographic density. Thus CMI is an excellent candidate breast cancer risk biomarker, warranting larger prospective studies to establish its utility for cancer risk assessment. PMID:27261491
Physics-based Control-oriented Modeling of the Current Profile Evolution in NSTX-Upgrade
NASA Astrophysics Data System (ADS)
Ilhan, Zeki; Barton, Justin; Shi, Wenyu; Schuster, Eugenio; Gates, David; Gerhardt, Stefan; Kolemen, Egemen; Menard, Jonathan
2013-10-01
The operational goals for the NSTX-Upgrade device include non-inductive sustainment of high- β plasmas, realization of the high performance equilibrium scenarios with neutral beam heating, and achievement of longer pulse durations. Active feedback control of the current profile is proposed to enable these goals. Motivated by the coupled, nonlinear, multivariable, distributed-parameter plasma dynamics, the first step towards feedback control design is the development of a physics-based, control-oriented model for the current profile evolution in response to non-inductive current drives and heating systems. For this purpose, the nonlinear magnetic-diffusion equation is coupled with empirical models for the electron density, electron temperature, and non-inductive current drives (neutral beams). The resulting first-principles-driven, control-oriented model is tailored for NSTX-U based on the PTRANSP predictions. Main objectives and possible challenges associated with the use of the developed model for control design are discussed. This work was supported by PPPL.
Yan, En-Rong; Yang, Xiao-Dong; Chang, Scott X; Wang, Xi-Hua
2013-01-01
Understanding how plant trait-species abundance relationships change with a range of single and multivariate environmental properties is crucial for explaining species abundance and rarity. In this study, the abundance of 94 woody plant species was examined and related to 15 plant leaf and wood traits at both local and landscape scales involving 31 plots in subtropical forests in eastern China. Further, plant trait-species abundance relationships were related to a range of single and multivariate (PCA axes) environmental properties such as air humidity, soil moisture content, soil temperature, soil pH, and soil organic matter, nitrogen (N) and phosphorus (P) contents. At the landscape scale, plant maximum height, and twig and stem wood densities were positively correlated, whereas mean leaf area (MLA), leaf N concentration (LN), and total leaf area per twig size (TLA) were negatively correlated with species abundance. At the plot scale, plant maximum height, leaf and twig dry matter contents, twig and stem wood densities were positively correlated, but MLA, specific leaf area, LN, leaf P concentration and TLA were negatively correlated with species abundance. Plant trait-species abundance relationships shifted over the range of seven single environmental properties and along multivariate environmental axes in a similar way. In conclusion, strong relationships between plant traits and species abundance existed among and within communities. Significant shifts in plant trait-species abundance relationships in a range of environmental properties suggest strong environmental filtering processes that influence species abundance and rarity in the studied subtropical forests.
Empirical prediction intervals improve energy forecasting
Kaack, Lynn H.; Apt, Jay; Morgan, M. Granger; McSharry, Patrick
2017-01-01
Hundreds of organizations and analysts use energy projections, such as those contained in the US Energy Information Administration (EIA)’s Annual Energy Outlook (AEO), for investment and policy decisions. Retrospective analyses of past AEO projections have shown that observed values can differ from the projection by several hundred percent, and thus a thorough treatment of uncertainty is essential. We evaluate the out-of-sample forecasting performance of several empirical density forecasting methods, using the continuous ranked probability score (CRPS). The analysis confirms that a Gaussian density, estimated on past forecasting errors, gives comparatively accurate uncertainty estimates over a variety of energy quantities in the AEO, in particular outperforming scenario projections provided in the AEO. We report probabilistic uncertainties for 18 core quantities of the AEO 2016 projections. Our work frames how to produce, evaluate, and rank probabilistic forecasts in this setting. We propose a log transformation of forecast errors for price projections and a modified nonparametric empirical density forecasting method. Our findings give guidance on how to evaluate and communicate uncertainty in future energy outlooks. PMID:28760997
John M. Kabrick; David R. Larsen; Stephen R. Shifley
1997-01-01
We conducted a study to identify pre-treatment trends in woody species density, diameter, and basal area among MOFEP sites, blocks, and treatment areas; relate woody species differences among sites, blocks, and treatment areas to differences in environmental conditions; and identify potential treatment response differences based upon our fmdings. Sites 2 through 5 had...
Estimating topological properties of weighted networks from limited information
NASA Astrophysics Data System (ADS)
Gabrielli, Andrea; Cimini, Giulio; Garlaschelli, Diego; Squartini, Angelo
A typical problem met when studying complex systems is the limited information available on their topology, which hinders our understanding of their structural and dynamical properties. A paramount example is provided by financial networks, whose data are privacy protected. Yet, the estimation of systemic risk strongly depends on the detailed structure of the interbank network. The resulting challenge is that of using aggregate information to statistically reconstruct a network and correctly predict its higher-order properties. Standard approaches either generate unrealistically dense networks, or fail to reproduce the observed topology by assigning homogeneous link weights. Here we develop a reconstruction method, based on statistical mechanics concepts, that exploits the empirical link density in a highly non-trivial way. Technically, our approach consists in the preliminary estimation of node degrees from empirical node strengths and link density, followed by a maximum-entropy inference based on a combination of empirical strengths and estimated degrees. Our method is successfully tested on the international trade network and the interbank money market, and represents a valuable tool for gaining insights on privacy-protected or partially accessible systems. Acknoweledgement to ``Growthcom'' ICT - EC project (Grant No: 611272) and ``Crisislab'' Italian Project.
NASA Astrophysics Data System (ADS)
Palenčár, Rudolf; Sopkuliak, Peter; Palenčár, Jakub; Ďuriš, Stanislav; Suroviak, Emil; Halaj, Martin
2017-06-01
Evaluation of uncertainties of the temperature measurement by standard platinum resistance thermometer calibrated at the defining fixed points according to ITS-90 is a problem that can be solved in different ways. The paper presents a procedure based on the propagation of distributions using the Monte Carlo method. The procedure employs generation of pseudo-random numbers for the input variables of resistances at the defining fixed points, supposing the multivariate Gaussian distribution for input quantities. This allows taking into account the correlations among resistances at the defining fixed points. Assumption of Gaussian probability density function is acceptable, with respect to the several sources of uncertainties of resistances. In the case of uncorrelated resistances at the defining fixed points, the method is applicable to any probability density function. Validation of the law of propagation of uncertainty using the Monte Carlo method is presented on the example of specific data for 25 Ω standard platinum resistance thermometer in the temperature range from 0 to 660 °C. Using this example, we demonstrate suitability of the method by validation of its results.
Bécares, Laia; Nazroo, James; Jackson, James
2014-12-01
We examined the association between Black ethnic density and depressive symptoms among African Americans. We sought to ascertain whether a threshold exists in the association between Black ethnic density and an important mental health outcome, and to identify differential effects of this association across social, economic, and demographic subpopulations. We analyzed the African American sample (n = 3570) from the National Survey of American Life, which we geocoded to the 2000 US Census. We determined the threshold with a multivariable regression spline model. We examined differential effects of ethnic density with random-effects multilevel linear regressions stratified by sociodemographic characteristics. The protective association between Black ethnic density and depressive symptoms changed direction, becoming a detrimental effect, when ethnic density reached 85%. Black ethnic density was protective for lower socioeconomic positions and detrimental for the better-off categories. The masking effects of area deprivation were stronger in the highest levels of Black ethnic density. Addressing racism, racial discrimination, economic deprivation, and poor services-the main drivers differentiating ethnic density from residential segregation-will help to ensure that the racial/ethnic composition of a neighborhood is not a risk factor for poor mental health.
Nazroo, James; Jackson, James
2014-01-01
Objectives. We examined the association between Black ethnic density and depressive symptoms among African Americans. We sought to ascertain whether a threshold exists in the association between Black ethnic density and an important mental health outcome, and to identify differential effects of this association across social, economic, and demographic subpopulations. Methods. We analyzed the African American sample (n = 3570) from the National Survey of American Life, which we geocoded to the 2000 US Census. We determined the threshold with a multivariable regression spline model. We examined differential effects of ethnic density with random-effects multilevel linear regressions stratified by sociodemographic characteristics. Results. The protective association between Black ethnic density and depressive symptoms changed direction, becoming a detrimental effect, when ethnic density reached 85%. Black ethnic density was protective for lower socioeconomic positions and detrimental for the better-off categories. The masking effects of area deprivation were stronger in the highest levels of Black ethnic density. Conclusions. Addressing racism, racial discrimination, economic deprivation, and poor services—the main drivers differentiating ethnic density from residential segregation—will help to ensure that the racial/ethnic composition of a neighborhood is not a risk factor for poor mental health. PMID:25322307
NASA Astrophysics Data System (ADS)
Dehghani, H.; Ataee-Pour, M.
2012-12-01
The block economic value (EV) is one of the most important parameters in mine evaluation. This parameter can affect significant factors such as mining sequence, final pit limit and net present value. Nowadays, the aim of open pit mine planning is to define optimum pit limits and an optimum life of mine production scheduling that maximizes the pit value under some technical and operational constraints. Therefore, it is necessary to calculate the block economic value at the first stage of the mine planning process, correctly. Unrealistic block economic value estimation may cause the mining project managers to make the wrong decision and thus may impose inexpiable losses to the project. The effective parameters such as metal price, operating cost, grade and so forth are always assumed certain in the conventional methods of EV calculation. While, obviously, these parameters have uncertain nature. Therefore, usually, the conventional methods results are far from reality. In order to solve this problem, a new technique is used base on an invented binomial tree which is developed in this research. This method can calculate the EV and project PV under economic uncertainty. In this paper, the EV and project PV were initially determined using Whittle formula based on certain economic parameters and a multivariate binomial tree based on the economic uncertainties such as the metal price and cost uncertainties. Finally the results were compared. It is concluded that applying the metal price and cost uncertainties causes the calculated block economic value and net present value to be more realistic than certain conditions.
Project management techniques for highly integrated programs
NASA Technical Reports Server (NTRS)
Stewart, J. F.; Bauer, C. A.
1983-01-01
The management and control of a representative, highly integrated high-technology project, in the X-29A aircraft flight test project is addressed. The X-29A research aircraft required the development and integration of eight distinct technologies in one aircraft. The project management system developed for the X-29A flight test program focuses on the dynamic interactions and the the intercommunication among components of the system. The insights gained from the new conceptual framework permitted subordination of departments to more functional units of decisionmaking, information processing, and communication networks. These processes were used to develop a project management system for the X-29A around the information flows that minimized the effects inherent in sampled-data systems and exploited the closed-loop multivariable nature of highly integrated projects.
Multivariate Cryptography Based on Clipped Hopfield Neural Network.
Wang, Jia; Cheng, Lee-Ming; Su, Tong
2018-02-01
Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in space. The Diffie-Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography. The efficiency and security of our proposed new public key cryptosystem CHNN-MVC are simulated and found to be NP-hard. The proposed algorithm will strengthen multivariate public key cryptosystems and allows hardware realization practicality.
Chan, Emily Y Y; Kim, Jean H; Griffiths, Sian M; Lau, Joseph T F; Yu, Ignatius
2009-11-01
Injury is a major global disease burden for the twenty-first century. There are, however, few studies of unintentional household injury in Asian urban settings where living environments are characterized by extremely compact, high-living-density, multistory apartments. This study investigated the association between nonfatal unintentional household injuries with the resident's sociodemographic attributes and household characteristics in Hong Kong, the city with the world's highest population density. A cross-sectional retrospective recall study was conducted in May 2007 using a random telephone survey with a modified Chinese version of the World Health Organization Injury and Violence instrument. The study sample included 1,001 noninstitutionalized Cantonese-speaking Hong Kong residents of all ages, including foreign live-in domestic helpers. Multivariate regression was conducted to identify risk factors for nonfatal unintentional injuries in Hong Kong. Among a predominantly adult sample, household size and time spent at home were not associated with nonfatal unintentional household injuries in the general population in Hong Kong. The multivariate analyses indicated that female gender, owners of private homes, lower square footage of living space per person, and those with slip prevention devices in the bathroom were significantly associated with household injuries. Injured and noninjured groups were found to have adopted different injury prevention strategies toward household injuries. The results identified potential target groups for household injury prevention programs.
Global asymptotic stability of density dependent integral population projection models.
Rebarber, Richard; Tenhumberg, Brigitte; Townley, Stuart
2012-02-01
Many stage-structured density dependent populations with a continuum of stages can be naturally modeled using nonlinear integral projection models. In this paper, we study a trichotomy of global stability result for a class of density dependent systems which include a Platte thistle model. Specifically, we identify those systems parameters for which zero is globally asymptotically stable, parameters for which there is a positive asymptotically stable equilibrium, and parameters for which there is no asymptotically stable equilibrium. Copyright © 2011 Elsevier Inc. All rights reserved.
MULTIVARIATE RECEPTOR MODELING BY N-DIMENSIONAL EDGE DETECTION. (R826238)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
PSEUDOLIKELIHOOD MODELING OF MULTIVARIATE OUTCOMES IN DEVELOPMENTAL TOXICOLOGY. (R824757)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Septic system density and infectious diarrhea in a defined population of children.
Borchardt, Mark A; Chyou, Po-Huang; DeVries, Edna O; Belongia, Edward A
2003-05-01
One-quarter of U.S. households use a septic system for wastewater disposal. In this study we investigated whether septic system density was associated with endemic diarrheal illness in children. Cases--children 1 to < 19 years old seeking medical care for acute diarrhea--and controls resided in the Marshfield Epidemiologic Study Area, a population-based cohort in central Wisconsin. Enrollment was from February 1997 through September 1998. Study participants completed a structured interview, and septic system density was determined from county sanitary permits. Household wells were sampled for bacterial pathogens and indicators of water sanitary quality. Risk factors were assessed for cases grouped by diarrhea etiology. In multivariate analyses, viral diarrhea was associated with the number of holding tank septic systems in the 640-acre section surrounding the case residence [adjusted odds ratio (AOR), 1.08; 95% confidence interval (CI), 1.02-1.15; p = 0.008], and bacterial diarrhea was associated with the number of holding tanks per 40-acre quarter-quarter section (AOR, 1.22; 95% CI, 1.02-1.46; p = 0.026). Diarrhea of unknown etiology was independently associated with drinking from a household well contaminated with fecal enterococci (AOR, 6.18; 95% CI, 1.22-31.46; p = 0.028). Septic system densities were associated with endemic diarrheal illness in central Wisconsin. The association should be investigated in other regions, and standards for septic systems should be evaluated to ensure that the public health is protected.
Septic system density and infectious diarrhea in a defined population of children.
Borchardt, Mark A; Chyou, Po-Huang; DeVries, Edna O; Belongia, Edward A
2003-01-01
One-quarter of U.S. households use a septic system for wastewater disposal. In this study we investigated whether septic system density was associated with endemic diarrheal illness in children. Cases--children 1 to < 19 years old seeking medical care for acute diarrhea--and controls resided in the Marshfield Epidemiologic Study Area, a population-based cohort in central Wisconsin. Enrollment was from February 1997 through September 1998. Study participants completed a structured interview, and septic system density was determined from county sanitary permits. Household wells were sampled for bacterial pathogens and indicators of water sanitary quality. Risk factors were assessed for cases grouped by diarrhea etiology. In multivariate analyses, viral diarrhea was associated with the number of holding tank septic systems in the 640-acre section surrounding the case residence [adjusted odds ratio (AOR), 1.08; 95% confidence interval (CI), 1.02-1.15; p = 0.008], and bacterial diarrhea was associated with the number of holding tanks per 40-acre quarter-quarter section (AOR, 1.22; 95% CI, 1.02-1.46; p = 0.026). Diarrhea of unknown etiology was independently associated with drinking from a household well contaminated with fecal enterococci (AOR, 6.18; 95% CI, 1.22-31.46; p = 0.028). Septic system densities were associated with endemic diarrheal illness in central Wisconsin. The association should be investigated in other regions, and standards for septic systems should be evaluated to ensure that the public health is protected. PMID:12727604
NASA Astrophysics Data System (ADS)
Vanfleteren, Diederik; Van Neck, Dimitri; Bultinck, Patrick; Ayers, Paul W.; Waroquier, Michel
2012-01-01
A previously introduced partitioning of the molecular one-electron density matrix over atoms and bonds [D. Vanfleteren et al., J. Chem. Phys. 133, 231103 (2010)] is investigated in detail. Orthogonal projection operators are used to define atomic subspaces, as in Natural Population Analysis. The orthogonal projection operators are constructed with a recursive scheme. These operators are chemically relevant and obey a stockholder principle, familiar from the Hirshfeld-I partitioning of the electron density. The stockholder principle is extended to density matrices, where the orthogonal projectors are considered to be atomic fractions of the summed contributions. All calculations are performed as matrix manipulations in one-electron Hilbert space. Mathematical proofs and numerical evidence concerning this recursive scheme are provided in the present paper. The advantages associated with the use of these stockholder projection operators are examined with respect to covalent bond orders, bond polarization, and transferability.
Cathodes for molten-salt batteries
NASA Technical Reports Server (NTRS)
Argade, Shyam D.
1993-01-01
Viewgraphs of the discussion on cathodes for molten-salt batteries are presented. For the cathode reactions in molten-salt cells, chlorine-based and sulfur-based cathodes reactants have relatively high exchange current densities. Sulfur-based cathodes, metal sulfides, and disulfides have been extensively investigated. Primary thermal batteries of the Li-alloy/FeS2 variety have been available for a number of years. Chlorine based rechargable cathodes were investigated for the pulse power application. A brief introduction is followed by the experimental aspects of research, and the results obtained. Performance projections to the battery system level are discussed and the presentation is summarized with conclusions.
Luan, Xiaoli; Chen, Qiang; Liu, Fei
2014-09-01
This article presents a new scheme to design full matrix controller for high dimensional multivariable processes based on equivalent transfer function (ETF). Differing from existing ETF method, the proposed ETF is derived directly by exploiting the relationship between the equivalent closed-loop transfer function and the inverse of open-loop transfer function. Based on the obtained ETF, the full matrix controller is designed utilizing the existing PI tuning rules. The new proposed ETF model can more accurately represent the original processes. Furthermore, the full matrix centralized controller design method proposed in this paper is applicable to high dimensional multivariable systems with satisfactory performance. Comparison with other multivariable controllers shows that the designed ETF based controller is superior with respect to design-complexity and obtained performance. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Evaluation of Non-Oxide Fuel for Fission-based Nuclear Reactors on Spacecraft
smaller and potentially lighter core, whichis a significant advantage. The results of this study indicate that use of both UC and UN may result in significant weight savings due tohigher uranium loading density....The goal of this project was to study the performance of atypical uranium-based fuels in a nuclear reactor capable of producing 1 megawattof thermal...UN), or uranium carbide (UC) and compared their performance to uranium oxide (UO2) which is thefuel form used in the vast majority of commercial
2007-07-21
the spin coherent states P-representation", Conference on Quantum Computations and Many- Body Systems, February 2006, Key West, FL 9. B. N. Harmon...solid-state spin-based qubit systems was the focus of our project. Since decoherence is a complex many- body non-equilibrium process, and its...representation of the density matrix, see Sec. 3 below). This work prompted J. Taylor from the experimental group of C. Marcus and M. Lukin (funded by
Casanova, Ramon; Espeland, Mark A; Goveas, Joseph S; Davatzikos, Christos; Gaussoin, Sarah A; Maldjian, Joseph A; Brunner, Robert L; Kuller, Lewis H; Johnson, Karen C; Mysiw, W Jerry; Wagner, Benjamin; Resnick, Susan M
2011-05-01
Use of conjugated equine estrogens (CEE) has been linked to smaller regional brain volumes in women aged ≥65 years; however, it is unknown whether this results in a broad-based characteristic pattern of effects. Structural magnetic resonance imaging was used to assess regional volumes of normal tissue and ischemic lesions among 513 women who had been enrolled in a randomized clinical trial of CEE therapy for an average of 6.6 years, beginning at ages 65-80 years. A multivariate pattern analysis, based on a machine learning technique that combined Random Forest and logistic regression with L(1) penalty, was applied to identify patterns among regional volumes associated with therapy and whether patterns discriminate between treatment groups. The multivariate pattern analysis detected smaller regional volumes of normal tissue within the limbic and temporal lobes among women that had been assigned to CEE therapy. Mean decrements ranged as high as 7% in the left entorhinal cortex and 5% in the left perirhinal cortex, which exceeded the effect sizes reported previously in frontal lobe and hippocampus. Overall accuracy of classification based on these patterns, however, was projected to be only 54.5%. Prescription of CEE therapy for an average of 6.6 years is associated with lower regional brain volumes, but it does not induce a characteristic spatial pattern of changes in brain volumes of sufficient magnitude to discriminate users and nonusers. Copyright © 2011 Elsevier Inc. All rights reserved.
Patterson, Leslie; McGinley, Emily; Ertl, Kristyn; Morzinski, Jeffrey; Fyfe, Robert; Whittle, Jeff
2012-01-01
Research shows that community-based membership organizations are effective partners in health promotion activities; however, most community organizations do not participate in such partnerships. There is little research regarding the geographical and organizational characteristics associated with participation. We examined the factors associated with community-based veterans service organization (VSO) units' decision to participate in a health promotion project. We collected location and organizational characteristics regarding 218 VSO units asked to participate in POWER, a partnership to improve hypertension self-management skills between the Medical College of Wisconsin, the Milwaukee Veterans Affairs Medical Center (VAMC) and Wisconsin branches of the American Legion, Veterans of Foreign Wars (VFW), Vietnam Veterans of America, and National Association of Black Veterans. We tested the association of these characteristics with participation using chi-square and Fisher's exact tests for categorical variables, and analysis of variance and the Kruskal-Wallis test for continuous variables. We used multivariable logistic regression to identify factors independently associated with participation. In bivariable analyses, likelihood of participation was positively associated with increasing membership (p < .001), meeting attendance (p < .001), publication of an organizational newsletter (p < .001), presence of a women's auxiliary (p = .02), and location within 44 miles of the VAMC (p = .047). On multivariable analysis, only meeting attendance and census tract-level educational attainment predicted participation. Greater membership sizes, meeting attendance, and more group resources might be important factors for researchers to consider when initiating community-based health and wellness programs.
Casanova, Ramon; Espeland, Mark A.; Goveas, Joseph S.; Davatzikos, Christos; Gaussoin, Sarah A.; Maldjian, Joseph A.; Brunner, Robert L.; Kuller, Lewis H.; Johnson, Karen C.; Mysiw, W. Jerry; Wagner, Benjamin; Resnick, Susan M.
2011-01-01
Use of conjugated equine estrogens (CEE) has been linked to smaller regional brain volumes in women aged ≥65 years, however it is unknown whether this results in a broad-based characteristic pattern of effects. Structural MRI was used to assess regional volumes of normal tissue and ischemic lesions among 513 women who had been enrolled in a randomized clinical trial of CEE therapy for an average of 6.6 years, beginning at ages 65-80 years. A multivariate pattern analysis, based on a machine learning technique that combined Random Forest and logistic regression with L1 penalty, was applied to identify patterns among regional volumes associated with therapy and whether patterns discriminate between treatment groups. The multivariate pattern analysis detected smaller regional volumes of normal tissue within the limbic and temporal lobes among women that had been assigned to CEE therapy. Mean decrements ranged as high as 7% in the left entorhinal cortex and 5% in the left perirhinal cortex, which exceeded the effect sizes reported previously in frontal lobe and hippocampus. Overall accuracy of classification based on these patterns, however, was projected to be only 54.5%. Prescription of CEE therapy for an average of 6.6 years is associated with lower regional brain volumes, but it does not induce a characteristic spatial pattern of changes in brain volumes of sufficient magnitude to discriminate users and non-users. PMID:21292420
Ali, H Raza; Dariush, Aliakbar; Provenzano, Elena; Bardwell, Helen; Abraham, Jean E; Iddawela, Mahesh; Vallier, Anne-Laure; Hiller, Louise; Dunn, Janet A; Bowden, Sarah J; Hickish, Tamas; McAdam, Karen; Houston, Stephen; Irwin, Mike J; Pharoah, Paul D P; Brenton, James D; Walton, Nicholas A; Earl, Helena M; Caldas, Carlos
2016-02-16
There is a need to improve prediction of response to chemotherapy in breast cancer in order to improve clinical management and this may be achieved by harnessing computational metrics of tissue pathology. We investigated the association between quantitative image metrics derived from computational analysis of digital pathology slides and response to chemotherapy in women with breast cancer who received neoadjuvant chemotherapy. We digitised tissue sections of both diagnostic and surgical samples of breast tumours from 768 patients enrolled in the Neo-tAnGo randomized controlled trial. We subjected digital images to systematic analysis optimised for detection of single cells. Machine-learning methods were used to classify cells as cancer, stromal or lymphocyte and we computed estimates of absolute numbers, relative fractions and cell densities using these data. Pathological complete response (pCR), a histological indicator of chemotherapy response, was the primary endpoint. Fifteen image metrics were tested for their association with pCR using univariate and multivariate logistic regression. Median lymphocyte density proved most strongly associated with pCR on univariate analysis (OR 4.46, 95 % CI 2.34-8.50, p < 0.0001; observations = 614) and on multivariate analysis (OR 2.42, 95 % CI 1.08-5.40, p = 0.03; observations = 406) after adjustment for clinical factors. Further exploratory analyses revealed that in approximately one quarter of cases there was an increase in lymphocyte density in the tumour removed at surgery compared to diagnostic biopsies. A reduction in lymphocyte density at surgery was strongly associated with pCR (OR 0.28, 95 % CI 0.17-0.47, p < 0.0001; observations = 553). A data-driven analysis of computational pathology reveals lymphocyte density as an independent predictor of pCR. Paradoxically an increase in lymphocyte density, following exposure to chemotherapy, is associated with a lack of pCR. Computational pathology can provide objective, quantitative and reproducible tissue metrics and represents a viable means of outcome prediction in breast cancer. ClinicalTrials.gov NCT00070278 ; 03/10/2003.
Multivariate Lipschitz optimization: Survey and computational comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, P.; Gourdin, E.; Jaumard, B.
1994-12-31
Many methods have been proposed to minimize a multivariate Lipschitz function on a box. They pertain the three approaches: (i) reduction to the univariate case by projection (Pijavskii) or by using a space-filling curve (Strongin); (ii) construction and refinement of a single upper bounding function (Pijavskii, Mladineo, Mayne and Polak, Jaumard Hermann and Ribault, Wood...); (iii) branch and bound with local upper bounding functions (Galperin, Pint{acute e}r, Meewella and Mayne, the present authors). A survey is made, stressing similarities of algorithms, expressed when possible within a unified framework. Moreover, an extensive computational comparison is reported on.
Selecting minimum dataset soil variables using PLSR as a regressive multivariate method
NASA Astrophysics Data System (ADS)
Stellacci, Anna Maria; Armenise, Elena; Castellini, Mirko; Rossi, Roberta; Vitti, Carolina; Leogrande, Rita; De Benedetto, Daniela; Ferrara, Rossana M.; Vivaldi, Gaetano A.
2017-04-01
Long-term field experiments and science-based tools that characterize soil status (namely the soil quality indices, SQIs) assume a strategic role in assessing the effect of agronomic techniques and thus in improving soil management especially in marginal environments. Selecting key soil variables able to best represent soil status is a critical step for the calculation of SQIs. Current studies show the effectiveness of statistical methods for variable selection to extract relevant information deriving from multivariate datasets. Principal component analysis (PCA) has been mainly used, however supervised multivariate methods and regressive techniques are progressively being evaluated (Armenise et al., 2013; de Paul Obade et al., 2016; Pulido Moncada et al., 2014). The present study explores the effectiveness of partial least square regression (PLSR) in selecting critical soil variables, using a dataset comparing conventional tillage and sod-seeding on durum wheat. The results were compared to those obtained using PCA and stepwise discriminant analysis (SDA). The soil data derived from a long-term field experiment in Southern Italy. On samples collected in April 2015, the following set of variables was quantified: (i) chemical: total organic carbon and nitrogen (TOC and TN), alkali-extractable C (TEC and humic substances - HA-FA), water extractable N and organic C (WEN and WEOC), Olsen extractable P, exchangeable cations, pH and EC; (ii) physical: texture, dry bulk density (BD), macroporosity (Pmac), air capacity (AC), and relative field capacity (RFC); (iii) biological: carbon of the microbial biomass quantified with the fumigation-extraction method. PCA and SDA were previously applied to the multivariate dataset (Stellacci et al., 2016). PLSR was carried out on mean centered and variance scaled data of predictors (soil variables) and response (wheat yield) variables using the PLS procedure of SAS/STAT. In addition, variable importance for projection (VIP) statistics was used to quantitatively assess the predictors most relevant for response variable estimation and then for variable selection (Andersen and Bro, 2010). PCA and SDA returned TOC and RFC as influential variables both on the set of chemical and physical data analyzed separately as well as on the whole dataset (Stellacci et al., 2016). Highly weighted variables in PCA were also TEC, followed by K, and AC, followed by Pmac and BD, in the first PC (41.2% of total variance); Olsen P and HA-FA in the second PC (12.6%), Ca in the third (10.6%) component. Variables enabling maximum discrimination among treatments for SDA were WEOC, on the whole dataset, humic substances, followed by Olsen P, EC and clay, in the separate data analyses. The highest PLS-VIP statistics were recorded for Olsen P and Pmac, followed by TOC, TEC, pH and Mg for chemical variables and clay, RFC and AC for the physical variables. Results show that different methods may provide different ranking of the selected variables and the presence of a response variable, in regressive techniques, may affect variable selection. Further investigation with different response variables and with multi-year datasets would allow to better define advantages and limits of single or combined approaches. Acknowledgment The work was supported by the projects "BIOTILLAGE, approcci innovative per il miglioramento delle performances ambientali e produttive dei sistemi cerealicoli no-tillage", financed by PSR-Basilicata 2007-2013, and "DESERT, Low-cost water desalination and sensor technology compact module" financed by ERANET-WATERWORKS 2014. References Andersen C.M. and Bro R., 2010. Variable selection in regression - a tutorial. Journal of Chemometrics, 24 728-737. Armenise et al., 2013. Developing a soil quality index to compare soil fitness for agricultural use under different managements in the mediterranean environment. Soil and Tillage Research, 130:91-98. de Paul Obade et al., 2016. A standardized soil quality index for diverse field conditions. Sci. Total Env. 541:424-434. Pulido Moncada et al., 2014. Data-driven analysis of soil quality indicators using limited data. Geoderma, 235:271-278. Stellacci et al., 2016. Comparison of different multivariate methods to select key soil variables for soil quality indices computation. XLV Congress of the Italian Society of Agronomy (SIA), Sassari, 20-22 September 2016.
An ultra-compact processor module based on the R3000
NASA Astrophysics Data System (ADS)
Mullenhoff, D. J.; Kaschmitter, J. L.; Lyke, J. C.; Forman, G. A.
1992-08-01
Viable high density packaging is of critical importance for future military systems, particularly space borne systems which require minimum weight and size and high mechanical integrity. A leading, emerging technology for high density packaging is multi-chip modules (MCM). During the 1980's, a number of different MCM technologies have emerged. In support of Strategic Defense Initiative Organization (SDIO) programs, Lawrence Livermore National Laboratory (LLNL) has developed, utilized, and evaluated several different MCM technologies. Prior LLNL efforts include modules developed in 1986, using hybrid wafer scale packaging, which are still operational in an Air Force satellite mission. More recent efforts have included very high density cache memory modules, developed using laser pantography. As part of the demonstration effort, LLNL and Phillips Laboratory began collaborating in 1990 in the Phase 3 Multi-Chip Module (MCM) technology demonstration project. The goal of this program was to demonstrate the feasibility of General Electric's (GE) High Density Interconnect (HDI) MCM technology. The design chosen for this demonstration was the processor core for a MIPS R3000 based reduced instruction set computer (RISC), which has been described previously. It consists of the R3000 microprocessor, R3010 floating point coprocessor and 128 Kbytes of cache memory.
Geurts, Brigitte P; Neerincx, Anne H; Bertrand, Samuel; Leemans, Manja A A P; Postma, Geert J; Wolfender, Jean-Luc; Cristescu, Simona M; Buydens, Lutgarde M C; Jansen, Jeroen J
2017-04-22
Revealing the biochemistry associated to micro-organismal interspecies interactions is highly relevant for many purposes. Each pathogen has a characteristic metabolic fingerprint that allows identification based on their unique multivariate biochemistry. When pathogen species come into mutual contact, their co-culture will display a chemistry that may be attributed both to mixing of the characteristic chemistries of the mono-cultures and to competition between the pathogens. Therefore, investigating pathogen development in a polymicrobial environment requires dedicated chemometric methods to untangle and focus upon these sources of variation. The multivariate data analysis method Projected Orthogonalised Chemical Encounter Monitoring (POCHEMON) is dedicated to highlight metabolites characteristic for the interaction of two micro-organisms in co-culture. However, this approach is currently limited to a single time-point, while development of polymicrobial interactions may be highly dynamic. A well-known multivariate implementation of Analysis of Variance (ANOVA) uses Principal Component Analysis (ANOVA-PCA). This allows the overall dynamics to be separated from the pathogen-specific chemistry to analyse the contributions of both aspects separately. For this reason, we propose to integrate ANOVA-PCA with the POCHEMON approach to disentangle the pathogen dynamics and the specific biochemistry in interspecies interactions. Two complementary case studies show great potential for both liquid and gas chromatography - mass spectrometry to reveal novel information on chemistry specific to interspecies interaction during pathogen development. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gomben, Peter; Lilieholm, Robert; Gonzalez-Guillen, Manuel
2012-02-01
During the post-World War II era, the Mojave Desert Region of San Bernardino County, California, has experienced rapid levels of population growth. Over the past several decades, growth has accelerated, accompanied by significant shifts in ethnic composition, most notably from predominantly White non-Hispanic to Hispanic. This study explores the impacts of changing ethnicity on future development and the loss of open space by modeling ethnic propensities regarding family size and settlement preferences reflected by U.S. Census Bureau data. Demographic trends and land conversion data were obtained for seven Mojave Desert communities for the period between 1990 and 2001. Using a spatially explicit, logistic regression-based urban growth model, these data and trends were used to project community-specific future growth patterns from 2000 to 2020 under three future settlement scenarios: (1) an "historic" scenario reported in earlier research that uses a Mojave-wide average settlement density of 3.76 persons/ha; (2) an "existing" scenario based on community-specific settlement densities as of 2001; and (3) a "demographic futures" scenario based on community-specific settlement densities that explicitly model the Region's changing ethnicity. Results found that under the demographic futures scenario, by 2020 roughly 53% of within-community open space would remain, under the existing scenario only 40% would remain, and under the historic scenario model the communities would have what amounts to a deficit of open space. Differences in the loss of open space across the scenarios demonstrate the importance of considering demographic trends that are reflective of the residential needs and preferences of projected future populations.
Gomben, Peter; Lilieholm, Robert; Gonzalez-Guillen, Manuel
2012-02-01
During the post-World War II era, the Mojave Desert Region of San Bernardino County, California, has experienced rapid levels of population growth. Over the past several decades, growth has accelerated, accompanied by significant shifts in ethnic composition, most notably from predominantly White non-Hispanic to Hispanic. This study explores the impacts of changing ethnicity on future development and the loss of open space by modeling ethnic propensities regarding family size and settlement preferences reflected by U.S. Census Bureau data. Demographic trends and land conversion data were obtained for seven Mojave Desert communities for the period between 1990 and 2001. Using a spatially explicit, logistic regression-based urban growth model, these data and trends were used to project community-specific future growth patterns from 2000 to 2020 under three future settlement scenarios: (1) an "historic" scenario reported in earlier research that uses a Mojave-wide average settlement density of 3.76 persons/ha; (2) an "existing" scenario based on community-specific settlement densities as of 2001; and (3) a "demographic futures" scenario based on community-specific settlement densities that explicitly model the Region's changing ethnicity. Results found that under the demographic futures scenario, by 2020 roughly 53% of within-community open space would remain, under the existing scenario only 40% would remain, and under the historic scenario model the communities would have what amounts to a deficit of open space. Differences in the loss of open space across the scenarios demonstrate the importance of considering demographic trends that are reflective of the residential needs and preferences of projected future populations.
DARPA Advanced High Current Density Cathodes for Defense Applications: Development Phase
1993-03-01
Project Number 01-0624-07-0857 Report Number SAIC-93/1018 March 1, 1993 Science Apphcations Internatia Corporation An Employee-Owned Company OTIC a...Density Cathodes for Defense Applications: Development Phase FINAL REPORT Contract Number N00014-90-C-2118 Project Number 01-0624-07-0857 Report...of a typical Si-TaSi2 boule used for the eutectic advanced cathode materials in this project . The seed for the boule is at right in the photograph. v
Watari, Yuya; Nishijima, Shota; Fukasawa, Marina; Yamada, Fumio; Abe, Shintaro; Miyashita, Tadashi
2013-11-01
For maintaining social and financial support for eradication programs of invasive species, quantitative assessment of recovery of native species or ecosystems is important because it provides a measurable parameter of success. However, setting a concrete goal for recovery is often difficult owing to lack of information prior to the introduction of invaders. Here, we present a novel approach to evaluate the achievement level of invasive predator management based on the carrying capacity of endangered species estimated using long-term monitoring data. In Amami-Oshima Island, Japan, where the eradication project of introduced small Indian mongoose is ongoing since 2000, we surveyed the population densities of four endangered species threatened by the mongoose (Amami rabbit, the Otton frog, Amami tip-nosed frog, and Amami Ishikawa's frog) at four time points ranging from 2003 to 2011. We estimated the carrying capacities of these species using the logistic growth model combined with the effects of mongoose predation and environmental heterogeneity. All species showed clear tendencies toward increasing their density in line with decreased mongoose density, and they exhibited density-dependent population growth. The estimated carrying capacities of three endangered species had small confidence intervals enough to measure recovery levels by the mongoose management. The population density of each endangered species has recovered to the level of the carrying capacity at about 20-40% of all sites, whereas no individuals were observed at more than 25% of all sites. We propose that the present approach involving appropriate monitoring data of native organism populations will be widely applicable to various eradication projects and provide unambiguous goals for management of invasive species.
Assessment of lesser prairie-chicken lek density relative to landscape characteristics in Texas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timmer, Jennifer; Butler, Matthew; Ballard, Warren
My 2.5-yr Master's project accomplished the objectives of estimating lesser prairie-chicken (LPC) lek density and abundance in the Texas occupied range and modeling anthropogenic and landscape features associated with lek density by flying helicopter lek surveys for 2 field seasons and employing a line-transect distance sampling method. This project was important for several reasons. Firstly, wildlife managers and biologists have traditionally monitored LPC populations with road-based surveys that may result in biased estimates and do not provide access to privately-owned or remote property. From my aerial surveys and distance sampling, I was able to provide accurate density and abundance estimates,more » as well as new leks and I detected LPCs outside the occupied range. Secondly, recent research has indicated that energy development has the potential to impact LPCs through avoidance of tall structures, increased mortality from raptors perching on transmission lines, disturbance to nesting hens, and habitat loss/fragmentation. Given the potential wind energy development in the Texas Panhandle, spatial models of current anthropogenic and vegetative features (such as transmission lines, roads, and percent native grassland) influencing lek density were needed. This information provided wildlife managers and wind energy developers in Texas with guidelines for how change in landscape features could impact LPCs. Lastly, LPC populations have faced range-wide declines over the last century and they are currently listed as a candidate species under the Endangered Species Act. I was able to provide timely information on LPC populations in Texas that will be used during the listing process.« less
The Prognostic Value of Tumor-Infiltrating Neutrophils in Gastric Adenocarcinoma after Resection
Wang, Wei; Chen, Ju-gao; Wu, Yan-heng; Lv, Lin; Li, Jian-jun; Chen, Yi-bing; Wang, Dan-dan; Pan, Qiu-zhong; Li, Xiao-dong; Xia, Jian-chuan
2012-01-01
Background Several pieces of evidence indicate that tumor-infiltrating neutrophils (TINs) are correlated to tumor progression. In the current study, we explore the relationship between TINs and clinicopathological features of gastric adenocarcinoma patients. Furthermore, we investigated the prognostic value of TINs. Patients and Methods The study was comprised of two groups, training group (115 patients) and test group (97 patients). Biomarkers (intratumoral CD15+ neutrophils) were assessed by immunohistochemistry. The relationship between clinicopathological features and patient outcome were evaluated using Cox regression and Kaplan-Meier analysis. Results Immunohistochemical detection showed that the tumor-infiltrating neutrophils (TINs) in the training group ranged from 0.00–115.70 cells/high-power microscopic field (HPF) and the median number was 21.60 cells/HPF. Based on the median number, the patients were divided into high and low TINs groups. Chi-square test analysis revealed that the density of CD15+ TINs was positively associated with lymph node metastasis (p = 0.024), distance metastasis (p = 0.004) and UICC (International Union Against Cancer) staging (p = 0.028). Kaplan-Meier analysis showed that patients with a lower density of TINs had a better prognosis than patients with a higher density of TINs (p = 0.002). Multivariate Cox's analysis showed that the density of CD15+ TINs was an independent prognostic factor for overall survival of gastric adenocarcinoma patients. Using another 97 patients as a test group and basing on the median number of TINs (21.60 cells/HPF) coming from the training group, Kaplan-Meier analysis also showed that patients with a lower density of TINs had a better prognosis than patients with a higher density of TINs (p = 0.032). The results verify that the number of CD15+ TINs can predict the survival of gastric adenocarcinoma surgical patients. Conclusions The presence of CD15+ TINs is an independent and unfavorable factor in the prognosis of gastric adenocarcinoma patients. Targeting CD15+ TINs may be a potential intervenient therapy in the future. PMID:22442706
FROM FINANCE TO COSMOLOGY: THE COPULA OF LARGE-SCALE STRUCTURE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scherrer, Robert J.; Berlind, Andreas A.; Mao, Qingqing
2010-01-01
Any multivariate distribution can be uniquely decomposed into marginal (one-point) distributions, and a function called the copula, which contains all of the information on correlations between the distributions. The copula provides an important new methodology for analyzing the density field in large-scale structure. We derive the empirical two-point copula for the evolved dark matter density field. We find that this empirical copula is well approximated by a Gaussian copula. We consider the possibility that the full n-point copula is also Gaussian and describe some of the consequences of this hypothesis. Future directions for investigation are discussed.
Epidemiology of uveitis among the Chinese population in Taiwan: a population-based study.
Hwang, De-Kuang; Chou, Yiing-Jeng; Pu, Cheng-Yun; Chou, Pesus
2012-11-01
This study aimed to investigate the incidence and prevalence of uveitis in Taiwan, and then analyzed the risk factors related to uveitis using multivariate regression. Population-based cohort study using medical claims data. We randomly selected 1 000 000 residents from the Taiwan National Health Insurance Research Database. All participants with correct registry data (96%) were included in the study. The study period was from 2000 to 2008. All types of uveitis were identified using the International Classification of Diseases, 9th revision, Clinical Modification diagnostic codes. The annual incidence and cumulative prevalence of uveitis were calculated. A univariate and a multivariate Poisson regression were used to determine the risk factors associated with uveitis. The first diagnosis of uveitis noted during the study period. The annual cumulative incidence rate of uveitis ranged from 102.2 to 122.0 cases per 100 000 persons over the study period, and the average incidence density was 111.3 cases per 100 000 person-years (95% confidence interval, 108.4-114.1). The cumulative prevalence was found to have increased from 318.8 cases per 100 000 persons in 2003 to 622.7 cases per 100 000 persons in 2008. Anterior uveitis was the most common location and accounted for 77.7% of all incident cases, which was followed by panuveitis, posterior uveitis, and intermediate uveitis. Multivariate regression analysis showed that males, the elderly, and individuals who lived in an urban area had higher incidence rates for uveitis. The epidemiology of uveitis in Taiwan differs from most previous studies in other countries. The incidence of uveitis in Taiwan has increased significantly recently. The elderly and individuals living in urban areas are the populations that are most commonly affected by uveitis. These findings are consistent with suggestions found in several recent studies. Copyright © 2012 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Fast Genome-Wide QTL Association Mapping on Pedigree and Population Data.
Zhou, Hua; Blangero, John; Dyer, Thomas D; Chan, Kei-Hang K; Lange, Kenneth; Sobel, Eric M
2017-04-01
Since most analysis software for genome-wide association studies (GWAS) currently exploit only unrelated individuals, there is a need for efficient applications that can handle general pedigree data or mixtures of both population and pedigree data. Even datasets thought to consist of only unrelated individuals may include cryptic relationships that can lead to false positives if not discovered and controlled for. In addition, family designs possess compelling advantages. They are better equipped to detect rare variants, control for population stratification, and facilitate the study of parent-of-origin effects. Pedigrees selected for extreme trait values often segregate a single gene with strong effect. Finally, many pedigrees are available as an important legacy from the era of linkage analysis. Unfortunately, pedigree likelihoods are notoriously hard to compute. In this paper, we reexamine the computational bottlenecks and implement ultra-fast pedigree-based GWAS analysis. Kinship coefficients can either be based on explicitly provided pedigrees or automatically estimated from dense markers. Our strategy (a) works for random sample data, pedigree data, or a mix of both; (b) entails no loss of power; (c) allows for any number of covariate adjustments, including correction for population stratification; (d) allows for testing SNPs under additive, dominant, and recessive models; and (e) accommodates both univariate and multivariate quantitative traits. On a typical personal computer (six CPU cores at 2.67 GHz), analyzing a univariate HDL (high-density lipoprotein) trait from the San Antonio Family Heart Study (935,392 SNPs on 1,388 individuals in 124 pedigrees) takes less than 2 min and 1.5 GB of memory. Complete multivariate QTL analysis of the three time-points of the longitudinal HDL multivariate trait takes less than 5 min and 1.5 GB of memory. The algorithm is implemented as the Ped-GWAS Analysis (Option 29) in the Mendel statistical genetics package, which is freely available for Macintosh, Linux, and Windows platforms from http://genetics.ucla.edu/software/mendel. © 2016 WILEY PERIODICALS, INC.
Simprini, Lauren A; Villines, Todd C; Rich, Michael; Taylor, Allen J
2012-01-01
Non-high-density lipoprotein (HDL) cholesterol is recommended as a secondary lipid goal treated initially with lifestyle modification. However, the relationship between non-HDL and subclinical atherosclerosis is unknown. We examined the independent relationships between coronary artery calcium (CAC), lipids including non-HDL, exercise, and diet among healthy male participants of the Prospective Army Coronary Calcium (PACC) Project. Male participants from the PACC Project (n = 1637, mean age 42.8 years; no history of coronary heart disease) were studied. We used validated surveys to measure dietary quality and habitual physical exercise. Fasting lipid concentrations and other cardiovascular risk variables were measured. Subclinical atherosclerosis was detected with the use of electron beam computed tomography for CAC. Factors independently associated with the presence of any detectable CAC (CAC score > 0), including standard CV risk variables, non-HDL, exercise, and diet, were evaluated with the use of logistic regression. The mean Framingham risk score was 4.6 ± 2.6%; CAC was present in 22.4%. Fasting lipid concentrations showed mean LDL-C 128 ± 32 mg/dL, HDL-C 50 ± 13 mg/dL, TG-C 130 ± 86 mg/dL, and non-HDL-C 154 ± 37 mg/dL. Men with CAC had significantly greater levels of LDL-C (135 vs 127 mg/dL), TG (148 vs 124 mg/dL), and non-HDL-C (164 vs 151 mg/dL) and less habitual physical activity (P = 0.006). There were nonsignificant trends between prevalent CAC, greater amounts of dietary fat intake, and lower HDL-C. In successive multivariable logistic regression models for the dependent variable CAC, only non-HDL-C (odds ratio [OR] 1.012 per mg/dL; 95% CI 1.002-1.023; P = .019) and age (OR 1.119 per year; 95% CI 1.063-1.178; P < .001) were independently associated with the presence of CAC, and exercise (OR 0.808; 95% CI 0.703-0.928; P = 0.003) was associated with the absence of CAC. Non-HDL-C and exercise are independently predictive of the presence of subclinical CAC among healthy lower-risk middle-aged men. Copyright © 2012 National Lipid Association. All rights reserved.
MULTIVARIATE RECEPTOR MODELING FOR TEMPORALLY CORRELATED DATA BY USING MCMC. (R826238)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
An error bound for a discrete reduced order model of a linear multivariable system
NASA Technical Reports Server (NTRS)
Al-Saggaf, Ubaid M.; Franklin, Gene F.
1987-01-01
The design of feasible controllers for high dimension multivariable systems can be greatly aided by a method of model reduction. In order for the design based on the order reduction to include a guarantee of stability, it is sufficient to have a bound on the model error. Previous work has provided such a bound for continuous-time systems for algorithms based on balancing. In this note an L-infinity bound is derived for model error for a method of order reduction of discrete linear multivariable systems based on balancing.
Wang, Binwu; Li, Hong; Sun, Danfeng
2014-01-01
The regional management of trace elements in soils requires understanding the interaction between the natural system and human socio-economic activities. In this study, a social-ecological patterns of heavy metals (SEPHM) approach was proposed to identify the heavy metal concentration patterns and processes in different ecoregions of Beijing (China) based on a self-organizing map (SOM). Potential ecological risk index (RI) values of Cr, Ni, Zn, Hg, Cu, As, Cd and Pb were calculated for 1,018 surface soil samples. These data were averaged in accordance with 253 communities and/or towns, and compared with demographic, agriculture structure, geomorphology, climate, land use/cover, and soil-forming parent material to discover the SEPHM. Multivariate statistical techniques were further applied to interpret the control factors of each SEPHM. SOM application clustered the 253 towns into nine groups on the map size of 12 × 7 plane (quantization error 1.809; topographic error, 0.0079). The distribution characteristics and Spearman rank correlation coefficients of RIs were strongly associated with the population density, vegetation index, industrial and mining land percent and road density. The RIs were relatively high in which towns in a highly urbanized area with large human population density exist, while low RIs occurred in mountainous and high vegetation cover areas. The resulting dataset identifies the SEPHM of Beijing and links the apparent results of RIs to driving factors, thus serving as an excellent data source to inform policy makers for legislative and land management actions. PMID:24690947
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-08
... Climate and Land Use Scenarios, a project which is described in the 2009 EPA Report, ``Land-Use Scenarios: National-Scale Housing- Density Scenarios Consistent with Climate Change Storylines.'' These scenarios are... economic development, which are used by climate change modelers to develop projections of future climate...
Assembly-history dynamics of a pitcher-plant protozoan community in experimental microcosms.
Kadowaki, Kohmei; Inouye, Brian D; Miller, Thomas E
2012-01-01
History drives community assembly through differences both in density (density effects) and in the sequence in which species arrive (sequence effects). Density effects arise from predictable population dynamics, which are free of history, but sequence effects are due to a density-free mechanism, arising solely from the order and timing of immigration events. Few studies have determined how components of immigration history (timing, number of individuals, frequency) alter local dynamics to determine community assembly, beyond addressing when immigration history produces historically contingent assembly. We varied density and sequence effects independently in a two-way factorial design to follow community assembly in a three-species aquatic protozoan community. A superior competitor, Colpoda steinii, mediated alternative community states; early arrival or high introduction density allowed this species to outcompete or suppress the other competitors (Poterioochromonas malhamensis and Eimeriidae gen. sp.). Multivariate analysis showed that density effects caused greater variation in community states, whereas sequence effects altered the mean community composition. A significant interaction between density and sequence effects suggests that we should refine our understanding of priority effects. These results highlight a practical need to understand not only the "ingredients" (species) in ecological communities but their "recipes" as well.
NASA Astrophysics Data System (ADS)
Trieu, Phuong Dung (Yun); Mello-Thoms, Claudia; Peat, Jenny; Do, Thuan Doan; Brennan, Patrick C.
2017-03-01
This study aims to investigate patterns of breast density among women in Vietnam and their association with demographic, reproductive and lifestyle features. Mammographic densities of 1,651 women were collected from the two largest breast cancer screening and treatment centers in Ha Noi and Ho Chi Minh city. Putative factors associated with breast density were obtained from self-administered questionnaires which considered demographic, reproductive and lifestyle elements and were provided by women who attended mammography examinations. Results show that a large proportion of Vietnamese women (78.4%) had a high breast density. With multivariable logistic regression, significant associations of high breast density were evident with women with less than 55 years old (OR=3.0), having BMI less than 23 (OR=2.2), experiencing pre-menopausal status (OR=2.9), having less than three children (OR=1.7), and being less than 32 years old when having their last child (OR=1.8). Participants who consumed more than two vegetable servings per day also had an increased risk of higher density (OR=2.6). The findings suggest some unique features regarding mammographic density amongst Vietnamese compared with westernized women.
Alam, Khurshid; Oliveras, Elizabeth
2014-05-20
Volunteer community health workers (CHWs) are a key approach to improving community-based maternal and child health services in developing countries. BRAC, a large Bangladeshi non-governmental organization (NGO), has employed female volunteer CHWs in its community-based health programs since 1977, recently including its Manoshi project, a community-based maternal and child health intervention in the urban slums of Bangladesh. A case-control study conducted in response to high dropout rates in the first year of the project showed that financial incentives, social prestige, community approval and household responsibilities were related to early retention in the project. In our present prospective cohort study, we aimed to better understand the factors associated with retention of volunteer CHWs once the project was more mature. We used a prospective cohort study design to examine the factors affecting retention of volunteer CHWs who remained in the project after the initial start-up period. We surveyed a random sample of 542 CHWs who were working for BRAC Manoshi in December 2008. In December 2009, we revisited this cohort of CHWs and interviewed those who had dropped out about the main reasons for their dropping out. We used a multivariable generalized linear model regression analysis with a log link to estimate the relative risk (RR) of independent factors on retention. Of the 542 CHWs originally enrolled, 120 had dropped out by the end of one year, mainly because they left the slums. CHWs who received positive community appraisal (adjusted RR = 1.45, 95% confidence interval (CI) = 1.10 to 1.91) or were associated with other NGOs (adjusted RR = 1.13, 95% CI = 1.04 to 1.23) were more likely to have been retained in the project. Although refresher training was also associated with increased retention (adjusted RR = 2.25, 95% CI = 1.08 to 4.71) in this study, too few CHWs had not attended refresher training regularly to make it a meaningful predictor of retention that could be applied in the project setting. Factors that affect retention of CHWs may change over time, with some factors that are important in the early years of a project losing importance as the project matures. Community health programs operating in fragile urban slums should consider changing factors over program duration for better retention of volunteer CHWs.
Comprehensive drought characteristics analysis based on a nonlinear multivariate drought index
NASA Astrophysics Data System (ADS)
Yang, Jie; Chang, Jianxia; Wang, Yimin; Li, Yunyun; Hu, Hui; Chen, Yutong; Huang, Qiang; Yao, Jun
2018-02-01
It is vital to identify drought events and to evaluate multivariate drought characteristics based on a composite drought index for better drought risk assessment and sustainable development of water resources. However, most composite drought indices are constructed by the linear combination, principal component analysis and entropy weight method assuming a linear relationship among different drought indices. In this study, the multidimensional copulas function was applied to construct a nonlinear multivariate drought index (NMDI) to solve the complicated and nonlinear relationship due to its dependence structure and flexibility. The NMDI was constructed by combining meteorological, hydrological, and agricultural variables (precipitation, runoff, and soil moisture) to better reflect the multivariate variables simultaneously. Based on the constructed NMDI and runs theory, drought events for a particular area regarding three drought characteristics: duration, peak, and severity were identified. Finally, multivariate drought risk was analyzed as a tool for providing reliable support in drought decision-making. The results indicate that: (1) multidimensional copulas can effectively solve the complicated and nonlinear relationship among multivariate variables; (2) compared with single and other composite drought indices, the NMDI is slightly more sensitive in capturing recorded drought events; and (3) drought risk shows a spatial variation; out of the five partitions studied, the Jing River Basin as well as the upstream and midstream of the Wei River Basin are characterized by a higher multivariate drought risk. In general, multidimensional copulas provides a reliable way to solve the nonlinear relationship when constructing a comprehensive drought index and evaluating multivariate drought characteristics.
Multivariate assessment of event-related potentials with the t-CWT method.
Bostanov, Vladimir
2015-11-05
Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.
Use of uninformative priors to initialize state estimation for dynamical systems
NASA Astrophysics Data System (ADS)
Worthy, Johnny L.; Holzinger, Marcus J.
2017-10-01
The admissible region must be expressed probabilistically in order to be used in Bayesian estimation schemes. When treated as a probability density function (PDF), a uniform admissible region can be shown to have non-uniform probability density after a transformation. An alternative approach can be used to express the admissible region probabilistically according to the Principle of Transformation Groups. This paper uses a fundamental multivariate probability transformation theorem to show that regardless of which state space an admissible region is expressed in, the probability density must remain the same under the Principle of Transformation Groups. The admissible region can be shown to be analogous to an uninformative prior with a probability density that remains constant under reparameterization. This paper introduces requirements on how these uninformative priors may be transformed and used for state estimation and the difference in results when initializing an estimation scheme via a traditional transformation versus the alternative approach.
Hogerwerf, Lenny; Holstege, Manon M C; Benincà, Elisa; Dijkstra, Frederika; van der Hoek, Wim
2017-07-26
Human psittacosis is a highly under diagnosed zoonotic disease, commonly linked to psittacine birds. Psittacosis in birds, also known as avian chlamydiosis, is endemic in poultry, but the risk for people living close to poultry farms is unknown. Therefore, our study aimed to explore the temporal and spatial patterns of human psittacosis infections and identify possible associations with poultry farming in the Netherlands. We analysed data on 700 human cases of psittacosis notified between 01-01-2000 and 01-09-2015. First, we studied the temporal behaviour of psittacosis notifications by applying wavelet analysis. Then, to identify possible spatial patterns, we applied spatial cluster analysis. Finally, we investigated the possible spatial association between psittacosis notifications and data on the Dutch poultry sector at municipality level using a multivariable model. We found a large spatial cluster that covered a highly poultry-dense area but additional clusters were found in areas that had a low poultry density. There were marked geographical differences in the awareness of psittacosis and the amount and the type of laboratory diagnostics used for psittacosis, making it difficult to draw conclusions about the correlation between the large cluster and poultry density. The multivariable model showed that the presence of chicken processing plants and slaughter duck farms in a municipality was associated with a higher rate of human psittacosis notifications. The significance of the associations was influenced by the inclusion or exclusion of farm density in the model. Our temporal and spatial analyses showed weak associations between poultry-related variables and psittacosis notifications. Because of the low number of psittacosis notifications available for analysis, the power of our analysis was relative low. Because of the exploratory nature of this research, the associations found cannot be interpreted as evidence for airborne transmission of psittacosis from poultry to the general population. Further research is needed to determine the prevalence of C. psittaci in Dutch poultry. Also, efforts to promote PCR-based testing for C. psittaci and genotyping for source tracing are important to reduce the diagnostic deficit, and to provide better estimates of the human psittacosis burden, and the possible role of poultry.
Multivariate Epi-splines and Evolving Function Identification Problems
2015-04-15
such extrinsic information as well as observed function and subgradient values often evolve in applications, we establish conditions under which the...previous study [30] dealt with compact intervals of IR. Splines are intimately tied to optimization problems through their variational theory pioneered...approxima- tion. Motivated by applications in curve fitting, regression, probability density estimation, variogram computation, financial curve construction
Shim, Heejung; Chasman, Daniel I.; Smith, Joshua D.; Mora, Samia; Ridker, Paul M.; Nickerson, Deborah A.; Krauss, Ronald M.; Stephens, Matthew
2015-01-01
We conducted a genome-wide association analysis of 7 subfractions of low density lipoproteins (LDLs) and 3 subfractions of intermediate density lipoproteins (IDLs) measured by gradient gel electrophoresis, and their response to statin treatment, in 1868 individuals of European ancestry from the Pharmacogenomics and Risk of Cardiovascular Disease study. Our analyses identified four previously-implicated loci (SORT1, APOE, LPA, and CETP) as containing variants that are very strongly associated with lipoprotein subfractions (log10Bayes Factor > 15). Subsequent conditional analyses suggest that three of these (APOE, LPA and CETP) likely harbor multiple independently associated SNPs. Further, while different variants typically showed different characteristic patterns of association with combinations of subfractions, the two SNPs in CETP show strikingly similar patterns - both in our original data and in a replication cohort - consistent with a common underlying molecular mechanism. Notably, the CETP variants are very strongly associated with LDL subfractions, despite showing no association with total LDLs in our study, illustrating the potential value of the more detailed phenotypic measurements. In contrast with these strong subfraction associations, genetic association analysis of subfraction response to statins showed much weaker signals (none exceeding log10Bayes Factor of 6). However, two SNPs (in APOE and LPA) previously-reported to be associated with LDL statin response do show some modest evidence for association in our data, and the subfraction response proles at the LPA SNP are consistent with the LPA association, with response likely being due primarily to resistance of Lp(a) particles to statin therapy. An additional important feature of our analysis is that, unlike most previous analyses of multiple related phenotypes, we analyzed the subfractions jointly, rather than one at a time. Comparisons of our multivariate analyses with standard univariate analyses demonstrate that multivariate analyses can substantially increase power to detect associations. Software implementing our multivariate analysis methods is available at http://stephenslab.uchicago.edu/software.html. PMID:25898129
OGLE II Eclipsing Binaries In The LMC: Analysis With Class
NASA Astrophysics Data System (ADS)
Devinney, Edward J.; Prsa, A.; Guinan, E. F.; DeGeorge, M.
2011-01-01
The Eclipsing Binaries (EBs) via Artificial Intelligence (EBAI) Project is applying machine learning techniques to elucidate the nature of EBs. Previously, Prsa, et al. applied artificial neural networks (ANNs) trained on physically-realistic Wilson-Devinney models to solve the light curves of the 1882 detached EBs in the LMC discovered by the OGLE II Project (Wyrzykowski, et al.) fully automatically, bypassing the need for manually-derived starting solutions. A curious result is the non-monotonic distribution of the temperature ratio parameter T2/T1, featuring a subsidiary peak noted previously by Mazeh, et al. in an independent analysis using the EBOP EB solution code (Tamuz, et al.). To explore this and to gain a fuller understanding of the multivariate EBAI LMC observational plus solutions data, we have employed automatic clustering and advanced visualization (CAV) techniques. Clustering the OGLE II data aggregates objects that are similar with respect to many parameter dimensions. Measures of similarity for example, could include the multidimensional Euclidean Distance between data objects, although other measures may be appropriate. Applying clustering, we find good evidence that the T2/T1 subsidiary peak is due to evolved binaries, in support of Mazeh et al.'s speculation. Further, clustering suggests that the LMC detached EBs occupying the main sequence region belong to two distinct classes. Also identified as a separate cluster in the multivariate data are stars having a Period-I band relation. Derekas et al. had previously found a Period-K band relation for LMC EBs discovered by the MACHO Project (Alcock, et al.). We suggest such CAV techniques will prove increasingly useful for understanding the large, multivariate datasets increasingly being produced in astronomy. We are grateful for the support of this research from NSF/RUI Grant AST-05-75042 f.
A compressed sensing based approach on Discrete Algebraic Reconstruction Technique.
Demircan-Tureyen, Ezgi; Kamasak, Mustafa E
2015-01-01
Discrete tomography (DT) techniques are capable of computing better results, even using less number of projections than the continuous tomography techniques. Discrete Algebraic Reconstruction Technique (DART) is an iterative reconstruction method proposed to achieve this goal by exploiting a prior knowledge on the gray levels and assuming that the scanned object is composed from a few different densities. In this paper, DART method is combined with an initial total variation minimization (TvMin) phase to ensure a better initial guess and extended with a segmentation procedure in which the threshold values are estimated from a finite set of candidates to minimize both the projection error and the total variation (TV) simultaneously. The accuracy and the robustness of the algorithm is compared with the original DART by the simulation experiments which are done under (1) limited number of projections, (2) limited view problem and (3) noisy projections conditions.
The Human Engineering Eye Movement Measurement Research Facility.
1985-04-01
tracked reliably. When tracking is disrupted (e.g., by gross and sudden head movements, gross change in the head position, sneezing, prolonged eye...these are density ^\\ and " busyness " of the slides (stimulus material), as well as consistency . I„ between successive... change the material being projected based on the subject’s previous performance. The minicomputer relays the calibrated data to one of the magnetic
Growth model for uneven-aged loblolly pine stands : simulations and management implications
C.-R. Lin; J. Buongiorno; Jeffrey P. Prestemon; K. E. Skog
1998-01-01
A density-dependent matrix growth model of uneven-aged loblolly pine stands was developed with data from 991 permanent plots in the southern United States. The model predicts the number of pine, soft hardwood, and hard hardwood trees in 13 diameter classes, based on equations for ingrowth, upgrowth, and mortality. Projections of 6 to 10 years agreed with the growth...
Forests on the edge: housing development on America’s private forests.
Ronald E. McRoberts; Ralph J. Alig; Mark D. Nelson; David M. Theobald; Mike Eley; Mike Dechter; Mary. Carr
2005-01-01
The private working land base of Americaâs forests is being converted to developed uses, with implications for the condition and management of affected private forests and the watersheds in which they occur. The Forests on the Edge project seeks to improve understanding of the processes and thresholds associated with increases in housing density in private forests and...
Re-Shuffling of Species with Climate Disruption: A No-Analog Future for California Birds?
Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.; Alexander, John D.; Wiens, John A.; Root, Terry L.
2009-01-01
By facilitating independent shifts in species' distributions, climate disruption may result in the rapid development of novel species assemblages that challenge the capacity of species to co-exist and adapt. We used a multivariate approach borrowed from paleoecology to quantify the potential change in California terrestrial breeding bird communities based on current and future species-distribution models for 60 focal species. Projections of future no-analog communities based on two climate models and two species-distribution-model algorithms indicate that by 2070 over half of California could be occupied by novel assemblages of bird species, implying the potential for dramatic community reshuffling and altered patterns of species interactions. The expected percentage of no-analog bird communities was dependent on the community scale examined, but consistent geographic patterns indicated several locations that are particularly likely to host novel bird communities in the future. These no-analog areas did not always coincide with areas of greatest projected species turnover. Efforts to conserve and manage biodiversity could be substantially improved by considering not just future changes in the distribution of individual species, but including the potential for unprecedented changes in community composition and unanticipated consequences of novel species assemblages. PMID:19724641
Ditah, Chobufo; Otvos, James; Nassar, Hisham; Shaham, Dorith; Sinnreich, Ronit; Kark, Jeremy D
2016-08-01
Failure of trials to observe benefits by elevating plasma high-density lipoprotein cholesterol (HDL-C) has raised serious doubts about HDL-C's atheroprotective properties. We aimed to identify protective HDL biomarkers by examining the association of nuclear magnetic resonance (NMR) measures of total HDL-particle (HDL-P), large HDL-particle, and small and medium-sized HDL-particle (MS-HDL-P) concentrations and average HDL-particle size with coronary artery calcification (CAC), which reflects the burden of coronary atherosclerosis, and compare with that of HDL-C. Using a cross-sectional design, 504 Jerusalem residents (274 Arabs and 230 Jews), recruited by population-based probability sampling, had HDL measured by NMR spectroscopy. CAC was determined by multidetector helical CT-scanning using Agatston scoring. Independent associations between the NMR measures and CAC (comparing scores ≥100 vs. <100) were assessed with multivariable binary logistic models. Comparing tertile 3 vs. tertile 1, we observed protective associations of HDL-P (multivariable-adjusted OR 0.42, 95% CI 0.22-0.79, plinear trend = 0.002) and MS-HDL-P (OR 0.36, 95% CI 0.19-0.69), plinear trend = 0.006 with CAC, which persisted after further adjustment for HDL-C. HDL-C was not significantly associated with CAC (multivariable-adjusted OR 0.59, 95% CI 0.27-1.29 for tertiles 3 vs. 1, plinear trend = 0.49). Large HDL-P and average particle size (which are highly correlated; r = 0.83) were not associated with CAC: large HDL-P (OR 0.77, 95% CI 0.33-1.83, plinear trend = 0.29) and average HDL-P size (OR 0.72, 95% CI 0.35-1.48, plinear trend = 0.58). MS-HDL-P represents a protective subpopulation of HDL particles. HDL-P and MS-HDL-P were more strongly associated with CAC than HDL-C. Based on the accumulating evidence, incorporation of MS-HDL-P or HDL-P into the routine prediction of CHD risk should be evaluated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Anero, Jesús G; Español, Pep; Tarazona, Pedro
2013-07-21
We present a generalization of Density Functional Theory (DFT) to non-equilibrium non-isothermal situations. By using the original approach set forth by Gibbs in his consideration of Macroscopic Thermodynamics (MT), we consider a Functional Thermo-Dynamics (FTD) description based on the density field and the energy density field. A crucial ingredient of the theory is an entropy functional, which is a concave functional. Therefore, there is a one to one connection between the density and energy fields with the conjugate thermodynamic fields. The connection between the three levels of description (MT, DFT, FTD) is clarified through a bridge theorem that relates the entropy of different levels of description and that constitutes a generalization of Mermin's theorem to arbitrary levels of description whose relevant variables are connected linearly. Although the FTD level of description does not provide any new information about averages and correlations at equilibrium, it is a crucial ingredient for the dynamics in non-equilibrium states. We obtain with the technique of projection operators the set of dynamic equations that describe the evolution of the density and energy density fields from an initial non-equilibrium state towards equilibrium. These equations generalize time dependent density functional theory to non-isothermal situations. We also present an explicit model for the entropy functional for hard spheres.
Probing consciousness in a sensory-disconnected paralyzed patient.
Rohaut, Benjamin; Raimondo, Federico; Galanaud, Damien; Valente, Mélanie; Sitt, Jacobo Diego; Naccache, Lionel
2017-01-01
Diagnosis of consciousness can be very challenging in some clinical situations such as severe sensory-motor impairments. We report the case study of a patient who presented a total "locked-in syndrome" associated with and a multi-sensory deafferentation (visual, auditory and tactile modalities) following a protuberantial infarction. In spite of this severe and extreme disconnection from the external world, we could detect reliable evidence of consciousness using a multivariate analysis of his high-density resting state electroencephalogram. This EEG-based diagnosis was eventually confirmed by the clinical evolution of the patient. This approach illustrates the potential importance of functional brain-imaging data to improve diagnosis of consciousness and of cognitive abilities in critical situations in which the behavioral channel is compromised such as deafferented locked-in syndrome.
Urban Impact at the Urban-Agricultural Interface in Madison, WI: an Ecosystem Modeling Approach
NASA Astrophysics Data System (ADS)
Logan, K. E.; Kucharik, C. J.; Schneider, A.
2009-12-01
Global population and the proportion of people living in urban areas both continue to grow while average urban density is decreasing worldwide. Because urban areas are often located in the most agriculturally productive lands, expansion of the built environment can cause sharp reductions in land available for cultivation. Conversion of land to urban use also significantly alters climate variables. Urban materials differ from natural land covers in terms of albedo, thermal properties, and permeability, altering energy and water cycles. Anthropogenic heat emissions also alter the energy balance in and around a city. Preliminary analysis of urban impacts around Madison, WI, a small city located in a thriving agricultural region, was performed using the National Land Cover Database (NLCD), MODIS albedo products, ground-based observations, and a simulation of urban expansion, within a geographic information system (GIS). Population of the county is expected to increase by 58% while urban density is projected to decrease by 49% between 1992 and 2030, reflecting projected worldwide patterns. Carbon stored in the top 25cm of soil was found to be over 2.5 times greater in remnant prairies than in croplands and was calculated to be even less in urban areas; projected urban development may thus lead to large losses in carbon storage. Albedo measurements also show a significant decrease with urban development. Projected urban expansion between 2001 and 2030 is expected to convert enough agricultural lands to urban areas to result in a loss of 247,000 tons of crop yield in Dane County alone, based on current yields. For a more complete analysis of these impacts, urban parameters are incorporated into a terrestrial ecosystem model known as Agro-IBIS. This approach allows for detailed comparison of energy balance and biogeochemical cycles between local crop systems, lawns, and impervious city surfaces. Changes in these important cycles, in soil carbon storage, and in crop productivity/yield for 1992 - 2001 and projected 2030 development around Madison, WI will be shown.
Progress in extremely high brightness LED-based light sources
NASA Astrophysics Data System (ADS)
Hoelen, Christoph; Antonis, Piet; de Boer, Dick; Koole, Rolf; Kadijk, Simon; Li, Yun; Vanbroekhoven, Vincent; Van De Voorde, Patrick
2017-09-01
Although the maximum brightness of LEDs has been increasing continuously during the past decade, their luminance is still far from what is required for multiple applications that still rely on the high brightness of discharge lamps. In particular for high brightness applications with limited étendue, e.g. front projection, only very modest luminance values in the beam can be achieved with LEDs compared to systems based on discharge lamps or lasers. With dedicated architectures, phosphor-converted green LEDs for projection may achieve luminance values up to 200-300 Mnit. In this paper we report on the progress made in the development of light engines based on an elongated luminescent concentrator pumped by blue LEDs. This concept has recently been introduced to the market as ColorSpark High Lumen Density LED technology. These sources outperform the maximum brightness of LEDs by multiple factors. In LED front projection, green LEDs are the main limiting factor. With our green modules, we now have achieved peak luminance values of 2 Gnit, enabling LED-based projection systems with over 4000 ANSI lm. Extension of this concept to yellow and red light sources is presented. The light source efficiency has been increased considerably, reaching 45-60 lm/W for green under practical application conditions. The module architecture, beam shaping, and performance characteristics are reviewed, as well as system aspects. The performance increase, spectral range extensions, beam-shaping flexibility, and cost reductions realized with the new module architecture enable a breakthrough in LED-based projection systems and in a wide variety of other high brightness applications.
NASA Technical Reports Server (NTRS)
Liou, Jer-Chyi; Clark, S.; Fitz-Coy, N.; Huynh, T.; Opiela, J.; Polk, M.; Roebuck, B.; Rushing, R.; Sorge, M.; Werremeyer, M.
2013-01-01
The goal of the DebriSat project is to characterize fragments generated by a hypervelocity collision involving a modern satellite in low Earth orbit (LEO). The DebriSat project will update and expand upon the information obtained in the 1992 Satellite Orbital Debris Characterization Impact Test (SOCIT), which characterized the breakup of a 1960 s US Navy Transit satellite. There are three phases to this project: the design and fabrication of DebriSat - an engineering model representing a modern, 60-cm/50-kg class LEO satellite; conduction of a laboratory-based hypervelocity impact to catastrophically break up the satellite; and characterization of the properties of breakup fragments down to 2 mm in size. The data obtained, including fragment size, area-to-mass ratio, density, shape, material composition, optical properties, and radar cross-section distributions, will be used to supplement the DoD s and NASA s satellite breakup models to better describe the breakup outcome of a modern satellite.
Technical and investigative support for high density digital satellite recording systems
NASA Technical Reports Server (NTRS)
Schultz, R. A.
1983-01-01
Recent results of dropout measurements and defect analysis conducted on one reel of Ampex 721 which was submitted for evaluation by the manufacturer are described. The results or status of other tape evaluation activities are also reviewed. Several changes in test interpretations and applications are recommended. In some cases, deficiencies in test methods or equipment became apparent during continued work on this project and other IITRI tape evaluation projects. Techniques and equipment for future tasks such as tape qualification are also recommended and discussed. Project effort and expenditures were kept at a relatively low level. This rate provided added development time and experience with the IITRI Dropout Measurement System, which is approaching its potential as a computer based dropout analysis tool. Another benefit is the expanded data base on critical parameters that can be achieved from tests on different tape types and lots as they become available. More consideration and effort was directed toward identification of critical parameters, development of meaningful repeatable test procedures, and tape procurement strategy.
Li, Wenqing; Walther, Christian F J; Kuc, Agnieszka; Heine, Thomas
2013-07-09
The performance of a wide variety of commonly used density functionals, as well as two screened hybrid functionals (HSE06 and TB-mBJ), on predicting electronic structures of a large class of en vogue materials, such as metal oxides, chalcogenides, and nitrides, is discussed in terms of band gaps, band structures, and projected electronic densities of states. Contrary to GGA, hybrid functionals and GGA+U, both HSE06 and TB-mBJ are able to predict band gaps with an appreciable accuracy of 25% and thus allow the screening of various classes of transition-metal-based compounds, i.e., mixed or doped materials, at modest computational cost. The calculated electronic structures are largely unaffected by the choice of basis functions and software implementation, however, might be subject to the treatment of the core electrons.
Multivariate Cluster Analysis.
ERIC Educational Resources Information Center
McRae, Douglas J.
Procedures for grouping students into homogeneous subsets have long interested educational researchers. The research reported in this paper is an investigation of a set of objective grouping procedures based on multivariate analysis considerations. Four multivariate functions that might serve as criteria for adequate grouping are given and…
A multivariate model and statistical method for validating tree grade lumber yield equations
Donald W. Seegrist
1975-01-01
Lumber yields within lumber grades can be described by a multivariate linear model. A method for validating lumber yield prediction equations when there are several tree grades is presented. The method is based on multivariate simultaneous test procedures.
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.
Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin
2015-04-01
Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Ma, Cheng-Jiun; Ebeling, Harald; Donovan, David; Barrett, Elizabeth
2008-09-01
We present the results of a wide-field spectroscopic analysis of the galaxy population of the massive cluster MACS J0717.5+3745 and the surrounding filamentary structure (z = 0.55), as part of our systematic study of the 12 most distant clusters in the MACS sample. Of 1368 galaxies spectroscopically observed in this field, 563 are identified as cluster members; of those, 203 are classified as emission-line galaxies, 260 as absorption-line galaxies, and 17 as E+A galaxies (defined by (H δ + H γ )/2 > 6 Å and no detection of [O II] and Hβ in emission). The variation of the fraction of emission- and absorption-line galaxies as a function of local projected galaxy density confirms the well-known morphology-density relation, and becomes flat at projected galaxy densities less than ~20 Mpc-2. Interestingly, 16 out of 17 E+A galaxies lie (in projection) within the ram-pressure stripping radius around the cluster core, which we take to be direct evidence that ram-pressure stripping is the primary mechanism that terminates star formation in the E+A population of galaxy clusters. This conclusion is supported by the rarity of E+A galaxies in the filament, which rules out galaxy mergers as the dominant driver of evolution for E+A galaxies in clusters. In addition, we find that the 42 e(a) and 27 e(b) member galaxies, i.e., the dusty-starburst and starburst galaxies respectively, are spread out across almost the entire study area. Their spatial distribution, which shows a strong preference for the filament region, suggests that starbursts are triggered in relatively low-density environments as galaxies are accreted from the field population. Based in part on data collected at Subaru Telescope, which is operated by the National Astronomical Observatory of Japan. Based also in part on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT), which is operated by the National Research Council (NRC) of Canada, the Institute National des Sciences de l'Univers of the Centre National de la Recherche Scientifique of France, and the University of Hawaii. The spectroscopic data presented herein were obtained at the W.M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration. The Observatory was made possible by the generous financial support of the W.M. Keck Foundation.
ERIC Educational Resources Information Center
Haberman, Shelby J.; von Davier, Matthias; Lee, Yi-Hsuan
2008-01-01
Multidimensional item response models can be based on multivariate normal ability distributions or on multivariate polytomous ability distributions. For the case of simple structure in which each item corresponds to a unique dimension of the ability vector, some applications of the two-parameter logistic model to empirical data are employed to…
A RUTCOR Project on Discrete Applied Mathematics
1989-01-30
the more important results of this work is the possibility that Groebner basis methods of computational commutative algebra might lead to effective...Billera, L.J., " Groebner Basis Methods for Multivariate Splines," prepared for the Proceedings of the Oslo Conference on Computer-aided Geometric Design
imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel
Grapov, Dmitry; Newman, John W.
2012-01-01
Summary: Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Availability and implementation: Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010). Contact: John.Newman@ars.usda.gov Supplementary Information: Installation instructions, tutorials and users manual are available at http://sourceforge.net/projects/imdev/. PMID:22815358
A PDF closure model for compressible turbulent chemically reacting flows
NASA Technical Reports Server (NTRS)
Kollmann, W.
1992-01-01
The objective of the proposed research project was the analysis of single point closures based on probability density function (pdf) and characteristic functions and the development of a prediction method for the joint velocity-scalar pdf in turbulent reacting flows. Turbulent flows of boundary layer type and stagnation point flows with and without chemical reactions were be calculated as principal applications. Pdf methods for compressible reacting flows were developed and tested in comparison with available experimental data. The research work carried in this project was concentrated on the closure of pdf equations for incompressible and compressible turbulent flows with and without chemical reactions.
Nukazawa, Kei; Arai, Ryosuke; Kazama, So; Takemon, Yasuhiro
2018-06-14
Climate change places considerable stress on riverine ecosystems by altering flow regimes and increasing water temperature. This study evaluated how water temperature increases under climate change scenarios will affect stream invertebrates in pristine headwater streams. The studied headwater-stream sites were distributed within a temperate catchment of Japan and had similar hydraulic-geographical conditions, but were subject to varying temperature conditions due to altitudinal differences (100 to 850 m). We adopted eight general circulation models (GCMs) to project air temperature under conservative (RCP2.6), intermediate (RCP4.5), and extreme climate scenarios (RCP8.5) during the near (2031-2050) and far (2081-2100) future. Using the water temperature of headwater streams computed by a distributed hydrological-thermal model as a predictor variable, we projected the population density of stream invertebrates in the future scenarios based on generalized linear models. The mean decrease in the temporally averaged population density of Plecoptera was 61.3% among the GCMs, even under RCP2.6 in the near future, whereas density deteriorated even further (90.7%) under RCP8.5 in the far future. Trichoptera density was also projected to greatly deteriorate under RCP8.5 in the far future. We defined taxa that exhibited temperature-sensitive declines under climate change as cold stenotherms and found that most Plecoptera taxa were cold stenotherms in comparison to other orders. Specifically, the taxonomic families that only distribute in Palearctic realm (e.g., Megarcys ochracea and Scopura longa) were selectively assigned, suggesting that Plecoptera family with its restricted distribution in the Palearctic might be a sensitive indicator of climate change. Plecoptera and Trichoptera populations in the headwaters are expected/anticipated to decrease over the considerable geographical range of the catchment, even under the RCP2.6 in the near future. Given headwater invertebrates play important roles in streams, such as contributing to watershed productivity, our results provide useful information for managing streams at the catchment-level. Copyright © 2018 Elsevier B.V. All rights reserved.
Cournède, Paul-Henry; Mathieu, Amélie; Houllier, François; Barthélémy, Daniel; de Reffye, Philippe
2008-01-01
Background and Aims The dynamical system of plant growth GREENLAB was originally developed for individual plants, without explicitly taking into account interplant competition for light. Inspired by the competition models developed in the context of forest science for mono-specific stands, we propose to adapt the method of crown projection onto the x–y plane to GREENLAB, in order to study the effects of density on resource acquisition and on architectural development. Methods The empirical production equation of GREENLAB is extrapolated to stands by computing the exposed photosynthetic foliage area of each plant. The computation is based on the combination of Poisson models of leaf distribution for all the neighbouring plants whose crown projection surfaces overlap. To study the effects of density on architectural development, we link the proposed competition model to the model of interaction between functional growth and structural development introduced by Mathieu (2006, PhD Thesis, Ecole Centrale de Paris, France). Key Results and Conclusions The model is applied to mono-specific field crops and forest stands. For high-density crops at full cover, the model is shown to be equivalent to the classical equation of field crop production ( Howell and Musick, 1985, in Les besoins en eau des cultures; Paris: INRA Editions). However, our method is more accurate at the early stages of growth (before cover) or in the case of intermediate densities. It may potentially account for local effects, such as uneven spacing, variation in the time of plant emergence or variation in seed biomass. The application of the model to trees illustrates the expression of plant plasticity in response to competition for light. Density strongly impacts on tree architectural development through interactions with the source–sink balances during growth. The effects of density on tree height and radial growth that are commonly observed in real stands appear as emerging properties of the model. PMID:18037666
Cournède, Paul-Henry; Mathieu, Amélie; Houllier, François; Barthélémy, Daniel; de Reffye, Philippe
2008-05-01
The dynamical system of plant growth GREENLAB was originally developed for individual plants, without explicitly taking into account interplant competition for light. Inspired by the competition models developed in the context of forest science for mono-specific stands, we propose to adapt the method of crown projection onto the x-y plane to GREENLAB, in order to study the effects of density on resource acquisition and on architectural development. The empirical production equation of GREENLAB is extrapolated to stands by computing the exposed photosynthetic foliage area of each plant. The computation is based on the combination of Poisson models of leaf distribution for all the neighbouring plants whose crown projection surfaces overlap. To study the effects of density on architectural development, we link the proposed competition model to the model of interaction between functional growth and structural development introduced by Mathieu (2006, PhD Thesis, Ecole Centrale de Paris, France). The model is applied to mono-specific field crops and forest stands. For high-density crops at full cover, the model is shown to be equivalent to the classical equation of field crop production (Howell and Musick, 1985, in Les besoins en eau des cultures; Paris: INRA Editions). However, our method is more accurate at the early stages of growth (before cover) or in the case of intermediate densities. It may potentially account for local effects, such as uneven spacing, variation in the time of plant emergence or variation in seed biomass. The application of the model to trees illustrates the expression of plant plasticity in response to competition for light. Density strongly impacts on tree architectural development through interactions with the source-sink balances during growth. The effects of density on tree height and radial growth that are commonly observed in real stands appear as emerging properties of the model.
Kwak, Yoonjin; Koh, Jiwon; Kim, Duck-Woo; Kang, Sung-Bum; Kim, Woo Ho; Lee, Hye Seung
2016-01-01
Background The immunoscore (IS), an index based on the density of CD3+ and CD8+ tumor-infiltrating lymphocytes (TILs) in the tumor center (CT) and invasive margin (IM), has gained considerable attention as a prognostic marker. Tumor-associated macrophages (TAMs) have also been reported to have prognostic value. However, its clinical significance has not been fully clarified in patients with advanced CRC who present with distant metastases. Methods The density of CD3+, CD4+, CD8+, FOXP3+, CD68+, and CD163+ immune cells within CRC tissue procured from three sites–the primary CT, IM, and distant metastasis (DM)–was determined using immunohistochemistry and digital image analyzer (n=196). The IS was obtained by quantifying the densities of CD3+ and CD8+ TILs in the CT and IM. IS-metastatic and IS-macrophage–additional IS models designed in this study–were obtained by adding the score of CD3 and CD8 in DM and the score of CD163 in primary tumors (CT and IM), respectively, to the IS. Result Higher IS, IS-metastatic, and IS-macrophage values were significantly correlated with better prognosis (p=0.020, p≤0.001, and p=0.005, respectively). Multivariate analysis revealed that only IS-metastatic was an independent prognostic marker (p=0.012). No significant correlation was observed between KRAS mutation and three IS models. However, in the subgroup analysis, IS-metastatic showed a prognostic association regardless of the KRAS mutational status. Conclusion IS is a reproducible method for predicting the survival of patients with advanced CRC. Additionally, an IS including the CD3+ and CD8+ TIL densities at DM could be a strong prognostic marker for advanced CRC. PMID:27835889
Klukkert, Marten; Wu, Jian X; Rantanen, Jukka; Carstensen, Jens M; Rades, Thomas; Leopold, Claudia S
2016-07-30
Monitoring of tablet quality attributes in direct vicinity of the production process requires analytical techniques that allow fast, non-destructive, and accurate tablet characterization. The overall objective of this study was to investigate the applicability of multispectral UV imaging as a reliable, rapid technique for estimation of the tablet API content and tablet hardness, as well as determination of tablet intactness and the tablet surface density profile. One of the aims was to establish an image analysis approach based on multivariate image analysis and pattern recognition to evaluate the potential of UV imaging for automatized quality control of tablets with respect to their intactness and surface density profile. Various tablets of different composition and different quality regarding their API content, radial tensile strength, intactness, and surface density profile were prepared using an eccentric as well as a rotary tablet press at compression pressures from 20MPa up to 410MPa. It was found, that UV imaging can provide both, relevant information on chemical and physical tablet attributes. The tablet API content and radial tensile strength could be estimated by UV imaging combined with partial least squares analysis. Furthermore, an image analysis routine was developed and successfully applied to the UV images that provided qualitative information on physical tablet surface properties such as intactness and surface density profiles, as well as quantitative information on variations in the surface density. In conclusion, this study demonstrates that UV imaging combined with image analysis is an effective and non-destructive method to determine chemical and physical quality attributes of tablets and is a promising approach for (near) real-time monitoring of the tablet compaction process and formulation optimization purposes. Copyright © 2015 Elsevier B.V. All rights reserved.
Salisbury, Margaret L; Xia, Meng; Murray, Susan; Bartholmai, Brian J; Kazerooni, Ella A; Meldrum, Catherine A; Martinez, Fernando J; Flaherty, Kevin R
2016-09-01
Idiopathic pulmonary fibrosis (IPF) can be diagnosed confidently and non-invasively when clinical and computed tomography (CT) criteria are met. Many do not meet these criteria due to absence of CT honeycombing. We investigated predictors of IPF and combinations allowing accurate diagnosis in individuals without honeycombing. We utilized prospectively collected clinical and CT data from patients enrolled in the Lung Tissue Research Consortium. Included patients had no honeycombing, no connective tissue disease, underwent diagnostic lung biopsy, and had CT pattern consistent with fibrosing ILD (n = 200). Logistic regression identified clinical and CT variables predictive of IPF. The probability of IPF was assessed at various cut-points of important clinical and CT variables. A multivariable model adjusted for age and gender found increasingly extensive reticular densities (OR 2.93, CI 95% 1.55-5.56, p = 0.001) predicted IPF, while increasing ground glass densities predicted a diagnosis other than IPF (OR 0.55, CI 95% 0.34-0.89, p = 0.02). The model-based probability of IPF was 80% or greater in patients with age at least 60 years and extent of reticular density one-third or more of total lung volume; for patients meeting or exceeding these clinical thresholds the specificity for IPF is 96% (CI 95% 91-100%) with 21 of 134 (16%) biopsies avoided. In patients with suspected fibrotic ILD and absence of CT honeycombing, extent of reticular and ground glass densities predict a diagnosis of IPF. The probability of IPF exceeds 80% in subjects over age 60 years with one-third of total lung having reticular densities. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salamon, Todd
2012-12-13
Faster, more powerful and dense computing hardware generates significant heat and imposes considerable data center cooling requirements. Traditional computer room air conditioning (CRAC) cooling methods are proving increasingly cost-ineffective and inefficient. Studies show that using the volume of room air as a heat exchange medium is wasteful and allows for substantial mixing of hot and cold air. Further, it limits cabinet/frame/rack density because it cannot effectively cool high heat density equipment that is spaced closely together. A more cost-effective, efficient solution for maximizing heat transfer and enabling higher heat density equipment frames can be accomplished by utilizing properly positioned phasemore » change or two-phase pumped refrigerant cooling methods. Pumping low pressure, oil-free phase changing refrigerant through microchannel heat exchangers can provide up to 90% less energy consumption for the primary cooling loop within the room. The primary benefits of such a solution include reduced energy requirements, optimized utilization of data center space, and lower OPEX and CAPEX. Alcatel-Lucent recently developed a modular cooling technology based on a pumped two-phase refrigerant that removes heat directly at the shelf level of equipment racks. The key elements that comprise the modular cooling technology consist of the following. A pump delivers liquid refrigerant to finned microchannel heat exchangers mounted on the back of equipment racks. Fans drive air through the equipment shelf, where the air gains heat dissipated by the electronic components therein. Prior to exiting the rack, the heated air passes through the heat exchangers, where it is cooled back down to the temperature level of the air entering the frame by vaporization of the refrigerant, which is subsequently returned to a condenser where it is liquefied and recirculated by the pump. All the cooling air enters and leaves the shelves/racks at nominally the same temperature. Results of a 100 kW prototype data center installation of the refrigerant-based modular cooling technology were dramatic in terms of energy efficiency and the ability to cool high-heat-density equipment. The prototype data center installation consisted of 10 racks each loaded with 10 kW of high-heat-density IT equipment with the racks arranged in a standard hot-aisle/cold-aisle configuration with standard cabinet spacing. A typical chilled-water CRAC unit would require approximately 16 kW to cool such a heat load. In contrast, the refrigerant-based modular cooling technology required only 2.3 kW of power for the refrigerant pump and shelf-level fans, a reduction of 85 percent. Differences in hot-aisle and cold-aisle temperature were also substantially reduced, mitigating many issues that arise in purely air-based cooling systems, such as mixing of hot and cold air streams, or from placing high-heat-density equipment in close proximity. The technology is also such that it is able to retro-fit live equipment without service interruption, which is particularly important to the large installed ICT customer base, thereby providing a means of mitigating reliability and performance concerns during the installation, training and validation phases of product integration. Moreover, the refrigerant used in our approach, R134a, is a widely-used, non-toxic dielectric liquid which, unlike water, is non-conducting and non-corrosive and will not damage electronics in the case of a leak a triple-play win over alternative water-based liquid coolant technologies. Finally, through use of a pumped refrigerant, pressures are modest (~60 psi), and toxic lubricants and oils are not required, in contrast to compressorized refrigerant systems another environmental win. Project Activities - The ARCTIC project goal was to further develop and dramatically accelerate the commercialization of this game-changing, refrigerant-based, liquid-cooling technology and achieve a revolutionary increase in energy efficiency and carbon footprint reduction for our nation's Information and Communications Technology (ICT) infrastructure. The specific objectives of the ARCTIC project focused in the following three areas: i) advanced research innovations that dramatically enhance the ability to deal with ever-increasing device heat densities and footprint reduction by bringing the liquid cooling much closer to the actual heat sources; ii) manufacturing optimization of key components; and iii) ensuring rapid market acceptance by reducing cost, thoroughly understanding system-level performance, and developing viable commercialization strategies. The project involved participants with expertise in all aspects of commercialization, including research & development, manufacturing, sales & marketing and end users. The team was lead by Alcatel-Lucent, and included subcontractors Modine and USHose.« less
FORGE Newberry 3D Gravity Density Model for Newberry Volcano
Alain Bonneville
2016-03-11
These data are Pacific Northwest National Lab inversions of an amalgamation of two surface gravity datasets: Davenport-Newberry gravity collected prior to 2012 stimulations and Zonge International gravity collected for the project "Novel use of 4D Monitoring Techniques to Improve Reservoir Longevity and Productivity in Enhanced Geothermal Systems" in 2012. Inversions of surface gravity recover a 3D distribution of density contrast from which intrusive igneous bodies are identified. The data indicate a body name, body type, point type, UTM X and Y coordinates, Z data is specified as meters below sea level (negative values then indicate elevations above sea level), thickness of the body in meters, suscept, density anomaly in g/cc, background density in g/cc, and density in g/cc. The model was created using a commercial gravity inversion software called ModelVision 12.0 (http://www.tensor-research.com.au/Geophysical-Products/ModelVision). The initial model is based on the seismic tomography interpretation (Beachly et al., 2012). All the gravity data used to constrain this model are on the GDR: https://gdr.openei.org/submissions/760.
Stearns, Vered; Fackler, Mary Jo; Hafeez, Sidra; Bujanda, Zoila Lopez; Chatterton, Robert T; Jacobs, Lisa K; Khouri, Nagi F; Ivancic, David; Kenney, Kara; Shehata, Christina; Jeter, Stacie C; Wolfman, Judith A; Zalles, Carola M; Huang, Peng; Khan, Seema A; Sukumar, Saraswati
2016-08-01
Methods to determine individualized breast cancer risk lack sufficient sensitivity to select women most likely to benefit from preventive strategies. Alterations in DNA methylation occur early in breast cancer. We hypothesized that cancer-specific methylation markers could enhance breast cancer risk assessment. We evaluated 380 women without a history of breast cancer. We determined their menopausal status or menstrual cycle phase, risk of developing breast cancer (Gail model), and breast density and obtained random fine-needle aspiration (rFNA) samples for assessment of cytopathology and cumulative methylation index (CMI). Eight methylated gene markers were identified through whole-genome methylation analysis and included novel and previously established breast cancer detection genes. We performed correlative and multivariate linear regression analyses to evaluate DNA methylation of a gene panel as a function of clinical factors associated with breast cancer risk. CMI and individual gene methylation were independent of age, menopausal status or menstrual phase, lifetime Gail risk score, and breast density. CMI and individual gene methylation for the eight genes increased significantly (P < 0.001) with increasing cytological atypia. The findings were verified with multivariate analyses correcting for age, log (Gail), log (percent density), rFNA cell number, and body mass index. Our results demonstrate a significant association between cytological atypia and high CMI, which does not vary with menstrual phase or menopause and is independent of Gail risk and mammographic density. Thus, CMI is an excellent candidate breast cancer risk biomarker, warranting larger prospective studies to establish its utility for cancer risk assessment. Cancer Prev Res; 9(8); 673-82. ©2016 AACR. ©2016 American Association for Cancer Research.
Analysis of percent density estimates from digital breast tomosynthesis projection images
NASA Astrophysics Data System (ADS)
Bakic, Predrag R.; Kontos, Despina; Zhang, Cuiping; Yaffe, Martin J.; Maidment, Andrew D. A.
2007-03-01
Women with dense breasts have an increased risk of breast cancer. Breast density is typically measured as the percent density (PD), the percentage of non-fatty (i.e., dense) tissue in breast images. Mammographic PD estimates vary, in part, due to the projective nature of mammograms. Digital breast tomosynthesis (DBT) is a novel radiographic method in which 3D images of the breast are reconstructed from a small number of projection (source) images, acquired at different positions of the x-ray focus. DBT provides superior visualization of breast tissue and has improved sensitivity and specificity as compared to mammography. Our long-term goal is to test the hypothesis that PD obtained from DBT is superior in estimating cancer risk compared with other modalities. As a first step, we have analyzed the PD estimates from DBT source projections since the results would be independent of the reconstruction method. We estimated PD from MLO mammograms (PD M) and from individual DBT projections (PD T). We observed good agreement between PD M and PD T from the central projection images of 40 women. This suggests that variations in breast positioning, dose, and scatter between mammography and DBT do not negatively affect PD estimation. The PD T estimated from individual DBT projections of nine women varied with the angle between the projections. This variation is caused by the 3D arrangement of the breast dense tissue and the acquisition geometry.
NASA Astrophysics Data System (ADS)
Zhang, Zhiyuan; Jiang, Wanrun; Wang, Bo; Wang, Zhigang
2017-06-01
We introduce the orbital-resolved electron density projected integral (EDPI) along the H-bond in the real space to quantitatively investigate the specific contribution from the molecular orbitals (MOs) aspect in (H2O)2. Calculation results show that, the electronic occupied orbital (HOMO-4) of (H2O)2 accounts for about surprisingly 40% of the electron density at the bond critical point. Moreover, the electronic density difference analysis visualizes the electron accumulating effect of the orbital interaction within the H-bond between water molecules, supporting its covalent-like character. Our work expands the understanding of H-bond with specific contributions from certain MOs.
Multivariate classification of small order watersheds in the Quabbin Reservoir Basin, Massachusetts
Lent, R.M.; Waldron, M.C.; Rader, J.C.
1998-01-01
A multivariate approach was used to analyze hydrologic, geologic, geographic, and water-chemistry data from small order watersheds in the Quabbin Reservoir Basin in central Massachusetts. Eighty three small order watersheds were delineated and landscape attributes defining hydrologic, geologic, and geographic features of the watersheds were compiled from geographic information system data layers. Principal components analysis was used to evaluate 11 chemical constituents collected bi-weekly for 1 year at 15 surface-water stations in order to subdivide the basin into subbasins comprised of watersheds with similar water quality characteristics. Three principal components accounted for about 90 percent of the variance in water chemistry data. The principal components were defined as a biogeochemical variable related to wetland density, an acid-neutralization variable, and a road-salt variable related to density of primary roads. Three subbasins were identified. Analysis of variance and multiple comparisons of means were used to identify significant differences in stream water chemistry and landscape attributes among subbasins. All stream water constituents were significantly different among subbasins. Multiple regression techniques were used to relate stream water chemistry to landscape attributes. Important differences in landscape attributes were related to wetlands, slope, and soil type.A multivariate approach was used to analyze hydrologic, geologic, geographic, and water-chemistry data from small order watersheds in the Quabbin Reservoir Basin in central Massachusetts. Eighty three small order watersheds were delineated and landscape attributes defining hydrologic, geologic, and geographic features of the watersheds were compiled from geographic information system data layers. Principal components analysis was used to evaluate 11 chemical constituents collected bi-weekly for 1 year at 15 surface-water stations in order to subdivide the basin into subbasins comprised of watersheds with similar water quality characteristics. Three principal components accounted for about 90 percent of the variance in water chemistry data. The principal components were defined as a biogeochemical variable related to wetland density, an acid-neutralization variable, and a road-salt variable related to density of primary roads. Three subbasins were identified. Analysis of variance and multiple comparisons of means were used to identify significant differences in stream water chemistry and landscape attributes among subbasins. All stream water constituents were significantly different among subbasins. Multiple regression techniques were used to relate stream water chemistry to landscape attributes. Important differences in landscape attributes were related to wetlands, slope, and soil type.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berdichevsky, Gene
Commercial Li-ion batteries typically use Ni- and Co-based intercalation cathodes. As the demand for improved performance from batteries increases, these cathode materials will no longer be able to provide the desired energy storage characteristics since they are currently approaching their theoretical limits. Conversion cathode materials are prime candidates for improvement of Li-ion batteries. On both a volumetric and gravimetric basis they have higher theoretical capacity than intercalation cathode materials. Metal fluoride (MFx) cathodes offer higher specific energy density and dramatically higher volumetric energy density. Challenges associated with metal fluoride cathodes were addressed through nanostructured material design and synthesis. A majormore » goal of this project was to develop and demonstrate Li-ion cells based on Si-comprising anodes and metal fluoride (MFx) comprising cathodes. Pairing the high-capacity MFx cathode with a high-capacity anode, such as an alloying Si anode, allows for the highest possible energy density on a cell level. After facing and overcoming multiple material synthesis and electrochemical instability challenges, we succeeded in fabrication of MFx half cells with cycle stability in excess of 500 cycles (to 20% or smaller degradation) and full cells with MFx-based cathodes and Si-based anodes with cycle stability in excess of 200 cycles (to 20% or smaller degradation).« less