Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Elkhoudary, Mahmoud M; Naguib, Ibrahim A; Abdel Salam, Randa A; Hadad, Ghada M
2017-05-01
Four accurate, sensitive and reliable stability indicating chemometric methods were developed for the quantitative determination of Agomelatine (AGM) whether in pure form or in pharmaceutical formulations. Two supervised learning machines' methods; linear artificial neural networks (PC-linANN) preceded by principle component analysis and linear support vector regression (linSVR), were compared with two principle component based methods; principle component regression (PCR) as well as partial least squares (PLS) for the spectrofluorimetric determination of AGM and its degradants. The results showed the benefits behind using linear learning machines' methods and the inherent merits of their algorithms in handling overlapped noisy spectral data especially during the challenging determination of AGM alkaline and acidic degradants (DG1 and DG2). Relative mean squared error of prediction (RMSEP) for the proposed models in the determination of AGM were 1.68, 1.72, 0.68 and 0.22 for PCR, PLS, SVR and PC-linANN; respectively. The results showed the superiority of supervised learning machines' methods over principle component based methods. Besides, the results suggested that linANN is the method of choice for determination of components in low amounts with similar overlapped spectra and narrow linearity range. Comparison between the proposed chemometric models and a reported HPLC method revealed the comparable performance and quantification power of the proposed models.
NASA Astrophysics Data System (ADS)
Li, Jiangtong; Luo, Yongdao; Dai, Honglin
2018-01-01
Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.
Plant, soil, and shadow reflectance components of row crops
NASA Technical Reports Server (NTRS)
Richardson, A. J.; Wiegand, C. L.; Gausman, H. W.; Cuellar, J. A.; Gerbermann, A. H.
1975-01-01
Data from the first Earth Resource Technology Satellite (LANDSAT-1) multispectral scanner (MSS) were used to develop three plant canopy models (Kubelka-Munk (K-M), regression, and combined K-M and regression models) for extracting plant, soil, and shadow reflectance components of cropped fields. The combined model gave the best correlation between MSS data and ground truth, by accounting for essentially all of the reflectance of plants, soil, and shadow between crop rows. The principles presented can be used to better forecast crop yield and to estimate acreage.
Diversity of soil yeasts isolated from South Victoria Land, Antarctica
Connell, L.; Redman, R.; Craig, S.; Scorzetti, G.; Iszard, M.; Rodriguez, R.
2008-01-01
Unicellular fungi, commonly referred to as yeasts, were found to be components of the culturable soil fungal population in Taylor Valley, Mt. Discovery, Wright Valley, and two mountain peaks of South Victoria Land, Antarctica. Samples were taken from sites spanning a diversity of soil habitats that were not directly associated with vertebrate activity. A large proportion of yeasts isolated in this study were basidiomycetous species (89%), of which 43% may represent undescribed species, demonstrating that culturable yeasts remain incompletely described in these polar desert soils. Cryptococcus species represented the most often isolated genus (33%) followed by Leucosporidium (22%). Principle component analysis and multiple linear regression using stepwise selection was used to model the relation between abiotic variables (principle component 1 and principle component 2 scores) and yeast biodiversity (the number of species present at a given site). These analyses identified soil pH and electrical conductivity as significant predictors of yeast biodiversity. Species-specific PCR primers were designed to rapidly discriminate among the Dioszegia and Leucosporidium species collected in this study. ?? 2008 Springer Science+Business Media, LLC.
A framework for evaluating student perceptions of health policy training in medical school.
Patel, Mitesh S; Lypson, Monica L; Miller, D Douglas; Davis, Matthew M
2014-10-01
Nearly half of graduating medical students in the United States report that medical school provides inadequate instruction in topics related to health policy. Although most medical schools report some form of policy education, there lacks a standard for teaching core concepts and evaluating student satisfaction. Responses to the Association of American Medical College's Medical School Graduation Questionnaire were obtained for the years 2007-2008 and 2011-2012 and mapped to domains of training in health policy curricula for four domains: systems and principles; value and equity; quality and safety; and politics and law. Chi-square tests were used to test differences among unadjusted temporal trends. Multiple logistic regression models were fit to the outcome variables and adjusted for student characteristics, student preferences, and medical school characteristics. Compared with 2007-2008, students' perceptions of training in 2011-2012 increased on a relative basis by 11.7% for components within systems and principles, 2.8% for quality and safety, and 6.8% for value and equity. Components within politics and law had a composite decline of 4.8%. Multiple logistic regression models found higher odds of reporting satisfaction with training over time for all components within the domains of systems and principles, quality and safety, and value and equity (P < .01), with the exception of medical economics. Medical student perceptions of training in health policy improved over time. Causal factors for these trends require further study. Despite improvement, nearly 40% of graduating medical students still report inadequate instruction in health policy.
Liu, Hui-lin; Wan, Xia; Yang, Gong-huan
2013-02-01
To explore the relationship between the strength of tobacco control and the effectiveness of creating smoke-free hospital, and summarize the main factors that affect the program of creating smoke-free hospitals. A total of 210 hospitals from 7 provinces/municipalities directly under the central government were enrolled in this study using stratified random sampling method. Principle component analysis and regression analysis were conducted to analyze the strength of tobacco control and the effectiveness of creating smoke-free hospitals. Two principal components were extracted in the strength of tobacco control index, which respectively reflected the tobacco control policies and efforts, and the willingness and leadership of hospital managers regarding tobacco control. The regression analysis indicated that only the first principal component was significantly correlated with the progression in creating smoke-free hospital (P<0.001), i.e. hospitals with higher scores on the first principal component had better achievements in smoke-free environment creation. Tobacco control policies and efforts are critical in creating smoke-free hospitals. The principal component analysis provides a comprehensive and objective tool for evaluating the creation of smoke-free hospitals.
NASA Astrophysics Data System (ADS)
Ying, Yibin; Liu, Yande; Fu, Xiaping; Lu, Huishan
2005-11-01
The artificial neural networks (ANNs) have been used successfully in applications such as pattern recognition, image processing, automation and control. However, majority of today's applications of ANNs is back-propagate feed-forward ANN (BP-ANN). In this paper, back-propagation artificial neural networks (BP-ANN) were applied for modeling soluble solid content (SSC) of intact pear from their Fourier transform near infrared (FT-NIR) spectra. One hundred and sixty-four pear samples were used to build the calibration models and evaluate the models predictive ability. The results are compared to the classical calibration approaches, i.e. principal component regression (PCR), partial least squares (PLS) and non-linear PLS (NPLS). The effects of the optimal methods of training parameters on the prediction model were also investigated. BP-ANN combine with principle component regression (PCR) resulted always better than the classical PCR, PLS and Weight-PLS methods, from the point of view of the predictive ability. Based on the results, it can be concluded that FT-NIR spectroscopy and BP-ANN models can be properly employed for rapid and nondestructive determination of fruit internal quality.
1983-12-01
analysis; such work is not reported here. It seems pos- sible that a robust principle component analysis may he informa- tive (see Gnanadesikan (1977...Statistics in Atmospheric Sciences, American Meteorological Soc., Boston, Mass. (1979) pp. 46-48. a Gnanadesikan , R., Methods for Statistical Data...North Carolina Chapel Hill, NC 20742 Dr. R. Gnanadesikan Bell Telephone Lab Murray Hill, NJ 07733 -%.. *5%a: *1 *15 I ,, - . . , ,, ... . . . . . . NO
Face Alignment via Regressing Local Binary Features.
Ren, Shaoqing; Cao, Xudong; Wei, Yichen; Sun, Jian
2016-03-01
This paper presents a highly efficient and accurate regression approach for face alignment. Our approach has two novel components: 1) a set of local binary features and 2) a locality principle for learning those features. The locality principle guides us to learn a set of highly discriminative local binary features for each facial landmark independently. The obtained local binary features are used to jointly learn a linear regression for the final output. This approach achieves the state-of-the-art results when tested on the most challenging benchmarks to date. Furthermore, because extracting and regressing local binary features are computationally very cheap, our system is much faster than previous methods. It achieves over 3000 frames per second (FPS) on a desktop or 300 FPS on a mobile phone for locating a few dozens of landmarks. We also study a key issue that is important but has received little attention in the previous research, which is the face detector used to initialize alignment. We investigate several face detectors and perform quantitative evaluation on how they affect alignment accuracy. We find that an alignment friendly detector can further greatly boost the accuracy of our alignment method, reducing the error up to 16% relatively. To facilitate practical usage of face detection/alignment methods, we also propose a convenient metric to measure how good a detector is for alignment initialization.
The measure and significance of Bateman's principles
Collet, Julie M.; Dean, Rebecca F.; Worley, Kirsty; Richardson, David S.; Pizzari, Tommaso
2014-01-01
Bateman's principles explain sex roles and sexual dimorphism through sex-specific variance in mating success, reproductive success and their relationships within sexes (Bateman gradients). Empirical tests of these principles, however, have come under intense scrutiny. Here, we experimentally show that in replicate groups of red junglefowl, Gallus gallus, mating and reproductive successes were more variable in males than in females, resulting in a steeper male Bateman gradient, consistent with Bateman's principles. However, we use novel quantitative techniques to reveal that current methods typically overestimate Bateman's principles because they (i) infer mating success indirectly from offspring parentage, and thus miss matings that fail to result in fertilization, and (ii) measure Bateman gradients through the univariate regression of reproductive over mating success, without considering the substantial influence of other components of male reproductive success, namely female fecundity and paternity share. We also find a significant female Bateman gradient but show that this likely emerges as spurious consequences of male preference for fecund females, emphasizing the need for experimental approaches to establish the causal relationship between reproductive and mating success. While providing qualitative support for Bateman's principles, our study demonstrates how current approaches can generate a misleading view of sex differences and roles. PMID:24648220
Liu, Weijian; Wang, Yilong; Chen, Yuanchen; Tao, Shu; Liu, Wenxin
2017-07-01
The total concentrations and component profiles of polycyclic aromatic hydrocarbons (PAHs) in ambient air, surface soil and wheat grain collected from wheat fields near a large steel-smelting manufacturer in Northern China were determined. Based on the specific isomeric ratios of paired species in ambient air, principle component analysis and multivariate linear regression, the main emission source of local PAHs was identified as a mixture of industrial and domestic coal combustion, biomass burning and traffic exhaust. The total organic carbon (TOC) fraction was considerably correlated with the total and individual PAH concentrations in surface soil. The total concentrations of PAHs in wheat grain were relatively low, with dominant low molecular weight constituents, and the compositional profile was more similar to that in ambient air than in topsoil. Combined with more significant results from partial correlation and linear regression models, the contribution from air PAHs to grain PAHs may be greater than that from soil PAHs. Copyright © 2016. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Chen, Hua-cai; Chen, Xing-dan; Lu, Yong-jun; Cao, Zhi-qiang
2006-01-01
Near infrared (NIR) reflectance spectroscopy was used to develop a fast determination method for total ginsenosides in Ginseng (Panax Ginseng) powder. The spectra were analyzed with multiplicative signal correction (MSC) correlation method. The best correlative spectra region with the total ginsenosides content was 1660 nm~1880 nm and 2230nm~2380 nm. The NIR calibration models of ginsenosides were built with multiple linear regression (MLR), principle component regression (PCR) and partial least squares (PLS) regression respectively. The results showed that the calibration model built with PLS combined with MSC and the optimal spectrum region was the best one. The correlation coefficient and the root mean square error of correction validation (RMSEC) of the best calibration model were 0.98 and 0.15% respectively. The optimal spectrum region for calibration was 1204nm~2014nm. The result suggested that using NIR to rapidly determinate the total ginsenosides content in ginseng powder were feasible.
Prendergast, Michael L.; Pearson, Frank S.; Podus, Deborah; Hamilton, Zachary K.; Greenwell, Lisa
2013-01-01
Objectives The purpose of the present meta-analysis was to answer the question: Can the Andrews principles of risk, needs, and responsivity, originally developed for programs that treat offenders, be extended to programs that treat drug abusers? Methods Drawing from a dataset that included 243 independent comparisons, we conducted random-effects meta-regression and ANOVA-analog meta-analyses to test the Andrews principles by averaging crime and drug use outcomes over a diverse set of programs for drug abuse problems. Results For crime outcomes, in the meta-regressions the point estimates for each of the principles were substantial, consistent with previous studies of the Andrews principles. There was also a substantial point estimate for programs exhibiting a greater number of the principles. However, almost all of the 95% confidence intervals included the zero point. For drug use outcomes, in the meta-regressions the point estimates for each of the principles was approximately zero; however, the point estimate for programs exhibiting a greater number of the principles was somewhat positive. All of the estimates for the drug use principles had confidence intervals that included the zero point. Conclusions This study supports previous findings from primary research studies targeting the Andrews principles that those principles are effective in reducing crime outcomes, here in meta-analytic research focused on drug treatment programs. By contrast, programs that follow the principles appear to have very little effect on drug use outcomes. Primary research studies that experimentally test the Andrews principles in drug treatment programs are recommended. PMID:24058325
Pathak, Ekta; Campos-Herrera, Raquel; El-Borai, Fahiem E; Duncan, Larry W
2017-03-01
Relationships between entomopathogenic nematodes (EPNs), nematophagous fungi (NF) and soil physical and chemical properties were studied in a survey of 53 citrus orchards in central ridge and flatwoods ecoregions of Florida. Seven species of NF associated with nematodes were quantified directly using a real time qPCR assay. All nematophagous fungi studied except Arthrobotrys musiformis and Hirsutella rhossiliensis were frequently detected (24-56%) in both regions. Paecilomyces lilacinus and Gamsylella gephyropagumwere encountered more frequently in the flatwoods (P=0.03) and on the ridge (P=0.02), respectively. Redundancy analysis revealed seven abiotic and biotic factors as significantly related to the NF occurrence. Multiple regression of fungi on these variables explained 78%, 66%, 48%, 36%, 23% and 4% of the variation in Catenaria sp., A. musiformis, A. dactyloides, P. lilacinus, A. oligospora and G. gepharopagum, respectively. When the data from citrus were pooled with those reported previously from natural areas and subjected to principle component analysis, the first two principle components explained 43% of the variation in NF communities. The surveys (citrus vs natural areas) were discriminated by PC2 (P<0.001) and the ecoregion by PC1 (P<0.002), and all but one NF species were related (P<0.01) to one or both components. NF communities tended to have more species and greater diversity in the flatwoods, where EPN richness and diversity were the least. However, the strength of associations between individual EPN and NF species as measured by SADIE reflected the associations between each species and ground water depth, suggesting that ecoregion preferences affected the species associations. Within each ecoregion, significant relationships between the individual NF and EPN species measured by stepwise regression tended to be positive. The results did not support the hypothesis that NF modulate the spatial patterns of EPN species between or within these two ecoregions. Copyright © 2017 Elsevier Inc. All rights reserved.
State-space decoding of primary afferent neuron firing rates
NASA Astrophysics Data System (ADS)
Wagenaar, J. B.; Ventura, V.; Weber, D. J.
2011-02-01
Kinematic state feedback is important for neuroprostheses to generate stable and adaptive movements of an extremity. State information, represented in the firing rates of populations of primary afferent (PA) neurons, can be recorded at the level of the dorsal root ganglia (DRG). Previous work in cats showed the feasibility of using DRG recordings to predict the kinematic state of the hind limb using reverse regression. Although accurate decoding results were attained, reverse regression does not make efficient use of the information embedded in the firing rates of the neural population. In this paper, we present decoding results based on state-space modeling, and show that it is a more principled and more efficient method for decoding the firing rates in an ensemble of PA neurons. In particular, we show that we can extract confounded information from neurons that respond to multiple kinematic parameters, and that including velocity components in the firing rate models significantly increases the accuracy of the decoded trajectory. We show that, on average, state-space decoding is twice as efficient as reverse regression for decoding joint and endpoint kinematics.
“Smooth” Semiparametric Regression Analysis for Arbitrarily Censored Time-to-Event Data
Zhang, Min; Davidian, Marie
2008-01-01
Summary A general framework for regression analysis of time-to-event data subject to arbitrary patterns of censoring is proposed. The approach is relevant when the analyst is willing to assume that distributions governing model components that are ordinarily left unspecified in popular semiparametric regression models, such as the baseline hazard function in the proportional hazards model, have densities satisfying mild “smoothness” conditions. Densities are approximated by a truncated series expansion that, for fixed degree of truncation, results in a “parametric” representation, which makes likelihood-based inference coupled with adaptive choice of the degree of truncation, and hence flexibility of the model, computationally and conceptually straightforward with data subject to any pattern of censoring. The formulation allows popular models, such as the proportional hazards, proportional odds, and accelerated failure time models, to be placed in a common framework; provides a principled basis for choosing among them; and renders useful extensions of the models straightforward. The utility and performance of the methods are demonstrated via simulations and by application to data from time-to-event studies. PMID:17970813
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the forestry component of Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three components, with…
Feuerhahn, Nicolas; Bellingrath, Silja; Kudielka, Brigitte M
2013-07-01
We investigated how matching and non-matching demands and resources are related to emotional exhaustion (EE) in teachers. Theoretically, we draw on the Demand-Induced Strain Compensation (DISC) model that proposes that demands, resources, and strains are multidimensional and comprise emotional, cognitive, and physical components. We first tested whether resources compensate aversive effects of demands. Second, as proposed by the triple-match principle, we tested whether interaction effects between job demands and resources are most likely if demands, resources, and outcomes relate to the same dimension. We retrieved data from 177 school teachers; a subsample was re-examined after a time lag of about 21 month (N = 56). Linear regression analyses reveal concurrent and longitudinal main and interaction effects of teacher-specific emotional and cognitive job demands and resources on EE. Results support the compensation principle and triple-match principle. Therefore, the DISC model seems to provide a valuable framework for the study of interaction effects in job stress research and, in particular, for interventions to reduce job strain in teachers. © 2013 The Authors. Applied Psychology: Health and Well-Being © 2013 The International Association of Applied Psychology.
ERIC Educational Resources Information Center
Mugrage, Beverly; And Others
Three ridge regression solutions are compared with ordinary least squares regression and with principal components regression using all components. Ridge regression, particularly the Lawless-Wang solution, out-performed ordinary least squares regression and the principal components solution on the criteria of stability of coefficient and closeness…
NASA Astrophysics Data System (ADS)
Müller, Benjamin; Bernhardt, Matthias; Jackisch, Conrad; Schulz, Karsten
2016-09-01
For understanding water and solute transport processes, knowledge about the respective hydraulic properties is necessary. Commonly, hydraulic parameters are estimated via pedo-transfer functions using soil texture data to avoid cost-intensive measurements of hydraulic parameters in the laboratory. Therefore, current soil texture information is only available at a coarse spatial resolution of 250 to 1000 m. Here, a method is presented to derive high-resolution (15 m) spatial topsoil texture patterns for the meso-scale Attert catchment (Luxembourg, 288 km2) from 28 images of ASTER (advanced spaceborne thermal emission and reflection radiometer) thermal remote sensing. A principle component analysis of the images reveals the most dominant thermal patterns (principle components, PCs) that are related to 212 fractional soil texture samples. Within a multiple linear regression framework, distributed soil texture information is estimated and related uncertainties are assessed. An overall root mean squared error (RMSE) of 12.7 percentage points (pp) lies well within and even below the range of recent studies on soil texture estimation, while requiring sparser sample setups and a less diverse set of basic spatial input. This approach will improve the generation of spatially distributed topsoil maps, particularly for hydrologic modeling purposes, and will expand the usage of thermal remote sensing products.
Ma, Wan-Li; Sun, De-Zhi; Shen, Wei-Guo; Yang, Meng; Qi, Hong; Liu, Li-Yan; Shen, Ji-Min; Li, Yi-Fan
2011-07-01
A comprehensive sampling campaign was carried out to study atmospheric concentration of polycyclic aromatic hydrocarbons (PAHs) in Beijing and to evaluate the effectiveness of source control strategies in reducing PAHs pollution after the 29th Olympic Games. The sub-cooled liquid vapor pressure (logP(L)(o))-based model and octanol-air partition coefficient (K(oa))-based model were applied based on each seasonal dateset. Regression analysis among log K(P), logP(L)(o) and log K(oa) exhibited high significant correlations for four seasons. Source factors were identified by principle component analysis and contributions were further estimated by multiple linear regression. Pyrogenic sources and coke oven emission were identified as major sources for both the non-heating and heating seasons. As compared with literatures, the mean PAH concentrations before and after the 29th Olympic Games were reduced by more than 60%, indicating that the source control measures were effective for reducing PAHs pollution in Beijing. Copyright © 2011 Elsevier Ltd. All rights reserved.
[Discrimination of Red Tide algae by fluorescence spectra and principle component analysis].
Su, Rong-guo; Hu, Xu-peng; Zhang, Chuan-song; Wang, Xiu-lin
2007-07-01
Fluorescence discrimination technology for 11 species of the Red Tide algae at genus level was constructed by principle component analysis and non-negative least squares. Rayleigh and Raman scattering peaks of 3D fluorescence spectra were eliminated by Delaunay triangulation method. According to the results of Fisher linear discrimination, the first principle component score and the second component score of 3D fluorescence spectra were chosen as discriminant feature and the feature base was established. The 11 algae species were tested, and more than 85% samples were accurately determinated, especially for Prorocentrum donghaiense, Skeletonema costatum, Gymnodinium sp., which have frequently brought Red tide in the East China Sea. More than 95% samples were right discriminated. The results showed that the genus discriminant feature of 3D fluorescence spectra of Red Tide algae given by principle component analysis could work well.
USDA-ARS?s Scientific Manuscript database
Selective principal component regression analysis (SPCR) uses a subset of the original image bands for principal component transformation and regression. For optimal band selection before the transformation, this paper used genetic algorithms (GA). In this case, the GA process used the regression co...
NASA Astrophysics Data System (ADS)
Dai, Xiaoqian; Tian, Jie; Chen, Zhe
2010-03-01
Parametric images can represent both spatial distribution and quantification of the biological and physiological parameters of tracer kinetics. The linear least square (LLS) method is a well-estimated linear regression method for generating parametric images by fitting compartment models with good computational efficiency. However, bias exists in LLS-based parameter estimates, owing to the noise present in tissue time activity curves (TTACs) that propagates as correlated error in the LLS linearized equations. To address this problem, a volume-wise principal component analysis (PCA) based method is proposed. In this method, firstly dynamic PET data are properly pre-transformed to standardize noise variance as PCA is a data driven technique and can not itself separate signals from noise. Secondly, the volume-wise PCA is applied on PET data. The signals can be mostly represented by the first few principle components (PC) and the noise is left in the subsequent PCs. Then the noise-reduced data are obtained using the first few PCs by applying 'inverse PCA'. It should also be transformed back according to the pre-transformation method used in the first step to maintain the scale of the original data set. Finally, the obtained new data set is used to generate parametric images using the linear least squares (LLS) estimation method. Compared with other noise-removal method, the proposed method can achieve high statistical reliability in the generated parametric images. The effectiveness of the method is demonstrated both with computer simulation and with clinical dynamic FDG PET study.
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the agricultural mechanics component of the Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the common core component of Applied Principles of Agribusiness and Natural Resources Occupations. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the agricultural resources component of Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the agricultural production component of Applied Principles of Agribusiness and Natural Resources Occupations. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist…
Applying Regression Analysis to Problems in Institutional Research.
ERIC Educational Resources Information Center
Bohannon, Tom R.
1988-01-01
Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)
E-Enterprise for the Environment Conceptual Blueprint: Principles and Components
The State-EPA E-Enterprise Working Group commissioned a Conceptual Blueprint document to define the principles and primary components of E-Enterprise. This Blueprint is the first step in defining E-Enterprise.
21 CFR 814.39 - PMA supplements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... sterilization procedures. (5) Changes in packaging. (6) Changes in the performance or design specifications, circuits, components, ingredients, principle of operation, or physical layout of the device. (7) Extension... the performance or design specifications, circuits, components, ingredients, principles of operation...
A New Multifunctional Sensor for Measuring Concentrations of Ternary Solution
NASA Astrophysics Data System (ADS)
Wei, Guo; Shida, Katsunori
This paper presents a multifunctional sensor with novel structure, which is capable of directly sensing temperature and two physical parameters of solutions, namely ultrasonic velocity and conductivity. By combined measurement of these three measurable parameters, the concentrations of various components in a ternary solution can be simultaneously determined. The structure and operation principle of the sensor are described, and a regression algorithm based on natural cubic spline interpolation and the least square method is adopted to estimate the concentrations. The performances of the proposed sensor are experimentally tested by the use of ternary aqueous solution of sodium chloride and sucrose, which is widely involved in food and beverage industries. This sensor could prove valuable as a process control sensor in industry fields.
Evaluation of driver fatigue on two channels of EEG data.
Li, Wei; He, Qi-chang; Fan, Xiu-min; Fei, Zhi-min
2012-01-11
Electroencephalogram (EEG) data is an effective indicator to evaluate driver fatigue. The 16 channels of EEG data are collected and transformed into three bands (θ, α, and β) in the current paper. First, 12 types of energy parameters are computed based on the EEG data. Then, Grey Relational Analysis (GRA) is introduced to identify the optimal indicator of driver fatigue, after which, the number of significant electrodes is reduced using Kernel Principle Component Analysis (KPCA). Finally, the evaluation model for driver fatigue is established with the regression equation based on the EEG data from two significant electrodes (Fp1 and O1). The experimental results verify that the model is effective in evaluating driver fatigue. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Judo principles and practices: applications to conflict-solving strategies in psychotherapy.
Gleser, J; Brown, P
1988-07-01
Jigoro Kano created judo from ju-jitsu techniques. He realized that the Ju principle of both judo and ju-jitsu as the art of yielding, was that of living and changing. The principle of yielding has been applied in dynamic and directive psychotherapies for many years and was recently linked to the Ju principle in martial arts. After several years of using a modified judo practice as a therapeutic tool, and applying the principle of yielding as a dynamic conflict-solving strategy, the authors discovered judo principles applicable to conflict solving, particularly for regressed and violent psychotic patients.
Manufacturing Methods and Technology Project Summary Reports
1981-06-01
a tough urethane film. The basic principle is to pump two components to a spinning disc, mixing the components just prior to depositing in a well...and check out an electronic target scoring device using developed scientific principles without drastically modifying existing commercial...equipment. The scoring device selected and installed was an Accubar Model ATS-16D using the underlying physics principle of acoustic shock wave propagation
Principles of Quantile Regression and an Application
ERIC Educational Resources Information Center
Chen, Fang; Chalhoub-Deville, Micheline
2014-01-01
Newer statistical procedures are typically introduced to help address the limitations of those already in practice or to deal with emerging research needs. Quantile regression (QR) is introduced in this paper as a relatively new methodology, which is intended to overcome some of the limitations of least squares mean regression (LMR). QR is more…
Giesen, E B W; Ding, M; Dalstra, M; van Eijden, T M G J
2003-09-01
As several morphological parameters of cancellous bone express more or less the same architectural measure, we applied principal components analysis to group these measures and correlated these to the mechanical properties. Cylindrical specimens (n = 24) were obtained in different orientations from embalmed mandibular condyles; the angle of the first principal direction and the axis of the specimen, expressing the orientation of the trabeculae, ranged from 10 degrees to 87 degrees. Morphological parameters were determined by a method based on Archimedes' principle and by micro-CT scanning, and the mechanical properties were obtained by mechanical testing. The principal components analysis was used to obtain a set of independent components to describe the morphology. This set was entered into linear regression analyses for explaining the variance in mechanical properties. The principal components analysis revealed four components: amount of bone, number of trabeculae, trabecular orientation, and miscellaneous. They accounted for about 90% of the variance in the morphological variables. The component loadings indicated that a higher amount of bone was primarily associated with more plate-like trabeculae, and not with more or thicker trabeculae. The trabecular orientation was most determinative (about 50%) in explaining stiffness, strength, and failure energy. The amount of bone was second most determinative and increased the explained variance to about 72%. These results suggest that trabecular orientation and amount of bone are important in explaining the anisotropic mechanical properties of the cancellous bone of the mandibular condyle.
Elastohydrodynamic principles applied to the design of helicopter components.
NASA Technical Reports Server (NTRS)
Townsend, D. P.
1973-01-01
Elastohydrodynamic principles affecting the lubrication of transmission components are presented and discussed. Surface temperatures of the transmission bearings and gears affect elastohydrodynamic film thickness. Traction forces and sliding as well as the inlet temperature determine surface temperatures. High contact ratio gears cause increased sliding and may run at higher surface temperatures. Component life is a function of the ratio of elastohydrodynamic film thickness to composite surface roughness. Lubricant starvation reduces elastohydrodynamic film thickness and increases surface temperatures. Methods are presented which allow for the application of elastohydrodynamic principles to transmission design in order to increase system life and reliability.
Elastohydrodynamic principles applied to the design of helicopter components
NASA Technical Reports Server (NTRS)
Townsend, D. P.
1973-01-01
Elastohydrodynamic principles affecting the lubrication of transmission components are presented and discussed. Surface temperature of the transmission bearings and gears affect elastohydrodynamic film thickness. Traction forces and sliding as well as the inlet temperature determine surface temperatures. High contact ratio gears cause increased sliding and may run at higher surface temperatures. Component life is a function of the ratio of elastohydrodynamic film thickness to composite surface roughness. Lubricant starvation reduces elastrohydrodynamic film thickness and increases surface temperatures. Methods are presented which allow for the application of elastohydrodynamic principles to transmission design in order to increase system life and reliability.
Shan, Si-Ming; Luo, Jian-Guang; Huang, Fang; Kong, Ling-Yi
2014-02-01
Panax ginseng C.A. Meyer has been known as a valuable traditional Chinese medicines for thousands years of history. Ginsenosides, the main active constituents, exhibit prominent immunoregulation effect. The present study first describes a holistic method based on chemical characteristic and lymphocyte proliferative capacity to evaluate systematically the quality of P. ginseng in thirty samples from different seasons during 2-6 years. The HPLC fingerprints were evaluated using principle component analysis (PCA) and hierarchical clustering analysis (HCA). The spectrum-efficacy model between HPLC fingerprints and T-lymphocyte proliferative activities was investigated by principal component regression (PCR) and partial least squares (PLS). The results indicated that the growth of the ginsenosides could be grouped into three periods and from August of the fifth year, P. ginseng appeared significant lymphocyte proliferative capacity. Close correlation existed between the spectrum-efficacy relationship and ginsenosides Rb1, Ro, Rc, Rb2 and Re were the main contributive components to the lymphocyte proliferative capacity. This comprehensive strategy, providing reliable and adequate scientific evidence, could be applied to other TCMs to ameliorate their quality control. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gurjanov, A. V.; Zakoldaev, D. A.; Shukalov, A. V.; Zharinov, I. O.
2018-03-01
The task of developing principles of cyber-physical system constitution at the Industry 4.0 company of the item designing components of mechanical assembly production is being studied. The task has been solved by analyzing the components and technologies, which have some practical application in the digital production organization. The list of components has been defined and the authors proposed the scheme of the components and technologies interconnection in the Industry 4.0 of mechanical assembly production to make an uninterrupted manufacturing route of the item designing components with application of some cyber-physical systems.
Cao, Yingjie; Tang, Changyuan; Song, Xianfang; Liu, Changming; Zhang, Yinghua
2013-04-01
In this study, an approach is put forward to study the relationship between changing land use and groundwater nitrate contamination in the Sanjiang Plain. This approach emphasizes the importance of groundwater residence time when relating the nitrates to the changing land use. The principles underlying the approach involve the assessment of groundwater residence time by CFCs and the Vogel age model and the reconstruction of the land use at the groundwater recharge time by interpolation. Nitrate trend analysis shows that nitrates have begun to leach into the aquifers since agricultural activities boomed after the 1950s. Hydrochemical analysis implies that the possible process relating to the nitrate reduction in the groundwater is the oxidation of Fe(ii)-silicates. However, the chemical kinetics of the oxidation of Fe(ii)-silicates is slow, so this denitrification process contributes little to the nitrate variations. Stepwise regression shows that the nitrate concentrations of samples had no direct relationship with the land use at the groundwater sampling time, but had a relatively strong relationship with the land use at the groundwater recharge time. Dry land is recognized as the dominant factor contributing to the elevated concentration of nitrates. The nitrogen isotope for nitrate (δ(15)N-NO3) gives a more direct result of the identification of nitrate sources: the use of manure in agricultural activities. Principle component (PC) regression shows that the process of the dry land exploitation is the major process that controls the nitrate contamination in the Sanjiang Plain.
Evaluating the Quality of Transfer versus Nontransfer Accounting Principles Grades.
ERIC Educational Resources Information Center
Colley, J. R.; And Others
1996-01-01
Using 1989-92 student records from three colleges accepting large numbers of transfers from junior schools into accounting, regression analyses compared grades of transfer and nontransfer students. Quality of accounting principle grades of transfer students was not equivalent to that of nontransfer students. (SK)
NASA Astrophysics Data System (ADS)
Mahaboob, B.; Venkateswarlu, B.; Sankar, J. Ravi; Balasiddamuni, P.
2017-11-01
This paper uses matrix calculus techniques to obtain Nonlinear Least Squares Estimator (NLSE), Maximum Likelihood Estimator (MLE) and Linear Pseudo model for nonlinear regression model. David Pollard and Peter Radchenko [1] explained analytic techniques to compute the NLSE. However the present research paper introduces an innovative method to compute the NLSE using principles in multivariate calculus. This study is concerned with very new optimization techniques used to compute MLE and NLSE. Anh [2] derived NLSE and MLE of a heteroscedatistic regression model. Lemcoff [3] discussed a procedure to get linear pseudo model for nonlinear regression model. In this research article a new technique is developed to get the linear pseudo model for nonlinear regression model using multivariate calculus. The linear pseudo model of Edmond Malinvaud [4] has been explained in a very different way in this paper. David Pollard et.al used empirical process techniques to study the asymptotic of the LSE (Least-squares estimation) for the fitting of nonlinear regression function in 2006. In Jae Myung [13] provided a go conceptual for Maximum likelihood estimation in his work “Tutorial on maximum likelihood estimation
Regression Analysis: Instructional Resource for Cost/Managerial Accounting
ERIC Educational Resources Information Center
Stout, David E.
2015-01-01
This paper describes a classroom-tested instructional resource, grounded in principles of active learning and a constructivism, that embraces two primary objectives: "demystify" for accounting students technical material from statistics regarding ordinary least-squares (OLS) regression analysis--material that students may find obscure or…
Hemmateenejad, Bahram; Yazdani, Mahdieh
2009-02-16
Steroids are widely distributed in nature and are found in plants, animals, and fungi in abundance. A data set consists of a diverse set of steroids have been used to develop quantitative structure-electrochemistry relationship (QSER) models for their half-wave reduction potential. Modeling was established by means of multiple linear regression (MLR) and principle component regression (PCR) analyses. In MLR analysis, the QSPR models were constructed by first grouping descriptors and then stepwise selection of variables from each group (MLR1) and stepwise selection of predictor variables from the pool of all calculated descriptors (MLR2). Similar procedure was used in PCR analysis so that the principal components (or features) were extracted from different group of descriptors (PCR1) and from entire set of descriptors (PCR2). The resulted models were evaluated using cross-validation, chance correlation, application to prediction reduction potential of some test samples and accessing applicability domain. Both MLR approaches represented accurate results however the QSPR model found by MLR1 was statistically more significant. PCR1 approach produced a model as accurate as MLR approaches whereas less accurate results were obtained by PCR2 approach. In overall, the correlation coefficients of cross-validation and prediction of the QSPR models resulted from MLR1, MLR2 and PCR1 approaches were higher than 90%, which show the high ability of the models to predict reduction potential of the studied steroids.
Utilizing Virtual Teams in a Management Principles Course
ERIC Educational Resources Information Center
Olson-Buchanan, Julie B.; Rechner, Paula L.; Sanchez, Rudolph J.; Schmidtke, James M.
2007-01-01
Purpose: The purpose of this paper is to describe development of a component in a management principles course to develop university students' virtual team skills. There were several challenges in creating and implementing this new component. The paper aims to describe how these challenges were addressed and discusses outcomes associated with this…
Vulnerable users: deceptive robotics
NASA Astrophysics Data System (ADS)
Collins, Emily C.
2017-07-01
The Principles of Robotics were outlined by the EPSRC in 2010. They are aimed at regulating robots in the real world. This paper represents a response to principle number four which reads: "Robots are manufactured artefacts. They should not be designed in a deceptive way to exploit vulnerable users; instead their machine nature should be transparent". The following critique questions the principle's validity by asking whether it is correct as a statement about the nature of robots, and the relationship between robots and people. To achieve this, the principle is broken down into the following two main component statements: (1) "Robots should not be designed in a deceptive way to exploit vulnerable users", and, (2) "Machine nature should be transparent". It is argued that both of the component statements that make up this principle are fundamentally flawed because of the undefined nature of the critical terms: "deceptive", "vulnerable", and "machine nature", and that as such the principle as a whole is misleading.
NASA Astrophysics Data System (ADS)
Metwally, Fadia H.
2008-02-01
The quantitative predictive abilities of the new and simple bivariate spectrophotometric method are compared with the results obtained by the use of multivariate calibration methods [the classical least squares (CLS), principle component regression (PCR) and partial least squares (PLS)], using the information contained in the absorption spectra of the appropriate solutions. Mixtures of the two drugs Nifuroxazide (NIF) and Drotaverine hydrochloride (DRO) were resolved by application of the bivariate method. The different chemometric approaches were applied also with previous optimization of the calibration matrix, as they are useful in simultaneous inclusion of many spectral wavelengths. The results found by application of the bivariate, CLS, PCR and PLS methods for the simultaneous determinations of mixtures of both components containing 2-12 μg ml -1 of NIF and 2-8 μg ml -1 of DRO are reported. Both approaches were satisfactorily applied to the simultaneous determination of NIF and DRO in pure form and in pharmaceutical formulation. The results were in accordance with those given by the EVA Pharma reference spectrophotometric method.
[Research on fast classification based on LIBS technology and principle component analyses].
Yu, Qi; Ma, Xiao-Hong; Wang, Rui; Zhao, Hua-Feng
2014-11-01
Laser-induced breakdown spectroscopy (LIBS) and the principle component analysis (PCA) were combined to study aluminum alloy classification in the present article. Classification experiments were done on thirteen different kinds of standard samples of aluminum alloy which belong to 4 different types, and the results suggested that the LIBS-PCA method can be used to aluminum alloy fast classification. PCA was used to analyze the spectrum data from LIBS experiments, three principle components were figured out that contribute the most, the principle component scores of the spectrums were calculated, and the scores of the spectrums data in three-dimensional coordinates were plotted. It was found that the spectrum sample points show clear convergence phenomenon according to the type of aluminum alloy they belong to. This result ensured the three principle components and the preliminary aluminum alloy type zoning. In order to verify its accuracy, 20 different aluminum alloy samples were used to do the same experiments to verify the aluminum alloy type zoning. The experimental result showed that the spectrum sample points all located in their corresponding area of the aluminum alloy type, and this proved the correctness of the earlier aluminum alloy standard sample type zoning method. Based on this, the identification of unknown type of aluminum alloy can be done. All the experimental results showed that the accuracy of principle component analyses method based on laser-induced breakdown spectroscopy is more than 97.14%, and it can classify the different type effectively. Compared to commonly used chemical methods, laser-induced breakdown spectroscopy can do the detection of the sample in situ and fast with little sample preparation, therefore, using the method of the combination of LIBS and PCA in the areas such as quality testing and on-line industrial controlling can save a lot of time and cost, and improve the efficiency of detection greatly.
Revisiting the Principle of Relative Constancy: Consumer Mass Media Expenditures in Belgium.
ERIC Educational Resources Information Center
Dupagne, Michel; Green, R. Jeffery
1996-01-01
Proposes two new econometric models for testing the principle of relative constancy (PRC). Reports on regression and cointegration analyses conducted with Belgian mass media expenditure data from 1953-91. Suggests that alternative mass media expenditure models should be developed because PRC lacks of economic foundation and sound empirical…
Modeling Success: Using Preenrollment Data to Identify Academically At-Risk Students
ERIC Educational Resources Information Center
Gansemer-Topf, Ann M.; Compton, Jonathan; Wohlgemuth, Darin; Forbes, Greg; Ralston, Ekaterina
2015-01-01
Improving student success and degree completion is one of the core principles of strategic enrollment management. To address this principle, institutional data were used to develop a statistical model to identify academically at-risk students. The model employs multiple linear regression techniques to predict students at risk of earning below a…
The Relative Performance of Female and Male Students in Accounting Principles Classes.
ERIC Educational Resources Information Center
Bouillon, Marvin L.; Doran, B. Michael
1992-01-01
The performance of female and male students in Accounting Principles (AP) I and II was compared by using multiple regression techniques to assess the incremental explanatory effects of gender. Males significantly outperformed females in AP I, contradicting earlier studies. Similar gender of instructor and student was insignificant. (JOW)
Robust regression on noisy data for fusion scaling laws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verdoolaege, Geert, E-mail: geert.verdoolaege@ugent.be; Laboratoire de Physique des Plasmas de l'ERM - Laboratorium voor Plasmafysica van de KMS
2014-11-15
We introduce the method of geodesic least squares (GLS) regression for estimating fusion scaling laws. Based on straightforward principles, the method is easily implemented, yet it clearly outperforms established regression techniques, particularly in cases of significant uncertainty on both the response and predictor variables. We apply GLS for estimating the scaling of the L-H power threshold, resulting in estimates for ITER that are somewhat higher than predicted earlier.
Pethica, Brian A
2007-12-21
As indicated by Gibbs and made explicit by Guggenheim, the electrical potential difference between two regions of different chemical composition cannot be measured. The Gibbs-Guggenheim Principle restricts the use of classical electrostatics in electrochemical theories as thermodynamically unsound with some few approximate exceptions, notably for dilute electrolyte solutions and concomitant low potentials where the linear limit for the exponential of the relevant Boltzmann distribution applies. The Principle invalidates the widespread use of forms of the Poisson-Boltzmann equation which do not include the non-electrostatic components of the chemical potentials of the ions. From a thermodynamic analysis of the parallel plate electrical condenser, employing only measurable electrical quantities and taking into account the chemical potentials of the components of the dielectric and their adsorption at the surfaces of the condenser plates, an experimental procedure to provide exceptions to the Principle has been proposed. This procedure is now reconsidered and rejected. No other related experimental procedures circumvent the Principle. Widely-used theoretical descriptions of electrolyte solutions, charged surfaces and colloid dispersions which neglect the Principle are briefly discussed. MD methods avoid the limitations of the Poisson-Bolzmann equation. Theoretical models which include the non-electrostatic components of the inter-ion and ion-surface interactions in solutions and colloid systems assume the additivity of dispersion and electrostatic forces. An experimental procedure to test this assumption is identified from the thermodynamics of condensers at microscopic plate separations. The available experimental data from Kelvin probe studies are preliminary, but tend against additivity. A corollary to the Gibbs-Guggenheim Principle is enunciated, and the Principle is restated that for any charged species, neither the difference in electrostatic potential nor the sum of the differences in the non-electrostatic components of the thermodynamic potential difference between regions of different chemical compositions can be measured.
NASA Astrophysics Data System (ADS)
Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.
2015-06-01
Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).
NASA Technical Reports Server (NTRS)
Murphy, M. R.; Awe, C. A.
1986-01-01
Six professionally active, retired captains rated the coordination and decisionmaking performances of sixteen aircrews while viewing videotapes of a simulated commercial air transport operation. The scenario featured a required diversion and a probable minimum fuel situation. Seven point Likert-type scales were used in rating variables on the basis of a model of crew coordination and decisionmaking. The variables were based on concepts of, for example, decision difficulty, efficiency, and outcome quality; and leader-subordin ate concepts such as person and task-oriented leader behavior, and competency motivation of subordinate crewmembers. Five-front-end variables of the model were in turn dependent variables for a hierarchical regression procedure. The variance in safety performance was explained 46%, by decision efficiency, command reversal, and decision quality. The variance of decision quality, an alternative substantive dependent variable to safety performance, was explained 60% by decision efficiency and the captain's quality of within-crew communications. The variance of decision efficiency, crew coordination, and command reversal were in turn explained 78%, 80%, and 60% by small numbers of preceding independent variables. A principle component, varimax factor analysis supported the model structure suggested by regression analyses.
System principles, mathematical models and methods to ensure high reliability of safety systems
NASA Astrophysics Data System (ADS)
Zaslavskyi, V.
2017-04-01
Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.
Baschung Pfister, Pierrette; de Bruin, Eling D; Tobler-Ammann, Bernadette C; Maurer, Britta; Knols, Ruud H
2015-10-01
Physical exercise seems to be a safe and effective intervention in patients with inflammatory myopathy (IM). However, the optimal training intervention is not clear. To achieve an optimum training effect, physical exercise training principles must be considered and to replicate research findings, FITT components (frequency, intensity, time, and type) of exercise training should be reported. This review aims to evaluate exercise interventions in studies with IM patients in relation to (1) the application of principles of exercise training, (2) the reporting of FITT components, (3) the adherence of participants to the intervention, and (4) to assess the methodological quality of the included studies. The literature was searched for exercise studies in IM patients. Data were extracted to evaluate the application of the training principles, the reporting of and the adherence to the exercise prescription. The Downs and Black checklist was used to assess methodological quality of the included studies. From the 14 included studies, four focused on resistance, two on endurance, and eight on combined training. In terms of principles of exercise training, 93 % reported specificity, 50 % progression and overload, and 79 % initial values. Reversibility and diminishing returns were never reported. Six articles reported all FITT components in the prescription of the training though no study described adherence to all of these components. Incomplete application of the exercise training principles and insufficient reporting of the exercise intervention prescribed and completed hamper the reproducibility of the intervention and the ability to determine the optimal dose of exercise.
Confidence Intervals for Squared Semipartial Correlation Coefficients: The Effect of Nonnormality
ERIC Educational Resources Information Center
Algina, James; Keselman, H. J.; Penfield, Randall D.
2010-01-01
The increase in the squared multiple correlation coefficient ([delta]R[superscript 2]) associated with a variable in a regression equation is a commonly used measure of importance in regression analysis. Algina, Keselman, and Penfield found that intervals based on asymptotic principles were typically very inaccurate, even though the sample size…
Grades, Gender, and Encouragement: A Regression Discontinuity Analysis
ERIC Educational Resources Information Center
Owen, Ann L.
2010-01-01
The author employs a regression discontinuity design to provide direct evidence on the effects of grades earned in economics principles classes on the decision to major in economics and finds a differential effect for male and female students. Specifically, for female students, receiving an A for a final grade in the first economics class is…
Deriving the Regression Line with Algebra
ERIC Educational Resources Information Center
Quintanilla, John A.
2017-01-01
Exploration with spreadsheets and reliance on previous skills can lead students to determine the line of best fit. To perform linear regression on a set of data, students in Algebra 2 (or, in principle, Algebra 1) do not have to settle for using the mysterious "black box" of their graphing calculators (or other classroom technologies).…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
The central purpose of this study was to develop and field test written criterion-referenced tests for the ornamental horticulture component of applied principles of agribusiness and natural resources occupations programs. The test items were to be used by secondary agricultural education students in Florida. Based upon the objectives identified…
A quantitative model for designing keyboard layout.
Shieh, K K; Lin, C C
1999-02-01
This study analyzed the quantitative relationship between keytapping times and ergonomic principles in typewriting skills. Keytapping times and key-operating characteristics of a female subject typing on the Qwerty and Dvorak keyboards for six weeks each were collected and analyzed. The results showed that characteristics of the typed material and the movements of hands and fingers were significantly related to keytapping times. The most significant factors affecting keytapping times were association frequency between letters, consecutive use of the same hand or finger, and the finger used. A regression equation for relating keytapping times to ergonomic principles was fitted to the data. Finally, a protocol for design of computerized keyboard layout based on the regression equation was proposed.
Accounting Principles 30G. Interim Guide.
ERIC Educational Resources Information Center
Manitoba Dept. of Education and Training, Winnipeg.
This curriculum guide was developed for a senior-level introductory accounting course for students in high schools in Manitoba. The course introduces Canadian accounting principles and practices; it applies Generally Accepted Accounting Principles (GAAP) to introductory financial accounting. The guide includes the following components: (1) an…
Default Bayes Factors for Model Selection in Regression
ERIC Educational Resources Information Center
Rouder, Jeffrey N.; Morey, Richard D.
2012-01-01
In this article, we present a Bayes factor solution for inference in multiple regression. Bayes factors are principled measures of the relative evidence from data for various models or positions, including models that embed null hypotheses. In this regard, they may be used to state positive evidence for a lack of an effect, which is not possible…
NASA Astrophysics Data System (ADS)
Polat, Esra; Gunay, Suleyman
2013-10-01
One of the problems encountered in Multiple Linear Regression (MLR) is multicollinearity, which causes the overestimation of the regression parameters and increase of the variance of these parameters. Hence, in case of multicollinearity presents, biased estimation procedures such as classical Principal Component Regression (CPCR) and Partial Least Squares Regression (PLSR) are then performed. SIMPLS algorithm is the leading PLSR algorithm because of its speed, efficiency and results are easier to interpret. However, both of the CPCR and SIMPLS yield very unreliable results when the data set contains outlying observations. Therefore, Hubert and Vanden Branden (2003) have been presented a robust PCR (RPCR) method and a robust PLSR (RPLSR) method called RSIMPLS. In RPCR, firstly, a robust Principal Component Analysis (PCA) method for high-dimensional data on the independent variables is applied, then, the dependent variables are regressed on the scores using a robust regression method. RSIMPLS has been constructed from a robust covariance matrix for high-dimensional data and robust linear regression. The purpose of this study is to show the usage of RPCR and RSIMPLS methods on an econometric data set, hence, making a comparison of two methods on an inflation model of Turkey. The considered methods have been compared in terms of predictive ability and goodness of fit by using a robust Root Mean Squared Error of Cross-validation (R-RMSECV), a robust R2 value and Robust Component Selection (RCS) statistic.
Brown, C. Erwin
1993-01-01
Correlation analysis in conjunction with principal-component and multiple-regression analyses were applied to laboratory chemical and petrographic data to assess the usefulness of these techniques in evaluating selected physical and hydraulic properties of carbonate-rock aquifers in central Pennsylvania. Correlation and principal-component analyses were used to establish relations and associations among variables, to determine dimensions of property variation of samples, and to filter the variables containing similar information. Principal-component and correlation analyses showed that porosity is related to other measured variables and that permeability is most related to porosity and grain size. Four principal components are found to be significant in explaining the variance of data. Stepwise multiple-regression analysis was used to see how well the measured variables could predict porosity and (or) permeability for this suite of rocks. The variation in permeability and porosity is not totally predicted by the other variables, but the regression is significant at the 5% significance level. ?? 1993.
Principles of recruitment and retention in clinical trials.
Aitken, Leanne; Gallagher, Robyn; Madronio, Christine
2003-12-01
Efficient and effective recruitment and retention of participants is the largest single component of the study workload and forms an essential component in the conduct of clinical trials. In this paper, we present five principles to guide the processes of both recruitment and retention. These principles include the selection of an appropriate population to adequately answer the research question, followed by the establishment of a sampling process that accurately represents that population. Creation of systematic and effective recruitment mechanisms should be supported by implementation of follow-up mechanisms that promote participant retention. Finally, all activities related to recruitment and retention must be conducted within the framework of ethics and privacy regulations. Adherence to these principles will assist the researcher in achieving the goals of the study within the available resources.
A Nonlinear Model for Gene-Based Gene-Environment Interaction.
Sa, Jian; Liu, Xu; He, Tao; Liu, Guifen; Cui, Yuehua
2016-06-04
A vast amount of literature has confirmed the role of gene-environment (G×E) interaction in the etiology of complex human diseases. Traditional methods are predominantly focused on the analysis of interaction between a single nucleotide polymorphism (SNP) and an environmental variable. Given that genes are the functional units, it is crucial to understand how gene effects (rather than single SNP effects) are influenced by an environmental variable to affect disease risk. Motivated by the increasing awareness of the power of gene-based association analysis over single variant based approach, in this work, we proposed a sparse principle component regression (sPCR) model to understand the gene-based G×E interaction effect on complex disease. We first extracted the sparse principal components for SNPs in a gene, then the effect of each principal component was modeled by a varying-coefficient (VC) model. The model can jointly model variants in a gene in which their effects are nonlinearly influenced by an environmental variable. In addition, the varying-coefficient sPCR (VC-sPCR) model has nice interpretation property since the sparsity on the principal component loadings can tell the relative importance of the corresponding SNPs in each component. We applied our method to a human birth weight dataset in Thai population. We analyzed 12,005 genes across 22 chromosomes and found one significant interaction effect using the Bonferroni correction method and one suggestive interaction. The model performance was further evaluated through simulation studies. Our model provides a system approach to evaluate gene-based G×E interaction.
Side Effect Perceptions and their Impact on Treatment Decisions in Women
Waters, Erika A.; Pachur, Thorsten; Colditz, Graham A.
2016-01-01
Background Side effects prompt some patients to forego otherwise-beneficial therapies. This study explored which characteristics make side effects particularly aversive. Methods We used a psychometric approach, originating from research on risk perception, to identify the factors (or components) underlying side effect perceptions. Women (N=149) aged 40–74 were recruited from a patient registry to complete an online experiment. Participants were presented with hypothetical scenarios in which an effective and necessary medication conferred a small risk of a single side effect (e.g., nausea, dizziness). They rated a broad range of side effects on several characteristics (e.g., embarrassing, treatable). In addition, we collected four measures of aversiveness for each side effect: choosing to take the medication, willingness to pay to avoid the side effect (WTP), negative affective attitude associated with the side effect, and how each side effect ranks among others in terms of undesirability. A principle-components analysis (PCA) was used to identify the components underlying side effect perceptions. Then, for each aversiveness measure separately, regression analyses were used to determine which components predicted differences in aversiveness among the side effects. Results The PCA revealed four components underlying side effect perceptions: affective challenge (e.g., frightening), social challenge (e.g., disfiguring), physical challenge (e.g., painful), and familiarity (e.g., common). Side effects perceived as affectively and physically challenging elicited the highest levels of aversiveness across all four measures. Conclusions Understanding what side effect characteristics are most aversive may inform interventions to improve medical decisions and facilitate the translation of novel biomedical therapies into clinical practice. PMID:27216581
Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul
2016-01-01
Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257
EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES.
Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D
2008-05-12
This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component's discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies.
2015-03-01
wine warfare NCC naval component commander NFC numbered fleet commander NM nautical mile NMP Navy mission planner NOP Navy...principles for naval component commanders ( NCCs ), numbered fleet commanders (NFCs) or joint force maritime component commanders (JFMCCs) and their
On state-of-charge determination for lithium-ion batteries
NASA Astrophysics Data System (ADS)
Li, Zhe; Huang, Jun; Liaw, Bor Yann; Zhang, Jianbo
2017-04-01
Accurate estimation of state-of-charge (SOC) of a battery through its life remains challenging in battery research. Although improved precisions continue to be reported at times, almost all are based on regression methods empirically, while the accuracy is often not properly addressed. Here, a comprehensive review is set to address such issues, from fundamental principles that are supposed to define SOC to methodologies to estimate SOC for practical use. It covers topics from calibration, regression (including modeling methods) to validation in terms of precision and accuracy. At the end, we intend to answer the following questions: 1) can SOC estimation be self-adaptive without bias? 2) Why Ah-counting is a necessity in almost all battery-model-assisted regression methods? 3) How to establish a consistent framework of coupling in multi-physics battery models? 4) To assess the accuracy in SOC estimation, statistical methods should be employed to analyze factors that contribute to the uncertainty. We hope, through this proper discussion of the principles, accurate SOC estimation can be widely achieved.
Basal area increment and growth efficiency as functions of canopy dynamics and stem mechanics
Thomas J. Dean
2004-01-01
Crown and canopy structurecorrelate with growth efficiency and also determine stem size and taper as described by the uniform stress principle of stem formation. A regression model was derived from this principle that expresses basal area increment in terms of the amount and vertical distribution of leaf area and change in these variables during a growth period. This...
Liu, Yu; Xi, Du-Gang; Li, Zhao-Liang
2015-01-01
Predicting the levels of chlorophyll-a (Chl-a) is a vital component of water quality management, which ensures that urban drinking water is safe from harmful algal blooms. This study developed a model to predict Chl-a levels in the Yuqiao Reservoir (Tianjin, China) biweekly using water quality and meteorological data from 1999-2012. First, six artificial neural networks (ANNs) and two non-ANN methods (principal component analysis and the support vector regression model) were compared to determine the appropriate training principle. Subsequently, three predictors with different input variables were developed to examine the feasibility of incorporating meteorological factors into Chl-a prediction, which usually only uses water quality data. Finally, a sensitivity analysis was performed to examine how the Chl-a predictor reacts to changes in input variables. The results were as follows: first, ANN is a powerful predictive alternative to the traditional modeling techniques used for Chl-a prediction. The back program (BP) model yields slightly better results than all other ANNs, with the normalized mean square error (NMSE), the correlation coefficient (Corr), and the Nash-Sutcliffe coefficient of efficiency (NSE) at 0.003 mg/l, 0.880 and 0.754, respectively, in the testing period. Second, the incorporation of meteorological data greatly improved Chl-a prediction compared to models solely using water quality factors or meteorological data; the correlation coefficient increased from 0.574-0.686 to 0.880 when meteorological data were included. Finally, the Chl-a predictor is more sensitive to air pressure and pH compared to other water quality and meteorological variables.
Plöchl, Michael; Ossandón, José P.; König, Peter
2012-01-01
Eye movements introduce large artifacts to electroencephalographic recordings (EEG) and thus render data analysis difficult or even impossible. Trials contaminated by eye movement and blink artifacts have to be discarded, hence in standard EEG-paradigms subjects are required to fixate on the screen. To overcome this restriction, several correction methods including regression and blind source separation have been proposed. Yet, there is no automated standard procedure established. By simultaneously recording eye movements and 64-channel-EEG during a guided eye movement paradigm, we investigate and review the properties of eye movement artifacts, including corneo-retinal dipole changes, saccadic spike potentials and eyelid artifacts, and study their interrelations during different types of eye- and eyelid movements. In concordance with earlier studies our results confirm that these artifacts arise from different independent sources and that depending on electrode site, gaze direction, and choice of reference these sources contribute differently to the measured signal. We assess the respective implications for artifact correction methods and therefore compare the performance of two prominent approaches, namely linear regression and independent component analysis (ICA). We show and discuss that due to the independence of eye artifact sources, regression-based correction methods inevitably over- or under-correct individual artifact components, while ICA is in principle suited to address such mixtures of different types of artifacts. Finally, we propose an algorithm, which uses eye tracker information to objectively identify eye-artifact related ICA-components (ICs) in an automated manner. In the data presented here, the algorithm performed very similar to human experts when those were given both, the topographies of the ICs and their respective activations in a large amount of trials. Moreover it performed more reliable and almost twice as effective than human experts when those had to base their decision on IC topographies only. Furthermore, a receiver operating characteristic (ROC) analysis demonstrated an optimal balance of false positive and false negative at an area under curve (AUC) of more than 0.99. Removing the automatically detected ICs from the data resulted in removal or substantial suppression of ocular artifacts including microsaccadic spike potentials, while the relevant neural signal remained unaffected. In conclusion the present work aims at a better understanding of individual eye movement artifacts, their interrelations and the respective implications for eye artifact correction. Additionally, the proposed ICA-procedure provides a tool for optimized detection and correction of eye movement-related artifact components. PMID:23087632
Enhancement of multispectral thermal infrared images - Decorrelation contrast stretching
NASA Technical Reports Server (NTRS)
Gillespie, Alan R.
1992-01-01
Decorrelation contrast stretching is an effective method for displaying information from multispectral thermal infrared (TIR) images. The technique involves transformation of the data to principle components ('decorrelation'), independent contrast 'stretching' of data from the new 'decorrelated' image bands, and retransformation of the stretched data back to the approximate original axes, based on the inverse of the principle component rotation. The enhancement is robust in that colors of the same scene components are similar in enhanced images of similar scenes, or the same scene imaged at different times. Decorrelation contrast stretching is reviewed in the context of other enhancements applied to TIR images.
Just a fad? Gamification in health and fitness apps.
Lister, Cameron; West, Joshua H; Cannon, Ben; Sax, Tyler; Brodegard, David
2014-08-04
Gamification has been a predominant focus of the health app industry in recent years. However, to our knowledge, there has yet to be a review of gamification elements in relation to health behavior constructs, or insight into the true proliferation of gamification in health apps. The objective of this study was to identify the extent to which gamification is used in health apps, and analyze gamification of health and fitness apps as a potential component of influence on a consumer's health behavior. An analysis of health and fitness apps related to physical activity and diet was conducted among apps in the Apple App Store in the winter of 2014. This analysis reviewed a sample of 132 apps for the 10 effective game elements, the 6 core components of health gamification, and 13 core health behavior constructs. A regression analysis was conducted in order to measure the correlation between health behavior constructs, gamification components, and effective game elements. This review of the most popular apps showed widespread use of gamification principles, but low adherence to any professional guidelines or industry standard. Regression analysis showed that game elements were associated with gamification (P<.001). Behavioral theory was associated with gamification (P<.05), but not game elements, and upon further analysis gamification was only associated with composite motivational behavior scores (P<.001), and not capacity or opportunity/trigger. This research, to our knowledge, represents the first comprehensive review of gamification use in health and fitness apps, and the potential to impact health behavior. The results show that use of gamification in health and fitness apps has become immensely popular, as evidenced by the number of apps found in the Apple App Store containing at least some components of gamification. This shows a lack of integrating important elements of behavioral theory from the app industry, which can potentially impact the efficacy of gamification apps to change behavior. Apps represent a very promising, burgeoning market and landscape in which to disseminate health behavior change interventions. Initial results show an abundant use of gamification in health and fitness apps, which necessitates the in-depth study and evaluation of the potential of gamification to change health behaviors.
Demonstration of Human-Autonomy Teaming Principles
NASA Technical Reports Server (NTRS)
Shively, Robert Jay
2016-01-01
Known problems with automation include lack of mode awareness, automation brittleness, and risk of miscalibrated trust. Human-Autonomy Teaming (HAT) is essential for improving these problems. We have identified some critical components of HAT and ran a part-task study to introduce these components to a ground station that supports flight following of multiple aircraft. Our goal was to demonstrate, evaluate, and refine HAT principles. This presentation provides a brief summary of the study and initial findings.
Gamal El-Dien, Omnia; Ratcliffe, Blaise; Klápště, Jaroslav; Chen, Charles; Porth, Ilga; El-Kassaby, Yousry A
2015-05-09
Genomic selection (GS) in forestry can substantially reduce the length of breeding cycle and increase gain per unit time through early selection and greater selection intensity, particularly for traits of low heritability and late expression. Affordable next-generation sequencing technologies made it possible to genotype large numbers of trees at a reasonable cost. Genotyping-by-sequencing was used to genotype 1,126 Interior spruce trees representing 25 open-pollinated families planted over three sites in British Columbia, Canada. Four imputation algorithms were compared (mean value (MI), singular value decomposition (SVD), expectation maximization (EM), and a newly derived, family-based k-nearest neighbor (kNN-Fam)). Trees were phenotyped for several yield and wood attributes. Single- and multi-site GS prediction models were developed using the Ridge Regression Best Linear Unbiased Predictor (RR-BLUP) and the Generalized Ridge Regression (GRR) to test different assumption about trait architecture. Finally, using PCA, multi-trait GS prediction models were developed. The EM and kNN-Fam imputation methods were superior for 30 and 60% missing data, respectively. The RR-BLUP GS prediction model produced better accuracies than the GRR indicating that the genetic architecture for these traits is complex. GS prediction accuracies for multi-site were high and better than those of single-sites while multi-site predictability produced the lowest accuracies reflecting type-b genetic correlations and deemed unreliable. The incorporation of genomic information in quantitative genetics analyses produced more realistic heritability estimates as half-sib pedigree tended to inflate the additive genetic variance and subsequently both heritability and gain estimates. Principle component scores as representatives of multi-trait GS prediction models produced surprising results where negatively correlated traits could be concurrently selected for using PCA2 and PCA3. The application of GS to open-pollinated family testing, the simplest form of tree improvement evaluation methods, was proven to be effective. Prediction accuracies obtained for all traits greatly support the integration of GS in tree breeding. While the within-site GS prediction accuracies were high, the results clearly indicate that single-site GS models ability to predict other sites are unreliable supporting the utilization of multi-site approach. Principle component scores provided an opportunity for the concurrent selection of traits with different phenotypic optima.
EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES
Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D.
2009-01-01
This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component’s discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies. PMID:20582334
Rezaei-Hachesu, Peyman; Pesianian, Esmaeil; Mohammadian, Mohsen
2016-02-01
Radiology information system (RIS) in order to reduce workload and improve the quality of services must be well-designed. Heuristic evaluation is one of the methods that understand usability problems with the least time, cost and resources. The aim of present study is to evaluate the usability of RISs in hospitals. This is a cross-sectional descriptive study (2015) that uses heuristic evaluation method to evaluate the usability of RIS used in 3 hospitals of Tabriz city. The data are collected using a standard checklist based on 13 principles of Nielsen Heuristic evaluation method. Usability of RISs was investigated based on the number of components observed from Nielsen principles and problems of usability based on the number of non-observed components as well as non-existent or unrecognizable components. by evaluation of RISs in each of the hospitals 1, 2 and 3, total numbers of observed components were obtained as 173, 202 and 196, respectively. It was concluded that the usability of RISs in the studied population, on average and with observing 190 components of the 291 components related to the 13 principles of Nielsen is 65.41 %. Furthermore, problems of usability were obtained as 26.35%. The established and visible nature of some components such as response time of application, visual feedbacks, colors, view and design and arrangement of software objects cause more attention to these components as principal components in designing UI software. Also, incorrect analysis before system design leads to a lack of attention to secondary needs like Help software and security issues.
Support vector regression to predict porosity and permeability: Effect of sample size
NASA Astrophysics Data System (ADS)
Al-Anazi, A. F.; Gates, I. D.
2012-02-01
Porosity and permeability are key petrophysical parameters obtained from laboratory core analysis. Cores, obtained from drilled wells, are often few in number for most oil and gas fields. Porosity and permeability correlations based on conventional techniques such as linear regression or neural networks trained with core and geophysical logs suffer poor generalization to wells with only geophysical logs. The generalization problem of correlation models often becomes pronounced when the training sample size is small. This is attributed to the underlying assumption that conventional techniques employing the empirical risk minimization (ERM) inductive principle converge asymptotically to the true risk values as the number of samples increases. In small sample size estimation problems, the available training samples must span the complexity of the parameter space so that the model is able both to match the available training samples reasonably well and to generalize to new data. This is achieved using the structural risk minimization (SRM) inductive principle by matching the capability of the model to the available training data. One method that uses SRM is support vector regression (SVR) network. In this research, the capability of SVR to predict porosity and permeability in a heterogeneous sandstone reservoir under the effect of small sample size is evaluated. Particularly, the impact of Vapnik's ɛ-insensitivity loss function and least-modulus loss function on generalization performance was empirically investigated. The results are compared to the multilayer perception (MLP) neural network, a widely used regression method, which operates under the ERM principle. The mean square error and correlation coefficients were used to measure the quality of predictions. The results demonstrate that SVR yields consistently better predictions of the porosity and permeability with small sample size than the MLP method. Also, the performance of SVR depends on both kernel function type and loss functions used.
Sun, Yi; Arning, Martin; Bochmann, Frank; Börger, Jutta; Heitmann, Thomas
2018-06-01
The Occupational Safety and Health Monitoring and Assessment Tool (OSH-MAT) is a practical instrument that is currently used in the German woodworking and metalworking industries to monitor safety conditions at workplaces. The 12-item scoring system has three subscales rating technical, organizational, and personnel-related conditions in a company. Each item has a rating value ranging from 1 to 9, with higher values indicating higher standard of safety conditions. The reliability of this instrument was evaluated in a cross-sectional survey among 128 companies and its validity among 30,514 companies. The inter-rater reliability of the instrument was examined independently and simultaneously by two well-trained safety engineers. Agreement between the double ratings was quantified by the intraclass correlation coefficient and absolute agreement of the rating values. The content validity of the OSH-MAT was evaluated by quantifying the association between OSH-MAT values and 5-year average injury rates by Poisson regression analysis adjusted for the size of the companies and industrial sectors. The construct validity of OSH-MAT was examined by principle component factor analysis. Our analysis indicated good to very good inter-rater reliability (intraclass correlation coefficient = 0.64-0.74) of OSH-MAT values with an absolute agreement of between 72% and 81%. Factor analysis identified three component subscales that met exactly the structure theory of this instrument. The Poisson regression analysis demonstrated a statistically significant exposure-response relationship between OSH-MAT values and the 5-year average injury rates. These analyses indicate that OSH-MAT is a valid and reliable instrument that can be used effectively to monitor safety conditions at workplaces.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC.
Four aspects of preassembled building components are discussed--(1) attitudes on preassembled components, (2) principles of preassembled components construction, (3) structural component case studies, and (4) mechanical component case studies. In section 1, various views on preassembled components are discussed including--(1) the architect's view,…
Analysis of exogenous components of mortality risks.
Blinkin, V L
1998-04-01
A new technique for deriving exogenous components of mortality risks from national vital statistics has been developed. Each observed death rate Dij (where i corresponds to calendar time (year or interval of years) and j denotes the number of corresponding age group) was represented as Dij = Aj + BiCj, and unknown quantities Aj, Bi, and Cj were estimated by a special procedure using the least-squares principle. The coefficients of variation do not exceed 10%. It is shown that the term Aj can be interpreted as the endogenous and the second term BiCj as the exogenous components of the death rate. The aggregate of endogenous components Aj can be described by a regression function, corresponding to the Gompertz-Makeham law, A(tau) = gamma + beta x e alpha tau, where gamma, beta, and alpha are constants, tau is age, A(tau) [symbol: see text] tau = tau j identical to A(tau j) identical to Aj and tau j is the value of age tau in jth age group. The coefficients of variation for such a representation does not exceed 4%. An analysis of exogenous risk levels in the Moscow and Russian populations during 1980-1995 shows that since 1992 all components of exogenous risk in the Moscow population had been increasing up to 1994. The greatest contribution to the total level of exogenous risk was lethal diseases, and their death rate was 387 deaths per 100,000 persons in 1994, i.e., 61.9% of all deaths. The dynamics of exogenous mortality risk change during 1990-1994 in the Moscow population and in the Russian population without Moscow had been identical: the risk had been increasing and its value in the Russian population had been higher than that in the Moscow population.
NASA Astrophysics Data System (ADS)
Jiang, Weiping; Ma, Jun; Li, Zhao; Zhou, Xiaohui; Zhou, Boye
2018-05-01
The analysis of the correlations between the noise in different components of GPS stations has positive significance to those trying to obtain more accurate uncertainty of velocity with respect to station motion. Previous research into noise in GPS position time series focused mainly on single component evaluation, which affects the acquisition of precise station positions, the velocity field, and its uncertainty. In this study, before and after removing the common-mode error (CME), we performed one-dimensional linear regression analysis of the noise amplitude vectors in different components of 126 GPS stations with a combination of white noise, flicker noise, and random walking noise in Southern California. The results show that, on the one hand, there are above-moderate degrees of correlation between the white noise amplitude vectors in all components of the stations before and after removal of the CME, while the correlations between flicker noise amplitude vectors in horizontal and vertical components are enhanced from un-correlated to moderately correlated by removing the CME. On the other hand, the significance tests show that, all of the obtained linear regression equations, which represent a unique function of the noise amplitude in any two components, are of practical value after removing the CME. According to the noise amplitude estimates in two components and the linear regression equations, more accurate noise amplitudes can be acquired in the two components.
Design and Implementation of a REST API for the Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
NASA Astrophysics Data System (ADS)
Zharinov, I. O.; Zharinov, O. O.
2017-12-01
The problem of the research is concerned with quantitative analysis of influence of technological variation of the screen color profile parameters on chromaticity coordinates of the displayed image. Some mathematical expressions which approximate the two-dimensional distribution of chromaticity coordinates of an image, which is displayed on the screen with a three-component color formation principle were proposed. Proposed mathematical expressions show the way to development of correction techniques to improve reproducibility of the colorimetric features of displays.
Recent advances in Ni-H2 technology at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Gonzalezsanabria, O. D.; Britton, D. L.; Smithrick, J. J.; Reid, M. A.
1986-01-01
The NASA Lewis Research Center has concentrated its efforts on advancing the Ni-H2 system technology for low Earth orbit applications. Component technology as well as the design principles were studied in an effort to understand the system behavior and failure mechanisms in order to increase performance and extend cycle life. The design principles were previously addressed. The component development is discussed, in particular the separator and nickel electrode and how these efforts will advance the Ni-H2 system technology.
Effectiveness of a worksite mindfulness-based multi-component intervention on lifestyle behaviors
2014-01-01
Introduction Overweight and obesity are associated with an increased risk of morbidity. Mindfulness training could be an effective strategy to optimize lifestyle behaviors related to body weight gain. The aim of this study was to evaluate the effectiveness of a worksite mindfulness-based multi-component intervention on vigorous physical activity in leisure time, sedentary behavior at work, fruit intake and determinants of these behaviors. The control group received information on existing lifestyle behavior- related facilities that were already available at the worksite. Methods In a randomized controlled trial design (n = 257), 129 workers received a mindfulness training, followed by e-coaching, lunch walking routes and fruit. Outcome measures were assessed at baseline and after 6 and 12 months using questionnaires. Physical activity was also measured using accelerometers. Effects were analyzed using linear mixed effect models according to the intention-to-treat principle. Linear regression models (complete case analyses) were used as sensitivity analyses. Results There were no significant differences in lifestyle behaviors and determinants of these behaviors between the intervention and control group after 6 or 12 months. The sensitivity analyses showed effect modification for gender in sedentary behavior at work at 6-month follow-up, although the main analyses did not. Conclusions This study did not show an effect of a worksite mindfulness-based multi-component intervention on lifestyle behaviors and behavioral determinants after 6 and 12 months. The effectiveness of a worksite mindfulness-based multi-component intervention as a health promotion intervention for all workers could not be established. PMID:24467802
The principle of low frictional torque in the Charnley total hip replacement.
Wroblewski, B M; Siney, P D; Fleming, P A
2009-07-01
The design of the Charnley total hip replacement follows the principle of low frictional torque. It is based on the largest possible difference between the radius of the femoral head and that of the outer aspect of the acetabular component. The aim is to protect the bone-cement interface by movement taking place at the smaller radius, the articulation. This is achieved in clinical practice by a 22.225 mm diameter head articulating with a 40 mm or 43 mm diameter acetabular component of ultra-high molecular weight polyethylene. We compared the incidence of aseptic loosening of acetabular components with an outer diameter of 40 mm and 43 mm at comparable depths of penetration with a mean follow-up of 17 years (1 to 40). In cases with no measurable wear none of the acetabular components were loose. With increasing acetabular penetration there was an increased incidence of aseptic loosening which reflected the difference in the external radii, with 1.5% at 1 mm, 8.8% at 2 mm, 9.7% at 3 mm and 9.6% at 4 mm of penetration in favour of the larger 43 mm acetabular component. Our findings support the Charnley principle of low frictional torque. The level of the benefit is in keeping with the predicted values.
Chapman, Benjamin P; Weiss, Alexander; Duberstein, Paul R
2016-12-01
Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in "big data" problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how 3 common SLT algorithms-supervised principal components, regularization, and boosting-can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach-or perhaps because of them-SLT methods may hold value as a statistically rigorous approach to exploratory regression. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Wang, Lu; Qu, Haibin
2016-03-01
A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.
Common evolutionary trends underlie the four-bar linkage systems of sunfish and mantis shrimp.
Hu, Yinan; Nelson-Maney, Nathan; Anderson, Philip S L
2017-05-01
Comparative biomechanics offers an opportunity to explore the evolution of disparate biological systems that share common underlying mechanics. Four-bar linkage modeling has been applied to various biological systems such as fish jaws and crustacean appendages to explore the relationship between biomechanics and evolutionary diversification. Mechanical sensitivity states that the functional output of a mechanical system will show differential sensitivity to changes in specific morphological components. We document similar patterns of mechanical sensitivity in two disparate four-bar systems from different phyla: the opercular four-bar system in centrarchid fishes and the raptorial appendage of stomatopods. We built dynamic linkage models of 19 centrarchid and 36 stomatopod species and used phylogenetic generalized least squares regression (PGLS) to compare evolutionary shifts in linkage morphology and mechanical outputs derived from the models. In both systems, the kinematics of the four-bar mechanism show significant evolutionary correlation with the output link, while travel distance of the output arm is correlated with the coupler link. This common evolutionary pattern seen in both fish and crustacean taxa is a potential consequence of the mechanical principles underlying four-bar systems. Our results illustrate the potential influence of physical principles on morphological evolution across biological systems with different structures, behaviors, and ecologies. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.
Butler, Rebecca A.
2014-01-01
Stroke aphasia is a multidimensional disorder in which patient profiles reflect variation along multiple behavioural continua. We present a novel approach to separating the principal aspects of chronic aphasic performance and isolating their neural bases. Principal components analysis was used to extract core factors underlying performance of 31 participants with chronic stroke aphasia on a large, detailed battery of behavioural assessments. The rotated principle components analysis revealed three key factors, which we labelled as phonology, semantic and executive/cognition on the basis of the common elements in the tests that loaded most strongly on each component. The phonology factor explained the most variance, followed by the semantic factor and then the executive-cognition factor. The use of principle components analysis rendered participants’ scores on these three factors orthogonal and therefore ideal for use as simultaneous continuous predictors in a voxel-based correlational methodology analysis of high resolution structural scans. Phonological processing ability was uniquely related to left posterior perisylvian regions including Heschl’s gyrus, posterior middle and superior temporal gyri and superior temporal sulcus, as well as the white matter underlying the posterior superior temporal gyrus. The semantic factor was uniquely related to left anterior middle temporal gyrus and the underlying temporal stem. The executive-cognition factor was not correlated selectively with the structural integrity of any particular region, as might be expected in light of the widely-distributed and multi-functional nature of the regions that support executive functions. The identified phonological and semantic areas align well with those highlighted by other methodologies such as functional neuroimaging and neurostimulation. The use of principle components analysis allowed us to characterize the neural bases of participants’ behavioural performance more robustly and selectively than the use of raw assessment scores or diagnostic classifications because principle components analysis extracts statistically unique, orthogonal behavioural components of interest. As such, in addition to improving our understanding of lesion–symptom mapping in stroke aphasia, the same approach could be used to clarify brain–behaviour relationships in other neurological disorders. PMID:25348632
Study on fast discrimination of varieties of yogurt using Vis/NIR-spectroscopy
NASA Astrophysics Data System (ADS)
He, Yong; Feng, Shuijuan; Deng, Xunfei; Li, Xiaoli
2006-09-01
A new approach for discrimination of varieties of yogurt by means of VisINTR-spectroscopy was present in this paper. Firstly, through the principal component analysis (PCA) of spectroscopy curves of 5 typical kinds of yogurt, the clustering of yogurt varieties was processed. The analysis results showed that the cumulate reliabilities of PC1 and PC2 (the first two principle components) were more than 98.956%, and the cumulate reliabilities from PC1 to PC7 (the first seven principle components) was 99.97%. Secondly, a discrimination model of Artificial Neural Network (ANN-BP) was set up. The first seven principles components of the samples were applied as ANN-BP inputs, and the value of type of yogurt were applied as outputs, then the three-layer ANN-BP model was build. In this model, every variety yogurt includes 27 samples, the total number of sample is 135, and the rest 25 samples were used as prediction set. The results showed the distinguishing rate of the five yogurt varieties was 100%. It presented that this model was reliable and practicable. So a new approach for the rapid and lossless discrimination of varieties of yogurt was put forward.
Developing a patient-led electronic feedback system for quality and safety within Renal PatientView.
Giles, Sally J; Reynolds, Caroline; Heyhoe, Jane; Armitage, Gerry
2017-03-01
It is increasingly acknowledged that patients can provide direct feedback about the quality and safety of their care through patient reporting systems. The aim of this study was to explore the feasibility of patients, healthcare professionals and researchers working in partnership to develop a patient-led quality and safety feedback system within an existing electronic health record (EHR), known as Renal PatientView (RPV). Phase 1 (inception) involved focus groups (n = 9) and phase 2 (requirements) involved cognitive walkthroughs (n = 34) and 1:1 qualitative interviews (n = 34) with patients and healthcare professionals. A Joint Services Expert Panel (JSP) was convened to review the findings from phase 1 and agree the core principles and components of the system prototype. Phase 1 data were analysed using a thematic approach. Data from phase 1 were used to inform the design of the initial system prototype. Phase 2 data were analysed using the components of heuristic evaluation, resulting in a list of core principles and components for the final system prototype. Phase 1 identified four main barriers and facilitators to patients feeding back on quality and safety concerns. In phase 2, the JSP agreed that the system should be based on seven core principles and components. Stakeholders were able to work together to identify core principles and components for an electronic patient quality and safety feedback system in renal services. Tensions arose due to competing priorities, particularly around anonymity and feedback. Careful consideration should be given to the feasibility of integrating a novel element with differing priorities into an established system with existing functions and objectives. © 2016 European Dialysis and Transplant Nurses Association/European Renal Care Association.
ERIC Educational Resources Information Center
Brown, William M.; Hamburger, Michael W.
2012-01-01
A successful campus sustainability effort catalyzes broad engagement of the campus community and integration of sustainability principles into the academic and operational components of campus life. Although many universities have embraced sustainability as a new core value, others have been more sluggish in adopting sustainability principles to…
An Analysis of Bid Evaluation Procedures of Contemporary Models for Procurement in Pakistan
2016-12-01
21 E. COMPONENTS FOR BID EVALUATION .........................................21 1. Market Intelligence...PROCUREMENT SYSTEM ...............................................23 1. Basic Principles ...43 1. Principles of Procurement and Prequalification .......................43 2. Contract Types
Regression Models for Identifying Noise Sources in Magnetic Resonance Images
Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.
2009-01-01
Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478
The accentuation principle of figure-ground segregation and the downbeat illusion.
Pinna, Baingio; Sirigu, Luca
2016-10-01
Pinna and Sirigu (2011) demonstrated a new principle of grouping, called the accentuation principle, stating that, all else being equal, elements tend to group in the same oriented direction of the discontinuous element placed within a whole set of continuous/homogeneous components. The discontinuous element behaves like an accent, i.e. a visual emphasis within the wholeness of components as shown in the next section. In this work, the accentuation principle has been extended to new visual domains. In particular, it is shown how this principle affects shape perception. Moreover several visual object attributes are also highlighted, among which orientation, spatial position, inner dynamics and apparent motion that determine the so-called organic segmentation and furthermore tend to induce figure-ground segregation. On the basis of the results of experimental phenomenology, the accentuation can be considered as a complex principle ruling grouping, figure-ground segregation, shape and meaning formation. Through a new musical illusion of downbeat, it is also demonstrated that this principle influences perceptual organization not only in space but also in time and, thus, in both visual and musical domains. This illusion can be heard in eight measures of Pagodes, a solo piano music by Claude Debussy (1862-1918), where a strong physical-perceptual discrepancy in terms of upbeats and downbeats inversion is strongly perceived in both staves. Copyright © 2016 Elsevier B.V. All rights reserved.
Variable selection and model choice in geoadditive regression models.
Kneib, Thomas; Hothorn, Torsten; Tutz, Gerhard
2009-06-01
Model choice and variable selection are issues of major concern in practical regression analyses, arising in many biometric applications such as habitat suitability analyses, where the aim is to identify the influence of potentially many environmental conditions on certain species. We describe regression models for breeding bird communities that facilitate both model choice and variable selection, by a boosting algorithm that works within a class of geoadditive regression models comprising spatial effects, nonparametric effects of continuous covariates, interaction surfaces, and varying coefficients. The major modeling components are penalized splines and their bivariate tensor product extensions. All smooth model terms are represented as the sum of a parametric component and a smooth component with one degree of freedom to obtain a fair comparison between the model terms. A generic representation of the geoadditive model allows us to devise a general boosting algorithm that automatically performs model choice and variable selection.
A regression-adjusted approach can estimate competing biomass
James H. Miller
1983-01-01
A method is presented for estimating above-ground herbaceous and woody biomass on competition research plots. On a set of destructively-sampled plots, an ocular estimate of biomass by vegetative component is first made, after which vegetation is clipped, dried, and weighed. Linear regressions are then calculated for each component between estimated and actual weights...
Peng, Ying; Li, Su-Ning; Pei, Xuexue; Hao, Kun
2018-03-01
Amultivariate regression statisticstrategy was developed to clarify multi-components content-effect correlation ofpanaxginseng saponins extract and predict the pharmacological effect by components content. In example 1, firstly, we compared pharmacological effects between panax ginseng saponins extract and individual saponin combinations. Secondly, we examined the anti-platelet aggregation effect in seven different saponin combinations of ginsenoside Rb1, Rg1, Rh, Rd, Ra3 and notoginsenoside R1. Finally, the correlation between anti-platelet aggregation and the content of multiple components was analyzed by a partial least squares algorithm. In example 2, firstly, 18 common peaks were identified in ten different batches of panax ginseng saponins extracts from different origins. Then, we investigated the anti-myocardial ischemia reperfusion injury effects of the ten different panax ginseng saponins extracts. Finally, the correlation between the fingerprints and the cardioprotective effects was analyzed by a partial least squares algorithm. Both in example 1 and 2, the relationship between the components content and pharmacological effect was modeled well by the partial least squares regression equations. Importantly, the predicted effect curve was close to the observed data of dot marked on the partial least squares regression model. This study has given evidences that themulti-component content is a promising information for predicting the pharmacological effects of traditional Chinese medicine.
Han, Sheng-Nan
2014-07-01
Chemometrics is a new branch of chemistry which is widely applied to various fields of analytical chemistry. Chemometrics can use theories and methods of mathematics, statistics, computer science and other related disciplines to optimize the chemical measurement process and maximize access to acquire chemical information and other information on material systems by analyzing chemical measurement data. In recent years, traditional Chinese medicine has attracted widespread attention. In the research of traditional Chinese medicine, it has been a key problem that how to interpret the relationship between various chemical components and its efficacy, which seriously restricts the modernization of Chinese medicine. As chemometrics brings the multivariate analysis methods into the chemical research, it has been applied as an effective research tool in the composition-activity relationship research of Chinese medicine. This article reviews the applications of chemometrics methods in the composition-activity relationship research in recent years. The applications of multivariate statistical analysis methods (such as regression analysis, correlation analysis, principal component analysis, etc. ) and artificial neural network (such as back propagation artificial neural network, radical basis function neural network, support vector machine, etc. ) are summarized, including the brief fundamental principles, the research contents and the advantages and disadvantages. Finally, the existing main problems and prospects of its future researches are proposed.
Thermal stress characterization using the electro-mechanical impedance method
NASA Astrophysics Data System (ADS)
Zhu, Xuan; Lanza di Scalea, Francesco; Fateh, Mahmood
2017-04-01
This study examines the potential of the Electro-Mechanical Impedance (EMI) method to provide an estimation of the developed thermal stress in constrained bar-like structures. This non-invasive method features the easiness of implementation and interpretation, while it is notoriously known for being vulnerable to environmental variability. A comprehensive analytical model is proposed to relate the measured electric admittance signatures of the PZT element to temperature and uniaxial stress applied to the underlying structure. The model results compare favorably to the experimental ones, where the sensitivities of features extracted from the admittance signatures to the varying stress levels and temperatures are determined. Two temperature compensation frameworks are proposed to characterize the thermal stress states: (a) a regression model is established based on temperature-only tests, and the residuals from the thermal stress tests are then used to isolate the stress measurand; (b) the temperature-only tests are decomposed by Principle Components Analysis (PCA) and the feature vectors of the thermal stress tests are reconstructed after removal of the temperaturesensitive components. For both methods, the features were selected based on their performance in Receiver Operating Characteristic (ROC) curves. Experimental results on the Continuous Welded Rails (CWR) are shown to demonstrate the effectiveness of these temperature compensation methods.
A New Principle of Sound Frequency Analysis
NASA Technical Reports Server (NTRS)
Theodorsen, Theodore
1932-01-01
In connection with the study of aircraft and propeller noises, the National Advisory Committee for Aeronautics has developed an instrument for sound-frequency analysis which differs fundamentally from previous types, and which, owing to its simplicity of principle, construction, and operation, has proved to be of value in this investigation. The method is based on the well-known fact that the Ohmic loss in an electrical resistance is equal to the sum of the losses of the harmonic components of a complex wave, except for the case in which any two components approach or attain vectorial identity, in which case the Ohmic loss is increased by a definite amount. The principle of frequency analysis has been presented mathematically and a number of distinct advantages relative to previous methods have been pointed out. An automatic recording instrument embodying this principle is described in detail. It employs a beat-frequency oscillator as a source of variable frequency. A large number of experiments have verified the predicted superiority of the method. A number of representative records are presented.
Determination of whey adulteration in milk powder by using laser induced breakdown spectroscopy.
Bilge, Gonca; Sezer, Banu; Eseller, Kemal Efe; Berberoglu, Halil; Topcu, Ali; Boyaci, Ismail Hakki
2016-12-01
A rapid and in situ method has been developed to detect and quantify adulterated milk powder through adding whey powder by using laser induced breakdown spectroscopy (LIBS). The methodology is based on elemental composition differences between milk and whey products. Milk powder, sweet and acid whey powders were produced as standard samples, and milk powder was adulterated with whey powders. Based on LIBS spectra of standard samples and commercial products, species was identified using principle component analysis (PCA) method, and discrimination rate of milk and whey powders was found as 80.5%. Calibration curves were obtained with partial least squares regression (PLS). Correlation coefficient (R(2)) and limit of detection (LOD) values were 0.981 and 1.55% for adulteration with sweet whey powder, and 0.985 and 0.55% for adulteration with acid whey powder, respectively. The results were found to be consistent with the data from inductively coupled plasma - mass spectrometer (ICP-MS) method. Copyright © 2016 Elsevier Ltd. All rights reserved.
A new comprehensive index for discriminating adulteration in bovine raw milk.
Liu, Jing; Ren, Jing; Liu, Zhen-Min; Guo, Ben-Heng
2015-04-01
This paper proposes a new comprehensive index, called Q, which can effectively discriminate artificial adulterated milk from unadulterated milk. Both normal and adulterated samples of bovine raw milk were analysed by Fourier transform infrared spectroscopic instrument to measure the traditional indices of quality, including fat (FAT), protein (PRO), lactose (LAC), total solids (TS), non-fat solid (NFS), freezing point (FP) and somatic cell counts (SCC). From these traditional indices, this paper elaborates a method to build the index Q. First, correlated analysis and principle component analysis were used to select parameter pairs TS-FAT and FP-LAC as predominant variables. Second, linear-regression analysis and residual analysis are applied to determine the index Q and its discriminating ranges. The verification and two-blind trial results suggested that index Q could accurately detect milk adulteration with maltodextrin and water (as low as 1.0% of adulteration proportions), and with other nine kinds of synthetic adulterants (as low as 0.5% of adulteration proportions). Copyright © 2014 Elsevier Ltd. All rights reserved.
Limitations of inclusive fitness.
Allen, Benjamin; Nowak, Martin A; Wilson, Edward O
2013-12-10
Until recently, inclusive fitness has been widely accepted as a general method to explain the evolution of social behavior. Affirming and expanding earlier criticism, we demonstrate that inclusive fitness is instead a limited concept, which exists only for a small subset of evolutionary processes. Inclusive fitness assumes that personal fitness is the sum of additive components caused by individual actions. This assumption does not hold for the majority of evolutionary processes or scenarios. To sidestep this limitation, inclusive fitness theorists have proposed a method using linear regression. On the basis of this method, it is claimed that inclusive fitness theory (i) predicts the direction of allele frequency changes, (ii) reveals the reasons for these changes, (iii) is as general as natural selection, and (iv) provides a universal design principle for evolution. In this paper we evaluate these claims, and show that all of them are unfounded. If the objective is to analyze whether mutations that modify social behavior are favored or opposed by natural selection, then no aspect of inclusive fitness theory is needed.
Retnam, Ananthy; Zakaria, Mohamad Pauzi; Juahir, Hafizan; Aris, Ahmad Zaharin; Zali, Munirah Abdul; Kasim, Mohd Fadhil
2013-04-15
This study investigated polycyclic aromatic hydrocarbons (PAHs) pollution in surface sediments within aquaculture areas in Peninsular Malaysia using chemometric techniques, forensics and univariate methods. The samples were analysed using soxhlet extraction, silica gel column clean-up and gas chromatography mass spectrometry. The total PAH concentrations ranged from 20 to 1841 ng/g with a mean of 363 ng/g dw. The application of chemometric techniques enabled clustering and discrimination of the aquaculture sediments into four groups according to the contamination levels. A combination of chemometric and molecular indices was used to identify the sources of PAHs, which could be attributed to vehicle emissions, oil combustion and biomass combustion. Source apportionment using absolute principle component scores-multiple linear regression showed that the main sources of PAHs are vehicle emissions 54%, oil 37% and biomass combustion 9%. Land-based pollution from vehicle emissions is the predominant contributor of PAHs in the aquaculture sediments of Peninsular Malaysia. Copyright © 2013 Elsevier Ltd. All rights reserved.
Visual perception of landscape: sex and personality differences
A. Macia
1979-01-01
The present study established relationships between individual differences and subjective evaluation of different kinds of landscapes. These were the first three principle components of the five components obtained from a matrix of coincidences. The three components used were: 1) natural versus humanized landscapes; 2) pleasant versus rough landscapes; 3) straight and...
ERIC Educational Resources Information Center
Beauducel, Andre
2007-01-01
It was investigated whether commonly used factor score estimates lead to the same reproduced covariance matrix of observed variables. This was achieved by means of Schonemann and Steiger's (1976) regression component analysis, since it is possible to compute the reproduced covariance matrices of the regression components corresponding to different…
ERIC Educational Resources Information Center
Uline, Mark J.; Corti, David S.
2006-01-01
Le Chatelier's principle states that the further addition of a particular component will cause the reaction to shift in the direction that reduces the total number of moles of the system. However, the addition of one reactant [N[subscript 2
Principles and Techniques of Radiation Chemistry.
ERIC Educational Resources Information Center
Dorfman, Leon M.
1981-01-01
Discusses the physical processes involved in the deposition of energy from ionizing radiation in the absorber system. Identifies principles relevant to these processes which are responsible for ionization and excitation of the components of the absorber system. Briefly describes some experimental techniques in use in radiation chemical studies.…
ERIC Educational Resources Information Center
Dantley, Michael E.
2003-01-01
Reimagines educational leadership using Cornel West's notions of prophetic spirituality. Proposes three categories of leadership: principled, pragmatic, and purposive, all of which are grounded in components of West's prophetic spirituality. Argues that transformation of educational leadership necessitates searching for unique way to alter its…
Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring
ERIC Educational Resources Information Center
Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri
2017-01-01
Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…
Integrating Leadership Processes: Redefining the Principles Course.
ERIC Educational Resources Information Center
Neff, Bonita Dostal
2002-01-01
Revamps the principles of a public relations course, the first professional course in the public relations sequence, by integrating a leadership process and a service-learning component. Finds that more students are reflecting the interpersonal and team skills desired in the 1998 national study on public relations. (SG)
Rhee, Chang-Hoon; Shin, Sang Min; Choi, Yong-Seok; Yamaguchi, Tetsutaro; Maki, Koutaro; Kim, Yong-Il; Kim, Seong-Sik; Park, Soo-Byung; Son, Woo-Sung
2015-12-01
From computed tomographic images, the dentocentral synchondrosis can be identified in the second cervical vertebra. This can demarcate the border between the odontoid process and the body of the 2nd cervical vertebra and serve as a good model for the prediction of bone and forensic age. Nevertheless, until now, there has been no application of the 2nd cervical vertebra based on the dentocentral synchondrosis. In this study, statistical shape analysis was used to build bone and forensic age estimation regression models. Following the principles of statistical shape analysis and principal components analysis, we used cone-beam computed tomography (CBCT) to evaluate a Japanese population (35 males and 45 females, from 5 to 19 years old). The narrowest prediction intervals among the multivariate regression models were 19.63 for bone age and 2.99 for forensic age. There was no significant difference between form space and shape space in the bone and forensic age estimation models. However, for gender comparison, the bone and forensic age estimation models for males had the higher explanatory power. This study derived an improved objective and quantitative method for bone and forensic age estimation based on only the 2nd, 3rd and 4th cervical vertebral shapes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Westerlund, Emma E; Tovar, Marco A; Lönnermark, Elisabet; Montoya, Rosario; Evans, Carlton A
2015-09-01
Tuberculosis is frequent among poor and marginalized people whose limited tuberculosis-related knowledge may impair healthcare access. We characterised tuberculosis-related knowledge and associations with delayed treatment and treatment outcome. Tuberculosis patients (n = 943), people being tested for suspected tuberculosis (n = 2020), and randomly selected healthy controls (n = 476) in 16 periurban shantytowns were interviewed characterizing: socio-demographic factors; tuberculosis risk-factors; and patients' treatment delay. Principle component analysis was used to generate a tuberculosis-related knowledge score. Patients were followed-up for median 7.7 years. Factors associated with tuberculosis treatment delay, treatment outcome and tuberculosis recurrence were assessed using linear, logistic and Cox regression. Tuberculosis-related knowledge was poor, especially in older people who had not completed schooling and had never been diagnosed with tuberculosis. Tuberculosis treatment delay was median 60 days and was more delayed for patients who were poorer, older, had more severe tuberculosis and in only unadjusted analysis with incomplete schooling and low tuberculosis-related knowledge (all p ≤ 0.03). Lower than median tuberculosis-related knowledge was associated with tuberculosis recurrence (unadjusted hazard ratio = 2.1, p = 0.008), and this association was independent of co-morbidities, disease severity and demographic factors (multiple regression adjusted hazard ratio = 2.6, p = 0.008). Low tuberculosis-related knowledge independently predicted tuberculosis recurrence. Thus health education may improve tuberculosis prognosis. Copyright © 2015. Published by Elsevier Ltd.
On self-propagating methodological flaws in performance normalization for strength and power sports.
Arandjelović, Ognjen
2013-06-01
Performance in strength and power sports is greatly affected by a variety of anthropometric factors. The goal of performance normalization is to factor out the effects of confounding factors and compute a canonical (normalized) performance measure from the observed absolute performance. Performance normalization is applied in the ranking of elite athletes, as well as in the early stages of youth talent selection. Consequently, it is crucial that the process is principled and fair. The corpus of previous work on this topic, which is significant, is uniform in the methodology adopted. Performance normalization is universally reduced to a regression task: the collected performance data are used to fit a regression function that is then used to scale future performances. The present article demonstrates that this approach is fundamentally flawed. It inherently creates a bias that unfairly penalizes athletes with certain allometric characteristics, and, by virtue of its adoption in the ranking and selection of elite athletes, propagates and strengthens this bias over time. The main flaws are shown to originate in the criteria for selecting the data used for regression, as well as in the manner in which the regression model is applied in normalization. This analysis brings into light the aforesaid methodological flaws and motivates further work on the development of principled methods, the foundations of which are also laid out in this work.
Just a Fad? Gamification in Health and Fitness Apps
2014-01-01
Background Gamification has been a predominant focus of the health app industry in recent years. However, to our knowledge, there has yet to be a review of gamification elements in relation to health behavior constructs, or insight into the true proliferation of gamification in health apps. Objective The objective of this study was to identify the extent to which gamification is used in health apps, and analyze gamification of health and fitness apps as a potential component of influence on a consumer’s health behavior. Methods An analysis of health and fitness apps related to physical activity and diet was conducted among apps in the Apple App Store in the winter of 2014. This analysis reviewed a sample of 132 apps for the 10 effective game elements, the 6 core components of health gamification, and 13 core health behavior constructs. A regression analysis was conducted in order to measure the correlation between health behavior constructs, gamification components, and effective game elements. Results This review of the most popular apps showed widespread use of gamification principles, but low adherence to any professional guidelines or industry standard. Regression analysis showed that game elements were associated with gamification (P<.001). Behavioral theory was associated with gamification (P<.05), but not game elements, and upon further analysis gamification was only associated with composite motivational behavior scores (P<.001), and not capacity or opportunity/trigger. Conclusions This research, to our knowledge, represents the first comprehensive review of gamification use in health and fitness apps, and the potential to impact health behavior. The results show that use of gamification in health and fitness apps has become immensely popular, as evidenced by the number of apps found in the Apple App Store containing at least some components of gamification. This shows a lack of integrating important elements of behavioral theory from the app industry, which can potentially impact the efficacy of gamification apps to change behavior. Apps represent a very promising, burgeoning market and landscape in which to disseminate health behavior change interventions. Initial results show an abundant use of gamification in health and fitness apps, which necessitates the in-depth study and evaluation of the potential of gamification to change health behaviors. PMID:25654660
Lee, Ji Eun; Kim, Hyun Woong; Lee, Sang Joon; Lee, Joo Eun
2015-05-01
To investigate vascular structural changes of choroidal neovascularization (CNV) followed by intravitreal ranibizumab injections using indocyanine green angiography. A total of 31 patients with exudative age-related macular degeneration and CNV whose structures were identifiable in indocyanine green angiography were included. Ranibizumab was injected into the vitreous cavity once a month for 3 months and then as needed for the next 3 months prospectively. Indocyanine green angiography was performed at baseline, 3, and 6 months. Early to midphase images of the indocyanine green angiography in the details of vascular structure of the CNV were discerned the best were used in the image analysis. Vascular structures of CNV were described as arteriovenular and capillary components, and structural changes were assessed. Arteriovenular components were observed in 29 eyes (94%). Regression of the capillary components was observed in most cases. Although regression of arteriovenular component was noted in 14 eyes (48%), complete resolution was not observed. The eyes were categorized into 3 groups according to CNV structural changes: the regressed (Group R, 10 eyes, 31%), the matured (Group M, 7 eyes, 23%), and the growing (Group G, 14 eyes, 45%). In Group R, there was no regrowth of CNV found at 6 months. In Group M, distinct vascular structures were observed at 3 months and persisted without apparent changes at 6 months. In Group G, growth or reperfusion of capillary components from the persisting arteriovenular components was noted at 6 months. Both capillary and arteriovenular components were regressed during monthly ranibizumab injections. However, CNV regrowth was observed in a group of patients during the as-needed treatment phase.
Principle of maximum entropy for reliability analysis in the design of machine components
NASA Astrophysics Data System (ADS)
Zhang, Yimin
2018-03-01
We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.
Rahman, Md. Jahanur; Shamim, Abu Ahmed; Klemm, Rolf D. W.; Labrique, Alain B.; Rashid, Mahbubur; Christian, Parul; West, Keith P.
2017-01-01
Birth weight, length and circumferences of the head, chest and arm are key measures of newborn size and health in developing countries. We assessed maternal socio-demographic factors associated with multiple measures of newborn size in a large rural population in Bangladesh using partial least squares (PLS) regression method. PLS regression, combining features from principal component analysis and multiple linear regression, is a multivariate technique with an ability to handle multicollinearity while simultaneously handling multiple dependent variables. We analyzed maternal and infant data from singletons (n = 14,506) born during a double-masked, cluster-randomized, placebo-controlled maternal vitamin A or β-carotene supplementation trial in rural northwest Bangladesh. PLS regression results identified numerous maternal factors (parity, age, early pregnancy MUAC, living standard index, years of education, number of antenatal care visits, preterm delivery and infant sex) significantly (p<0.001) associated with newborn size. Among them, preterm delivery had the largest negative influence on newborn size (Standardized β = -0.29 − -0.19; p<0.001). Scatter plots of the scores of first two PLS components also revealed an interaction between newborn sex and preterm delivery on birth size. PLS regression was found to be more parsimonious than both ordinary least squares regression and principal component regression. It also provided more stable estimates than the ordinary least squares regression and provided the effect measure of the covariates with greater accuracy as it accounts for the correlation among the covariates and outcomes. Therefore, PLS regression is recommended when either there are multiple outcome measurements in the same study, or the covariates are correlated, or both situations exist in a dataset. PMID:29261760
Kabir, Alamgir; Rahman, Md Jahanur; Shamim, Abu Ahmed; Klemm, Rolf D W; Labrique, Alain B; Rashid, Mahbubur; Christian, Parul; West, Keith P
2017-01-01
Birth weight, length and circumferences of the head, chest and arm are key measures of newborn size and health in developing countries. We assessed maternal socio-demographic factors associated with multiple measures of newborn size in a large rural population in Bangladesh using partial least squares (PLS) regression method. PLS regression, combining features from principal component analysis and multiple linear regression, is a multivariate technique with an ability to handle multicollinearity while simultaneously handling multiple dependent variables. We analyzed maternal and infant data from singletons (n = 14,506) born during a double-masked, cluster-randomized, placebo-controlled maternal vitamin A or β-carotene supplementation trial in rural northwest Bangladesh. PLS regression results identified numerous maternal factors (parity, age, early pregnancy MUAC, living standard index, years of education, number of antenatal care visits, preterm delivery and infant sex) significantly (p<0.001) associated with newborn size. Among them, preterm delivery had the largest negative influence on newborn size (Standardized β = -0.29 - -0.19; p<0.001). Scatter plots of the scores of first two PLS components also revealed an interaction between newborn sex and preterm delivery on birth size. PLS regression was found to be more parsimonious than both ordinary least squares regression and principal component regression. It also provided more stable estimates than the ordinary least squares regression and provided the effect measure of the covariates with greater accuracy as it accounts for the correlation among the covariates and outcomes. Therefore, PLS regression is recommended when either there are multiple outcome measurements in the same study, or the covariates are correlated, or both situations exist in a dataset.
Arthropod surveillance programs: Basic components, strategies, and analysis
USDA-ARS?s Scientific Manuscript database
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...
Analysis of School Food Safety Programs Based on HACCP Principles
ERIC Educational Resources Information Center
Roberts, Kevin R.; Sauer, Kevin; Sneed, Jeannie; Kwon, Junehee; Olds, David; Cole, Kerri; Shanklin, Carol
2014-01-01
Purpose/Objectives: The purpose of this study was to determine how school districts have implemented food safety programs based on HACCP principles. Specific objectives included: (1) Evaluate how schools are implementing components of food safety programs; and (2) Determine foodservice employees food-handling practices related to food safety.…
Thomas Gordon's Communicative Pedagogy in Modern Educational Realities
ERIC Educational Resources Information Center
Leshchenko, Maria; Isaieva, Svitlana
2014-01-01
In the article the principles, strategies, methods, techniques of communicative pedagogy of American scientist Thomas Gordon and system components of effective communication training for parents, teachers and administrators are enlightened. It has been determined that the main principle of Thomas Gordon's pedagogy is an interactive way of knowing…
26 CFR 1.460-5 - Cost allocation rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... must be allocated to a long-term contract when dedicated to the contract under principles similar to... component is dedicated, under principles similar to those in § 1.263A-11(b)(2). A taxpayer maintaining... exempt construction contract reported using the CCM— (A) Marketing and selling expenses, including...
77 FR 27534 - Department of Transportation Updated Environmental Justice Order 5610.2(a)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-10
... component of the Department's strategy to promote the principles of environmental justice in all... environmental justice principles in all (DOT) programs, policies, and activities. It describes how the... the Office of Management and Budget's (OMB) Revisions to the Standards for the Classification of...
NASA Astrophysics Data System (ADS)
Omar, M. A.; Parvataneni, R.; Zhou, Y.
2010-09-01
Proposed manuscript describes the implementation of a two step processing procedure, composed of the self-referencing and the Principle Component Thermography (PCT). The combined approach enables the processing of thermograms from transient (flash), steady (halogen) and selective (induction) thermal perturbations. Firstly, the research discusses the three basic processing schemes typically applied for thermography; namely mathematical transformation based processing, curve-fitting processing, and direct contrast based calculations. Proposed algorithm utilizes the self-referencing scheme to create a sub-sequence that contains the maximum contrast information and also compute the anomalies' depth values. While, the Principle Component Thermography operates on the sub-sequence frames by re-arranging its data content (pixel values) spatially and temporally then it highlights the data variance. The PCT is mainly used as a mathematical mean to enhance the defects' contrast thus enabling its shape and size retrieval. The results show that the proposed combined scheme is effective in processing multiple size defects in sandwich steel structure in real-time (<30 Hz) and with full spatial coverage, without the need for a priori defect-free area.
Lehmann, A; Scheffler, Ch; Hermanussen, M
2010-02-01
Recent progress in modelling individual growth has been achieved by combining the principal component analysis and the maximum likelihood principle. This combination models growth even in incomplete sets of data and in data obtained at irregular intervals. We re-analysed late 18th century longitudinal growth of German boys from the boarding school Carlsschule in Stuttgart. The boys, aged 6-23 years, were measured at irregular 3-12 monthly intervals during the period 1771-1793. At the age of 18 years, mean height was 1652 mm, but height variation was large. The shortest boy reached 1474 mm, the tallest 1826 mm. Measured height closely paralleled modelled height, with mean difference of 4 mm, SD 7 mm. Seasonal height variation was found. Low growth rates occurred in spring and high growth rates in summer and autumn. The present study demonstrates that combining the principal component analysis and the maximum likelihood principle enables growth modelling in historic height data also. Copyright (c) 2009 Elsevier GmbH. All rights reserved.
Noel, Sabrina E.; Newby, P. K.; Ordovas, Jose M.; Tucker, Katherine L.
2010-01-01
Combinations of fatty acids may affect risk of metabolic syndrome. Puerto Ricans have a disproportionate number of chronic conditions compared with other Hispanic groups. We aimed to characterize fatty acid intake patterns of Puerto Rican adults aged 45–75 y and living in the Greater Boston area (n = 1207) and to examine associations between these patterns and metabolic syndrome. Dietary fatty acids, as a percentage of total fat, were entered into principle components analysis. Spearman correlation coefficients were used to examine associations between fatty acid intake patterns, nutrients, and food groups. Associations with metabolic syndrome were analyzed by using logistic regression and general linear models with quintiles of principal component scores. Four principal components (factors) emerged: factor 1, short- and medium-chain SFA/dairy; factor 2, (n-3) fatty acid/fish; factor 3, very long-chain (VLC) SFA and PUFA/oils; and factor 4, monounsaturated fatty acid/trans fat. The SFA/dairy factor was inversely associated with fasting serum glucose concentrations (P = 0.02) and the VLC SFA/oils factor was negatively related to waist circumference (P = 0.008). However, these associations were no longer significant after additional adjustment for BMI. The (n-3) fatty acid/fish factor was associated with a lower likelihood of metabolic syndrome (Q5 vs. Q1: odds ratio: 0.54, 95% CI: 0.34, 0.86). In summary, principal components analysis of fatty acid intakes revealed 4 dietary fatty acid patterns in this population. Identifying optimal combinations of fatty acids may be beneficial for understanding relationships with health outcomes given their diverse effects on metabolism. PMID:20702744
Sanford, Ward E.; Nelms, David L.; Pope, Jason P.; Selnick, David L.
2012-01-01
This study by the U.S. Geological Survey, prepared in cooperation with the Virginia Department of Environmental Quality, quantifies the components of the hydrologic cycle across the Commonwealth of Virginia. Long-term, mean fluxes were calculated for precipitation, surface runoff, infiltration, total evapotranspiration (ET), riparian ET, recharge, base flow (or groundwater discharge) and net total outflow. Fluxes of these components were first estimated on a number of real-time-gaged watersheds across Virginia. Specific conductance was used to distinguish and separate surface runoff from base flow. Specific-conductance data were collected every 15 minutes at 75 real-time gages for approximately 18 months between March 2007 and August 2008. Precipitation was estimated for 1971–2000 using PRISM climate data. Precipitation and temperature from the PRISM data were used to develop a regression-based relation to estimate total ET. The proportion of watershed precipitation that becomes surface runoff was related to physiographic province and rock type in a runoff regression equation. Component flux estimates from the watersheds were transferred to flux estimates for counties and independent cities using the ET and runoff regression equations. Only 48 of the 75 watersheds yielded sufficient data, and data from these 48 were used in the final runoff regression equation. The base-flow proportion for the 48 watersheds averaged 72 percent using specific conductance, a value that was substantially higher than the 61 percent average calculated using a graphical-separation technique (the USGS program PART). Final results for the study are presented as component flux estimates for all counties and independent cities in Virginia.
Xie, Ping; Wu, Zi Yi; Zhao, Jiang Yan; Sang, Yan Fang; Chen, Jie
2018-04-01
A stochastic hydrological process is influenced by both stochastic and deterministic factors. A hydrological time series contains not only pure random components reflecting its inheri-tance characteristics, but also deterministic components reflecting variability characteristics, such as jump, trend, period, and stochastic dependence. As a result, the stochastic hydrological process presents complicated evolution phenomena and rules. To better understand these complicated phenomena and rules, this study described the inheritance and variability characteristics of an inconsistent hydrological series from two aspects: stochastic process simulation and time series analysis. In addition, several frequency analysis approaches for inconsistent time series were compared to reveal the main problems in inconsistency study. Then, we proposed a new concept of hydrological genes origined from biological genes to describe the inconsistent hydrolocal processes. The hydrologi-cal genes were constructed using moments methods, such as general moments, weight function moments, probability weight moments and L-moments. Meanwhile, the five components, including jump, trend, periodic, dependence and pure random components, of a stochastic hydrological process were defined as five hydrological bases. With this method, the inheritance and variability of inconsistent hydrological time series were synthetically considered and the inheritance, variability and evolution principles were fully described. Our study would contribute to reveal the inheritance, variability and evolution principles in probability distribution of hydrological elements.
NASA Astrophysics Data System (ADS)
Tian, Jialin; Smith, William L.; Gazarik, Michael J.
2008-10-01
The ultimate remote sensing benefits of the high resolution Infrared radiance spectrometers will be realized with their geostationary satellite implementation in the form of imaging spectrometers. This will enable dynamic features of the atmosphere's thermodynamic fields and pollutant and greenhouse gas constituents to be observed for revolutionary improvements in weather forecasts and more accurate air quality and climate predictions. As an important step toward realizing this application objective, the Geostationary Imaging Fourier Transform Spectrometer (GIFTS) Engineering Demonstration Unit (EDU) was successfully developed under the NASA New Millennium Program, 2000-2006. The GIFTS-EDU instrument employs three focal plane arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw GIFTS interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. The radiometric calibration is achieved using internal blackbody calibration references at ambient (260 K) and hot (286 K) temperatures. The absolute radiometric performance of the instrument is affected by several factors including the FPA off-axis effect, detector/readout electronics induced nonlinearity distortions, and fore-optics offsets. The GIFTS-EDU, being the very first imaging spectrometer to use ultra-high speed electronics to readout its large area format focal plane array detectors, operating at wavelengths as large as 15 microns, possessed non-linearity's not easily removable in the initial calibration process. In this paper, we introduce a refined calibration technique that utilizes Principle Component (PC) analysis to compensate for instrument distortions and artifacts remaining after the initial radiometric calibration process, thus, further enhance the absolute calibration accuracy. This method is applied to data collected during an atmospheric measurement experiment with the GIFTS, together with simultaneous observations by the accurately calibrated AERI (Atmospheric Emitted Radiance Interferometer), both simultaneously zenith viewing the sky through the same external scene mirror at ten-minute intervals throughout a cloudless day at Logan Utah on September 13, 2006. The PC vectors of the calibrated radiance spectra are defined from the AERI observations and regression matrices relating the initial GIFTS radiance PC scores to the AERI radiance PC scores are calculated using the least squares inverse method. A new set of accurately calibrated GIFTS radiances are produced using the first four PC scores in the regression model. Temperature and moisture profiles retrieved from the PC-calibrated GIFTS radiances are verified against radiosonde measurements collected throughout the GIFTS sky measurement period.
B. Desta Fekedulegn; J.J. Colbert; R.R., Jr. Hicks; Michael E. Schuckers
2002-01-01
The theory and application of principal components regression, a method for coping with multicollinearity among independent variables in analyzing ecological data, is exhibited in detail. A concrete example of the complex procedures that must be carried out in developing a diagnostic growth-climate model is provided. We use tree radial increment data taken from breast...
What Makes the Foucault Pendulum Move among the Stars?
NASA Astrophysics Data System (ADS)
Phillips, Norman
2004-11-01
Foucault's pendulum exhibition in 1851 occurred in an era now known by development of the theorems of Coriolis and the formulation of dynamical meteorology by Ferrel. Yet today the behavior of the pendulum is often misunderstood. The existence of a horizontal component of Newtonian gravitation is essential for understanding the behavior with respect to the stars. Two simple mechanical principles describe why the path of oscillation is fixed only at the poles; the principle of centripetal acceleration and the principle of conservation of angular momentum. A sky map is used to describe the elegant path among the stars produced by these principles.
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Ultrasound-enhanced bioscouring process factors for greige cotton fabric are examined using custom experimental design utilizing statistical principles. An equation is presented which predicts bioscouring performance based upon percent reflectance values obtained from UV-Vis measurements of rutheniu...
Multi-functional optical signal processing using optical spectrum control circuit
NASA Astrophysics Data System (ADS)
Hayashi, Shuhei; Ikeda, Tatsuhiko; Mizuno, Takayuki; Takahashi, Hiroshi; Tsuda, Hiroyuki
2015-02-01
Processing ultra-fast optical signals without optical/electronic conversion is in demand and time-to-space conversion has been proposed as an effective solution. We have designed and fabricated an arrayed-waveguide grating (AWG) based optical spectrum control circuit (OSCC) using silica planar lightwave circuit (PLC) technology. This device is composed of an AWG, tunable phase shifters and a mirror. The principle of signal processing is to spatially decompose the signal's frequency components by using the AWG. Then, the phase of each frequency component is controlled by the tunable phase shifters. Finally, the light is reflected back to the AWG by the mirror and synthesized. Amplitude of each frequency component can be controlled by distributing the power to high diffraction order light. The spectral controlling range of the OSCC is 100 GHz and its resolution is 1.67 GHz. This paper describes equipping the OSCC with optical coded division multiplex (OCDM) encoder/decoder functionality. The encoding principle is to apply certain phase patterns to the signal's frequency components and intentionally disperse the signal. The decoding principle is also to apply certain phase patterns to the frequency components at the receiving side. If the applied phase pattern compensates the intentional dispersion, the waveform is regenerated, but if the pattern is not appropriate, the waveform remains dispersed. We also propose an arbitrary filter function by exploiting the OSCC's amplitude and phase control attributes. For example, a filtered optical signal transmitted through multiple optical nodes that use the wavelength multiplexer/demultiplexer can be equalized.
Quality in Higher Education: The Contribution of Edward Demings Principles
ERIC Educational Resources Information Center
Redmond, Richard; Curtis, Elizabeth; Noone, Tom; Keenan, Paul
2008-01-01
Purpose--There can be little doubt about the importance and relevance of quality for any service industry. One of the most influential contributors to service quality developments was W. Edwards Deming (1900-1993). An important component of Demings philosophy is reflected in his 14-principles for transforming a service as they indicate what…
Principles of Cancer Screening.
Pinsky, Paul F
2015-10-01
Cancer screening has long been an important component of the struggle to reduce the burden of morbidity and mortality from cancer. Notwithstanding this history, many aspects of cancer screening remain poorly understood. This article presents a summary of basic principles of cancer screening that are relevant for researchers, clinicians, and public health officials alike. Published by Elsevier Inc.
What's in a Grade? Grading Policies and Practices in Principles of Economics
ERIC Educational Resources Information Center
Walstad, William B.; Miller, Laurie A.
2016-01-01
Survey results from a national sample of economics instructors describe the grading policies and practices in principles of economics courses. The survey results provide insights about absolute and relative grading systems used by instructors, the course components and their weights that determine grades, and the type of assessment items used for…
Designing Serious Game Interventions for Individuals with Autism
ERIC Educational Resources Information Center
Whyte, Elisabeth M.; Smyth, Joshua M.; Scherf, K. Suzanne
2015-01-01
The design of "Serious games" that use game components (e.g., storyline, long-term goals, rewards) to create engaging learning experiences has increased in recent years. We examine of the core principles of serious game design and examine the current use of these principles in computer-based interventions for individuals with autism.…
Devising Principles of Design for Numeracy Tasks
ERIC Educational Resources Information Center
Geiger, Vince; Forgasz, Helen; Goos, Merrilyn; Bennison, Anne
2014-01-01
Numeracy is a fundamental component of the Australian National Curriculum as a General Capability identified in each F-10 subject. In this paper, we consider the principles of design necessary for the development of numeracy tasks specific to subjects other than mathematics--in this case, the subject of English. We explore the nature of potential…
ERIC Educational Resources Information Center
Friedel, Curtis R.; Kirland, Kelsey Church; Grimes, Matthew W.
2016-01-01
Principles of Peer Leadership is an undergraduate course developed through the collaboration of leadership educators with colleagues from residence life and fraternity/sorority life to provide instruction to undergraduate students serving in peer leadership positions across campus. The course comprises online and recitation components to connect…
Regression to fuzziness method for estimation of remaining useful life in power plant components
NASA Astrophysics Data System (ADS)
Alamaniotis, Miltiadis; Grelle, Austin; Tsoukalas, Lefteri H.
2014-10-01
Mitigation of severe accidents in power plants requires the reliable operation of all systems and the on-time replacement of mechanical components. Therefore, the continuous surveillance of power systems is a crucial concern for the overall safety, cost control, and on-time maintenance of a power plant. In this paper a methodology called regression to fuzziness is presented that estimates the remaining useful life (RUL) of power plant components. The RUL is defined as the difference between the time that a measurement was taken and the estimated failure time of that component. The methodology aims to compensate for a potential lack of historical data by modeling an expert's operational experience and expertise applied to the system. It initially identifies critical degradation parameters and their associated value range. Once completed, the operator's experience is modeled through fuzzy sets which span the entire parameter range. This model is then synergistically used with linear regression and a component's failure point to estimate the RUL. The proposed methodology is tested on estimating the RUL of a turbine (the basic electrical generating component of a power plant) in three different cases. Results demonstrate the benefits of the methodology for components for which operational data is not readily available and emphasize the significance of the selection of fuzzy sets and the effect of knowledge representation on the predicted output. To verify the effectiveness of the methodology, it was benchmarked against the data-based simple linear regression model used for predictions which was shown to perform equal or worse than the presented methodology. Furthermore, methodology comparison highlighted the improvement in estimation offered by the adoption of appropriate of fuzzy sets for parameter representation.
Tu, Yu-Kang; Krämer, Nicole; Lee, Wen-Chung
2012-07-01
In the analysis of trends in health outcomes, an ongoing issue is how to separate and estimate the effects of age, period, and cohort. As these 3 variables are perfectly collinear by definition, regression coefficients in a general linear model are not unique. In this tutorial, we review why identification is a problem, and how this problem may be tackled using partial least squares and principal components regression analyses. Both methods produce regression coefficients that fulfill the same collinearity constraint as the variables age, period, and cohort. We show that, because the constraint imposed by partial least squares and principal components regression is inherent in the mathematical relation among the 3 variables, this leads to more interpretable results. We use one dataset from a Taiwanese health-screening program to illustrate how to use partial least squares regression to analyze the trends in body heights with 3 continuous variables for age, period, and cohort. We then use another dataset of hepatocellular carcinoma mortality rates for Taiwanese men to illustrate how to use partial least squares regression to analyze tables with aggregated data. We use the second dataset to show the relation between the intrinsic estimator, a recently proposed method for the age-period-cohort analysis, and partial least squares regression. We also show that the inclusion of all indicator variables provides a more consistent approach. R code for our analyses is provided in the eAppendix.
Protein structure similarity from Principle Component Correlation analysis.
Zhou, Xiaobo; Chou, James; Wong, Stephen T C
2006-01-25
Owing to rapid expansion of protein structure databases in recent years, methods of structure comparison are becoming increasingly effective and important in revealing novel information on functional properties of proteins and their roles in the grand scheme of evolutionary biology. Currently, the structural similarity between two proteins is measured by the root-mean-square-deviation (RMSD) in their best-superimposed atomic coordinates. RMSD is the golden rule of measuring structural similarity when the structures are nearly identical; it, however, fails to detect the higher order topological similarities in proteins evolved into different shapes. We propose new algorithms for extracting geometrical invariants of proteins that can be effectively used to identify homologous protein structures or topologies in order to quantify both close and remote structural similarities. We measure structural similarity between proteins by correlating the principle components of their secondary structure interaction matrix. In our approach, the Principle Component Correlation (PCC) analysis, a symmetric interaction matrix for a protein structure is constructed with relationship parameters between secondary elements that can take the form of distance, orientation, or other relevant structural invariants. When using a distance-based construction in the presence or absence of encoded N to C terminal sense, there are strong correlations between the principle components of interaction matrices of structurally or topologically similar proteins. The PCC method is extensively tested for protein structures that belong to the same topological class but are significantly different by RMSD measure. The PCC analysis can also differentiate proteins having similar shapes but different topological arrangements. Additionally, we demonstrate that when using two independently defined interaction matrices, comparison of their maximum eigenvalues can be highly effective in clustering structurally or topologically similar proteins. We believe that the PCC analysis of interaction matrix is highly flexible in adopting various structural parameters for protein structure comparison.
HT-FRTC: a fast radiative transfer code using kernel regression
NASA Astrophysics Data System (ADS)
Thelen, Jean-Claude; Havemann, Stephan; Lewis, Warren
2016-09-01
The HT-FRTC is a principal component based fast radiative transfer code that can be used across the electromagnetic spectrum from the microwave through to the ultraviolet to calculate transmittance, radiance and flux spectra. The principal components cover the spectrum at a very high spectral resolution, which allows very fast line-by-line, hyperspectral and broadband simulations for satellite-based, airborne and ground-based sensors. The principal components are derived during a code training phase from line-by-line simulations for a diverse set of atmosphere and surface conditions. The derived principal components are sensor independent, i.e. no extra training is required to include additional sensors. During the training phase we also derive the predictors which are required by the fast radiative transfer code to determine the principal component scores from the monochromatic radiances (or fluxes, transmittances). These predictors are calculated for each training profile at a small number of frequencies, which are selected by a k-means cluster algorithm during the training phase. Until recently the predictors were calculated using a linear regression. However, during a recent rewrite of the code the linear regression was replaced by a Gaussian Process (GP) regression which resulted in a significant increase in accuracy when compared to the linear regression. The HT-FRTC has been trained with a large variety of gases, surface properties and scatterers. Rayleigh scattering as well as scattering by frozen/liquid clouds, hydrometeors and aerosols have all been included. The scattering phase function can be fully accounted for by an integrated line-by-line version of the Edwards-Slingo spherical harmonics radiation code or approximately by a modification to the extinction (Chou scaling).
NASA Astrophysics Data System (ADS)
Oguntunde, Philip G.; Lischeid, Gunnar; Dietrich, Ottfried
2018-03-01
This study examines the variations of climate variables and rice yield and quantifies the relationships among them using multiple linear regression, principal component analysis, and support vector machine (SVM) analysis in southwest Nigeria. The climate and yield data used was for a period of 36 years between 1980 and 2015. Similar to the observed decrease ( P < 0.001) in rice yield, pan evaporation, solar radiation, and wind speed declined significantly. Eight principal components exhibited an eigenvalue > 1 and explained 83.1% of the total variance of predictor variables. The SVM regression function using the scores of the first principal component explained about 75% of the variance in rice yield data and linear regression about 64%. SVM regression between annual solar radiation values and yield explained 67% of the variance. Only the first component of the principal component analysis (PCA) exhibited a clear long-term trend and sometimes short-term variance similar to that of rice yield. Short-term fluctuations of the scores of the PC1 are closely coupled to those of rice yield during the 1986-1993 and the 2006-2013 periods thereby revealing the inter-annual sensitivity of rice production to climate variability. Solar radiation stands out as the climate variable of highest influence on rice yield, and the influence was especially strong during monsoon and post-monsoon periods, which correspond to the vegetative, booting, flowering, and grain filling stages in the study area. The outcome is expected to provide more in-depth regional-specific climate-rice linkage for screening of better cultivars that can positively respond to future climate fluctuations as well as providing information that may help optimized planting dates for improved radiation use efficiency in the study area.
Accounting for measurement error in log regression models with applications to accelerated testing.
Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M
2018-01-01
In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.
Golani, Ilan
2012-06-01
In this review I focus on how three methodological principles advocated by Philip Teitelbaum influenced my work to this day: that similar principles of organization should be looked for in ontogeny and recovery of function; that the order of emergence of behavioral components provides a view on the organization of that behavior; and that the components of behavior should be exhibited by the animal itself in relatively pure form. I start by showing how these principles influenced our common work on the developmental dynamics of rodent egocentric space, and then proceed to describe how these principles affected my work with Yoav Benjamini and others on the developmental dynamics of rodent allocentric space. We analyze issues traditionally addressed by physiological psychologists with methods borrowed from ethology, EW (Eshkol-Wachman) movement notation, dynamical systems and exploratory data analysis. Then we show how the natural origins of axes embodied by the behavior of the organism itself, are used by us as the origins of axes for the measurement of the developmental moment-by-moment dynamics of behavior. Using this methodology we expose similar principles of organization across situations, species and preparations, provide a developmental view on the organization of behavior, expose the natural components of behavior in relatively pure form, and reveal how low level primitives generate higher level constructs. Advances in tracking technology should allow us to study how movements in egocentric and allocentric spaces interlace. Tracking of multi-limb coordination, progress in online recording of neural activity in freely moving animals, and the unprecedented accumulation of genetically engineered mouse preparations makes the behavioral ground plan exposed in this review essential for a systematic study of the brain/behavior interface. Copyright © 2012 Elsevier B.V. All rights reserved.
Olson, Scott A.
2003-01-01
The stream-gaging network in New Hampshire was analyzed for its effectiveness in providing regional information on peak-flood flow, mean-flow, and low-flow frequency. The data available for analysis were from stream-gaging stations in New Hampshire and selected stations in adjacent States. The principles of generalized-least-squares regression analysis were applied to develop regional regression equations that relate streamflow-frequency characteristics to watershed characteristics. Regression equations were developed for (1) the instantaneous peak flow with a 100-year recurrence interval, (2) the mean-annual flow, and (3) the 7-day, 10-year low flow. Active and discontinued stream-gaging stations with 10 or more years of flow data were used to develop the regression equations. Each stream-gaging station in the network was evaluated and ranked on the basis of how much the data from that station contributed to the cost-weighted sampling-error component of the regression equation. The potential effect of data from proposed and new stream-gaging stations on the sampling error also was evaluated. The stream-gaging network was evaluated for conditions in water year 2000 and for estimated conditions under various network strategies if an additional 5 years and 20 years of streamflow data were collected. The effectiveness of the stream-gaging network in providing regional streamflow information could be improved for all three flow characteristics with the collection of additional flow data, both temporally and spatially. With additional years of data collection, the greatest reduction in the average sampling error of the regional regression equations was found for the peak- and low-flow characteristics. In general, additional data collection at stream-gaging stations with unregulated flow, relatively short-term record (less than 20 years), and drainage areas smaller than 45 square miles contributed the largest cost-weighted reduction to the average sampling error of the regional estimating equations. The results of the network analyses can be used to prioritize the continued operation of active stations, the reactivation of discontinued stations, or the activation of new stations to maximize the regional information content provided by the stream-gaging network. Final decisions regarding altering the New Hampshire stream-gaging network would require the consideration of the many uses of the streamflow data serving local, State, and Federal interests.
Practical aspects of estimating energy components in rodents
van Klinken, Jan B.; van den Berg, Sjoerd A. A.; van Dijk, Ko Willems
2013-01-01
Recently there has been an increasing interest in exploiting computational and statistical techniques for the purpose of component analysis of indirect calorimetry data. Using these methods it becomes possible to dissect daily energy expenditure into its components and to assess the dynamic response of the resting metabolic rate (RMR) to nutritional and pharmacological manipulations. To perform robust component analysis, however, is not straightforward and typically requires the tuning of parameters and the preprocessing of data. Moreover the degree of accuracy that can be attained by these methods depends on the configuration of the system, which must be properly taken into account when setting up experimental studies. Here, we review the methods of Kalman filtering, linear, and penalized spline regression, and minimal energy expenditure estimation in the context of component analysis and discuss their results on high resolution datasets from mice and rats. In addition, we investigate the effect of the sample time, the accuracy of the activity sensor, and the washout time of the chamber on the estimation accuracy. We found that on the high resolution data there was a strong correlation between the results of Kalman filtering and penalized spline (P-spline) regression, except for the activity respiratory quotient (RQ). For low resolution data the basal metabolic rate (BMR) and resting RQ could still be estimated accurately with P-spline regression, having a strong correlation with the high resolution estimate (R2 > 0.997; sample time of 9 min). In contrast, the thermic effect of food (TEF) and activity related energy expenditure (AEE) were more sensitive to a reduction in the sample rate (R2 > 0.97). In conclusion, for component analysis on data generated by single channel systems with continuous data acquisition both Kalman filtering and P-spline regression can be used, while for low resolution data from multichannel systems P-spline regression gives more robust results. PMID:23641217
ERIC Educational Resources Information Center
Sartori, Leo
1983-01-01
Fundamental principles governing nuclear explosions and their effects are discussed, including three components of a nuclear explosion (thermal radiation, shock wave, nuclear radiation). Describes how effects of these components depend on the weapon's yield, its height of burst, and distance of detonation point. Includes effects of three…
A Simulation Investigation of Principal Component Regression.
ERIC Educational Resources Information Center
Allen, David E.
Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…
James R. Wallis
1965-01-01
Written in Fortran IV and MAP, this computer program can handle up to 120 variables, and retain 40 principal components. It can perform simultaneous regression of up to 40 criterion variables upon the varimax rotated factor weight matrix. The columns and rows of all output matrices are labeled by six-character alphanumeric names. Data input can be from punch cards or...
Technological integration and hyperconnectivity: Tools for promoting extreme human lifespans
NASA Astrophysics Data System (ADS)
Kyriazis, Marios
2015-07-01
Artificial, neurobiological, and social networks are three distinct complex adaptive systems (CAS), each containing discrete processing units (nodes, neurons, and humans respectively). Despite the apparent differences, these three networks are bound by common underlying principles which describe the behaviour of the system in terms of the connections of its components, and its emergent properties. The longevity (long-term retention and functionality) of the components of each of these systems is also defined by common principles. Here, I will examine some properties of the longevity and function of the components of artificial and neurobiological systems, and generalise these to the longevity and function of the components of social CAS. In other words, I will show that principles governing the long-term functionality of computer nodes and of neurons, may be extrapolated to the study of the long-term functionality of humans (or more precisely, of the noemes, an abstract combination of existence and digital fame). The study of these phenomena can provide useful insights regarding practical ways that can be used in order to maximize human longevity. The basic law governing these behaviours is the Law of Requisite Usefulness, which states that the length of retention of an agent within a CAS is proportional to the contribution of the agent to the overall adaptability of the system. Key Words: Complex Adaptive Systems, Hyper-connectivity, Human Longevity, Adaptability and Evolution, Noeme
Abuse of disabled parking: Reforming public's attitude through persuasive multimedia strategy
NASA Astrophysics Data System (ADS)
Yahaya, W. A. J. W.; Zain, M. Z. M.
2014-02-01
Attitude is one of the factors that contribute to the abuse of disabled parking. The attitude's components are affective, cognitive and behavioral and may be formed in various ways including learning and persuasion. Using learning and persuasion approach, this study has produced a persuasive multimedia aiming to form a positive attitude toward disabled persons in order to minimize the rate of disabled parking abuse. The persuasive multimedia was developed using Principle of Social Learning draws from Persuasive Technology as learning strategy at macro persuasion level, and modality and redundancy principles draw from Multimedia Learning Principles as design strategy at micro persuasion level. In order to measure the effectiveness of the persuasive multimedia, 93 respondents were selected in a 2 × 2 quasi experimental research design for experiment. Attitude components of affective, cognitive and behavioral were measured using adapted instrument from the Multi Dimensional Attitudes Scale toward Persons With Disabilities (MAS). Result of the study shows that the persuasive multimedia which designed based on Social Learning Theory at macro persuasion level is capable of forming positive attitude toward disabled person. The cognitive component of the attitude found to be the most responsive component. In term of design strategy at the micro persuasion level, modality found to be the most significant strategy compare to redundancy. While males are more responsive to the persuasive multimedia compare to females.
Steiner, Genevieve Z.; Barry, Robert J.; Gonsalvez, Craig J.
2016-01-01
In oddball tasks, increasing the time between stimuli within a particular condition (target-to-target interval, TTI; nontarget-to-nontarget interval, NNI) systematically enhances N1, P2, and P300 event-related potential (ERP) component amplitudes. This study examined the mechanism underpinning these effects in ERP components recorded from 28 adults who completed a conventional three-tone oddball task. Bivariate correlations, partial correlations and multiple regression explored component changes due to preceding ERP component amplitudes and intervals found within the stimulus series, rather than constraining the task with experimentally constructed intervals, which has been adequately explored in prior studies. Multiple regression showed that for targets, N1 and TTI predicted N2, TTI predicted P3a and P3b, and Processing Negativity (PN), P3b, and TTI predicted reaction time. For rare nontargets, P1 predicted N1, NNI predicted N2, and N1 predicted Slow Wave (SW). Findings show that the mechanism is operating on separate stages of stimulus-processing, suggestive of either increased activation within a number of stimulus-specific pathways, or very long component generator recovery cycles. These results demonstrate the extent to which matching-stimulus intervals influence ERP component amplitudes and behavior in a three-tone oddball task, and should be taken into account when designing similar studies. PMID:27445774
Steiner, Genevieve Z; Barry, Robert J; Gonsalvez, Craig J
2016-01-01
In oddball tasks, increasing the time between stimuli within a particular condition (target-to-target interval, TTI; nontarget-to-nontarget interval, NNI) systematically enhances N1, P2, and P300 event-related potential (ERP) component amplitudes. This study examined the mechanism underpinning these effects in ERP components recorded from 28 adults who completed a conventional three-tone oddball task. Bivariate correlations, partial correlations and multiple regression explored component changes due to preceding ERP component amplitudes and intervals found within the stimulus series, rather than constraining the task with experimentally constructed intervals, which has been adequately explored in prior studies. Multiple regression showed that for targets, N1 and TTI predicted N2, TTI predicted P3a and P3b, and Processing Negativity (PN), P3b, and TTI predicted reaction time. For rare nontargets, P1 predicted N1, NNI predicted N2, and N1 predicted Slow Wave (SW). Findings show that the mechanism is operating on separate stages of stimulus-processing, suggestive of either increased activation within a number of stimulus-specific pathways, or very long component generator recovery cycles. These results demonstrate the extent to which matching-stimulus intervals influence ERP component amplitudes and behavior in a three-tone oddball task, and should be taken into account when designing similar studies.
Preferences for Key Ethical Principles that Guide Business School Students
ERIC Educational Resources Information Center
Guyette, Roger; Piotrowski, Chris
2010-01-01
Business ethics is presently a major component of the business school curriculum. Although there has been much attention focused on the impact of such coursework on instilling ethical decision-making (Nguyen et al., 2008), there is sparse research on how business students view the major ethical principles that serve as the foundation of business…
Hufnagel, S; Harbison, K; Silva, J; Mettala, E
1994-01-01
This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.
Fusciardi, J; Remérand, F; Landais, A; Brodeur, J; Journois, D; Laffon, M
2010-03-01
To know: (1) how French public services of anaesthesia and critical care (ACC) have applied the new principles of hospital management and (2) whether or not it has impacted the different components of ACC. National questionnaire at the end of 2008, i.e., after 2 years of new hospital management. Heads of ACC services in general (GH) and university hospitals (UH). Eighteen closed questions and open opinions analyzed. Comparisons of percentages (Chi(2) - Yates): linear correlation. Percentages of responses were 70% (n=51) for UH and 37% (n=146) for GH. The new management principles were mainly applied. The different clinical and academic components of the ACC specialty (ACC, emergency medicine, pain management) mainly remained associated in UH. In GH, the new management induced constant and various changes. They were mainly judged as defeating the object of the ACC speciality in GH, especially in those of lower and mild sizes. The general tendency is that the ACC specialty was able to maintain the family ties of its different components in the UH. However, this principle was not a cornerstone of the new management in the GH. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.
A Study on Components of Internal Control-Based Administrative System in Secondary Schools
ERIC Educational Resources Information Center
Montri, Paitoon; Sirisuth, Chaiyuth; Lammana, Preeda
2015-01-01
The aim of this study was to study the components of the internal control-based administrative system in secondary schools, and make a Confirmatory Factor Analysis (CFA) to confirm the goodness of fit of empirical data and component model that resulted from the CFA. The study consisted of three steps: 1) studying of principles, ideas, and theories…
ERIC Educational Resources Information Center
Science Teacher, 1989
1989-01-01
Describes classroom activities and models for migration, mutation, and isolation; a diffusion model; Bernoulli's principle; sound in a vacuum; time regression mystery of DNA; seating chart lesson plan; algae mystery laboratory; water as mass; science fair; flipped book; making a cloud; wet mount slide; timer adaptation; thread slide model; and…
Dirichlet Component Regression and its Applications to Psychiatric Data.
Gueorguieva, Ralitza; Rosenheck, Robert; Zelterman, Daniel
2008-08-15
We describe a Dirichlet multivariable regression method useful for modeling data representing components as a percentage of a total. This model is motivated by the unmet need in psychiatry and other areas to simultaneously assess the effects of covariates on the relative contributions of different components of a measure. The model is illustrated using the Positive and Negative Syndrome Scale (PANSS) for assessment of schizophrenia symptoms which, like many other metrics in psychiatry, is composed of a sum of scores on several components, each in turn, made up of sums of evaluations on several questions. We simultaneously examine the effects of baseline socio-demographic and co-morbid correlates on all of the components of the total PANSS score of patients from a schizophrenia clinical trial and identify variables associated with increasing or decreasing relative contributions of each component. Several definitions of residuals are provided. Diagnostics include measures of overdispersion, Cook's distance, and a local jackknife influence metric.
NASA Astrophysics Data System (ADS)
Li, Tao
2018-06-01
The complexity of aluminum electrolysis process leads the temperature for aluminum reduction cells hard to measure directly. However, temperature is the control center of aluminum production. To solve this problem, combining some aluminum plant's practice data, this paper presents a Soft-sensing model of temperature for aluminum electrolysis process on Improved Twin Support Vector Regression (ITSVR). ITSVR eliminates the slow learning speed of Support Vector Regression (SVR) and the over-fit risk of Twin Support Vector Regression (TSVR) by introducing a regularization term into the objective function of TSVR, which ensures the structural risk minimization principle and lower computational complexity. Finally, the model with some other parameters as auxiliary variable, predicts the temperature by ITSVR. The simulation result shows Soft-sensing model based on ITSVR has short time-consuming and better generalization.
Fuzzy multinomial logistic regression analysis: A multi-objective programming approach
NASA Astrophysics Data System (ADS)
Abdalla, Hesham A.; El-Sayed, Amany A.; Hamed, Ramadan
2017-05-01
Parameter estimation for multinomial logistic regression is usually based on maximizing the likelihood function. For large well-balanced datasets, Maximum Likelihood (ML) estimation is a satisfactory approach. Unfortunately, ML can fail completely or at least produce poor results in terms of estimated probabilities and confidence intervals of parameters, specially for small datasets. In this study, a new approach based on fuzzy concepts is proposed to estimate parameters of the multinomial logistic regression. The study assumes that the parameters of multinomial logistic regression are fuzzy. Based on the extension principle stated by Zadeh and Bárdossy's proposition, a multi-objective programming approach is suggested to estimate these fuzzy parameters. A simulation study is used to evaluate the performance of the new approach versus Maximum likelihood (ML) approach. Results show that the new proposed model outperforms ML in cases of small datasets.
The Unintentional Procrastination Scale.
Fernie, Bruce A; Bharucha, Zinnia; Nikčević, Ana V; Spada, Marcantonio M
2017-01-01
Procrastination refers to the delay or postponement of a task or decision and is often conceptualised as a failure of self-regulation. Recent research has suggested that procrastination could be delineated into two domains: intentional and unintentional. In this two-study paper, we aimed to develop a measure of unintentional procrastination (named the Unintentional Procrastination Scale or the 'UPS') and test whether this would be a stronger marker of psychopathology than intentional and general procrastination. In Study 1, a community sample of 139 participants completed a questionnaire that consisted of several items pertaining to unintentional procrastination that had been derived from theory, previous research, and clinical experience. Responses were subjected to a principle components analysis and assessment of internal consistency. In Study 2, a community sample of 155 participants completed the newly developed scale, along with measures of general and intentional procrastination, metacognitions about procrastination, and negative affect. Data from the UPS were subjected to confirmatory factor analysis and revised accordingly. The UPS was then validated using correlation and regression analyses. The six-item UPS possesses construct and divergent validity and good internal consistency. The UPS appears to be a stronger marker of psychopathology than the pre-existing measures of procrastination used in this study. Results from the regression models suggest that both negative affect and metacognitions about procrastination differentiate between general, intentional, and unintentional procrastination. The UPS is brief, has good psychometric properties, and has strong associations with negative affect, suggesting it has value as a research and clinical tool.
Classical Testing in Functional Linear Models.
Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab
2016-01-01
We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications.
Classical Testing in Functional Linear Models
Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab
2016-01-01
We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications. PMID:28955155
Shin, Jeong-Hun; Jun, Seung-lyul; Hwang, Sung-Yeoun; Ahn, Seong-Hun
2012-01-01
Objectives: This study used the basic principle of Oriental medicine, the sovereign, minister, assistant and courier principle (君臣佐使論) to investigate the effects of the component of ONGABO, which is composed of Ginseng Radix (Red Ginseng), Angelica Gigantis Radix, Schisandrae Fructus, Cuscuta Semen and Curcumae tuber on the viability of HepG2 cells. Methods: Single and mixed extracts of the component of ONGABO were prepared by lypohilizing powder of Red Ginseng (6-year root from Kanghwa), Angelica Gigantis Radix, Schisandrae Fructus, Cuscuta Semen, Curcumae Tuber (from Omniherb Co., Ltd., Korea) at the laboratory of herbal medicine in Woosuk University and were eluted after being macerated with 100% ethanol for three days. The cell viability of HepG2 was determined by using an absorptiometric analysis with PrestoBlue (Invitrogen) reagent after the plate had been incubated for 48 hours. All of the experiments were repeated three times to obtain the average value and standard deviation. The statistical analysis was done and the correlation factor was obtained by using Microsoft Office Excel 2007 and Origin 6.0 software. Results: Although Ginseng Radix (Red Ginseng) and Schisandrae Fructus did not enhance the viability of HepG2 cells, they were shown to provide protection of those cells. On the other hand, Angelica Gigantis Radix decreased the viability of HepG2 cells significantly, Cuscuta Semen and Curcumae Tuber had a small or no effect on the viability of HepG2 cells. Conclusions: In the sovereign, minister, assistant and courier principle (君臣佐使論), Ginseng Radix (Red Ginseng) corresponds to the sovereign component because it provides cell protection effects, Angelica Gigantis Radix corresponds to minister medicinal because it kills cells, Schisandrae Fructus corresponds to the assistant medicinal to help red ginseng having cell protect effects. Cuscuta Semen and Curcumae Tuber correspond to the courier medicinal having no effect in cell viability in HepG2. We hope this study provides motivation for advanced research on the sovereign, minister, assistant and courier principle. PMID:25780653
Shin, Jeong-Hun; Jun, Seung-Lyul; Hwang, Sung-Yeoun; Ahn, Seong-Hun
2012-12-01
This study used the basic principle of Oriental medicine, the sovereign, minister, assistant and courier principle () to investigate the effects of the component of ONGABO, which is composed of Ginseng Radix (Red Ginseng), Angelica Gigantis Radix, Schisandrae Fructus, Cuscuta Semen and Curcumae tuber on the viability of HepG2 cells. Single and mixed extracts of the component of ONGABO were prepared by lypohilizing powder of Red Ginseng (6-year root from Kanghwa), Angelica Gigantis Radix, Schisandrae Fructus, Cuscuta Semen, Curcumae Tuber (from Omniherb Co., Ltd., Korea) at the laboratory of herbal medicine in Woosuk University and were eluted after being macerated with 100% ethanol for three days. The cell viability of HepG2 was determined by using an absorptiometric analysis with PrestoBlue (Invitrogen) reagent after the plate had been incubated for 48 hours. All of the experiments were repeated three times to obtain the average value and standard deviation. The statistical analysis was done and the correlation factor was obtained by using Microsoft Office Excel 2007 and Origin 6.0 software. Although Ginseng Radix (Red Ginseng) and Schisandrae Fructus did not enhance the viability of HepG2 cells, they were shown to provide protection of those cells. On the other hand, Angelica Gigantis Radix decreased the viability of HepG2 cells significantly, Cuscuta Semen and Curcumae Tuber had a small or no effect on the viability of HepG2 cells. In the sovereign, minister, assistant and courier principle (), Ginseng Radix (Red Ginseng) corresponds to the sovereign component because it provides cell protection effects, Angelica Gigantis Radix corresponds to minister medicinal because it kills cells, Schisandrae Fructus corresponds to the assistant medicinal to help red ginseng having cell protect effects. Cuscuta Semen and Curcumae Tuber correspond to the courier medicinal having no effect in cell viability in HepG2. We hope this study provides motivation for advanced research on the sovereign, minister, assistant and courier principle.
Molecular Models Candy Components
ERIC Educational Resources Information Center
Coleman, William F.
2007-01-01
An explanation of various principles of chemistry in a paper by Fanny Ennever by the use of candy is described. The paper explains components of sucrose and the invert sugar that results from the hydrolysis of sucrose and will help students in determining whether the products are indeed hydrates of carbon.
Enhancing a cancer prevention and control curriculum through interactive group discussions.
Forsythe, L P; Gadalla, S M; Hamilton, J G; Heckman-Stoddard, B M; Kent, E E; Lai, G Y; Lin, S W; Luhn, P; Faupel-Badger, J M
2012-06-01
The Principles and Practice of Cancer Prevention and Control course (Principles course) is offered annually by the National Cancer Institute Cancer Prevention Fellowship Program. This 4-week postgraduate course covers the spectrum of cancer prevention and control research (e.g., epidemiology, laboratory, clinical, social, and behavioral sciences) and is open to attendees from medical, academic, government, and related institutions across the world. In this report, we describe a new addition to the Principles course syllabus, which was exclusively a lecture-based format for over 20 years. In 2011, cancer prevention fellows and staff designed and implemented small group discussion sessions as part of the curriculum. The goals of these sessions were to foster an interactive environment, discuss concepts presented during the Principles course, exchange ideas, and enhance networking among the course participants and provide a teaching and leadership opportunity to current cancer prevention fellows. Overall, both the participants and facilitators who returned the evaluation forms (n=61/87 and 8/10, respectively) reported a high satisfaction with the experience for providing both an opportunity to explore course concepts in a greater detail and to network with colleagues. Participants (93%) and facilitators (100%) stated that they would like to see this component remain a part of the Principles course curriculum, and both groups provided recommendations for the 2012 program. The design, implementation, and evaluation of this initial discussion group component of the Principles course are described herein. The findings in this report will not only inform future discussion group sessions in the Principles course but may also be useful to others planning to incorporate group learning into large primarily lecture-based courses.
Enhancing a Cancer Prevention and Control Curriculum through Interactive Group Discussions
Forsythe, L.P.; Gadalla, S.M.; Hamilton, J.G.; Heckman-Stoddard, B.M.; Kent, E.E.; Lai, G.Y.; Lin, S.W.; Luhn, P.; Faupel-Badger, J.M.
2012-01-01
The Principles and Practice of Cancer Prevention and Control course (Principles course) is offered annually by the National Cancer Institute Cancer Prevention Fellowship Program. This four-week post-graduate course covers the spectrum of cancer prevention and control research (e.g. epidemiology, laboratory, clinical, social, and behavioral sciences) and is open to attendees from medical, academic, government, and related institutions across the world. In this report, we describe a new addition to the Principles course syllabus, which was exclusively a lecture-based format for over 20 years. In 2011, Cancer Prevention Fellows and staff designed and implemented small group discussion sessions as part of the curriculum. The goals of these sessions were to foster an interactive environment, discuss concepts presented during the Principles course, exchange ideas, and enhance networking amongst the course participants, and provide a teaching and leadership opportunity to current Cancer Prevention Fellows. Overall, both the participants and facilitators who returned the evaluation forms (n=61/87, and 8/10, respectively), reported high satisfaction with the experience for providing both an opportunity to explore course concepts in greater detail and to network with colleagues. Participants (93%) and facilitators (100%) stated they would like to see this component remain a part of the Principles course curriculum, and both groups provided recommendations for the 2012 program. The design, implementation, and evaluation of this initial discussion group component of the Principles course are described herein. The findings in this report will not only inform future discussion group sessions in the Principles course but may also be useful to others planning to incorporate group learning into large primarily lecture-based courses. PMID:22661264
ERIC Educational Resources Information Center
Minnesota State Dept. of Education, St. Paul. Div. of Vocational and Technical Education.
THIS MODULE OF A 30-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF THE FUNCTIONS OF DIESEL ENGINE LUBRICATION SYSTEMS AND COMPONENTS AND THE PRINCIPLES OF OPERATION OF BRAKE SYSTEMS USED ON DIESEL POWERED VEHICLES. TOPICS ARE (1) THE NEED FOR OIL, (2) SERVICE CLASSIFICATION OF OILS, (3) CATERPILLAR LUBRICATION SYSTEM COMPONENTS (4)…
Determining the Requisite Components of Visual Threat Detection to Improve Operational Performance
2014-04-01
cognitive processes, and may be enhanced by focusing training development on the principle components such as causal reasoning. The second report will...discuss the development and evaluation of a research-based training exemplar. Visual threat detection pervades many military contexts, but is also... developing computer-controlled exercises to study the primary components of visual threat detection. Similarly, civilian law enforcement officers were
[New method of mixed gas infrared spectrum analysis based on SVM].
Bai, Peng; Xie, Wen-Jun; Liu, Jun-Hua
2007-07-01
A new method of infrared spectrum analysis based on support vector machine (SVM) for mixture gas was proposed. The kernel function in SVM was used to map the seriously overlapping absorption spectrum into high-dimensional space, and after transformation, the high-dimensional data could be processed in the original space, so the regression calibration model was established, then the regression calibration model with was applied to analyze the concentration of component gas. Meanwhile it was proved that the regression calibration model with SVM also could be used for component recognition of mixture gas. The method was applied to the analysis of different data samples. Some factors such as scan interval, range of the wavelength, kernel function and penalty coefficient C that affect the model were discussed. Experimental results show that the component concentration maximal Mean AE is 0.132%, and the component recognition accuracy is higher than 94%. The problems of overlapping absorption spectrum, using the same method for qualitative and quantitative analysis, and limit number of training sample, were solved. The method could be used in other mixture gas infrared spectrum analyses, promising theoretic and application values.
ERIC Educational Resources Information Center
Marshall, Miguel G.; Allegrante, John P.
2017-01-01
Background: Although the 8 components of the coordinated school health (CSH) framework have been implemented to various degrees in the nation's public schools, principles of good practice (PGPs) to guide health promotion efforts in independent schools do not exist. The purpose of this study was to generate PGPs and rate their feasibility of…
NASA Astrophysics Data System (ADS)
Wibowo, Wahyu; Wene, Chatrien; Budiantara, I. Nyoman; Permatasari, Erma Oktania
2017-03-01
Multiresponse semiparametric regression is simultaneous equation regression model and fusion of parametric and nonparametric model. The regression model comprise several models and each model has two components, parametric and nonparametric. The used model has linear function as parametric and polynomial truncated spline as nonparametric component. The model can handle both linearity and nonlinearity relationship between response and the sets of predictor variables. The aim of this paper is to demonstrate the application of the regression model for modeling of effect of regional socio-economic on use of information technology. More specific, the response variables are percentage of households has access to internet and percentage of households has personal computer. Then, predictor variables are percentage of literacy people, percentage of electrification and percentage of economic growth. Based on identification of the relationship between response and predictor variable, economic growth is treated as nonparametric predictor and the others are parametric predictors. The result shows that the multiresponse semiparametric regression can be applied well as indicate by the high coefficient determination, 90 percent.
Allegrini, Franco; Braga, Jez W B; Moreira, Alessandro C O; Olivieri, Alejandro C
2018-06-29
A new multivariate regression model, named Error Covariance Penalized Regression (ECPR) is presented. Following a penalized regression strategy, the proposed model incorporates information about the measurement error structure of the system, using the error covariance matrix (ECM) as a penalization term. Results are reported from both simulations and experimental data based on replicate mid and near infrared (MIR and NIR) spectral measurements. The results for ECPR are better under non-iid conditions when compared with traditional first-order multivariate methods such as ridge regression (RR), principal component regression (PCR) and partial least-squares regression (PLS). Copyright © 2018 Elsevier B.V. All rights reserved.
Spaceborne receivers: Basic principles
NASA Technical Reports Server (NTRS)
Stacey, J. M.
1984-01-01
The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.
A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.
Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain
2015-10-01
Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.
Prediction of warmed-over flavour development in cooked chicken by colorimetric sensor array.
Kim, Su-Yeon; Li, Jinglei; Lim, Na-Ri; Kang, Bo-Sik; Park, Hyun-Jin
2016-11-15
The aim of this study was to develop a simple and rapid method based on colorimetric sensor array (CSA) for evaluation of warmed-over flavour (WOF) in cooked chicken. All samples were classified according to storage time by CSA coupled with principle component analysis (PCA) or hierarchical cluster analysis (HCA). The CSA data were used to establish prediction models with thiobarbituric acid reactive substances (TBARS), pentanal, hexanal, or heptanal associated with WOF by partial least square regression (PLSR). For the TBARS model, the coefficient of determination (rp(2)) was 0.9997 in the prediction range of 0.28-0.69mg/kg. In each of the models for pentanal, hexanal, and heptanal, all rp(2) were higher than 0.960 in the range of 0.58-2.10mg/kg, 5.50-11.69mg/kg, and 0.09-0.16mg/kg, respectively. These results demonstrate that the CSA was able to predict WOF development and to distinguish between each storage time. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modeling the Components of an Economy as a Complex Adaptive System
principles of constrained optimization and fails to see economic variables as part of an interconnected network. While tools for forecasting economic...data sets such as the stock market . This research portrays the stock market as one component of a networked system of economic variables, with the
21 CFR 102.5 - General principles.
Code of Federal Regulations, 2011 CFR
2011-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... words “containing (or contains) _ percent (or %) ___” or “_ percent (or %) ___” with the first blank... create an erroneous impression that such ingredient(s) or component(s) is present when it is not, and...
21 CFR 102.5 - General principles.
Code of Federal Regulations, 2010 CFR
2010-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... words “containing (or contains) _ percent (or %) ___” or “_ percent (or %) ___” with the first blank... create an erroneous impression that such ingredient(s) or component(s) is present when it is not, and...
21 CFR 102.5 - General principles.
Code of Federal Regulations, 2012 CFR
2012-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... words “containing (or contains) _ percent (or %) ___” or “_ percent (or %) ___” with the first blank... create an erroneous impression that such ingredient(s) or component(s) is present when it is not, and...
21 CFR 102.5 - General principles.
Code of Federal Regulations, 2014 CFR
2014-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... words “containing (or contains) _ percent (or %) ___” or “_ percent (or %) ___” with the first blank... create an erroneous impression that such ingredient(s) or component(s) is present when it is not, and...
21 CFR 102.5 - General principles.
Code of Federal Regulations, 2013 CFR
2013-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... words “containing (or contains) _ percent (or %) ___” or “_ percent (or %) ___” with the first blank... create an erroneous impression that such ingredient(s) or component(s) is present when it is not, and...
Accelerated Reader/Reading Renaissance. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2007
2007-01-01
The Accelerated Reader/Reading Renaissance program (now called Accelerated Reader Best Classroom Practices) is a guided reading intervention in which teachers direct student reading of text. It involves two components. Reading Renaissance, the first component, is a set of recommended principles on guided reading (or teachers' direction of…
Enhancing the Impact of Quality Points in Interteaching
ERIC Educational Resources Information Center
Rosales, Rocío; Soldner, James L.; Crimando, William
2014-01-01
Interteaching is a classroom instruction approach based on behavioral principles that offers increased flexibility to instructors. There are several components of interteaching that may contribute to its demonstrated efficacy. In a prior analysis of one of these components, the quality points contingency, no significant difference was reported in…
First-Principle Characterization for Singlet Fission Couplings.
Yang, Chou-Hsun; Hsu, Chao-Ping
2015-05-21
The electronic coupling for singlet fission, an important parameter for determining the rate, has been found to be too small unless charge-transfer (CT) components were introduced in the diabatic states, mostly through perturbation or a model Hamiltonian. In the present work, the fragment spin difference (FSD) scheme was generalized to calculate the singlet fission coupling. The largest coupling strength obtained was 14.8 meV for two pentacenes in a crystal structure, or 33.7 meV for a transition-state structure, which yielded a singlet fission lifetime of 239 or 37 fs, generally consistent with experimental results (80 fs). Test results with other polyacene molecules are similar. We found that the charge on one fragment in the S1 diabatic state correlates well with FSD coupling, indicating the importance of the CT component. The FSD approach is a useful first-principle method for singlet fission coupling, without the need to include the CT component explicitly.
Hainsworth, S V; Fitzpatrick, M E
2007-06-01
Forensic engineering is the application of engineering principles or techniques to the investigation of materials, products, structures or components that fail or do not perform as intended. In particular, forensic engineering can involve providing solutions to forensic problems by the application of engineering science. A criminal aspect may be involved in the investigation but often the problems are related to negligence, breach of contract, or providing information needed in the redesign of a product to eliminate future failures. Forensic engineering may include the investigation of the physical causes of accidents or other sources of claims and litigation (for example, patent disputes). It involves the preparation of technical engineering reports, and may require giving testimony and providing advice to assist in the resolution of disputes affecting life or property.This paper reviews the principal methods available for the analysis of failed components and then gives examples of different component failure modes through selected case studies.
Introduction to COFFE: The Next-Generation HPCMP CREATE-AV CFD Solver
NASA Technical Reports Server (NTRS)
Glasby, Ryan S.; Erwin, J. Taylor; Stefanski, Douglas L.; Allmaras, Steven R.; Galbraith, Marshall C.; Anderson, W. Kyle; Nichols, Robert H.
2016-01-01
HPCMP CREATE-AV Conservative Field Finite Element (COFFE) is a modular, extensible, robust numerical solver for the Navier-Stokes equations that invokes modularity and extensibility from its first principles. COFFE implores a flexible, class-based hierarchy that provides a modular approach consisting of discretization, physics, parallelization, and linear algebra components. These components are developed with modern software engineering principles to ensure ease of uptake from a user's or developer's perspective. The Streamwise Upwind/Petrov-Galerkin (SU/PG) method is utilized to discretize the compressible Reynolds-Averaged Navier-Stokes (RANS) equations tightly coupled with a variety of turbulence models. The mathematics and the philosophy of the methodology that makes up COFFE are presented.
System identification principles in studies of forest dynamics.
Rolfe A. Leary
1970-01-01
Shows how it is possible to obtain governing equation parameter estimates on the basis of observed system states. The approach used represents a constructive alternative to regression techniques for models expressed as differential equations. This approach allows scientists to more completely quantify knowledge of forest development processes, to express theories in...
Fairman, C M; Hyde, P N; Focht, B C
2017-04-01
The primary purpose of this systematic review is to examine the extant resistance training (RT) cancer research to evaluate the proportion of RT interventions that: (1) implemented key RT training principles (specificity, progression, overload) and (2) explicitly reported relevant RT prescription components (frequency, intensity, sets, reps). A qualitative systematic review was performed by two reviewers (CMF and PNH) who inspected the titles and abstracts to determine eligibility for this systematic review. Identified papers were obtained in full and further reviewed. Data were extracted to evaluate the application of principles of training, along with specific RT components. Electronic databases (PubMed, EMBASE, CINAHL, Cochrane, PEDro, PsychInfo, Cancer Lit, Sport Discus, AMED, Cochrane Central Register of Controlled Trials) and reference lists of included articles from inception to May 2016. 37 studies were included. The principle of specificity was used appropriately in all of the studies, progression in 65% and overload in 76% of the studies. The most common exercise prescription (∼50%) implemented in the studies included in this review were 2-3 days/week, focusing on large muscle groups, 60-70% 1 repetition maximum (RM), 1-3 sets of 8-12 repetitions. Reporting of RT principles in an oncology setting varies greatly, with often vague or non-existent references to the principles of training and how the RT prescription was designed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Components for the Global Digital Object Cloud
NASA Astrophysics Data System (ADS)
Glaves, Helen; Hanahoe, Hilary; Weigel, Tobias; Lannom, Larry; Wittenburg, Peter; Koureas, Dimitris; Almas, Bridget
2017-04-01
We are at a tipping point in the development of a common conceptual framework and set of tools and components which will revolutionize the management of scientific data. It is widely acknowledged that the current volumes and complexity of data now being collected, and the inevitable and enormous increase in that volume and complexity, have reached the point where action is required. Around 80% of the data generated is being lost after short time periods and a corresponding amount of time is being wasted by reseachers on routine data management tasks. At the same time, and largely in response to this perceived crisis, a number of principles (G8, RDA DFT, FAIR) for the management of scientific data have arisen and been widely endorsed. The danger now is that agreement will stop at the level of principles and that multiple non-interoperable domain and technology specific silos will continue to arise, all based on the abstract principles. If this happens, we will lose the opportunity to create a common set of low-level tools and components based on an agreed conceptual approach. The Research Data Alliance (RDA) is now combining recommendations from its individual working and interest groups, such as suggestions for proper citation of dynamic data or how to assess the quality of repositories, to design configurations of core components (as specified by RDA and other initiatives such as W3C) and stimulate their implementation. Together with a few global communities such as climate modeling, biodiversity and material science, experts involved in RDA are developing a concept called Global Digital Object Cloud (GDOC) which has the potential to overcome the huge fragmentation which hampers efficient data management and re-use. It is compliant with the FAIR principles in so far as a) it puts Digital Objects (DOs) in its center, b) has all DOs assigned PIDs which are resolvable to useful state information, c) has all DOs associated with metadata, and d) has all DO bit sequences stored in trustworthy repositories. The presentation will give an overview of the types of components involved, the corresponding specifications of RDA, and the concept of the GDOC.
Alexander, Lorraine K; Horney, Jennifer A; Markiewicz, Milissa; MacDonald, Pia D M
2010-01-01
Distance learning is an effective strategy to address the many barriers to continuing education faced by the public health workforce. With the proliferation of online learning programs focused on public health, there is a need to develop and adopt a common set of principles and practices for distance learning. In this article, we discuss the 10 principles that guide the development, design, and delivery of the various training modules and courses offered by the North Carolina Center for Public Health Preparedness (NCCPHP). These principles are the result of 10 years of experience in Internet-based public health preparedness educational programming. In this article, we focus on three representative components of NCCPHP's overall training and education program to illustrate how the principles are implemented and help others in the field plan and develop similar programs.
NASA Astrophysics Data System (ADS)
Syuhada Mangsor, Aneez; Haider Rizvi, Zuhaib; Chaudhary, Kashif; Safwan Aziz, Muhammad
2018-05-01
The study of atomic spectroscopy has contributed to a wide range of scientific applications. In principle, laser induced breakdown spectroscopy (LIBS) method has been used to analyse various types of matter regardless of its physical state, either it is solid, liquid or gas because all elements emit light of characteristic frequencies when it is excited to sufficiently high energy. The aim of this work was to analyse the signature spectrums of each element contained in three different types of samples. Metal alloys of Aluminium, Titanium and Brass with the purities of 75%, 80%, 85%, 90% and 95% were used as the manipulated variable and their LIBS spectra were recorded. The characteristic emission lines of main elements were identified from the spectra as well as its corresponding contents. Principal component analysis (PCA) was carried out using the data from LIBS spectra. Three obvious clusters were observed in 3-dimensional PCA plot which corresponding to the different group of alloys. Findings from this study showed that LIBS technology with the help of principle component analysis could conduct the variety discrimination of alloys demonstrating the capability of LIBS-PCA method in field of spectro-analysis. Thus, LIBS-PCA method is believed to be an effective method for classifying alloys with different percentage of purifications, which was high-cost and time-consuming before.
Modeling vertebrate diversity in Oregon using satellite imagery
NASA Astrophysics Data System (ADS)
Cablk, Mary Elizabeth
Vertebrate diversity was modeled for the state of Oregon using a parametric approach to regression tree analysis. This exploratory data analysis effectively modeled the non-linear relationships between vertebrate richness and phenology, terrain, and climate. Phenology was derived from time-series NOAA-AVHRR satellite imagery for the year 1992 using two methods: principal component analysis and derivation of EROS data center greenness metrics. These two measures of spatial and temporal vegetation condition incorporated the critical temporal element in this analysis. The first three principal components were shown to contain spatial and temporal information about the landscape and discriminated phenologically distinct regions in Oregon. Principal components 2 and 3, 6 greenness metrics, elevation, slope, aspect, annual precipitation, and annual seasonal temperature difference were investigated as correlates to amphibians, birds, all vertebrates, reptiles, and mammals. Variation explained for each regression tree by taxa were: amphibians (91%), birds (67%), all vertebrates (66%), reptiles (57%), and mammals (55%). Spatial statistics were used to quantify the pattern of each taxa and assess validity of resulting predictions from regression tree models. Regression tree analysis was relatively robust against spatial autocorrelation in the response data and graphical results indicated models were well fit to the data.
Dirichlet Component Regression and its Applications to Psychiatric Data
Gueorguieva, Ralitza; Rosenheck, Robert; Zelterman, Daniel
2011-01-01
Summary We describe a Dirichlet multivariable regression method useful for modeling data representing components as a percentage of a total. This model is motivated by the unmet need in psychiatry and other areas to simultaneously assess the effects of covariates on the relative contributions of different components of a measure. The model is illustrated using the Positive and Negative Syndrome Scale (PANSS) for assessment of schizophrenia symptoms which, like many other metrics in psychiatry, is composed of a sum of scores on several components, each in turn, made up of sums of evaluations on several questions. We simultaneously examine the effects of baseline socio-demographic and co-morbid correlates on all of the components of the total PANSS score of patients from a schizophrenia clinical trial and identify variables associated with increasing or decreasing relative contributions of each component. Several definitions of residuals are provided. Diagnostics include measures of overdispersion, Cook’s distance, and a local jackknife influence metric. PMID:22058582
Hanley, James A
2008-01-01
Most survival analysis textbooks explain how the hazard ratio parameters in Cox's life table regression model are estimated. Fewer explain how the components of the nonparametric baseline survivor function are derived. Those that do often relegate the explanation to an "advanced" section and merely present the components as algebraic or iterative solutions to estimating equations. None comment on the structure of these estimators. This note brings out a heuristic representation that may help to de-mystify the structure.
Murphy, Matthew; MacCarthy, M Jayne; McAllister, Lynda; Gilbert, Robert
2014-12-05
Competency profiles for occupational clusters within Canada's substance abuse workforce (SAW) define the need for skill and knowledge in evidence-based practice (EBP) across all its members. Members of the Senior Management occupational cluster hold ultimate responsibility for decisions made within addiction services agencies and therefore must possess the highest level of proficiency in EBP. The objective of this study was to assess the knowledge of the principles of EBP, and use of the components of the evidence-based decision making (EBDM) process in members of this occupational cluster from selected addiction services agencies in Nova Scotia. A convenience sampling method was used to recruit participants from addiction services agencies. Semi-structured qualitative interviews were conducted with eighteen Senior Management. The interviews were audio-recorded, transcribed verbatim and checked by the participants. Interview transcripts were coded and analyzed for themes using content analysis and assisted by qualitative data analysis software (NVivo 9.0). Data analysis revealed four main themes: 1) Senior Management believe that addictions services agencies are evidence-based; 2) Consensus-based decision making is the norm; 3) Senior Management understand the principles of EBP and; 4) Senior Management do not themselves use all components of the EBDM process when making decisions, oftentimes delegating components of this process to decision support staff. Senior Management possess an understanding of the principles of EBP, however, when making decisions they often delegate components of the EBDM process to decision support staff. Decision support staff are not defined as an occupational cluster in Canada's SAW and have not been ascribed a competency profile. As such, there is no guarantee that this group possesses competency in EBDM. There is a need to advocate for the development of a defined occupational cluster and associated competency profile for this critical group.
Evaluation of fire weather forecasts using PM2.5 sensitivity analysis
NASA Astrophysics Data System (ADS)
Balachandran, Sivaraman; Baumann, Karsten; Pachon, Jorge E.; Mulholland, James A.; Russell, Armistead G.
2017-01-01
Fire weather forecasts are used by land and wildlife managers to determine when meteorological and fuel conditions are suitable to conduct prescribed burning. In this work, we investigate the sensitivity of ambient PM2.5 to various fire and meteorological variables in a spatial setting that is typical for the southeastern US, where prescribed fires are the single largest source of fine particulate matter. We use the method of principle components regression to estimate sensitivity of PM2.5, measured at a monitoring site in Jacksonville, NC (JVL), to fire data and observed and forecast meteorological variables. Fire data were gathered from prescribed fire activity used for ecological management at Marine Corps Base Camp Lejeune, extending 10-50 km south from the PM2.5 monitor. Principal components analysis (PCA) was run on 10 data sets that included acres of prescribed burning activity (PB) along with meteorological forecast data alone or in combination with observations. For each data set, observed PM2.5 (unitless) was regressed against PCA scores from the first seven principal components (explaining at least 80% of total variance). PM2.5 showed significant sensitivity to PB: 3.6 ± 2.2 μg m-3 per 1000 acres burned at the investigated distance scale of ∼10-50 km. Applying this sensitivity to the available activity data revealed a prescribed burning source contribution to measured PM2.5 of up to 25% on a given day. PM2.5 showed a positive sensitivity to relative humidity and temperature, and was also sensitive to wind direction, indicating the capture of more regional aerosol processing and transport effects. As expected, PM2.5 had a negative sensitivity to dispersive variables but only showed a statistically significant negative sensitivity to ventilation rate, highlighting the importance of this parameter to fire managers. A positive sensitivity to forecast precipitation was found, consistent with the practice of conducting prescribed burning on days when rain can naturally extinguish fires. Perhaps most importantly for land managers, our analysis suggests that instead of relying on the forecasts from a day before, prescribed burning decisions should be based on the forecasts released the morning of the burn when possible, since these data were more stable and yielded more statistically robust results.
NASA Astrophysics Data System (ADS)
Torun, Ahmet R.; Mountasir, Adil; Hoffmann, Gerald; Cherif, Chokri
2013-06-01
3D textile preforms offer a high potential to increase mechanical properties of composites and/or decrease manufacturing costs. Within the scope of this study, production principles were developed for complex spacer preforms and integrated stiffeners. These principles were applied through technological further development of the well-known face-to-face and terry weaving techniques. Various woven preforms were produced with Glass fibre/Polypropylene (GF/PP) Commingled yarns, however, the technology is suitable for any type of reinforcement yarns. U-shaped woven spacer preform was consolidated into a sandwich composite component for lightweight applications.
ERIC Educational Resources Information Center
Wilson, Mark V.; Wilson, Erin
2017-01-01
In this work we describe an authentic performance project for Instrumental Analysis in which students designed, built, and tested spectrophotometers made from simple components. The project addressed basic course content such as instrument design principles, UV-vis spectroscopy, and spectroscopic instrument components as well as skills such as…
Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...
A 5 year (2002-2006) simulation of CMAQ covering the eastern United States is evaluated using principle component analysis in order to identify and characterize statistically significant patterns of model bias. Such analysis is useful in that in can identify areas of poor model ...
NASA Technical Reports Server (NTRS)
Duong, T. A.
2004-01-01
In this paper, we present a new, simple, and optimized hardware architecture sequential learning technique for adaptive Principle Component Analysis (PCA) which will help optimize the hardware implementation in VLSI and to overcome the difficulties of the traditional gradient descent in learning convergence and hardware implementation.
26 CFR 1.382-8 - Controlled groups.
Code of Federal Regulations, 2010 CFR
2010-04-01
... corporation by the value of the stock of each component member of the controlled group that the loss... market value of the asset on that date. The principles of this paragraph (b)(2) also apply to items... section 382 with respect to each controlled group loss, the value of the stock of each component member of...
21 CFR 502.5 - General principles.
Code of Federal Regulations, 2011 CFR
2011-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... (or contains) __ percent (or %) __” or “__ percent (or %) __” with the first blank filled in with the... impression that such ingredient(s) or component(s) is present when it is not, and consumers may otherwise be...
21 CFR 502.5 - General principles.
Code of Federal Regulations, 2010 CFR
2010-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... (or contains) __ percent (or %) __” or “__ percent (or %) __” with the first blank filled in with the... impression that such ingredient(s) or component(s) is present when it is not, and consumers may otherwise be...
21 CFR 502.5 - General principles.
Code of Federal Regulations, 2013 CFR
2013-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... (or contains) __ percent (or %) __” or “__ percent (or %) __” with the first blank filled in with the... impression that such ingredient(s) or component(s) is present when it is not, and consumers may otherwise be...
21 CFR 502.5 - General principles.
Code of Federal Regulations, 2012 CFR
2012-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... (or contains) __ percent (or %) __” or “__ percent (or %) __” with the first blank filled in with the... impression that such ingredient(s) or component(s) is present when it is not, and consumers may otherwise be...
21 CFR 502.5 - General principles.
Code of Federal Regulations, 2014 CFR
2014-04-01
... acceptance or when the labeling or the appearance of the food may otherwise create an erroneous impression... (or contains) __ percent (or %) __” or “__ percent (or %) __” with the first blank filled in with the... impression that such ingredient(s) or component(s) is present when it is not, and consumers may otherwise be...
Biomass relations for components of five Minnesota shrubs.
Richard R. Buech; David J. Rugg
1995-01-01
Presents equations for estimating biomass of six components on five species of shrubs common to northeastern Minnesota. Regression analysis is used to compare the performance of three estimators of biomass.
NASA Astrophysics Data System (ADS)
Qu, Yonghua; Jiao, Siong; Lin, Xudong
2008-10-01
Hetao Irrigation District located in Inner Mongolia, is one of the three largest irrigated area in China. In the irrigational agriculture region, for the reasons that many efforts have been put on irrigation rather than on drainage, as a result much sedimentary salt that usually is solved in water has been deposited in surface soil. So there has arisen a problem in such irrigation district that soil salinity has become a chief fact which causes land degrading. Remote sensing technology is an efficiency way to map the salinity in regional scale. In the principle of remote sensing, soil spectrum is one of the most important indications which can be used to reflect the status of soil salinity. In the past decades, many efforts have been made to reveal the spectrum characteristics of the salinized soil, such as the traditional statistic regression method. But it also has been found that when the hyper-spectral reflectance data are considered, the traditional regression method can't be treat the large dimension data, because the hyper-spectral data usually have too higher spectral band number. In this paper, a partial least squares regression (PLSR) model was established based on the statistical analysis on the soil salinity and the reflectance of hyper-spectral. Dataset were collect through the field soil samples were collected in the region of Hetao irrigation from the end of July to the beginning of August. The independent validation using data which are not included in the calibration model reveals that the proposed model can predicate the main soil components such as the content of total ions(S%), PH with higher determination coefficients(R2) of 0.728 and 0.715 respectively. And the rate of prediction to deviation(RPD) of the above predicted value are larger than 1.6, which indicates that the calibrated PLSR model can be used as a tool to retrieve soil salinity with accurate results. When the PLSR model's regression coefficients were aggregated according to the wavelength of visual (blue, green, red) and near infrared bands of LandSat Thematic Mapper(TM) sensor, some significant response values were observed, which indicates that the proposed method in this paper can be used to analysis the remotely sensed data from the space-boarded platform.
Kajbafnezhad, H; Ahadi, H; Heidarie, A; Askari, P; Enayati, M
2012-10-01
The aim of this study was to predict athletic success motivation by mental skills, emotional intelligence and its components. The research sample consisted of 153 male athletes who were selected through random multistage sampling. The subjects completed the Mental Skills Questionnaire, Bar-On Emotional Intelligence questionnaire and the perception of sport success questionnaire. Data were analyzed using Pearson correlation coefficient and multiple regressions. Regression analysis shows that between the two variables of mental skill and emotional intelligence, mental skill is the best predictor for athletic success motivation and has a better ability to predict the success rate of the participants. Regression analysis results showed that among all the components of emotional intelligence, self-respect had a significantly higher ability to predict athletic success motivation. The use of psychological skills and emotional intelligence as an mediating and regulating factor and organizer cause leads to improved performance and can not only can to help athletes in making suitable and effective decisions for reaching a desired goal.
NASA Astrophysics Data System (ADS)
Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.
2008-11-01
We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.
ERIC Educational Resources Information Center
Black, Aime
2012-01-01
Student achievement to reward or sanction schools. These unadjusted accountability indicators do not account for differences in student or school characteristics that contribute to variations in assessment results. Since the "Coleman Report" (1966), a guiding principle in accountability design has been that educational outcomes data…
1987-08-01
of economics (e.g., present/future value, costing, cost-benefit analysis, regression analysis, etc.) are scattered throughout the core. Unfortunately...in the subject, largely because most of the problems and examples used are not related to military operations research. Some fundamental -j principles
Least Square Regression Method for Estimating Gas Concentration in an Electronic Nose System
Khalaf, Walaa; Pace, Calogero; Gaudioso, Manlio
2009-01-01
We describe an Electronic Nose (ENose) system which is able to identify the type of analyte and to estimate its concentration. The system consists of seven sensors, five of them being gas sensors (supplied with different heater voltage values), the remainder being a temperature and a humidity sensor, respectively. To identify a new analyte sample and then to estimate its concentration, we use both some machine learning techniques and the least square regression principle. In fact, we apply two different training models; the first one is based on the Support Vector Machine (SVM) approach and is aimed at teaching the system how to discriminate among different gases, while the second one uses the least squares regression approach to predict the concentration of each type of analyte. PMID:22573980
Mechanisms of developmental neurite pruning.
Schuldiner, Oren; Yaron, Avraham
2015-01-01
The precise wiring of the nervous system is a combined outcome of progressive and regressive events during development. Axon guidance and synapse formation intertwined with cell death and neurite pruning sculpt the mature circuitry. It is now well recognized that pruning of dendrites and axons as means to refine neuronal networks, is a wide spread phenomena required for the normal development of vertebrate and invertebrate nervous systems. Here we will review the arising principles of cellular and molecular mechanisms of neurite pruning. We will discuss these principles in light of studies in multiple neuronal systems, and speculate on potential explanations for the emergence of neurite pruning as a mechanism to sculpt the nervous system.
Pure field theories and MACSYMA algorithms
NASA Technical Reports Server (NTRS)
Ament, W. S.
1977-01-01
A pure field theory attempts to describe physical phenomena through singularity-free solutions of field equations resulting from an action principle. The physics goes into forming the action principle and interpreting specific results. Algorithms for the intervening mathematical steps are sketched. Vacuum general relativity is a pure field theory, serving as model and providing checks for generalizations. The fields of general relativity are the 10 components of a symmetric Riemannian metric tensor; those of the Einstein-Straus generalization are the 16 components of a nonsymmetric. Algebraic properties are exploited in top level MACSYMA commands toward performing some of the algorithms of that generalization. The light cone for the theory as left by Einstein and Straus is found and simplifications of that theory are discussed.
Gillon, R
2003-10-01
It is hypothesised and argued that "the four principles of medical ethics" can explain and justify, alone or in combination, all the substantive and universalisable claims of medical ethics and probably of ethics more generally. A request is renewed for falsification of this hypothesis showing reason to reject any one of the principles or to require any additional principle(s) that can't be explained by one or some combination of the four principles. This approach is argued to be compatible with a wide variety of moral theories that are often themselves mutually incompatible. It affords a way forward in the context of intercultural ethics, that treads the delicate path between moral relativism and moral imperialism. Reasons are given for regarding the principle of respect for autonomy as "first among equals", not least because it is a necessary component of aspects of the other three. A plea is made for bioethicists to celebrate the approach as a basis for global moral ecumenism rather than mistakenly perceiving and denigrating it as an attempt at global moral imperialism.
Advances in modeling the pressure correlation terms in the second moment equations
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Shabbir, Aamir; Lumley, John L.
1991-01-01
In developing turbulence models, various model constraints were proposed in an attempt to make the model equations more general (or universal). The most recent of these are the realizability principle, the linearity principle, the rapid distortion theory, and the material indifference principle. Several issues are discussed concerning these principles and special attention is payed to the realizability principle. Realizability (defined as the requirement of non-negative energy and Schwarz' inequality between any fluctuating quantities) is the basic physical and mathematical principle that any modeled equation should obey. Hence, it is the most universal, important and also the minimal requirement for a model equation to prevent it from producing unphysical results. The principle of realizability is described in detail, the realizability conditions are derived for various turbulence models, and the model forms are proposed for the pressure correlation terms in the second moment equations. Detailed comparisons of various turbulence models with experiments and direct numerical simulations are presented. As a special case of turbulence, the two dimensional two-component turbulence modeling is also discussed.
2016-06-01
regulations are in accordance with UNCITRAL Model Law and are based on principles of “ accountability , transparency, fairness, efficiency and value for... account certain factors about the firm(s) for pre-qualification. These factors include past performance and experience; financial health; managerial...internal control components, along with associated principles , were discussed in detail to develop a suitable internal control system for the financial
The Principle of Mass in Relation to Transformation and the Contemporary Operational Environment
2003-06-06
specifically show how the legacy forces used mass. For the Korean War , the thesis will analyze the Battle for the Imjin River, April 1951. The Battle...critical components and has existed as a United States Army principle of war since 1921. The United States Army military is currently undergoing vast changes...30 3. Gloster’s Battle for the Imjin .................................................................... 37 4. Battle for Hue Areas of
PDS4 - Some Principles for Agile Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.
2015-12-01
PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.
Pisa, Pedro T.; Pedro, Titilola M.; Kahn, Kathleen; Tollman, Stephen M.; Pettifor, John M.; Norris, Shane A.
2015-01-01
The aim of this study was to identify and describe the diversity of nutrient patterns and how they associate with socio-demographic and lifestyle factors including body mass index in rural black South African adolescents. Nutrient patterns were identified from quantified food frequency questionnaires (QFFQ) in 388 rural South African adolescents between the ages of 11–15 years from the Agincourt Health and Socio-demographic Surveillance System (AHDSS). Principle Component Analysis (PCA) was applied to 25 nutrients derived from QFFQs. Multiple linear regression and partial R2 models were fitted and computed respectively for each of the retained principal component (PC) scores on socio-demographic and lifestyle characteristics including body mass index (BMI) for age Z scores. Four nutrient patterns explaining 79% of the total variance were identified: PCI (26%) was characterized by animal derived nutrients; PC2 (21%) by vitamins, fibre and vegetable oil nutrients; PC3 (19%) by both animal and plant derived nutrients (mixed diet driven nutrients); and PC4 (13%) by starch and folate. A positive and significant association was observed with BMI for age Z scores per 1 standard deviation (SD) increase in PC1 (0.13 (0.02; 0.24); p = 0.02) and PC4 (0.10 (−0.01; 0.21); p = 0.05) scores only. We confirmed variability in nutrient patterns that were significantly associated with various lifestyle factors including obesity. PMID:25984738
Single-Trial Event-Related Potential Correlates of Belief Updating
Murawski, Carsten; Bode, Stefan
2015-01-01
Abstract Belief updating—the process by which an agent alters an internal model of its environment—is a core function of the CNS. Recent theory has proposed broad principles by which belief updating might operate, but more precise details of its implementation in the human brain remain unclear. In order to address this question, we studied how two components of the human event-related potential encoded different aspects of belief updating. Participants completed a novel perceptual learning task while electroencephalography was recorded. Participants learned the mapping between the contrast of a dynamic visual stimulus and a monetary reward and updated their beliefs about a target contrast on each trial. A Bayesian computational model was formulated to estimate belief states at each trial and was used to quantify the following two variables: belief update size and belief uncertainty. Robust single-trial regression was used to assess how these model-derived variables were related to the amplitudes of the P3 and the stimulus-preceding negativity (SPN), respectively. Results showed a positive relationship between belief update size and P3 amplitude at one fronto-central electrode, and a negative relationship between SPN amplitude and belief uncertainty at a left central and a right parietal electrode. These results provide evidence that belief update size and belief uncertainty have distinct neural signatures that can be tracked in single trials in specific ERP components. This, in turn, provides evidence that the cognitive mechanisms underlying belief updating in humans can be described well within a Bayesian framework. PMID:26473170
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Y; Shirato, H; Song, J
2015-06-15
Purpose: This study aims to identify novel prognostic imaging biomarkers in locally advanced pancreatic cancer (LAPC) using quantitative, high-throughput image analysis. Methods: 86 patients with LAPC receiving chemotherapy followed by SBRT were retrospectively studied. All patients had a baseline FDG-PET scan prior to SBRT. For each patient, we extracted 435 PET imaging features of five types: statistical, morphological, textural, histogram, and wavelet. These features went through redundancy checks, robustness analysis, as well as a prescreening process based on their concordance indices with respect to the relevant outcomes. We then performed principle component analysis on the remaining features (number ranged frommore » 10 to 16), and fitted a Cox proportional hazard regression model using the first 3 principle components. Kaplan-Meier analysis was used to assess the ability to distinguish high versus low-risk patients separated by median predicted survival. To avoid overfitting, all evaluations were based on leave-one-out cross validation (LOOCV), in which each holdout patient was assigned to a risk group according to the model obtained from a separate training set. Results: For predicting overall survival (OS), the most dominant imaging features were wavelet coefficients. There was a statistically significant difference in OS between patients with predicted high and low-risk based on LOOCV (hazard ratio: 2.26, p<0.001). Similar imaging features were also strongly associated with local progression-free survival (LPFS) (hazard ratio: 1.53, p=0.026) on LOOCV. In comparison, neither SUVmax nor TLG was associated with LPFS (p=0.103, p=0.433) (Table 1). Results for progression-free survival and distant progression-free survival showed similar trends. Conclusion: Radiomic analysis identified novel imaging features that showed improved prognostic value over conventional methods. These features characterize the degree of intra-tumor heterogeneity reflected on FDG-PET images, and their biological underpinnings warrant further investigation. If validated in large, prospective cohorts, this method could be used to stratify patients based on individualized risk.« less
Boosted Regression Tree Models to Explain Watershed Nutrient Concentrations and Biological Condition
Boosted regression tree (BRT) models were developed to quantify the nonlinear relationships between landscape variables and nutrient concentrations in a mesoscale mixed land cover watershed during base-flow conditions. Factors that affect instream biological components, based on ...
Predicting phase equilibria in one-component systems
NASA Astrophysics Data System (ADS)
Korchuganova, M. R.; Esina, Z. N.
2015-07-01
It is shown that Simon equation coefficients for n-alkanes and n-alcohols can be modeled using critical and triple point parameters. Predictions of the phase liquid-vapor, solid-vapor, and liquid-solid equilibria in one-component systems are based on the Clausius-Clapeyron relation, Van der Waals and Simon equations, and the principle of thermodynamic similarity.
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2011 CFR
2011-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2013 CFR
2013-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2012 CFR
2012-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2014 CFR
2014-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
Jeunet, Camille; Albert, Louis; Argelaguet, Ferran; Lecuyer, Anatole
2018-04-01
While the Sense of Agency (SoA) has so far been predominantly characterised in VR as a component of the Sense of Embodiment, other communities (e.g., in psychology or neurosciences) have investigated the SoA from a different perspective proposing complementary theories. Yet, despite the acknowledged potential benefits of catching up with these theories a gap remains. This paper first aims to contribute to fill this gap by introducing a theory according to which the SoA can be divided into two components, the feeling and the judgment of agency, and relies on three principles, namely the principles of priority, exclusivity and consistency. We argue that this theory could provide insights on the factors influencing the SoA in VR systems. Second, we propose novel approaches to manipulate the SoA in controlled VR experiments (based on these three principles) as well as to measure the SoA, and more specifically its two components based on neurophysiological markers, using ElectroEncephaloGraphy (EEG). We claim that these approaches would enable us to deepen our understanding of the SoA in VR contexts. Finally, we validate these approaches in an experiment. Our results (N=24) suggest that our approach was successful in manipulating the SoA as the modulation of each of the three principles induced significant decreases of the SoA (measured using questionnaires). In addition, we recorded participants' EEG signals during the VR experiment, and neurophysiological markers of the SoA, potentially reflecting the feeling and judgment of agency specifically, were revealed. Our results also suggest that users' profile, more precisely their Locus of Control (LoC), influences their level of immersion and SoA.
Prediction of human core body temperature using non-invasive measurement methods.
Niedermann, Reto; Wyss, Eva; Annaheim, Simon; Psikuta, Agnes; Davey, Sarah; Rossi, René Michel
2014-01-01
The measurement of core body temperature is an efficient method for monitoring heat stress amongst workers in hot conditions. However, invasive measurement of core body temperature (e.g. rectal, intestinal, oesophageal temperature) is impractical for such applications. Therefore, the aim of this study was to define relevant non-invasive measures to predict core body temperature under various conditions. We conducted two human subject studies with different experimental protocols, different environmental temperatures (10 °C, 30 °C) and different subjects. In both studies the same non-invasive measurement methods (skin temperature, skin heat flux, heart rate) were applied. A principle component analysis was conducted to extract independent factors, which were then used in a linear regression model. We identified six parameters (three skin temperatures, two skin heat fluxes and heart rate), which were included for the calculation of two factors. The predictive value of these factors for core body temperature was evaluated by a multiple regression analysis. The calculated root mean square deviation (rmsd) was in the range from 0.28 °C to 0.34 °C for all environmental conditions. These errors are similar to previous models using non-invasive measures to predict core body temperature. The results from this study illustrate that multiple physiological parameters (e.g. skin temperature and skin heat fluxes) are needed to predict core body temperature. In addition, the physiological measurements chosen in this study and the algorithm defined in this work are potentially applicable as real-time core body temperature monitoring to assess health risk in broad range of working conditions.
Folded concave penalized learning in identifying multimodal MRI marker for Parkinson’s disease
Liu, Hongcheng; Du, Guangwei; Zhang, Lijun; Lewis, Mechelle M.; Wang, Xue; Yao, Tao; Li, Runze; Huang, Xuemei
2016-01-01
Background Brain MRI holds promise to gauge different aspects of Parkinson’s disease (PD)-related pathological changes. Its analysis, however, is hindered by the high-dimensional nature of the data. New method This study introduces folded concave penalized (FCP) sparse logistic regression to identify biomarkers for PD from a large number of potential factors. The proposed statistical procedures target the challenges of high-dimensionality with limited data samples acquired. The maximization problem associated with the sparse logistic regression model is solved by local linear approximation. The proposed procedures then are applied to the empirical analysis of multimodal MRI data. Results From 45 features, the proposed approach identified 15 MRI markers and the UPSIT, which are known to be clinically relevant to PD. By combining the MRI and clinical markers, we can enhance substantially the specificity and sensitivity of the model, as indicated by the ROC curves. Comparison to existing methods We compare the folded concave penalized learning scheme with both the Lasso penalized scheme and the principle component analysis-based feature selection (PCA) in the Parkinson’s biomarker identification problem that takes into account both the clinical features and MRI markers. The folded concave penalty method demonstrates a substantially better clinical potential than both the Lasso and PCA in terms of specificity and sensitivity. Conclusions For the first time, we applied the FCP learning method to MRI biomarker discovery in PD. The proposed approach successfully identified MRI markers that are clinically relevant. Combining these biomarkers with clinical features can substantially enhance performance. PMID:27102045
Prediction of dynamical systems by symbolic regression
NASA Astrophysics Data System (ADS)
Quade, Markus; Abel, Markus; Shafi, Kamran; Niven, Robert K.; Noack, Bernd R.
2016-07-01
We study the modeling and prediction of dynamical systems based on conventional models derived from measurements. Such algorithms are highly desirable in situations where the underlying dynamics are hard to model from physical principles or simplified models need to be found. We focus on symbolic regression methods as a part of machine learning. These algorithms are capable of learning an analytically tractable model from data, a highly valuable property. Symbolic regression methods can be considered as generalized regression methods. We investigate two particular algorithms, the so-called fast function extraction which is a generalized linear regression algorithm, and genetic programming which is a very general method. Both are able to combine functions in a certain way such that a good model for the prediction of the temporal evolution of a dynamical system can be identified. We illustrate the algorithms by finding a prediction for the evolution of a harmonic oscillator based on measurements, by detecting an arriving front in an excitable system, and as a real-world application, the prediction of solar power production based on energy production observations at a given site together with the weather forecast.
Xu, Yun; Muhamadali, Howbeer; Sayqal, Ali; Dixon, Neil; Goodacre, Royston
2016-10-28
Partial least squares (PLS) is one of the most commonly used supervised modelling approaches for analysing multivariate metabolomics data. PLS is typically employed as either a regression model (PLS-R) or a classification model (PLS-DA). However, in metabolomics studies it is common to investigate multiple, potentially interacting, factors simultaneously following a specific experimental design. Such data often cannot be considered as a "pure" regression or a classification problem. Nevertheless, these data have often still been treated as a regression or classification problem and this could lead to ambiguous results. In this study, we investigated the feasibility of designing a hybrid target matrix Y that better reflects the experimental design than simple regression or binary class membership coding commonly used in PLS modelling. The new design of Y coding was based on the same principle used by structural modelling in machine learning techniques. Two real metabolomics datasets were used as examples to illustrate how the new Y coding can improve the interpretability of the PLS model compared to classic regression/classification coding.
Validity and validation of expert (Q)SAR systems.
Hulzebos, E; Sijm, D; Traas, T; Posthumus, R; Maslankiewicz, L
2005-08-01
At a recent workshop in Setubal (Portugal) principles were drafted to assess the suitability of (quantitative) structure-activity relationships ((Q)SARs) for assessing the hazards and risks of chemicals. In the present study we applied some of the Setubal principles to test the validity of three (Q)SAR expert systems and validate the results. These principles include a mechanistic basis, the availability of a training set and validation. ECOSAR, BIOWIN and DEREK for Windows have a mechanistic or empirical basis. ECOSAR has a training set for each QSAR. For half of the structural fragments the number of chemicals in the training set is >4. Based on structural fragments and log Kow, ECOSAR uses linear regression to predict ecotoxicity. Validating ECOSAR for three 'valid' classes results in predictivity of > or = 64%. BIOWIN uses (non-)linear regressions to predict the probability of biodegradability based on fragments and molecular weight. It has a large training set and predicts non-ready biodegradability well. DEREK for Windows predictions are supported by a mechanistic rationale and literature references. The structural alerts in this program have been developed with a training set of positive and negative toxicity data. However, to support the prediction only a limited number of chemicals in the training set is presented to the user. DEREK for Windows predicts effects by 'if-then' reasoning. The program predicts best for mutagenicity and carcinogenicity. Each structural fragment in ECOSAR and DEREK for Windows needs to be evaluated and validated separately.
Flying the ST-5 Constellation with "Plug and Play" Autonomy Components and the GMSEC Bus
NASA Technical Reports Server (NTRS)
Shendock, Bob; Witt, Ken; Stanley, Jason; Mandl, Dan; Coyle, Steve
2006-01-01
The Space Technology 5 (ST5) Project, part of NASA's New Millennium Program, will consist of a constellation of three micro-satellites. This viewgraph document presents the components that will allow it to operate in an autonomous mode. The ST-5 constellation will use the GSFC Mission Services Evolution Center (GMSEC) architecture to enable cost effective model based operations. The ST-5 mission will demonstrate several principles of self managing software components.
Kwok, Sylvia Lai Yuk Ching; Shek, Daniel Tan Lei
2010-03-05
Utilizing Daniel Goleman's theory of emotional competence, Beck's cognitive theory, and Rudd's cognitive-behavioral theory of suicidality, the relationships between hopelessness (cognitive component), social problem solving (cognitive-behavioral component), emotional competence (emotive component), and adolescent suicidal ideation were examined. Based on the responses of 5,557 Secondary 1 to Secondary 4 students from 42 secondary schools in Hong Kong, results showed that suicidal ideation was positively related to adolescent hopelessness, but negatively related to emotional competence and social problem solving. While standard regression analyses showed that all the above variables were significant predictors of suicidal ideation, hierarchical regression analyses showed that hopelessness was the most important predictor of suicidal ideation, followed by social problem solving and emotional competence. Further regression analyses found that all four subscales of emotional competence, i.e., empathy, social skills, self-management of emotions, and utilization of emotions, were important predictors of male adolescent suicidal ideation. However, the subscale of social skills was not a significant predictor of female adolescent suicidal ideation. Standard regression analysis also revealed that all three subscales of social problem solving, i.e., negative problem orientation, rational problem solving, and impulsiveness/carelessness style, were important predictors of suicidal ideation. Theoretical and practice implications of the findings are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khan, Saleem Ayaz, E-mail: sayaz_usb@yahoo.com; Azam, Sikander
The electronic band structure, valence electron charge density and optical susceptibilities of tetrabarium gallium trinitride (TGT) were calculated via first principle study. The electronic band structure calculation describes TGT as semiconductor having direct band gap of 1.38 eV. The valence electronic charge density contour verified the non-polar covalent nature of the bond. The absorption edge and first peak of dielectric tensor components showed electrons transition from N-p state to Ba-d state. The calculated uniaxial anisotropy (0.4842) and birefringence (−0.0061) of present paper is prearranged as follow the spectral components of the dielectric tensor. The first peak in energy loss functionmore » (ELOS) shows the energy loss of fast traveling electrons in the material. The first sharp peak produced in ELOS around 10.5 eV show plasmon loss having plasma frequencies 0.1536, 0.004 and 0.066 of dielectric tensor components. This plasmon loss also cause decrease in reflectivity spectra.« less
2007-11-30
The Food and Drug Administration (FDA) is reclassifying from class III to class II the automated blood cell separator device operating by centrifugal separation principle and intended for the routine collection of blood and blood components. FDA is taking this action on its own initiative based on new information. Elsewhere in this issue of the Federal Register, FDA is announcing the availability of a guidance document that will serve as the special controls for this device, as well as the special controls for the device with the same intended use but operating on a filtration separation principle.
Life on the arc: principle-centered comprehensive care.
Fohey, T; Cassidy, J L
1998-01-01
Today's dental practice is experiencing an evolution in the manner through which new materials and techniques are marketed and introduced. An increasing concern among the patient population regarding aesthetics contributes to the acceptance of a commodity dental philosophy, without questioning the reliability of the technique or new material. A principle-centered practice differentiates the product marketing from the viability of a restorative material in vivo. This article discusses the concept of a principle-centered practice and describes how to place quality products in a balanced system in which harmony exits between all components of the masticatory system: the teeth, the muscles, and the temporomandibular joints.
Peleato, Nicolás M; Andrews, Robert C
2015-01-01
This work investigated the application of several fluorescence excitation-emission matrix analysis methods as natural organic matter (NOM) indicators for use in predicting the formation of trihalomethanes (THMs) and haloacetic acids (HAAs). Waters from four different sources (two rivers and two lakes) were subjected to jar testing followed by 24hr disinfection by-product formation tests using chlorine. NOM was quantified using three common measures: dissolved organic carbon, ultraviolet absorbance at 254 nm, and specific ultraviolet absorbance as well as by principal component analysis, peak picking, and parallel factor analysis of fluorescence spectra. Based on multi-linear modeling of THMs and HAAs, principle component (PC) scores resulted in the lowest mean squared prediction error of cross-folded test sets (THMs: 43.7 (μg/L)(2), HAAs: 233.3 (μg/L)(2)). Inclusion of principle components representative of protein-like material significantly decreased prediction error for both THMs and HAAs. Parallel factor analysis did not identify a protein-like component and resulted in prediction errors similar to traditional NOM surrogates as well as fluorescence peak picking. These results support the value of fluorescence excitation-emission matrix-principal component analysis as a suitable NOM indicator in predicting the formation of THMs and HAAs for the water sources studied. Copyright © 2014. Published by Elsevier B.V.
Mohd Yusof, Mohd Yusmiaidil Putera; Cauwels, Rita; Deschepper, Ellen; Martens, Luc
2015-08-01
The third molar development (TMD) has been widely utilized as one of the radiographic method for dental age estimation. By using the same radiograph of the same individual, third molar eruption (TME) information can be incorporated to the TMD regression model. This study aims to evaluate the performance of dental age estimation in individual method models and the combined model (TMD and TME) based on the classic regressions of multiple linear and principal component analysis. A sample of 705 digital panoramic radiographs of Malay sub-adults aged between 14.1 and 23.8 years was collected. The techniques described by Gleiser and Hunt (modified by Kohler) and Olze were employed to stage the TMD and TME, respectively. The data was divided to develop three respective models based on the two regressions of multiple linear and principal component analysis. The trained models were then validated on the test sample and the accuracy of age prediction was compared between each model. The coefficient of determination (R²) and root mean square error (RMSE) were calculated. In both genders, adjusted R² yielded an increment in the linear regressions of combined model as compared to the individual models. The overall decrease in RMSE was detected in combined model as compared to TMD (0.03-0.06) and TME (0.2-0.8). In principal component regression, low value of adjusted R(2) and high RMSE except in male were exhibited in combined model. Dental age estimation is better predicted using combined model in multiple linear regression models. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Shipboard Elevator Magnetic Sensor Development. Phase I, Laboratory Investigations.
1981-08-19
greater detail. The principles studied were those of the flux-meter and the flux-gate magnetometer . Of these two, the flux-gate magnetometer principle was...Abstract (Continued) Flux-gate magnetometers continuously sense the component of a stationary or slowly varying magnetic field along a chosen axis. The...distance of the sensor from the target’s line of travel, while precisely indicating displacements along the line. The modes of detection include level
Military Dissent: What are the Ethical Implications of Tensions in U.S. Civil-Military Relations?
2013-06-14
they have social, political, economic, and moral components or dimensions that drive their design, formulation, and implementation. 2 According to...principles held by the Army profession and embedded in its culture (CAPE 2012). Army professional: A member of the Army profession who meets the Army’s...government. Ethics: A form of philosophy that deals with principles and concepts that guide right and wrong behavior (Mattox 2012). Ethos: The
Mass storage system experiences and future needs at the National Center for Atmospheric Research
NASA Technical Reports Server (NTRS)
Olear, Bernard T.
1992-01-01
This presentation is designed to relate some of the experiences of the Scientific Computing Division at NCAR dealing with the 'data problem'. A brief history and a development of some basic Mass Storage System (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. There is discussion of future MSS needs for future computing environments.
Logistic regression for risk factor modelling in stuttering research.
Reed, Phil; Wu, Yaqionq
2013-06-01
To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.
A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components
2005-05-01
interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.
Improving Accreditor's Evaluation of Experiential Learning Programs.
ERIC Educational Resources Information Center
Keeton, Morris T.
1980-01-01
Principles of good practice in assessing experiential learning include better self-evaluation of the learning outcomes of experiential components, systematic program auditing, and training of external evaluators. (SK)
Alvarez de Lorenzana, J M; Ward, L M
1987-01-01
This paper develops a metatheoretical framework for understanding evolutionary systems (systems that develop in ways that increase their own variety). The framework addresses shortcomings seen in other popular systems theories. It concerns both living and nonliving systems, and proposes a metahierarchy of hierarchical systems. Thus, it potentially addresses systems at all descriptive levels. We restrict our definition of system to that of a core system whose parts have a different ontological status than the system, and characterize the core system in terms of five global properties: minimal length interval, minimal time interval, system cycle, total receptive capacity, and system potential. We propose two principles through the interaction of which evolutionary systems develop. The Principle of Combinatorial Expansion describes how a core system realizes its developmental potential through a process of progressive differentiation of the single primal state up to a limit stage. The Principle of Generative Condensation describes how the components of the last stage of combinatorial expansion condense and become the environment for and components of new, enriched systems. The early evolution of the Universe after the "big bang" is discussed in light of these ideas as an example of the application of the framework.
Xian, George Z.; Homer, Collin G.; Rigge, Matthew B.; Shi, Hua; Meyer, Debbie
2015-01-01
Accurate and consistent estimates of shrubland ecosystem components are crucial to a better understanding of ecosystem conditions in arid and semiarid lands. An innovative approach was developed by integrating multiple sources of information to quantify shrubland components as continuous field products within the National Land Cover Database (NLCD). The approach consists of several procedures including field sample collections, high-resolution mapping of shrubland components using WorldView-2 imagery and regression tree models, Landsat 8 radiometric balancing and phenological mosaicking, medium resolution estimates of shrubland components following different climate zones using Landsat 8 phenological mosaics and regression tree models, and product validation. Fractional covers of nine shrubland components were estimated: annual herbaceous, bare ground, big sagebrush, herbaceous, litter, sagebrush, shrub, sagebrush height, and shrub height. Our study area included the footprint of six Landsat 8 scenes in the northwestern United States. Results show that most components have relatively significant correlations with validation data, have small normalized root mean square errors, and correspond well with expected ecological gradients. While some uncertainties remain with height estimates, the model formulated in this study provides a cross-validated, unbiased, and cost effective approach to quantify shrubland components at a regional scale and advances knowledge of horizontal and vertical variability of these components.
Least Principal Components Analysis (LPCA): An Alternative to Regression Analysis.
ERIC Educational Resources Information Center
Olson, Jeffery E.
Often, all of the variables in a model are latent, random, or subject to measurement error, or there is not an obvious dependent variable. When any of these conditions exist, an appropriate method for estimating the linear relationships among the variables is Least Principal Components Analysis. Least Principal Components are robust, consistent,…
Assessment of Weighted Quantile Sum Regression for Modeling Chemical Mixtures and Cancer Risk
Czarnota, Jenna; Gennings, Chris; Wheeler, David C
2015-01-01
In evaluation of cancer risk related to environmental chemical exposures, the effect of many chemicals on disease is ultimately of interest. However, because of potentially strong correlations among chemicals that occur together, traditional regression methods suffer from collinearity effects, including regression coefficient sign reversal and variance inflation. In addition, penalized regression methods designed to remediate collinearity may have limitations in selecting the truly bad actors among many correlated components. The recently proposed method of weighted quantile sum (WQS) regression attempts to overcome these problems by estimating a body burden index, which identifies important chemicals in a mixture of correlated environmental chemicals. Our focus was on assessing through simulation studies the accuracy of WQS regression in detecting subsets of chemicals associated with health outcomes (binary and continuous) in site-specific analyses and in non-site-specific analyses. We also evaluated the performance of the penalized regression methods of lasso, adaptive lasso, and elastic net in correctly classifying chemicals as bad actors or unrelated to the outcome. We based the simulation study on data from the National Cancer Institute Surveillance Epidemiology and End Results Program (NCI-SEER) case–control study of non-Hodgkin lymphoma (NHL) to achieve realistic exposure situations. Our results showed that WQS regression had good sensitivity and specificity across a variety of conditions considered in this study. The shrinkage methods had a tendency to incorrectly identify a large number of components, especially in the case of strong association with the outcome. PMID:26005323
Assessment of weighted quantile sum regression for modeling chemical mixtures and cancer risk.
Czarnota, Jenna; Gennings, Chris; Wheeler, David C
2015-01-01
In evaluation of cancer risk related to environmental chemical exposures, the effect of many chemicals on disease is ultimately of interest. However, because of potentially strong correlations among chemicals that occur together, traditional regression methods suffer from collinearity effects, including regression coefficient sign reversal and variance inflation. In addition, penalized regression methods designed to remediate collinearity may have limitations in selecting the truly bad actors among many correlated components. The recently proposed method of weighted quantile sum (WQS) regression attempts to overcome these problems by estimating a body burden index, which identifies important chemicals in a mixture of correlated environmental chemicals. Our focus was on assessing through simulation studies the accuracy of WQS regression in detecting subsets of chemicals associated with health outcomes (binary and continuous) in site-specific analyses and in non-site-specific analyses. We also evaluated the performance of the penalized regression methods of lasso, adaptive lasso, and elastic net in correctly classifying chemicals as bad actors or unrelated to the outcome. We based the simulation study on data from the National Cancer Institute Surveillance Epidemiology and End Results Program (NCI-SEER) case-control study of non-Hodgkin lymphoma (NHL) to achieve realistic exposure situations. Our results showed that WQS regression had good sensitivity and specificity across a variety of conditions considered in this study. The shrinkage methods had a tendency to incorrectly identify a large number of components, especially in the case of strong association with the outcome.
Azevedo, C F; Nascimento, M; Silva, F F; Resende, M D V; Lopes, P S; Guimarães, S E F; Glória, L S
2015-10-09
A significant contribution of molecular genetics is the direct use of DNA information to identify genetically superior individuals. With this approach, genome-wide selection (GWS) can be used for this purpose. GWS consists of analyzing a large number of single nucleotide polymorphism markers widely distributed in the genome; however, because the number of markers is much larger than the number of genotyped individuals, and such markers are highly correlated, special statistical methods are widely required. Among these methods, independent component regression, principal component regression, partial least squares, and partial principal components stand out. Thus, the aim of this study was to propose an application of the methods of dimensionality reduction to GWS of carcass traits in an F2 (Piau x commercial line) pig population. The results show similarities between the principal and the independent component methods and provided the most accurate genomic breeding estimates for most carcass traits in pigs.
Using Generalized Additive Models to Analyze Single-Case Designs
ERIC Educational Resources Information Center
Shadish, William; Sullivan, Kristynn
2013-01-01
Many analyses for single-case designs (SCDs)--including nearly all the effect size indicators-- currently assume no trend in the data. Regression and multilevel models allow for trend, but usually test only linear trend and have no principled way of knowing if higher order trends should be represented in the model. This paper shows how Generalized…
Trends in Cost-Sharing in the US and Potential International Implications
ERIC Educational Resources Information Center
Taylor, Barrett J.; Morphew, Christopher C.
2015-01-01
"Cost-sharing" refers to the principle that a variety of sources contribute to the cost of higher education. This study utilizes university-level data from the United States to explore the increasing shift of cost burdens from governments to students. Panel regression results suggest that the share of expenditures drawn from tuition…
Modeling Governance KB with CATPCA to Overcome Multicollinearity in the Logistic Regression
NASA Astrophysics Data System (ADS)
Khikmah, L.; Wijayanto, H.; Syafitri, U. D.
2017-04-01
The problem often encounters in logistic regression modeling are multicollinearity problems. Data that have multicollinearity between explanatory variables with the result in the estimation of parameters to be bias. Besides, the multicollinearity will result in error in the classification. In general, to overcome multicollinearity in regression used stepwise regression. They are also another method to overcome multicollinearity which involves all variable for prediction. That is Principal Component Analysis (PCA). However, classical PCA in only for numeric data. Its data are categorical, one method to solve the problems is Categorical Principal Component Analysis (CATPCA). Data were used in this research were a part of data Demographic and Population Survey Indonesia (IDHS) 2012. This research focuses on the characteristic of women of using the contraceptive methods. Classification results evaluated using Area Under Curve (AUC) values. The higher the AUC value, the better. Based on AUC values, the classification of the contraceptive method using stepwise method (58.66%) is better than the logistic regression model (57.39%) and CATPCA (57.39%). Evaluation of the results of logistic regression using sensitivity, shows the opposite where CATPCA method (99.79%) is better than logistic regression method (92.43%) and stepwise (92.05%). Therefore in this study focuses on major class classification (using a contraceptive method), then the selected model is CATPCA because it can raise the level of the major class model accuracy.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Aparna; Gupta, Jancy; Ravindran, Dileepkumar
2017-04-01
The study aims at assessing the vulnerability and tradeoffs of dairy based livelihoods to Climate Variability and Change (CVC) in the Western Ghats ecosystem, India. For this purpose; data were aggregated to an overall Livelihood Vulnerability Index (LVI) to CVC underlying the principles of IPCC, using 40 indicators under 7 LVI components. Fussel framework was used for the nomenclature of vulnerable situation and trade-off between vulnerability components and milk production was calculated. Data were collected through participatory rural appraisal and personal interviews from 360 randomly selected dairy farmers of nine blocks from three states of Western Ghat region, complemented by thirty years of gridded weather data and livestock data. The LVI score of dairy based livelihoods of six taluks were negative. The data were normalized and then combined into three indices of sensitivity, exposure and adaptive capacity, which were then averaged with weights given using principal component analysis, to obtain the overall vulnerability index. Mann Whitney U test was used to find the significant difference between the taluks in terms of LVI and cumulative square root frequency method was used to categorise the farmers. Even though the taluks are geographically closer, there is significant difference in the LVI values of the regions. Results indicated that the Lanja taluks of Maharashtra is the most vulnerable having an overall LVI value -4.17 with 48% farmers falling in highly vulnerable category. Panel regression analysis reveals that there is significant synergy between average milk production and livestock, social network component and trade-off between natural disasters climate variability component of LVI. Policies for incentivizing the 'climate risk adaptation' costs for small and marginal farmers and livelihood infrastructure for mitigating risks and promoting grass root level innovations are necessary to sustain dairy farming of the region. Thus the research will provide an important basis for policy makers to develop appropriate adaptation strategies for alarming situation and decision making for farmers to minimize the risk of dairy sector to climate variability.
NASA Astrophysics Data System (ADS)
Stück, H. L.; Siegesmund, S.
2012-04-01
Sandstones are a popular natural stone due to their wide occurrence and availability. The different applications for these stones have led to an increase in demand. From the viewpoint of conservation and the natural stone industry, an understanding of the material behaviour of this construction material is very important. Sandstones are a highly heterogeneous material. Based on statistical analyses with a sufficiently large dataset, a systematic approach to predicting the material behaviour should be possible. Since the literature already contains a large volume of data concerning the petrographical and petrophysical properties of sandstones, a large dataset could be compiled for the statistical analyses. The aim of this study is to develop constraints on the material behaviour and especially on the weathering behaviour of sandstones. Approximately 300 samples from historical and presently mined natural sandstones in Germany and ones described worldwide were included in the statistical approach. The mineralogical composition and fabric characteristics were determined from detailed thin section analyses and descriptions in the literature. Particular attention was paid to evaluating the compositional and textural maturity, grain contact respectively contact thickness, type of cement, degree of alteration and the intergranular volume. Statistical methods were used to test for normal distributions and calculating the linear regression of the basic petrophysical properties of density, porosity, water uptake as well as the strength. The sandstones were classified into three different pore size distributions and evaluated with the other petrophysical properties. Weathering behavior like hygric swelling and salt loading tests were also included. To identify similarities between individual sandstones or to define groups of specific sandstone types, principle component analysis, cluster analysis and factor analysis were applied. Our results show that composition and porosity evolution during diagenesis is a very important control on the petrophysical properties of a building stone. The relationship between intergranular volume, cementation and grain contact, can also provide valuable information to predict the strength properties. Since the samples investigated mainly originate from the Triassic German epicontinental basin, arkoses and feldspar-arenites are underrepresented. In general, the sandstones can be grouped as follows: i) quartzites, highly mature with a primary porosity of about 40%, ii) quartzites, highly mature, showing a primary porosity of 40% but with early clay infiltration, iii) sublitharenites-lithic arenites exhibiting a lower primary porosity, higher cementation with quartz and Fe-oxides ferritic and iv) sublitharenites-lithic arenites with a higher content of pseudomatrix. However, in the last two groups the feldspar and lithoclasts can also show considerable alteration. All sandstone groups differ with respect to the pore space and strength data, as well as water uptake properties, which were obtained by linear regression analysis. Similar petrophysical properties are discernible for each type when using principle component analysis. Furthermore, strength as well as the porosity of sandstones shows distinct differences considering their stratigraphic ages and the compositions. The relationship between porosity, strength as well as salt resistance could also be verified. Hygric swelling shows an interrelation to pore size type, porosity and strength but also to the degree of alteration (e.g. lithoclasts, pseudomatrix). To summarize, the different regression analyses and the calculated confidence regions provide a significant tool to classify the petrographical and petrophysical parameters of sandstones. Based on this, the durability and the weathering behavior of the sandstone groups can be constrained. Keywords: sandstones, petrographical & petrophysical properties, predictive approach, statistical investigation
An Excel Solver Exercise to Introduce Nonlinear Regression
ERIC Educational Resources Information Center
Pinder, Jonathan P.
2013-01-01
Business students taking business analytics courses that have significant predictive modeling components, such as marketing research, data mining, forecasting, and advanced financial modeling, are introduced to nonlinear regression using application software that is a "black box" to the students. Thus, although correct models are…
Oil and gas pipeline construction cost analysis and developing regression models for cost estimation
NASA Astrophysics Data System (ADS)
Thaduri, Ravi Kiran
In this study, cost data for 180 pipelines and 136 compressor stations have been analyzed. On the basis of the distribution analysis, regression models have been developed. Material, Labor, ROW and miscellaneous costs make up the total cost of a pipeline construction. The pipelines are analyzed based on different pipeline lengths, diameter, location, pipeline volume and year of completion. In a pipeline construction, labor costs dominate the total costs with a share of about 40%. Multiple non-linear regression models are developed to estimate the component costs of pipelines for various cross-sectional areas, lengths and locations. The Compressor stations are analyzed based on the capacity, year of completion and location. Unlike the pipeline costs, material costs dominate the total costs in the construction of compressor station, with an average share of about 50.6%. Land costs have very little influence on the total costs. Similar regression models are developed to estimate the component costs of compressor station for various capacities and locations.
Shi, J Q; Wang, B; Will, E J; West, R M
2012-11-20
We propose a new semiparametric model for functional regression analysis, combining a parametric mixed-effects model with a nonparametric Gaussian process regression model, namely a mixed-effects Gaussian process functional regression model. The parametric component can provide explanatory information between the response and the covariates, whereas the nonparametric component can add nonlinearity. We can model the mean and covariance structures simultaneously, combining the information borrowed from other subjects with the information collected from each individual subject. We apply the model to dose-response curves that describe changes in the responses of subjects for differing levels of the dose of a drug or agent and have a wide application in many areas. We illustrate the method for the management of renal anaemia. An individual dose-response curve is improved when more information is included by this mechanism from the subject/patient over time, enabling a patient-specific treatment regime. Copyright © 2012 John Wiley & Sons, Ltd.
Shrinkage regression-based methods for microarray missing value imputation.
Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng
2013-01-01
Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.
GMP in blood collection and processing.
Wagstaff, W
1998-01-01
The principles of Good Manufacturing Practice have, in the main, been universally developed for the guidance of the pharmaceutical industry rather than for transfusion services. However, these rules and guides are increasingly being adapted for use in blood centres, in the production of labile blood components and of plasma for fractionation. The guide for pharmaceutical industries produced by the commission of the European Communities is used as a model here, the nine basic requirements being those applicable to Quality Management, personnel, premises and equipment, document, production, Quality Control, contract manufacture and analysis, complaints and product recall, and self-inspection. Though having more direct application to the production laboratory preparing blood components, the majority of these requirements and principles are also directly applicable to all of the activities involved in blood collection.
Graphical methods for the sensitivity analysis in discriminant analysis
Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang
2015-09-30
Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern ofmore » the change.« less
Applications of Support Vector Machines In Chemo And Bioinformatics
NASA Astrophysics Data System (ADS)
Jayaraman, V. K.; Sundararajan, V.
2010-10-01
Conventional linear & nonlinear tools for classification, regression & data driven modeling are being replaced on a rapid scale by newer techniques & tools based on artificial intelligence and machine learning. While the linear techniques are not applicable for inherently nonlinear problems, newer methods serve as attractive alternatives for solving real life problems. Support Vector Machine (SVM) classifiers are a set of universal feed-forward network based classification algorithms that have been formulated from statistical learning theory and structural risk minimization principle. SVM regression closely follows the classification methodology. In this work recent applications of SVM in Chemo & Bioinformatics will be described with suitable illustrative examples.
NASA Astrophysics Data System (ADS)
Sumantari, Y. D.; Slamet, I.; Sugiyanto
2017-06-01
Semiparametric regression is a statistical analysis method that consists of parametric and nonparametric regression. There are various approach techniques in nonparametric regression. One of the approach techniques is spline. Central Java is one of the most densely populated province in Indonesia. Population density in this province can be modeled by semiparametric regression because it consists of parametric and nonparametric component. Therefore, the purpose of this paper is to determine the factors that in uence population density in Central Java using the semiparametric spline regression model. The result shows that the factors which in uence population density in Central Java is Family Planning (FP) active participants and district minimum wage.
Freyre-González, Julio A; Treviño-Quintanilla, Luis G; Valtierra-Gutiérrez, Ilse A; Gutiérrez-Ríos, Rosa María; Alonso-Pavón, José A
2012-10-31
Escherichia coli and Bacillus subtilis are two of the best-studied prokaryotic model organisms. Previous analyses of their transcriptional regulatory networks have shown that they exhibit high plasticity during evolution and suggested that both converge to scale-free-like structures. Nevertheless, beyond this suggestion, no analyses have been carried out to identify the common systems-level components and principles governing these organisms. Here we show that these two phylogenetically distant organisms follow a set of common novel biologically consistent systems principles revealed by the mathematically and biologically founded natural decomposition approach. The discovered common functional architecture is a diamond-shaped, matryoshka-like, three-layer (coordination, processing, and integration) hierarchy exhibiting feedback, which is shaped by four systems-level components: global transcription factors (global TFs), locally autonomous modules, basal machinery and intermodular genes. The first mathematical criterion to identify global TFs, the κ-value, was reassessed on B. subtilis and confirmed its high predictive power by identifying all the previously reported, plus three potential, master regulators and eight sigma factors. The functionally conserved cores of modules, basal cell machinery, and a set of non-orthologous common physiological global responses were identified via both orthologous genes and non-orthologous conserved functions. This study reveals novel common systems principles maintained between two phylogenetically distant organisms and provides a comparison of their lifestyle adaptations. Our results shed new light on the systems-level principles and the fundamental functions required by bacteria to sustain life. Copyright © 2012 Elsevier B.V. All rights reserved.
Analysis on Sealing Reliability of Bolted Joint Ball Head Component of Satellite Propulsion System
NASA Astrophysics Data System (ADS)
Guo, Tao; Fan, Yougao; Gao, Feng; Gu, Shixin; Wang, Wei
2018-01-01
Propulsion system is one of the important subsystems of satellite, and its performance directly affects the service life, attitude control and reliability of the satellite. The Paper analyzes the sealing principle of bolted joint ball head component of satellite propulsion system and discuss from the compatibility of hydrazine anhydrous and bolted joint ball head component, influence of ground environment on the sealing performance of bolted joint ball heads, and material failure caused by environment, showing that the sealing reliability of bolted joint ball head component is good and the influence of above three aspects on sealing of bolted joint ball head component can be ignored.
Instruction manual model 600F, data transmission test set
NASA Technical Reports Server (NTRS)
1972-01-01
Information necessary for the operation and maintenance of the Model 600F Data Transmission Test Set is presented. A description is contained of the physical and functional characteristics; pertinent installation data; instructions for operating the equipment; general and detailed principles of operation; preventive and corrective maintenance procedures; and block, logic, and component layout diagrams of the equipment and its major component assemblies.
Hu, Shan-Zhou; Chen, Fen-Fei; Zeng, Li-Bo; Wu, Qiong-Shui
2013-01-01
Imaging AOTF is an important optical filter component for new spectral imaging instruments developed in recent years. The principle of imaging AOTF component was demonstrated, and a set of testing methods for some key performances were studied, such as diffraction efficiency, wavelength shift with temperature, homogeneity in space for diffraction efficiency, imaging shift, etc.
Optimal Estimation of Clock Values and Trends from Finite Data
NASA Technical Reports Server (NTRS)
Greenhall, Charles
2005-01-01
We show how to solve two problems of optimal linear estimation from a finite set of phase data. Clock noise is modeled as a stochastic process with stationary dth increments. The covariance properties of such a process are contained in the generalized autocovariance function (GACV). We set up two principles for optimal estimation: with the help of the GACV, these principles lead to a set of linear equations for the regression coefficients and some auxiliary parameters. The mean square errors of the estimators are easily calculated. The method can be used to check the results of other methods and to find good suboptimal estimators based on a small subset of the available data.
Mechanisms of developmental neurite pruning
Schuldiner, Oren; Yaron, Avraham
2016-01-01
The precise wiring of the nervous system is a combined outcome of progressive and regressive events during development. Axon guidance and synapse formation intertwined with cell death and neurite pruning sculpt the mature circuitry. It is now well recognized that pruning of dendrites and axons as means to refine neuronal networks, is a wide spread phenomena required for the normal development of vertebrate and invertebrate nervous systems. Here we will review the arising principles of cellular and molecular mechanisms of neurite pruning. We will discuss these principles in light of studies in multiple neuronal systems, and speculate on potential explanations for the emergence of neurite pruning as a mechanism to sculpt the nervous system. PMID:25213356
Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie
2017-01-01
The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977
Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie
2017-01-01
The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.
Fore, Amanda M; Sculli, Gary L; Albee, Doreen; Neily, Julia
2013-01-01
To implement the sterile cockpit principle to decrease interruptions and distractions during high volume medication administration and reduce the number of medication errors. While some studies have described the importance of reducing interruptions as a tactic to reduce medication errors, work is needed to assess the impact on patient outcomes. Data regarding the type and frequency of distractions were collected during the first 11 weeks of implementation. Medication error rates were tracked 1 year before and after 1 year implementation. Simple regression analysis showed a decrease in the mean number of distractions, (β = -0.193, P = 0.02) over time. The medication error rate decreased by 42.78% (P = 0.04) after implementation of the sterile cockpit principle. The use of crew resource management techniques, including the sterile cockpit principle, applied to medication administration has a significant impact on patient safety. Applying the sterile cockpit principle to inpatient medical units is a feasible approach to reduce the number of distractions during the administration of medication, thus, reducing the likelihood of medication error. 'Do Not Disturb' signs and vests are inexpensive, simple interventions that can be used as reminders to decrease distractions. © 2012 Blackwell Publishing Ltd.
The patient-centered medical home: an ethical analysis of principles and practice.
Braddock, Clarence H; Snyder, Lois; Neubauer, Richard L; Fischer, Gary S
2013-01-01
The patient-centered medical home (PCMH), with its focus on patient-centered care, holds promise as a way to reinvigorate the primary care of patients and as a necessary component of health care reform. While its tenets have been the subject of review, the ethical dimensions of the PCMH have not been fully explored. Consideration of the ethical foundations for the core principles of the PCMH can and should be part of the debate concerning its merits. The PCMH can align with the principles of medical ethics and potentially strengthen the patient-physician relationship and aspects of health care that patients value. Patient choice and these ethical considerations are central and at least as important as the economic and practical arguments in support of the PCMH, if not more so. Further, the ethical principles that support key concepts of the PCMH have implications for the design and implementation of the PCMH. This paper explores the PCMH in light of core principles of ethics and professionalism, with an emphasis both on how the concept of the PCMH may reinforce core ethical principles of medical practice and on further implications of these principles.
Additivity of nonlinear biomass equations
Bernard R. Parresol
2001-01-01
Two procedures that guarantee the property of additivity among the components of tree biomass and total tree biomass utilizing nonlinear functions are developed. Procedure 1 is a simple combination approach, and procedure 2 is based on nonlinear joint-generalized regression (nonlinear seemingly unrelated regressions) with parameter restrictions. Statistical theory is...
A Demonstration of Regression False Positive Selection in Data Mining
ERIC Educational Resources Information Center
Pinder, Jonathan P.
2014-01-01
Business analytics courses, such as marketing research, data mining, forecasting, and advanced financial modeling, have substantial predictive modeling components. The predictive modeling in these courses requires students to estimate and test many linear regressions. As a result, false positive variable selection ("type I errors") is…
Modeling causes of death: an integrated approach using CODEm
2012-01-01
Background Data on causes of death by age and sex are a critical input into health decision-making. Priority setting in public health should be informed not only by the current magnitude of health problems but by trends in them. However, cause of death data are often not available or are subject to substantial problems of comparability. We propose five general principles for cause of death model development, validation, and reporting. Methods We detail a specific implementation of these principles that is embodied in an analytical tool - the Cause of Death Ensemble model (CODEm) - which explores a large variety of possible models to estimate trends in causes of death. Possible models are identified using a covariate selection algorithm that yields many plausible combinations of covariates, which are then run through four model classes. The model classes include mixed effects linear models and spatial-temporal Gaussian Process Regression models for cause fractions and death rates. All models for each cause of death are then assessed using out-of-sample predictive validity and combined into an ensemble with optimal out-of-sample predictive performance. Results Ensemble models for cause of death estimation outperform any single component model in tests of root mean square error, frequency of predicting correct temporal trends, and achieving 95% coverage of the prediction interval. We present detailed results for CODEm applied to maternal mortality and summary results for several other causes of death, including cardiovascular disease and several cancers. Conclusions CODEm produces better estimates of cause of death trends than previous methods and is less susceptible to bias in model specification. We demonstrate the utility of CODEm for the estimation of several major causes of death. PMID:22226226
Ferrier, J; Saleem, A; Carter Ramirez, A; Liu, R; Chen, Eric; Pesek, T; Cal, V; Balick, M; Arnason, J T
2018-06-21
Because of the recent increase in type 2 diabetes and the need for complementary treatments in remote communities in many parts of the world, we undertook a study of treatments for diabetic symptoms used by traditional Q'eqchi' Maya healers of Belize. We used quantitative ethnobotany to rank culturally important taxa and subsequent pharmacological and phytochemical studies to assess bioactivity. Antidiabetic plants identified in field interviews with traditional healers were ranked by syndromic importance value (SIV) based on 15 symptoms of diabetes.. Species ranked with high SIV were tested in an assay relevant to many diabetes complications, the advanced glycation endproduct (AGE) inhibition assay. Active principles were identified by phytochemical analysis and bioassay. We collected over 70 plant species having a promising SIV score. The plants represented a broad range of neotropical taxa. Selected Q'eqchi' antidiabetic plants with high SIV were collected in bulk and tested in the advanced glycation endproduct (AGE) inhibition assay. All plant extracts showed AGE inhibition and the half maximal inhibitory concentration (IC 50 ) ranged from 40.8 to 733µg/mL, while the most active species was Tynanthus guatemalensis Donn (Bignoniaceae). A linear regression showed a significant relationship between 1/ IC 50 and SIV. Phytochemical analysis revealed the presence of verbascoside, as a major component and active principle of the T guatemalensis which had an IC 50 = 5.1µg/mL, comparable to the positive control quercetin. The results reveal a rich botanical tradition of antidiabetic symptom treatments among the Q'eqchi'. Study of highly ranked plants revealed their activity in AGE inhibition correlated with SIV. T. guatemalensis was identified as a promising species for further evaluation and local use. Copyright © 2018. Published by Elsevier B.V.
Gillon, R
2003-01-01
It is hypothesised and argued that "the four principles of medical ethics" can explain and justify, alone or in combination, all the substantive and universalisable claims of medical ethics and probably of ethics more generally. A request is renewed for falsification of this hypothesis showing reason to reject any one of the principles or to require any additional principle(s) that can't be explained by one or some combination of the four principles. This approach is argued to be compatible with a wide variety of moral theories that are often themselves mutually incompatible. It affords a way forward in the context of intercultural ethics, that treads the delicate path between moral relativism and moral imperialism. Reasons are given for regarding the principle of respect for autonomy as "first among equals", not least because it is a necessary component of aspects of the other three. A plea is made for bioethicists to celebrate the approach as a basis for global moral ecumenism rather than mistakenly perceiving and denigrating it as an attempt at global moral imperialism. PMID:14519842
NASA Astrophysics Data System (ADS)
Gaunaa, Mac; Heinz, Joachim; Skrzypiński, Witold
2016-09-01
The crossflow principle is one of the key elements used in engineering models for prediction of the aerodynamic loads on wind turbine blades in standstill or blade installation situations, where the flow direction relative to the wind turbine blade has a component in the direction of the blade span direction. In the present work, the performance of the crossflow principle is assessed on the DTU 10MW reference blade using extensive 3D CFD calculations. Analysis of the computational results shows that there is only a relatively narrow region in which the crossflow principle describes the aerodynamic loading well. In some conditions the deviation of the predicted loadings can be quite significant, having a large influence on for instance the integral aerodynamic moments around the blade centre of mass; which is very important for single blade installation applications. The main features of these deviations, however, have a systematic behaviour on all force components, which in this paper is employed to formulate the first version of an engineering correction method to the crossflow principle applicable for wind turbine blades. The new correction model improves the agreement with CFD results for the key aerodynamic loads in crossflow situations. The general validity of this model for other blade shapes should be investigated in subsequent works.
A "good death": perspectives of Muslim patients and health care providers.
Tayeb, Mohamad A; Al-Zamel, Ersan; Fareed, Muhammed M; Abouellail, Hesham A
2010-01-01
Twelve "good death" principles have been identified that apply to Westerners. This study aimed to review the TFHCOP good death perception to determine its validity for Muslim patients and health care providers, and to identify and describe other components of the Muslim good death perspective. Participants included 284 Muslims of both genders with different nationalities and careers. We used a 12-question questionnaire based on the 12 principles of the TFHCOP good death definition, followed by face-to-face interviews. We used descriptive statistics to analyze questionnaire responses. However, for new themes, we used a grounded theory approach with a "constant comparisons" method. On average, each participant agreed on eight principles of the questionnaire. Dignity, privacy, spiritual and emotional support, access to hospice care, ability to issue advance directives, and to have time to say goodbye were the top priorities. Participants identified three main domains. The first domain was related to faith and belief. The second domain included some principles related to self-esteem and person's image to friends and family. The third domain was related to satisfaction about family security after the death of the patient. Professional role distinctions were more pronounced than were gender or nationality differences. Several aspects of "good death," as perceived by Western communities, are not recognized as being important by many Muslim patients and health care providers. Furthermore, our study introduced three novel components of good death in Muslim society.
Electrostatic engineering of strained ferroelectric perovskites from first principles
NASA Astrophysics Data System (ADS)
Cazorla, Claudio; Stengel, Massimiliano
2015-12-01
Design of novel artificial materials based on ferroelectric perovskites relies on the basic principles of electrostatic coupling and in-plane lattice matching. These rules state that the out-of-plane component of the electric displacement field and the in-plane components of the strain are preserved across a layered superlattice, provided that certain growth conditions are respected. Intense research is currently directed at optimizing materials functionalities based on these guidelines, often with remarkable success. Such principles, however, are of limited practical use unless one disposes of reliable data on how a given material behaves under arbitrary electrical and mechanical boundary conditions. Here we demonstrate, by focusing on the prototypical ferroelectrics PbTiO3 and BiFeO3 as test cases, how such information can be calculated from first principles in a systematic and efficient way. In particular, we construct a series of two-dimensional maps that describe the behavior of either compound (e.g., concerning the ferroelectric polarization and antiferrodistortive instabilities) at any conceivable choice of the in-plane lattice parameter, a , and out-of-plane electric displacement, D . In addition to being of immediate practical applicability to superlattice design, our results bring new insight into the complex interplay of competing degrees of freedom in perovskite materials and reveal some notable instances where the behavior of these materials depart from what naively is expected.
Improved Quantitative Analysis of Ion Mobility Spectrometry by Chemometric Multivariate Calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraga, Carlos G.; Kerr, Dayle; Atkinson, David A.
2009-09-01
Traditional peak-area calibration and the multivariate calibration methods of principle component regression (PCR) and partial least squares (PLS), including unfolded PLS (U-PLS) and multi-way PLS (N-PLS), were evaluated for the quantification of 2,4,6-trinitrotoluene (TNT) and cyclo-1,3,5-trimethylene-2,4,6-trinitramine (RDX) in Composition B samples analyzed by temperature step desorption ion mobility spectrometry (TSD-IMS). The true TNT and RDX concentrations of eight Composition B samples were determined by high performance liquid chromatography with UV absorbance detection. Most of the Composition B samples were found to have distinct TNT and RDX concentrations. Applying PCR and PLS on the exact same IMS spectra used for themore » peak-area study improved quantitative accuracy and precision approximately 3 to 5 fold and 2 to 4 fold, respectively. This in turn improved the probability of correctly identifying Composition B samples based upon the estimated RDX and TNT concentrations from 11% with peak area to 44% and 89% with PLS. This improvement increases the potential of obtaining forensic information from IMS analyzers by providing some ability to differentiate or match Composition B samples based on their TNT and RDX concentrations.« less
Wang, Lutao; Xiao, Jun; Chai, Hua
2015-08-01
The successful suppression of clutter arising from stationary or slowly moving tissue is one of the key issues in medical ultrasound color blood imaging. Remaining clutter may cause bias in the mean blood frequency estimation and results in a potentially misleading description of blood-flow. In this paper, based on the principle of general wall-filter, the design process of three classes of filters, infinitely impulse response with projection initialization (Prj-IIR), polynomials regression (Pol-Reg), and eigen-based filters are previewed and analyzed. The performance of the filters was assessed by calculating the bias and variance of a mean blood velocity using a standard autocorrelation estimator. Simulation results show that the performance of Pol-Reg filter is similar to Prj-IIR filters. Both of them can offer accurate estimation of mean blood flow speed under steady clutter conditions, and the clutter rejection ability can be enhanced by increasing the ensemble size of Doppler vector. Eigen-based filters can effectively remove the non-stationary clutter component, and further improve the estimation accuracy for low speed blood flow signals. There is also no significant increase in computation complexity for eigen-based filters when the ensemble size is less than 10.
Wahid, N B A; Latif, M T; Suan, L S; Dominick, D; Sahani, M; Jaafar, S A; Mohd Tahir, N
2014-03-01
This study aims to determine the composition and sources of particulate matter with an aerodynamic diameter of 10 μm or less (PM10) in a semi-urban area. PM10 samples were collected using a high volume sampler. Heavy metals (Fe, Zn, Pb, Mn, Cu, Cd and Ni) and cations (Na(+), K(+), Ca(2+) and Mg(2+)) were detected using inductively coupled plasma mass spectrometry, while anions (SO4 (2-), NO3 (-), Cl(-) and F(-)) were analysed using Ion Chromatography. Principle component analysis and multiple linear regressions were used to identify the source apportionment of PM10. Results showed the average concentration of PM10 was 29.5 ± 5.1 μg/m(3). The heavy metals found were dominated by Fe, followed by Zn, Pb, Cu, Mn, Cd and Ni. Na(+) was the dominant cation, followed by Ca(2+), K(+) and Mg(2+), whereas SO4 (2-) was the dominant anion, followed by NO3 (-), Cl(-) and F(-). The main sources of PM10 were the Earth's crust/road dust, followed by vehicle emissions, industrial emissions/road activity, and construction/biomass burning.
NASA Astrophysics Data System (ADS)
Riad, Safaa M.; Salem, Hesham; Elbalkiny, Heba T.; Khattab, Fatma I.
2015-04-01
Five, accurate, precise, and sensitive univariate and multivariate spectrophotometric methods were developed for the simultaneous determination of a ternary mixture containing Trimethoprim (TMP), Sulphamethoxazole (SMZ) and Oxytetracycline (OTC) in waste water samples collected from different cites either production wastewater or livestock wastewater after their solid phase extraction using OASIS HLB cartridges. In univariate methods OTC was determined at its λmax 355.7 nm (0D), while (TMP) and (SMZ) were determined by three different univariate methods. Method (A) is based on successive spectrophotometric resolution technique (SSRT). The technique starts with the ratio subtraction method followed by ratio difference method for determination of TMP and SMZ. Method (B) is successive derivative ratio technique (SDR). Method (C) is mean centering of the ratio spectra (MCR). The developed multivariate methods are principle component regression (PCR) and partial least squares (PLS). The specificity of the developed methods is investigated by analyzing laboratory prepared mixtures containing different ratios of the three drugs. The obtained results are statistically compared with those obtained by the official methods, showing no significant difference with respect to accuracy and precision at p = 0.05.
Tommasino, Paolo; Campolo, Domenico
2017-01-01
A major challenge in robotics and computational neuroscience is relative to the posture/movement problem in presence of kinematic redundancy. We recently addressed this issue using a principled approach which, in conjunction with nonlinear inverse optimization, allowed capturing postural strategies such as Donders' law. In this work, after presenting this general model specifying it as an extension of the Passive Motion Paradigm, we show how, once fitted to capture experimental postural strategies, the model is actually able to also predict movements. More specifically, the passive motion paradigm embeds two main intrinsic components: joint damping and joint stiffness. In previous work we showed that joint stiffness is responsible for static postures and, in this sense, its parameters are regressed to fit to experimental postural strategies. Here, we show how joint damping, in particular its anisotropy, directly affects task-space movements. Rather than using damping parameters to fit a posteriori task-space motions, we make the a priori hypothesis that damping is proportional to stiffness. This remarkably allows a postural-fitted model to also capture dynamic performance such as curvature and hysteresis of task-space trajectories during wrist pointing tasks, confirming and extending previous findings in literature. PMID:29249954
Riad, Safaa M; Salem, Hesham; Elbalkiny, Heba T; Khattab, Fatma I
2015-04-05
Five, accurate, precise, and sensitive univariate and multivariate spectrophotometric methods were developed for the simultaneous determination of a ternary mixture containing Trimethoprim (TMP), Sulphamethoxazole (SMZ) and Oxytetracycline (OTC) in waste water samples collected from different cites either production wastewater or livestock wastewater after their solid phase extraction using OASIS HLB cartridges. In univariate methods OTC was determined at its λmax 355.7 nm (0D), while (TMP) and (SMZ) were determined by three different univariate methods. Method (A) is based on successive spectrophotometric resolution technique (SSRT). The technique starts with the ratio subtraction method followed by ratio difference method for determination of TMP and SMZ. Method (B) is successive derivative ratio technique (SDR). Method (C) is mean centering of the ratio spectra (MCR). The developed multivariate methods are principle component regression (PCR) and partial least squares (PLS). The specificity of the developed methods is investigated by analyzing laboratory prepared mixtures containing different ratios of the three drugs. The obtained results are statistically compared with those obtained by the official methods, showing no significant difference with respect to accuracy and precision at p=0.05. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohammed, Irshad; Gnedin, Nickolay Y.
Baryonic effects are amongst the most severe systematics to the tomographic analysis of weak lensing data which is the principal probe in many future generations of cosmological surveys like LSST, Euclid etc.. Modeling or parameterizing these effects is essential in order to extract valuable constraints on cosmological parameters. In a recent paper, Eifler et al. (2015) suggested a reduction technique for baryonic effects by conducting a principal component analysis (PCA) and removing the largest baryonic eigenmodes from the data. In this article, we conducted the investigation further and addressed two critical aspects. Firstly, we performed the analysis by separating the simulations into training and test sets, computing a minimal set of principle components from the training set and examining the fits on the test set. We found that using only four parameters, corresponding to the four largest eigenmodes of the training set, the test sets can be fitted thoroughly with an RMSmore » $$\\sim 0.0011$$. Secondly, we explored the significance of outliers, the most exotic/extreme baryonic scenarios, in this method. We found that excluding the outliers from the training set results in a relatively bad fit and degraded the RMS by nearly a factor of 3. Therefore, for a direct employment of this method to the tomographic analysis of the weak lensing data, the principle components should be derived from a training set that comprises adequately exotic but reasonable models such that the reality is included inside the parameter domain sampled by the training set. The baryonic effects can be parameterized as the coefficients of these principle components and should be marginalized over the cosmological parameter space.« less
21 CFR 812.25 - Investigational plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... number, age, sex, and condition. (d) Description of device. A description of each important component, ingredient, property, and principle of operation of the device and of each anticipated change in the device...
21 CFR 812.25 - Investigational plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... number, age, sex, and condition. (d) Description of device. A description of each important component, ingredient, property, and principle of operation of the device and of each anticipated change in the device...
21 CFR 812.25 - Investigational plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... number, age, sex, and condition. (d) Description of device. A description of each important component, ingredient, property, and principle of operation of the device and of each anticipated change in the device...
21 CFR 812.25 - Investigational plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... number, age, sex, and condition. (d) Description of device. A description of each important component, ingredient, property, and principle of operation of the device and of each anticipated change in the device...
21 CFR 812.25 - Investigational plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... number, age, sex, and condition. (d) Description of device. A description of each important component, ingredient, property, and principle of operation of the device and of each anticipated change in the device...
QSAR modeling of flotation collectors using principal components extracted from topological indices.
Natarajan, R; Nirdosh, Inderjit; Basak, Subhash C; Mills, Denise R
2002-01-01
Several topological indices were calculated for substituted-cupferrons that were tested as collectors for the froth flotation of uranium. The principal component analysis (PCA) was used for data reduction. Seven principal components (PC) were found to account for 98.6% of the variance among the computed indices. The principal components thus extracted were used in stepwise regression analyses to construct regression models for the prediction of separation efficiencies (Es) of the collectors. A two-parameter model with a correlation coefficient of 0.889 and a three-parameter model with a correlation coefficient of 0.913 were formed. PCs were found to be better than partition coefficient to form regression equations, and inclusion of an electronic parameter such as Hammett sigma or quantum mechanically derived electronic charges on the chelating atoms did not improve the correlation coefficient significantly. The method was extended to model the separation efficiencies of mercaptobenzothiazoles (MBT) and aminothiophenols (ATP) used in the flotation of lead and zinc ores, respectively. Five principal components were found to explain 99% of the data variability in each series. A three-parameter equation with correlation coefficient of 0.985 and a two-parameter equation with correlation coefficient of 0.926 were obtained for MBT and ATP, respectively. The amenability of separation efficiencies of chelating collectors to QSAR modeling using PCs based on topological indices might lead to the selection of collectors for synthesis and testing from a virtual database.
From perception to art: how vision creates meanings.
Pinna, Baingio; Reeves, Adam
2009-01-01
This article describes the relationship between Art, as painting or sculpture, and a new theory of perceptual meaning, which builds on and now further develops the Gestalt principles. A key new idea in the theory is that higher-order groupings principles exist which, like the spatial grouping articulated by the principle of Prägnanz, helps to associate and combine stimuli, but which, unlike the Gestalt laws, can explain combinations of dissimilar as well as similar forms of visual information in a lawful manner. Similarities and dissimilarities are put together again by virtue of another and more global grouping factor that overcomes the dissimilarities of the components: it is some kind of meaning principle that perceptually solves the differences among whole and elements at a higher level, making them appear strongly linked just by virtue of the differences. In this way, similarities and dissimilarities complement and do not exclude each other. Such higher-order principles of grouping-by-meaning are articulated and illustrated using Art, from prehistoric to modern.
Analysis and improvement measures of flight delay in China
NASA Astrophysics Data System (ADS)
Zang, Yuhang
2017-03-01
Firstly, this paper establishes the principal component regression model to analyze the data quantitatively, based on principal component analysis to get the three principal component factors of flight delays. Then the least square method is used to analyze the factors and obtained the regression equation expression by substitution, and then found that the main reason for flight delays is airlines, followed by weather and traffic. Aiming at the above problems, this paper improves the controllable aspects of traffic flow control. For reasons of traffic flow control, an adaptive genetic queuing model is established for the runway terminal area. This paper, establish optimization method that fifteen planes landed simultaneously on the three runway based on Beijing capital international airport, comparing the results with the existing FCFS algorithm, the superiority of the model is proved.
Structured functional additive regression in reproducing kernel Hilbert spaces.
Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen
2014-06-01
Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.
Introduction to uses and interpretation of principal component analyses in forest biology.
J. G. Isebrands; Thomas R. Crow
1975-01-01
The application of principal component analysis for interpretation of multivariate data sets is reviewed with emphasis on (1) reduction of the number of variables, (2) ordination of variables, and (3) applications in conjunction with multiple regression.
Pinna, Baingio; Tanca, Maria
2008-05-23
The watercolor illusion is a long-range color assimilation (coloration effect) imparting a figure-ground segregation (figural effect) across large enclosed areas (B. Pinna, 1987; B. Pinna, G. Brelstaff, & L. Spillmann, 2001; B. Pinna, L. Spillmann, & J. S. Werner, 2003; B. Pinna, J. S. Werner, & L. Spillmann, 2003). The watercolored figure has a very poorly reversible or univocal figure-ground segregation and strongly enhances the unilateral belongingness of the boundaries (E. Rubin, 1915), a principle stating that the boundaries belong only to the figure and not to the background. The figural effect determines grouping and figure-ground segregation more strongly than the well-known Gestalt principles. Under watercolor conditions both the figure and the background assume new properties becoming respectively bulging object and hole both with a 3-D volumetric appearance (object-hole effect). Our purposes were: (i) to demonstrate that the hole induced by the watercolor illusion has unique figural properties comparable to those of the object and not present in the background induced by the known figure-ground principles; (ii) to demonstrate a dissociation of the object-hole effect from the coloration one; (iii) to demonstrate that the object-hole effect depends on a new principle. This was psychophysically tested by weakening (ungrouping) the whole figural organization of the watercolor illusion, i.e. by imparting motion to only some components of a stimulus, while other components remain stationary. The results showed that (i) subjects perceived moving holes more strongly than moving figures or objects enlarging and shrinking. (ii) Paradoxically, moving holes appear more as figures than the bulging surfaces. (iii) When motion was imparted to components that while stationary were perceived as objects, their figurality is further enhanced (summation effect). (iv) When object-hole and coloration effects were dissociated no significant difference compared to illusory colored conditions was reported. Coloration can be considered independent from the object-hole effect of the watercolor illusion. The object-hole effect may depend on the "asymmetric luminance contrast principle" (B. Pinna, 2005).
NASA Astrophysics Data System (ADS)
Mazurova, Elena; Lapshin, Aleksey
2013-04-01
The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.
Principles for system level electrochemistry
NASA Technical Reports Server (NTRS)
Thaller, L. H.
1986-01-01
The higher power and higher voltage levels anticipated for future space missions have required a careful review of the techniques currently in use to preclude battery problems that are related to the dispersion characteristics of the individual cells. Not only are the out-of-balance problems accentuated in these larger systems, but the thermal management considerations also require a greater degree of accurate design. Newer concepts which employ active cooling techniques are being developed which permit higher rates of discharge and tighter packing densities for the electrochemical components. This paper will put forward six semi-independent principles relating to battery systems. These principles will progressively address cell, battery and finally system related aspects of large electrochemical storage systems.
... resulting from translocation there is a hereditary component – accounting for about 1% of all cases of Down ... Another genetic test called FISH can apply similar principles and confirm a diagnosis in a shorter amount ...
Validation of Virtual Environments Incorporating Virtual Operators for Procedural Learning
2012-09-01
according to Hierarchical Task Analysis principles (Annett, 2003; Annett & Duncan, 1967; Annett, Duncan, Stammers & Gray, 1971; Annett & Stanton, 2000...the literature (Anderson, 2001; Haider & Grensch, 2002; Heathcote et al., 2000; Suzuki & Ohnishi, 2007 ). Nevertheless, the analysis of the regression...analysis and training design. Occupational Psychology, 41. Annett, J., Duncan, K. D., Stammers , R. B. & Gray, M. J. (1971). Task Analysis. London
Austin, Peter C
2010-04-22
Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.
Selection of optimal complexity for ENSO-EMR model by minimum description length principle
NASA Astrophysics Data System (ADS)
Loskutov, E. M.; Mukhin, D.; Mukhina, A.; Gavrilov, A.; Kondrashov, D. A.; Feigin, A. M.
2012-12-01
One of the main problems arising in modeling of data taken from natural system is finding a phase space suitable for construction of the evolution operator model. Since we usually deal with strongly high-dimensional behavior, we are forced to construct a model working in some projection of system phase space corresponding to time scales of interest. Selection of optimal projection is non-trivial problem since there are many ways to reconstruct phase variables from given time series, especially in the case of a spatio-temporal data field. Actually, finding optimal projection is significant part of model selection, because, on the one hand, the transformation of data to some phase variables vector can be considered as a required component of the model. On the other hand, such an optimization of a phase space makes sense only in relation to the parametrization of the model we use, i.e. representation of evolution operator, so we should find an optimal structure of the model together with phase variables vector. In this paper we propose to use principle of minimal description length (Molkov et al., 2009) for selection models of optimal complexity. The proposed method is applied to optimization of Empirical Model Reduction (EMR) of ENSO phenomenon (Kravtsov et al. 2005, Kondrashov et. al., 2005). This model operates within a subset of leading EOFs constructed from spatio-temporal field of SST in Equatorial Pacific, and has a form of multi-level stochastic differential equations (SDE) with polynomial parameterization of the right-hand side. Optimal values for both the number of EOF, the order of polynomial and number of levels are estimated from the Equatorial Pacific SST dataset. References: Ya. Molkov, D. Mukhin, E. Loskutov, G. Fidelin and A. Feigin, Using the minimum description length principle for global reconstruction of dynamic systems from noisy time series, Phys. Rev. E, Vol. 80, P 046207, 2009 Kravtsov S, Kondrashov D, Ghil M, 2005: Multilevel regression modeling of nonlinear processes: Derivation and applications to climatic variability. J. Climate, 18 (21): 4404-4424. D. Kondrashov, S. Kravtsov, A. W. Robertson and M. Ghil, 2005. A hierarchy of data-based ENSO models. J. Climate, 18, 4425-4444.
Sanford, Ward E.; Nelms, David L.; Pope, Jason P.; Selnick, David L.
2015-01-01
Mean long-term hydrologic budget components, such as recharge and base flow, are often difficult to estimate because they can vary substantially in space and time. Mean long-term fluxes were calculated in this study for precipitation, surface runoff, infiltration, total evapotranspiration (ET), riparian ET, recharge, base flow (or groundwater discharge) and net total outflow using long-term estimates of mean ET and precipitation and the assumption that the relative change in storage over that 30-year period is small compared to the total ET or precipitation. Fluxes of these components were first estimated on a number of real-time-gaged watersheds across Virginia. Specific conductance was used to distinguish and separate surface runoff from base flow. Specific-conductance (SC) data were collected every 15 minutes at 75 real-time gages for approximately 18 months between March 2007 and August 2008. Precipitation was estimated for 1971-2000 using PRISM climate data. Precipitation and temperature from the PRISM data were used to develop a regression-based relation to estimate total ET. The proportion of watershed precipitation that becomes surface runoff was related to physiographic province and rock type in a runoff regression equation. A new approach to estimate riparian ET using seasonal SC data gave results consistent with those from other methods. Component flux estimates from the watersheds were transferred to flux estimates for counties and independent cities using the ET and runoff regression equations. Only 48 of the 75 watersheds yielded sufficient data, and data from these 48 were used in the final runoff regression equation. Final results for the study are presented as component flux estimates for all counties and independent cities in Virginia. The method has the potential to be applied in many other states in the U.S. or in other regions or countries of the world where climate and stream flow data are plentiful.
A generalized theory of chromatography and multistep liquid extraction
NASA Astrophysics Data System (ADS)
Chizhkov, V. P.; Boitsov, V. N.
2017-03-01
A generalized theory of chromatography and multistep liquid extraction is developed. The principles of highly efficient processes for fine preparative separation of binary mixture components on a fixed sorbent layer are discussed.
ERIC Educational Resources Information Center
Rechnitz, Garry A.
1988-01-01
Describes theory and principles behind biosensors that incorporate biological components as part of a sensor or probe. Projects major applications in medicine and veterinary medicine, biotechnology, food and agriculture, environmental studies, and the military. Surveys current use of biosensors. (ML)
Principles of Precision Spectrophotometry: An Advanced Undergraduate Experiment
ERIC Educational Resources Information Center
Billmeyer, Fred W., Jr.
1974-01-01
Describes an experiment designed to familiarize students with the operation of a precision spectrophotometer, the effects of changes in operating variables, and the characteristics of such components as sources and detectors. (SLH)
Changes in Cross-Correlations as an Indicator for Systemic Risk
NASA Astrophysics Data System (ADS)
Zheng, Zeyu; Podobnik, Boris; Feng, Ling; Li, Baowen
2012-11-01
The 2008-2012 global financial crisis began with the global recession in December 2007 and exacerbated in September 2008, during which the U.S. stock markets lost 20% of value from its October 11 2007 peak. Various studies reported that financial crisis are associated with increase in both cross-correlations among stocks and stock indices and the level of systemic risk. In this paper, we study 10 different Dow Jones economic sector indexes, and applying principle component analysis (PCA) we demonstrate that the rate of increase in principle components with short 12-month time windows can be effectively used as an indicator of systemic risk--the larger the change of PC1, the higher the increase of systemic risk. Clearly, the higher the level of systemic risk, the more likely a financial crisis would occur in the near future.
Biominerals- hierarchical nanocomposites: the example of bone
Beniash, Elia
2010-01-01
Many organisms incorporate inorganic solids in their tissues to enhance their functional, primarily mechanical, properties. These mineralized tissues, also called biominerals, are unique organo-mineral nanocomposites, organized at several hierarchical levels, from nano- to macroscale. Unlike man made composite materials, which often are simple physical blends of their components, the organic and inorganic phases in biominerals interface at the molecular level. Although these tissues are made of relatively weak components at ambient conditions, their hierarchical structural organization and intimate interactions between different elements lead to superior mechanical properties. Understanding basic principles of formation, structure and functional properties of these tissues might lead to novel bioinspired strategies for material design and better treatments for diseases of the mineralized tissues. This review focuses on general principles of structural organization, formation and functional properties of biominerals on the example the bone tissues. PMID:20827739
Essentials of finance for occupational physicians.
Miller, K; Fallon, L F
2001-01-01
Comprehending the principles of finance is paramount to understanding the way an organization chooses to generate and use its financial resources. Financial principles may be employed in the same way a physician reviews fundamental systems to gauge a person s health. Just as basic anatomical and physiological components are used to assess the health of an individual, basic financial elements exist to ascertain the health of an organization. This chapter explains risk assessment, accounts receivable management, inventory, depreciation, capital formation, ratio analysis, and more.
EHV systems technology - A look at the principles and current status. [Electric and Hybrid Vehicle
NASA Technical Reports Server (NTRS)
Kurtz, D. W.; Levin, R. R.
1983-01-01
An examination of the basic principles and practices of systems engineering is undertaken in the context of their application to the component and subsystem technologies involved in electric and hybrid vehicle (EHV) development. The limitations of purely electric vehicles are contrasted with hybrid, heat engine-incorporating vehicle technology, which is inherently more versatile. A hybrid vehicle concept assessment methodology is presented which employs current technology and yet fully satisfies U.S. Department of Energy petroleum displacement goals.
Breast volume assessment: comparing five different techniques.
Bulstrode, N; Bellamy, E; Shrotria, S
2001-04-01
Breast volume assessment is not routinely performed pre-operatively because as yet there is no accepted technique. There have been a variety of methods published, but this is the first study to compare these techniques. We compared volume measurements obtained from mammograms (previously compared to mastectomy specimens) with estimates of volume obtained from four other techniques: thermoplastic moulding, magnetic resonance imaging, Archimedes principle and anatomical measurements. We also assessed the acceptability of each method to the patient. Measurements were performed on 10 women, which produced results for 20 breasts. We were able to calculate regression lines between volume measurements obtained from mammography to the other four methods: (1) magnetic resonance imaging (MRI), 379+(0.75 MRI) [r=0.48], (2) Thermoplastic moulding, 132+(1.46 Thermoplastic moulding) [r=0.82], (3) Anatomical measurements, 168+(1.55 Anatomical measurements) [r=0.83]. (4) Archimedes principle, 359+(0.6 Archimedes principle) [r=0.61] all units in cc. The regression curves for the different techniques are variable and it is difficult to reliably compare results. A standard method of volume measurement should be used when comparing volumes before and after intervention or between individual patients, and it is unreliable to compare volume measurements using different methods. Calculating the breast volume from mammography has previously been compared to mastectomy samples and shown to be reasonably accurate. However we feel thermoplastic moulding shows promise and should be further investigated as it gives not only a volume assessment but a three-dimensional impression of the breast shape, which may be valuable in assessing cosmesis following breast-conserving-surgery.
Ridge Regression Signal Processing
NASA Technical Reports Server (NTRS)
Kuhl, Mark R.
1990-01-01
The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.
Krishna P. Poudel; Temesgen Hailemariam
2016-01-01
Using data from destructively sampled Douglas-fir and lodgepole pine trees, we evaluated the performance of regional volume and component biomass equations in terms of bias and RMSE. The volume and component biomass equations were calibrated using three different adjustment methods that used: (a) a correction factor based on ordinary least square regression through...
Khazaei, Salman; Rezaeian, Shahab; Khazaei, Somayeh; Mansori, Kamyar; Sanjari Moghaddam, Ali; Ayubi, Erfan
2016-01-01
Geographic disparity for colorectal cancer (CRC) incidence and mortality according to the human development index (HDI) might be expected. This study aimed at quantifying the effect measure of association HDI and its components on the CRC incidence and mortality. In this ecological study, CRC incidence and mortality was obtained from GLOBOCAN, the global cancer project for 172 countries. Data were extracted about HDI 2013 for 169 countries from the World Bank report. Linear regression was constructed to measure effects of HDI and its components on CRC incidence and mortality. A positive trend between increasing HDI of countries and age-standardized rates per 100,000 of CRC incidence and mortality was observed. Among HDI components education was the strongest effect measure of association on CRC incidence and mortality, regression coefficients (95% confidence intervals) being 2.8 (2.4, 3.2) and 0.9 (0.8, 1), respectively. HDI and its components were positively related with CRC incidence and mortality and can be considered as targets for prevention and treatment intervention or tracking geographic disparities.
NASA Astrophysics Data System (ADS)
Nakamuta, Y.; Urata, K.; Shibata, Y.; Kuwahara, Y.
2017-03-01
In Lindsley's thermometry, a revised sequence of calculation of components is proposed for clinopyroxene, in which kosmochlor component is added. Temperatures obtained for the components calculated by the revised method are about 50 °C lower than those obtained for the components calculated by the Lindsley's original method and agree well with temperatures obtained from orthopyroxenes. Ca-partitioning between clino- and orthopyroxenes is then thought to be equilibrated in types 5 to 7 ordinary chondrites. The temperatures for Tuxtuac (LL5), Dhurmsala (LL6), NWA 2092 (LL6/7), and Dho 011 (LL7) are 767-793°, 818-835°, 872-892°, and 917-936°C, respectively, suggesting that chondrites of higher petrographic types show higher equilibrium temperatures of pyroxenes. The regression equations which relate temperature and Wo and Fs contents in the temperature-contoured pyroxene quadrilateral of 1 atm of Lindsley (1983) are also determined by the least squares method. It is possible to reproduce temperatures with an error less than 20 °C (2SE) using the regression equations.
Naval Research Logistics Quarterly. Volume 28. Number 3,
1981-09-01
denotes component-wise maximum. f has antone (isotone) differences on C x D if for cl < c2 and d, < d2, NAVAL RESEARCH LOGISTICS QUARTERLY VOL. 28...or negative correlations and linear or nonlinear regressions. Given are the mo- ments to order two and, for special cases, (he regression function and...data sets. We designate this bnb distribution as G - B - N(a, 0, v). The distribution admits only of positive correlation and linear regressions
Zhong, Cong; Yang, Zhongfang; Jiang, Wei; Hu, Baoqing; Hou, Qingye; Yu, Tao; Li, Jie
2016-12-15
Industrialization and urbanization have led to a deterioration in air quality and provoked some serious environmental concerns. Fifty-four samples of atmospheric deposition were collected from an emerging industrial area and analyzed to determine the concentrations of 11 trace elements (As, Cd, Cu, Fe, Hg, Mn, Mo, Pb, Se, S and Zn). Multivariate geostatistical analyses were conducted to determine the spatial distribution, possible sources and enrichment degrees of trace elements in atmospheric deposition. Results indicate that As, Fe and Mo mainly originated from soil, their natural parent materials, while the remaining trace elements were strongly influenced by anthropogenic or natural activities, such as coal combustion in coal-fired power plants (Pb, Se and S), manganese ore (Mn, Cd and Hg) and metal smelting (Cu and Zn). The results of ecological geochemical assessment indicate that Cd, Pb and Zn are the elements of priority concern, followed by Mn and Cu, and other heavy metals, which represent little threat to local environment. It was determine that the resuspension of soil particles impacted the behavior of heavy metals by 55.3%; the impact of the coal-fired power plants was 18.9%; and the contribution of the local manganese industry was 9.6%. The comparison of consequences from various statistical methods (principal component analysis (PCA), cluster analysis (CA), enrichment factor (EF) and absolute principle component score (APCS)-multiple linear regression (MLR)) confirmed the credibility of this research. Copyright © 2016 Elsevier B.V. All rights reserved.
Yusuf, O B; Bamgboye, E A; Afolabi, R F; Shodimu, M A
2014-09-01
Logistic regression model is widely used in health research for description and predictive purposes. Unfortunately, most researchers are sometimes not aware that the underlying principles of the techniques have failed when the algorithm for maximum likelihood does not converge. Young researchers particularly postgraduate students may not know why separation problem whether quasi or complete occurs, how to identify it and how to fix it. This study was designed to critically evaluate convergence issues in articles that employed logistic regression analysis published in an African Journal of Medicine and medical sciences between 2004 and 2013. Problems of quasi or complete separation were described and were illustrated with the National Demographic and Health Survey dataset. A critical evaluation of articles that employed logistic regression was conducted. A total of 581 articles was reviewed, of which 40 (6.9%) used binary logistic regression. Twenty-four (60.0%) stated the use of logistic regression model in the methodology while none of the articles assessed model fit. Only 3 (12.5%) properly described the procedures. Of the 40 that used the logistic regression model, the problem of convergence occurred in 6 (15.0%) of the articles. Logistic regression tends to be poorly reported in studies published between 2004 and 2013. Our findings showed that the procedure may not be well understood by researchers since very few described the process in their reports and may be totally unaware of the problem of convergence or how to deal with it.
The cognitive nexus between Bohr's analogy for the atom and Pauli's exclusion schema.
Ulazia, Alain
2016-03-01
The correspondence principle is the primary tool Bohr used to guide his contributions to quantum theory. By examining the cognitive features of the correspondence principle and comparing it with those of Pauli's exclusion principle, I will show that it did more than simply 'save the phenomena'. The correspondence principle in fact rested on powerful analogies and mental schemas. Pauli's rejection of model-based methods in favor of a phenomenological, rule-based approach was therefore not as disruptive as some historians have indicated. Even at a stage that seems purely phenomenological, historical studies of theoretical development should take into account non-formal, model-based approaches in the form of mental schemas, analogies and images. In fact, Bohr's images and analogies had non-classical components which were able to evoke the idea of exclusion as a prohibition law and as a preliminary mental schema. Copyright © 2016 Elsevier Ltd. All rights reserved.
Gestalt theory: implications for radiology education.
Koontz, Nicholas A; Gunderman, Richard B
2008-05-01
The Gestalt theory of modern psychology is grounded in the ideas that holistic rather than atomistic approaches are necessary to understand the mind, and that the mental whole is greater than the sum of its component parts. Although the Gestalt school fell out of favor due to its descriptive rather than explanatory nature, it permanently changed our understanding of perception. For the radiologist, such fundamental Gestalt concepts as figure-ground relationships and a variety of "grouping principles" (the laws of closure, proximity, similarity, common region, continuity, and symmetry) are ubiquitous in daily work, not to mention in art and personal life. By considering the applications of these principles and the stereotypical ways in which humans perceive visual stimuli, a radiology learner may incur fewer errors of diagnosis. This article serves to introduce several important principles of Gestalt theory, identify examples of these principles in widely recognizable fine art, and highlight their implications for radiology education.
Zheng, Jinghui; Wan, Yi; Chi, Jianhuai; Shen, Dekai; Wu, Tingting; Li, Weimin; Du, Pengcheng
2012-01-01
The present study induced in vitro-cultured passage 4 bone marrow-derived mesenchymal stem cells to differentiate into neural-like cells with a mixture of alkaloid, polysaccharide, aglycone, glycoside, essential oils, and effective components of Buyang Huanwu decoction (active principle region of decoction for invigorating yang for recuperation). After 28 days, nestin and neuron-specific enolase were expressed in the cytoplasm. Reverse transcription-PCR and western blot analyses showed that nestin and neuron-specific enolase mRNA and protein expression was greater in the active principle region group compared with the original formula group. Results demonstrated that the active principle region of Buyang Huanwu decoction induced greater differentiation of rat bone marrow-derived mesenchymal stem cells into neural-like cells in vitro than the original Buyang Huanwu decoction formula. PMID:25806066
Uncertainty principles for inverse source problems for electromagnetic and elastic waves
NASA Astrophysics Data System (ADS)
Griesmaier, Roland; Sylvester, John
2018-06-01
In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.
Alarcón, Renato D; Suarez-Richards, Manuel; Sarabia, Silvana
2014-01-01
Medical education has incorporated psychiatric or mental health components more consistently during the last decades thanks to various factors such as: advances in neurobiological research; the increasing prevalence of mental disorders in global health; the increasingly close relationship between mental health and public health; comorbidities with medical conditions and the impact of sociocultural phenomena in clinical manifestations, diagnosis, treatment, prognosis and prevention. Based on acquisition of core competencies and ethical principles of universal acceptance, the teaching process examined in this article proposes an education based on the provision of clinical experiences integrated throughout the collection of adequate information, the development of diagnostic capabilities, and exposure to a wide variety of forms of academic assessment of students and residents in training. The cultural components of psychiatric education receive special mention; we provide examples of their systematic integration with the acquisition of general skills. The teaching tools include theoretical and applied activities and supervision. Particular attention is paid to how the principles of modern psychiatric medical education, including cultural aspects and practice of holistic health care objectives, can and should be in effect in Latin American countries.
McClements, David Julian; Decker, Eric Andrew; Park, Yeonhwa; Weiss, Jochen
2009-06-01
There have been major advances in the design and fabrication of structured delivery systems for the encapsulation of nutraceutical and functional food components. A wide variety of delivery systems is now available, each with its own advantages and disadvantages for particular applications. This review begins by discussing some of the major nutraceutical and functional food components that need to be delivered and highlights the main limitations to their current utilization within the food industry. It then discusses the principles underpinning the rational design of structured delivery systems: the structural characteristics of the building blocks; the nature of the forces holding these building blocks together; and, the different ways of assembling these building blocks into structured delivery systems. Finally, we review the major types of structured delivery systems that are currently available to food scientists: lipid-based (simple, multiple, multilayer, and solid lipid particle emulsions); surfactant-based (simple micelles, mixed micelles, vesicles, and microemulsions) and biopolymer-based (soluble complexes, coacervates, hydrogel droplets, and particles). For each type of delivery system we describe its preparation, properties, advantages, and limitations.
Basic Science Considerations in Primary Total Hip Replacement Arthroplasty
Mirza, Saqeb B; Dunlop, Douglas G; Panesar, Sukhmeet S; Naqvi, Syed G; Gangoo, Shafat; Salih, Saif
2010-01-01
Total Hip Replacement is one of the most common operations performed in the developed world today. An increasingly ageing population means that the numbers of people undergoing this operation is set to rise. There are a numerous number of prosthesis on the market and it is often difficult to choose between them. It is therefore necessary to have a good understanding of the basic scientific principles in Total Hip Replacement and the evidence base underpinning them. This paper reviews the relevant anatomical and biomechanical principles in THA. It goes on to elaborate on the structural properties of materials used in modern implants and looks at the evidence base for different types of fixation including cemented and uncemented components. Modern bearing surfaces are discussed in addition to the scientific basis of various surface engineering modifications in THA prostheses. The basic science considerations in component alignment and abductor tension are also discussed. A brief discussion on modular and custom designs of THR is also included. This article reviews basic science concepts and the rationale underpinning the use of the femoral and acetabular component in total hip replacement. PMID:20582240
NASA Astrophysics Data System (ADS)
Baptistao, Mariana; Rocha, Werickson Fortunato de Carvalho; Poppi, Ronei Jesus
2011-09-01
In this work, it was used imaging spectroscopy and chemometric tools for the development and analysis of paracetamol and excipients in pharmaceutical formulations. It was also built concentration maps to study the distribution of the drug in the tablets surface. Multivariate models based on PLS regression were developed for paracetamol and excipients concentrations prediction. For the construction of the models it was used 31 samples in the tablet form containing the active principle in a concentration range of 30.0-90.0% (w/w) and errors below to 5% were obtained for validation samples. Finally, the study of the distribution in the drug was performed through the distribution maps of concentration of active principle and excipients. The analysis of maps showed the complementarity between the active principle and excipients in the tablets. The region with a high concentration of a constituent must have, necessarily, absence or low concentration of the other one. Thus, an alternative method for the paracetamol drug quality monitoring is presented.
An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys
ERIC Educational Resources Information Center
Wetzel, Eunike; Xu, Xueli; von Davier, Matthias
2015-01-01
In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…
Modeling health survey data with excessive zero and K responses.
Lin, Ting Hsiang; Tsai, Min-Hsiao
2013-04-30
Zero-inflated Poisson regression is a popular tool used to analyze data with excessive zeros. Although much work has already been performed to fit zero-inflated data, most models heavily depend on special features of the individual data. To be specific, this means that there is a sizable group of respondents who endorse the same answers making the data have peaks. In this paper, we propose a new model with the flexibility to model excessive counts other than zero, and the model is a mixture of multinomial logistic and Poisson regression, in which the multinomial logistic component models the occurrence of excessive counts, including zeros, K (where K is a positive integer) and all other values. The Poisson regression component models the counts that are assumed to follow a Poisson distribution. Two examples are provided to illustrate our models when the data have counts containing many ones and sixes. As a result, the zero-inflated and K-inflated models exhibit a better fit than the zero-inflated Poisson and standard Poisson regressions. Copyright © 2012 John Wiley & Sons, Ltd.
Overview of MDX-A System for Medical Diagnosis
Mittal, S.; Chandrasekaran, B.; Smith, J.
1979-01-01
We describe the design and performance of MDX, an experimental medical diagnosis system, which currently diagnoses in the syndrome called Cholestasis. The needed medical knowledge is represented in a scheme called conceptual structures, which can be viewed as a collection of conceptual experts interacting according to certain well-defined principles. MDX has three components: the diagnostic system, a patient data base and a radiology consultant. We describe these components, the inter-expert communication system and the query language used by these components. The system is illustrated by means of its performance on a real case.
NASA Technical Reports Server (NTRS)
Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.
1987-01-01
The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.
Safiri, Saeid; Kelishadi, Roya; Heshmat, Ramin; Rahimi, Ali; Djalalinia, Shirin; Ghasemian, Anoosheh; Sheidaei, Ali; Motlagh, Mohammad Esmaeil; Ardalan, Gelayol; Mansourian, Morteza; Asayesh, Hamid; Sepidarkish, Mahdi; Qorbani, Mostafa
2016-09-14
The present study set to describe the socioeconomic inequality associated with oral hygiene behavior among Iranian pediatric population. A representative sample of 13486 school students aged 6-18 years was selected through multistage random cluster sampling method from urban and rural areas of 30 provinces in Iran. Principle Component Analyses (PCA) correlated variables summarized as socioeconomic status (SES). Association of independent variables with tooth brushing was assessed through logistic regression analysis. Decomposition of the gap in tooth brushing between the first and fifth SES quintiles was assessed using the counterfactual decomposition technique. To assess the relation between tooth brushing and each socioeconomic category, Concentration Index (C) and the slope index of inequality (SII) were used, representing the linear regression coefficient. The participation rate was 90.6 % (50.7 % boys and 75.6 % urban inhabitants). The mean age of participants was 12.47 ± 3.36 years. The frequency of tooth brushing increased across SES quintiles, prevalence of tooth brushing between the first and fifth quintile, under 20 % difference, increased from 58.22 (95 % CI: 56.24,60.20) to 78.61 (95 % CI: 77.00,80.24). Only 3 % of the difference is explained by the factors considered in the study, and 17 % remained unknown. Residence area, family size, and smoking status made a significant contribution to the gap between the first and last SE groups. Residence area [ -2.01 (95 % CI: -3.46, -0.55)] was along the maximum levels of gaps between SE categories. The findings revealed a socio-economic inequality in oral health behavior in Iranian children and adolescents. Also, factors influencing oral health are addressed to develop and implement complementary public health actions.
Gorban, A N; Mirkes, E M; Zinovyev, A
2016-12-01
Most of machine learning approaches have stemmed from the application of minimizing the mean squared distance principle, based on the computationally efficient quadratic optimization methods. However, when faced with high-dimensional and noisy data, the quadratic error functionals demonstrated many weaknesses including high sensitivity to contaminating factors and dimensionality curse. Therefore, a lot of recent applications in machine learning exploited properties of non-quadratic error functionals based on L 1 norm or even sub-linear potentials corresponding to quasinorms L p (0
NASA Technical Reports Server (NTRS)
Wolf, S. F.; Lipschutz, M. E.
1993-01-01
Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.
Structured functional additive regression in reproducing kernel Hilbert spaces
Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen
2013-01-01
Summary Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application. PMID:25013362
Bayesian kernel machine regression for estimating the health effects of multi-pollutant mixtures.
Bobb, Jennifer F; Valeri, Linda; Claus Henn, Birgit; Christiani, David C; Wright, Robert O; Mazumdar, Maitreyi; Godleski, John J; Coull, Brent A
2015-07-01
Because humans are invariably exposed to complex chemical mixtures, estimating the health effects of multi-pollutant exposures is of critical concern in environmental epidemiology, and to regulatory agencies such as the U.S. Environmental Protection Agency. However, most health effects studies focus on single agents or consider simple two-way interaction models, in part because we lack the statistical methodology to more realistically capture the complexity of mixed exposures. We introduce Bayesian kernel machine regression (BKMR) as a new approach to study mixtures, in which the health outcome is regressed on a flexible function of the mixture (e.g. air pollution or toxic waste) components that is specified using a kernel function. In high-dimensional settings, a novel hierarchical variable selection approach is incorporated to identify important mixture components and account for the correlated structure of the mixture. Simulation studies demonstrate the success of BKMR in estimating the exposure-response function and in identifying the individual components of the mixture responsible for health effects. We demonstrate the features of the method through epidemiology and toxicology applications. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Sparse modeling of spatial environmental variables associated with asthma
Chang, Timothy S.; Gangnon, Ronald E.; Page, C. David; Buckingham, William R.; Tandias, Aman; Cowan, Kelly J.; Tomasallo, Carrie D.; Arndt, Brian G.; Hanrahan, Lawrence P.; Guilbert, Theresa W.
2014-01-01
Geographically distributed environmental factors influence the burden of diseases such as asthma. Our objective was to identify sparse environmental variables associated with asthma diagnosis gathered from a large electronic health record (EHR) dataset while controlling for spatial variation. An EHR dataset from the University of Wisconsin’s Family Medicine, Internal Medicine and Pediatrics Departments was obtained for 199,220 patients aged 5–50 years over a three-year period. Each patient’s home address was geocoded to one of 3,456 geographic census block groups. Over one thousand block group variables were obtained from a commercial database. We developed a Sparse Spatial Environmental Analysis (SASEA). Using this method, the environmental variables were first dimensionally reduced with sparse principal component analysis. Logistic thin plate regression spline modeling was then used to identify block group variables associated with asthma from sparse principal components. The addresses of patients from the EHR dataset were distributed throughout the majority of Wisconsin’s geography. Logistic thin plate regression spline modeling captured spatial variation of asthma. Four sparse principal components identified via model selection consisted of food at home, dog ownership, household size, and disposable income variables. In rural areas, dog ownership and renter occupied housing units from significant sparse principal components were associated with asthma. Our main contribution is the incorporation of sparsity in spatial modeling. SASEA sequentially added sparse principal components to Logistic thin plate regression spline modeling. This method allowed association of geographically distributed environmental factors with asthma using EHR and environmental datasets. SASEA can be applied to other diseases with environmental risk factors. PMID:25533437
Sparse modeling of spatial environmental variables associated with asthma.
Chang, Timothy S; Gangnon, Ronald E; David Page, C; Buckingham, William R; Tandias, Aman; Cowan, Kelly J; Tomasallo, Carrie D; Arndt, Brian G; Hanrahan, Lawrence P; Guilbert, Theresa W
2015-02-01
Geographically distributed environmental factors influence the burden of diseases such as asthma. Our objective was to identify sparse environmental variables associated with asthma diagnosis gathered from a large electronic health record (EHR) dataset while controlling for spatial variation. An EHR dataset from the University of Wisconsin's Family Medicine, Internal Medicine and Pediatrics Departments was obtained for 199,220 patients aged 5-50years over a three-year period. Each patient's home address was geocoded to one of 3456 geographic census block groups. Over one thousand block group variables were obtained from a commercial database. We developed a Sparse Spatial Environmental Analysis (SASEA). Using this method, the environmental variables were first dimensionally reduced with sparse principal component analysis. Logistic thin plate regression spline modeling was then used to identify block group variables associated with asthma from sparse principal components. The addresses of patients from the EHR dataset were distributed throughout the majority of Wisconsin's geography. Logistic thin plate regression spline modeling captured spatial variation of asthma. Four sparse principal components identified via model selection consisted of food at home, dog ownership, household size, and disposable income variables. In rural areas, dog ownership and renter occupied housing units from significant sparse principal components were associated with asthma. Our main contribution is the incorporation of sparsity in spatial modeling. SASEA sequentially added sparse principal components to Logistic thin plate regression spline modeling. This method allowed association of geographically distributed environmental factors with asthma using EHR and environmental datasets. SASEA can be applied to other diseases with environmental risk factors. Copyright © 2014 Elsevier Inc. All rights reserved.
Design principles for nickel-hydrogen cells and batteries
NASA Technical Reports Server (NTRS)
Thaller, L. H.; Manzo, M. A.; Gonzalez-Sanabria, O. D.
1985-01-01
Nickel-hydrogen cells and, more recently, bipolar batteries have been built by a variety of organizations. The design principles that have been used by the technology group at the NASA Lewis Research Center draw upon their extensive background in separator technology, alkaline fuel cell technology, and several alkaline cell technology areas. These design principles have been incorporated into both the more contemporary individual pressure vessel (IPV) designs that were pioneered by other groups, as well as the more recent bipolar battery designs using active cooling that are being developed at NASA Lewis Research Center and under contract. These principles are rather straightforward applications of capillary force formalisms, coupled with the slowly developing data base resulting from careful post test analyses. The objective of this overall effort is directed towards the low-Earth-orbit (LEO) application where the cycle life requirements are much more severe than the geosynchronous-orbit (GEO) application. A summary of the design principles employed is presented along with a discussion of the recommendations for component pore sizes and pore size distributions, as well as suggested materials of construction. These will be made based on our experience in these areas to show how these design principles have been translated into operating hardware.
Design principles for nickel-hydrogen cells and batteries
NASA Technical Reports Server (NTRS)
Thaller, L. H.; Manzo, M. A.; Gonzalez-Sanabria, O. D.
1985-01-01
Nickel-hydrogen cells and, more recently, bipolar batteries have been built by a variety of organizations. The design principles that have been used by the technology group at the NASA Lewis Research Center draw upon their extensive background in separator technology, alkaline fuel cell technology, and several alkaline cell technology areas. These design principles have been incorporated into both the more contemporary individual pressure vessel (IPV) designs that were pioneered by other groups, as well as the more recent bipolar battery designs using active cooling that are being developed at NASA Lewis Research Center and under contract. These principles are rather straightforward applications of capillary force formalisms, coupled with the slowly developing data base resulting from careful post test analyses. The objective of this overall effort is directed towards the low-earth-orbit (LEO) application where the cycle life requirements are much more severe than the geosynchronous-orbit (GEO) application. A summary of the design principles employed is presented along with a discussion of the recommendations for component pore sizes and pore size distributions, as well as suggested materials of construction. These will be made based on our experience in these areas to show how these design principles have been translated into operating hardware.
An audience-channel-message-evaluation (ACME) framework for health communication campaigns.
Noar, Seth M
2012-07-01
Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework.
ERIC Educational Resources Information Center
Miller, T. E.; Herreid, C. H.
2008-01-01
This article presents a project intended to produce a model for predicting the risk of attrition of individual students enrolled at the University of South Florida. The project is premised upon the principle that college student attrition is as highly individual and personal as any other aspect of the college-going experience. Students make…
A “good death”: perspectives of Muslim patients and health care providers
Tayeb, Mohamad A.; Al-Zamel, Ersan; Fareed, Muhammed M.; Abouellail, Hesham A.
2010-01-01
BACKGROUND AND OBJECTIVES: Twelve “good death” principles have been identified that apply to Westerners. This study aimed to review the TFHCOP good death perception to determine its validity for Muslim patients and health care providers, and to identify and describe other components of the Muslim good death perspective. SUBJECTS AND METHODS: Participants included 284 Muslims of both genders with different nationalities and careers. We used a 12-question questionnaire based on the 12 principles of the TFHCOP good death definition, followed by face-to-face interviews. We used descriptive statistics to analyze questionnaire responses. However, for new themes, we used a grounded theory approach with a “constant comparisons” method. RESULT: On average, each participant agreed on eight principles of the questionnaire. Dignity, privacy, spiritual and emotional support, access to hospice care, ability to issue advance directives, and to have time to say goodbye were the top priorities. Participants identified three main domains. The first domain was related to faith and belief. The second domain included some principles related to self-esteem and person>s image to friends and family. The third domain was related to satisfaction about family security after the death of the patient. Professional role distinctions were more pronounced than were gender or nationality differences. CONCLUSION: Several aspects of «good death,» as perceived by Western communities, are not recognized as being important by many Muslim patients and health care providers. Furthermore, our study introduced three novel components of good death in Muslim society. PMID:20427938
Data Mining Technologies Inspired from Visual Principle
NASA Astrophysics Data System (ADS)
Xu, Zongben
In this talk we review the recent work done by our group on data mining (DM) technologies deduced from simulating visual principle. Through viewing a DM problem as a cognition problems and treading a data set as an image with each light point located at a datum position, we developed a series of high efficient algorithms for clustering, classification and regression via mimicking visual principles. In pattern recognition, human eyes seem to possess a singular aptitude to group objects and find important structure in an efficient way. Thus, a DM algorithm simulating visual system may solve some basic problems in DM research. From this point of view, we proposed a new approach for data clustering by modeling the blurring effect of lateral retinal interconnections based on scale space theory. In this approach, as the data image blurs, smaller light blobs merge into large ones until the whole image becomes one light blob at a low enough level of resolution. By identifying each blob with a cluster, the blurring process then generates a family of clustering along the hierarchy. The proposed approach provides unique solutions to many long standing problems, such as the cluster validity and the sensitivity to initialization problems, in clustering. We extended such an approach to classification and regression problems, through combatively employing the Weber's law in physiology and the cell response classification facts. The resultant classification and regression algorithms are proven to be very efficient and solve the problems of model selection and applicability to huge size of data set in DM technologies. We finally applied the similar idea to the difficult parameter setting problem in support vector machine (SVM). Viewing the parameter setting problem as a recognition problem of choosing a visual scale at which the global and local structures of a data set can be preserved, and the difference between the two structures be maximized in the feature space, we derived a direct parameter setting formula for the Gaussian SVM. The simulations and applications show that the suggested formula significantly outperforms the known model selection methods in terms of efficiency and precision.
Pourat, Nadereh; Charles, Shana A; Snyder, Sophie
2016-03-01
Care delivery redesign in the form of patient-centered medical home (PCMH) is considered as a potential solution to improve patient outcomes and reduce costs, particularly for patients with chronic conditions. But studies of prevalence or impact at the population level are rare. We aimed to assess whether desired outcomes indicating better care delivery and patient-centeredness were associated with receipt of care according to 3 important PCMH principles. We analyzed data from a representative population survey in California in 2009, focusing on a population with chronic condition who had a usual source of care. We used bivariate, logistic, and negative-binomial regressions. The indicators of PCMH concordant care included continuity of care (personal doctor), care coordination, and care management (individual treatment plan). Outcomes included flu shots, count of outpatient visits, any emergency department visit, timely provider communication, and confidence in self-care. We found that patients whose care was concordant with all 3 PCMH principles were more likely to receive flu shots, more outpatient care, and timely response from providers. Concordance with 2 principles led to some desired outcomes. Concordance with only 1 principle was not associated with desired outcomes. Patients who received care that met 3 key aspects of PCMH: coordination, continuity, and management, had better quality of care and more efficient use of the health care system.
NASA Astrophysics Data System (ADS)
Shadrina, A.; Saruev, L.; Vasenin, S.
2016-09-01
This paper addresses the effectiveness of impact energy use in pilot bore directional drilling at pipe driving. We establish and develop new design-engineering principles for this method. These principles are based on a drill string construction with a new nipple thread connection and a generator construction of strain waves transferred through the drill string. The experiment was conducted on a test bench. Strain measurement is used to estimate compression, tensile, shear and bending stresses in the drill string during the propagation of elastic waves. Finally, the main directions of pilot bore directional drilling improvement during pipe driving are determinated. The new engineering design, as components of the pilot bore directional drilling technology are presented.
Mechatronics design principles for biotechnology product development.
Mandenius, Carl-Fredrik; Björkman, Mats
2010-05-01
Traditionally, biotechnology design has focused on the manufacture of chemicals and biologics. Still, a majority of biotechnology products that appear on the market today is the result of mechanical-electric (mechatronic) construction. For these, the biological components play decisive roles in the design solution; the biological entities are either integral parts of the design, or are transformed by the mechatronic system. This article explains how the development and production engineering design principles used for typical mechanical products can be adapted to the demands of biotechnology products, and how electronics, mechanics and biology can be integrated more successfully. We discuss three emerging areas of biotechnology in which mechatronic design principles can apply: stem cell manufacture, artificial organs, and bioreactors. Copyright 2010 Elsevier Ltd. All rights reserved.
Designing Serious Game Interventions for Individuals with Autism.
Whyte, Elisabeth M; Smyth, Joshua M; Scherf, K Suzanne
2015-12-01
The design of "Serious games" that use game components (e.g., storyline, long-term goals, rewards) to create engaging learning experiences has increased in recent years. We examine of the core principles of serious game design and examine the current use of these principles in computer-based interventions for individuals with autism. Participants who undergo these computer-based interventions often show little evidence of the ability to generalize such learning to novel, everyday social communicative interactions. This lack of generalized learning may result, in part, from the limited use of fundamental elements of serious game design that are known to maximize learning. We suggest that future computer-based interventions should consider the full range of serious game design principles that promote generalization of learning.
Estimated long-term outdoor air pollution concentrations in a cohort study
NASA Astrophysics Data System (ADS)
Beelen, Rob; Hoek, Gerard; Fischer, Paul; Brandt, Piet A. van den; Brunekreef, Bert
Several recent studies associated long-term exposure to air pollution with increased mortality. An ongoing cohort study, the Netherlands Cohort Study on Diet and Cancer (NLCS), was used to study the association between long-term exposure to traffic-related air pollution and mortality. Following on a previous exposure assessment study in the NLCS, we improved the exposure assessment methods. Long-term exposure to nitrogen dioxide (NO 2), nitrogen oxide (NO), black smoke (BS), and sulphur dioxide (SO 2) was estimated. Exposure at each home address ( N=21 868) was considered as a function of a regional, an urban and a local component. The regional component was estimated using inverse distance weighed interpolation of measurement data from regional background sites in a national monitoring network. Regression models with urban concentrations as dependent variables, and number of inhabitants in different buffers and land use variables, derived with a Geographic Information System (GIS), as predictor variables were used to estimate the urban component. The local component was assessed using a GIS and a digital road network with linked traffic intensities. Traffic intensity on the nearest road and on the nearest major road, and the sum of traffic intensity in a buffer of 100 m around each home address were assessed. Further, a quantitative estimate of the local component was estimated. The regression models to estimate the urban component explained 67%, 46%, 49% and 35% of the variances of NO 2, NO, BS, and SO 2 concentrations, respectively. Overall regression models which incorporated the regional, urban and local component explained 84%, 44%, 59% and 56% of the variability in concentrations for NO 2, NO, BS and SO 2, respectively. We were able to develop an exposure assessment model using GIS methods and traffic intensities that explained a large part of the variations in outdoor air pollution concentrations.
Poisson Mixture Regression Models for Heart Disease Prediction.
Mufudza, Chipo; Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.
Poisson Mixture Regression Models for Heart Disease Prediction
Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611
DC motor proportional control system for orthotic devices
NASA Technical Reports Server (NTRS)
Blaise, H. T.; Allen, J. R.
1972-01-01
Multi-channel proportional control system for operation of dc motors for use with externally-powered orthotic arm braces is described. Components of circuitry and principles of operation are described. Schematic diagram of control circuit is provided.
NASA Astrophysics Data System (ADS)
LaRue, James P.; Luzanov, Yuriy
2013-05-01
A new extension to the way in which the Bidirectional Associative Memory (BAM) algorithms are implemented is presented here. We will show that by utilizing the singular value decomposition (SVD) and integrating principles of independent component analysis (ICA) into the nullspace (NS) we have created a novel approach to mitigating spurious attractors. We demonstrate this with two applications. The first application utilizes a one-layer association while the second application is modeled after the several hierarchal associations of ventral pathways. The first application will detail the way in which we manage the associations in terms of matrices. The second application will take what we have learned from the first example and apply it to a cascade of a convolutional neural network (CNN) and perceptron this being our signal processing model of the ventral pathways, i.e., visual systems.
Young, Craig C; Campbell, Aaron D; Lemery, Jay; Young, David S
2015-12-01
Preparticipation evaluations (PPEs) are common in team, organized, or traditional sports but not common in wilderness sports or adventures. Regarding ethical, legal, and administrative considerations, the same principles can be used as in traditional sports. Clinicians should be trained to perform such a PPE to avoid missing essential components and to maximize the quality of the PPE. In general, participants' privacy should be observed; office-based settings may be best for professional and billing purposes, and adequate documentation of a complete evaluation, including clearance issues, should be essential components. Additional environmental and personal health issues relative to the wilderness activity should be documented, and referral for further screening should be made as deemed necessary, if unable to be performed by the primary clinician. Travel medicine principles should be incorporated, and recommendations for travel or adventure insurance should be made. Copyright © 2015. Published by Elsevier Inc.
Changes in Cross-Correlations as an Indicator for Systemic Risk
Zheng, Zeyu; Podobnik, Boris; Feng, Ling; Li, Baowen
2012-01-01
The 2008–2012 global financial crisis began with the global recession in December 2007 and exacerbated in September 2008, during which the U.S. stock markets lost 20% of value from its October 11 2007 peak. Various studies reported that financial crisis are associated with increase in both cross-correlations among stocks and stock indices and the level of systemic risk. In this paper, we study 10 different Dow Jones economic sector indexes, and applying principle component analysis (PCA) we demonstrate that the rate of increase in principle components with short 12-month time windows can be effectively used as an indicator of systemic risk—the larger the change of PC1, the higher the increase of systemic risk. Clearly, the higher the level of systemic risk, the more likely a financial crisis would occur in the near future. PMID:23185692
NASA Astrophysics Data System (ADS)
Chernomyrdin, Nikita V.; Zaytsev, Kirill I.; Lesnichaya, Anastasiya D.; Kudrin, Konstantin G.; Cherkasova, Olga P.; Kurlov, Vladimir N.; Shikunova, Irina A.; Perchik, Alexei V.; Yurchenko, Stanislav O.; Reshetov, Igor V.
2016-09-01
In present paper, an ability to differentiate basal cell carcinoma (BCC) and healthy skin by combining multi-spectral autofluorescence imaging, principle component analysis (PCA), and linear discriminant analysis (LDA) has been demonstrated. For this purpose, the experimental setup, which includes excitation and detection branches, has been assembled. The excitation branch utilizes a mercury arc lamp equipped with a 365-nm narrow-linewidth excitation filter, a beam homogenizer, and a mechanical chopper. The detection branch employs a set of bandpass filters with the central wavelength of spectral transparency of λ = 400, 450, 500, and 550 nm, and a digital camera. The setup has been used to study three samples of freshly excised BCC. PCA and LDA have been implemented to analyze the data of multi-spectral fluorescence imaging. Observed results of this pilot study highlight the advantages of proposed imaging technique for skin cancer diagnosis.
Anderson, Hazel P; Ward, Jamie
2015-05-01
Questionnaires have been developed for categorising grapheme-colour synaesthetes into two sub-types based on phenomenology: associators and projectors. The general approach has been to assume a priori the existence of two sub-types on a single dimension (with endpoints as projector and associator) rather than explore, in a data-driven fashion, other possible models. We collected responses from 175 grapheme-colour synaesthetes on two questionnaires, the Illustrated Synaesthetic Experience Questionnaire (Skelton, Ludwig, & Mohr, 2009) and Rouw and Scholte's (2007) Projector-Associator Questionnaire. After Principle Component Analysis both questionnaires were comprised of two factors which coincide with the projector/associator distinction. This suggests that projectors and associators are not opposites of each other, but separate dimensions of experience (e.g. some synaesthetes claim to be both, others claim to be neither). The revised questionnaires provide a useful tool for researchers and insights into the phenomenology of synaesthesia. Copyright © 2015 Elsevier Inc. All rights reserved.
Does the Coherent Lidar System Corroborate Non-Interaction of Waves (NIW)?
NASA Technical Reports Server (NTRS)
Prasad, Narasimha S.; Roychoudhari, Chandrasekhar
2013-01-01
The NIW (non-interaction of waves) property has been proposed by one of the coauthors. The NIW property states that in the absence of any "obstructing" detectors, all the Huygens-Fresnel secondary wavelets will continue to propagate unhindered and without interacting (interfering) with each other. Since a coherent lidar system incorporates complex behaviors of optical components with different polarizations including circular polarization for the transmitted radiation, then the question arises whether the NIW principle accommodate elliptical polarization of light. Elliptical polarization presumes the summation of orthogonally polarized electric field vectors which contradicts the NIW principle. In this paper, we present working of a coherent lidar system using Jones matrix formulation. The Jones matrix elements represent the anisotropic dipolar properties of molecules of optical components. Accordingly, when we use the Jones matrix methodology to analyze the coherent lidar system, we find that the system behavior is congruent with the NIW property.
Young, Craig C; Campbell, Aaron D; Lemery, Jay; Young, David S
2015-09-01
Preparticipation evaluations (PPEs) are common in team, organized, or traditional sports but not common in wilderness sports or adventures. Regarding ethical, legal, and administrative considerations, the same principles can be used as in traditional sports. Clinicians should be trained to perform such a PPE to avoid missing essential components and to maximize the quality of the PPE. In general, participants' privacy should be observed; office-based settings may be best for professional and billing purposes, and adequate documentation of a complete evaluation, including clearance issues, should be essential components. Additional environmental and personal health issues relative to the wilderness activity should be documented, and referral for further screening should be made as deemed necessary, if unable to be performed by the primary clinician. Travel medicine principles should be incorporated, and recommendations for travel or adventure insurance should be made.
Who let the demon out? Laplace and Boscovich on determinism.
Kožnjak, Boris
2015-06-01
In this paper, I compare Pierre-Simon Laplace's celebrated formulation of the principle of determinism in his 1814 Essai philosophique sur les probabilités with the formulation of the same principle offered by Roger Joseph Boscovich in his Theoria philosophiae naturalis, published 56 years earlier. This comparison discloses a striking general similarity between the two formulations of determinism as well as certain important differences. Regarding their similarities, both Boscovich's and Laplace's conceptions of determinism involve two mutually interdependent components-ontological and epistemic-and they are both intimately linked with the principles of causality and continuity. Regarding their differences, however, Boscovich's formulation of the principle of determinism turns out not only to be temporally prior to Laplace's but also-being founded on fewer metaphysical principles and more rooted in and elaborated by physical assumptions-to be more precise, complete and comprehensive than Laplace's somewhat parenthetical statement of the doctrine. A detailed analysis of these similarities and differences, so far missing in the literature on the history and philosophy of the concept of determinism, is the main goal of the present paper. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ahmed, S.; Abdul-Aziz, O. I.
2015-12-01
We used a systematic data-analytics approach to analyze and quantify relative linkages of four stream water quality indicators (total nitrogen, TN; total phosphorus, TP; chlorophyll-a, Chla; and dissolved oxygen, DO) with six land use and four hydrologic variables, along with the potential external (upstream in-land and downstream coastal) controls in highly complex coastal urban watersheds of southeast Florida, U.S.A. Multivariate pattern recognition techniques of principle component and factor analyses, in concert with Pearson correlation analysis, were applied to map interrelations and identify latent patterns of the participatory variables. Relative linkages of the in-stream water quality variables with their associated drivers were then quantified by developing dimensionless partial least squares (PLS) regression model based on standardized data. Model fitting efficiency (R2=0.71-0.87) and accuracy (ratio of root-mean-square error to the standard deviation of the observations, RSR=0.35-0.53) suggested good predictions of the water quality variables in both wet and dry seasons. Agricultural land and groundwater exhibited substantial controls on surface water quality. In-stream TN concentration appeared to be mostly contributed by the upstream water entering from Everglades in both wet and dry seasons. In contrast, watershed land uses had stronger linkages with TP and Chla than that of the watershed hydrologic and upstream (Everglades) components for both seasons. Both land use and hydrologic components showed strong linkages with DO in wet season; however, the land use linkage appeared to be less in dry season. The data-analytics method provided a comprehensive empirical framework to achieve crucial mechanistic insights into the urban stream water quality processes. Our study quantitatively identified dominant drivers of water quality, indicating key management targets to maintain healthy stream ecosystems in complex urban-natural environments near the coast.
NASA Technical Reports Server (NTRS)
Kalton, G.
1983-01-01
A number of surveys were conducted to study the relationship between the level of aircraft or traffic noise exposure experienced by people living in a particular area and their annoyance with it. These surveys generally employ a clustered sample design which affects the precision of the survey estimates. Regression analysis of annoyance on noise measures and other variables is often an important component of the survey analysis. Formulae are presented for estimating the standard errors of regression coefficients and ratio of regression coefficients that are applicable with a two- or three-stage clustered sample design. Using a simple cost function, they also determine the optimum allocation of the sample across the stages of the sample design for the estimation of a regression coefficient.
2011-01-01
Background Hemorrhagic fever with renal syndrome (HFRS) is an important infectious disease caused by different species of hantaviruses. As a rodent-borne disease with a seasonal distribution, external environmental factors including climate factors may play a significant role in its transmission. The city of Shenyang is one of the most seriously endemic areas for HFRS. Here, we characterized the dynamic temporal trend of HFRS, and identified climate-related risk factors and their roles in HFRS transmission in Shenyang, China. Methods The annual and monthly cumulative numbers of HFRS cases from 2004 to 2009 were calculated and plotted to show the annual and seasonal fluctuation in Shenyang. Cross-correlation and autocorrelation analyses were performed to detect the lagged effect of climate factors on HFRS transmission and the autocorrelation of monthly HFRS cases. Principal component analysis was constructed by using climate data from 2004 to 2009 to extract principal components of climate factors to reduce co-linearity. The extracted principal components and autocorrelation terms of monthly HFRS cases were added into a multiple regression model called principal components regression model (PCR) to quantify the relationship between climate factors, autocorrelation terms and transmission of HFRS. The PCR model was compared to a general multiple regression model conducted only with climate factors as independent variables. Results A distinctly declining temporal trend of annual HFRS incidence was identified. HFRS cases were reported every month, and the two peak periods occurred in spring (March to May) and winter (November to January), during which, nearly 75% of the HFRS cases were reported. Three principal components were extracted with a cumulative contribution rate of 86.06%. Component 1 represented MinRH0, MT1, RH1, and MWV1; component 2 represented RH2, MaxT3, and MAP3; and component 3 represented MaxT2, MAP2, and MWV2. The PCR model was composed of three principal components and two autocorrelation terms. The association between HFRS epidemics and climate factors was better explained in the PCR model (F = 446.452, P < 0.001, adjusted R2 = 0.75) than in the general multiple regression model (F = 223.670, P < 0.000, adjusted R2 = 0.51). Conclusion The temporal distribution of HFRS in Shenyang varied in different years with a distinctly declining trend. The monthly trends of HFRS were significantly associated with local temperature, relative humidity, precipitation, air pressure, and wind velocity of the different previous months. The model conducted in this study will make HFRS surveillance simpler and the control of HFRS more targeted in Shenyang. PMID:22133347
Construction of mathematical model for measuring material concentration by colorimetric method
NASA Astrophysics Data System (ADS)
Liu, Bing; Gao, Lingceng; Yu, Kairong; Tan, Xianghua
2018-06-01
This paper use the method of multiple linear regression to discuss the data of C problem of mathematical modeling in 2017. First, we have established a regression model for the concentration of 5 substances. But only the regression model of the substance concentration of urea in milk can pass through the significance test. The regression model established by the second sets of data can pass the significance test. But this model exists serious multicollinearity. We have improved the model by principal component analysis. The improved model is used to control the system so that it is possible to measure the concentration of material by direct colorimetric method.
NASA Astrophysics Data System (ADS)
Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi
2018-02-01
In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.
The Gyroplane : Its Principles and Its Possibilities
NASA Technical Reports Server (NTRS)
Breguet, Louis
1937-01-01
This report begins by indicating several simple principles concerning the velocity distribution over the blades of a lifting propeller of diameter D, revolving at n revolutions per second, and animated by a horizontal movement of translation at speed V. The calculation, compared with the test data, has shown that the aerodynamic action of the air on the blades depends almost only on the velocity components in a plane at right angles to the blade span. A history of gyroplane and gyrocopter development is presented as well as the advantages in using both types of craft.
Social Learning Theory: its application in the context of nurse education.
Bahn, D
2001-02-01
Cognitive theories are fundamental to enable problem solving and the ability to understand and apply principles in a variety of situations. This article looks at Social Learning Theory, critically analysing its principles, which are based on observational learning and modelling, and considering its value and application in the context of nurse education. It also considers the component processes that will determine the outcome of observed behaviour, other than reinforcement, as identified by Bandura, namely: attention, retention, motor reproduction, and motivation. Copyright 2001 Harcourt Publishers Ltd.
The principles and conduct of anaesthesia for emergency surgery.
Gray, L D; Morris, C
2013-01-01
In this second article we examine the principles underlying delivery of the components of anaesthesia. Topics considered include anaesthetic technique, management of the airway and lung ventilation, induction and maintenance of anaesthesia, patient monitoring including the place of cardiac output devices. We summarise recent research on the management of shock and sepsis syndromes including goal directed therapy and examine some controversies around intravenous fluid therapy. Finally, we discuss intra-operative awareness and challenges during emergence including peri-operative cognitive dysfunction. Anaesthesia © 2012 The Association of Anaesthetists of Great Britain and Ireland.
Mass storage system experiences and future needs at the National Center for Atmospheric Research
NASA Technical Reports Server (NTRS)
Olear, Bernard T.
1991-01-01
A summary and viewgraphs of a discussion presented at the National Space Science Data Center (NSSDC) Mass Storage Workshop is included. Some of the experiences of the Scientific Computing Division at the National Center for Atmospheric Research (NCAR) dealing the the 'data problem' are discussed. A brief history and a development of some basic mass storage system (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. Future MSS needs for future computing environments is discussed.
Using principles from emergency management to improve emergency response plans for research animals.
Vogelweid, Catherine M
2013-10-01
Animal research regulatory agencies have issued updated requirements for emergency response planning by regulated research institutions. A thorough emergency response plan is an essential component of an institution's animal care and use program, but developing an effective plan can be a daunting task. The author provides basic information drawn from the field of emergency management about best practices for developing emergency response plans. Planners should use the basic principles of emergency management to develop a common-sense approach to managing emergencies in their facilities.
Monakhova, Yulia B; Godelmann, Rolf; Kuballa, Thomas; Mushtakova, Svetlana P; Rutledge, Douglas N
2015-08-15
Discriminant analysis (DA) methods, such as linear discriminant analysis (LDA) or factorial discriminant analysis (FDA), are well-known chemometric approaches for solving classification problems in chemistry. In most applications, principle components analysis (PCA) is used as the first step to generate orthogonal eigenvectors and the corresponding sample scores are utilized to generate discriminant features for the discrimination. Independent components analysis (ICA) based on the minimization of mutual information can be used as an alternative to PCA as a preprocessing tool for LDA and FDA classification. To illustrate the performance of this ICA/DA methodology, four representative nuclear magnetic resonance (NMR) data sets of wine samples were used. The classification was performed regarding grape variety, year of vintage and geographical origin. The average increase for ICA/DA in comparison with PCA/DA in the percentage of correct classification varied between 6±1% and 8±2%. The maximum increase in classification efficiency of 11±2% was observed for discrimination of the year of vintage (ICA/FDA) and geographical origin (ICA/LDA). The procedure to determine the number of extracted features (PCs, ICs) for the optimum DA models was discussed. The use of independent components (ICs) instead of principle components (PCs) resulted in improved classification performance of DA methods. The ICA/LDA method is preferable to ICA/FDA for recognition tasks based on NMR spectroscopic measurements. Copyright © 2015 Elsevier B.V. All rights reserved.
The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.
Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin
2007-11-01
This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.
NASA Astrophysics Data System (ADS)
Kim, Young-Pil; Hong, Mi-Young; Shon, Hyun Kyong; Chegal, Won; Cho, Hyun Mo; Moon, Dae Won; Kim, Hak-Sung; Lee, Tae Geol
2008-12-01
Interaction between streptavidin and biotin on poly(amidoamine) (PAMAM) dendrimer-activated surfaces and on self-assembled monolayers (SAMs) was quantitatively studied by using time-of-flight secondary ion mass spectrometry (ToF-SIMS). The surface protein density was systematically varied as a function of protein concentration and independently quantified using the ellipsometry technique. Principal component analysis (PCA) and principal component regression (PCR) were used to identify a correlation between the intensities of the secondary ion peaks and the surface protein densities. From the ToF-SIMS and ellipsometry results, a good linear correlation of protein density was found. Our study shows that surface protein densities are higher on dendrimer-activated surfaces than on SAMs surfaces due to the spherical property of the dendrimer, and that these surface protein densities can be easily quantified with high sensitivity in a label-free manner by ToF-SIMS.
González-Costa, Juan José; Reigosa, Manuel Joaquín; Matías, José María; Fernández-Covelo, Emma
2017-01-01
This study determines the influence of the different soil components and of the cation-exchange capacity on the adsorption and retention of different heavy metals: cadmium, chromium, copper, nickel, lead and zinc. In order to do so, regression models were created through decision trees and the importance of soil components was assessed. Used variables were: humified organic matter, specific cation-exchange capacity, percentages of sand and silt, proportions of Mn, Fe and Al oxides and hematite, and the proportion of quartz, plagioclase and mica, and the proportions of the different clays: kaolinite, vermiculite, gibbsite and chlorite. The most important components in the obtained models were vermiculite and gibbsite, especially for the adsorption of cadmium and zinc, while clays were less relevant. Oxides are less important than clays, especially for the adsorption of chromium and lead and the retention of chromium, copper and lead. PMID:28072849
On the influence of high-pass filtering on ICA-based artifact reduction in EEG-ERP.
Winkler, Irene; Debener, Stefan; Müller, Klaus-Robert; Tangermann, Michael
2015-01-01
Standard artifact removal methods for electroencephalographic (EEG) signals are either based on Independent Component Analysis (ICA) or they regress out ocular activity measured at electrooculogram (EOG) channels. Successful ICA-based artifact reduction relies on suitable pre-processing. Here we systematically evaluate the effects of high-pass filtering at different frequencies. Offline analyses were based on event-related potential data from 21 participants performing a standard auditory oddball task and an automatic artifactual component classifier method (MARA). As a pre-processing step for ICA, high-pass filtering between 1-2 Hz consistently produced good results in terms of signal-to-noise ratio (SNR), single-trial classification accuracy and the percentage of `near-dipolar' ICA components. Relative to no artifact reduction, ICA-based artifact removal significantly improved SNR and classification accuracy. This was not the case for a regression-based approach to remove EOG artifacts.
Threshold regression to accommodate a censored covariate.
Qian, Jing; Chiou, Sy Han; Maye, Jacqueline E; Atem, Folefac; Johnson, Keith A; Betensky, Rebecca A
2018-06-22
In several common study designs, regression modeling is complicated by the presence of censored covariates. Examples of such covariates include maternal age of onset of dementia that may be right censored in an Alzheimer's amyloid imaging study of healthy subjects, metabolite measurements that are subject to limit of detection censoring in a case-control study of cardiovascular disease, and progressive biomarkers whose baseline values are of interest, but are measured post-baseline in longitudinal neuropsychological studies of Alzheimer's disease. We propose threshold regression approaches for linear regression models with a covariate that is subject to random censoring. Threshold regression methods allow for immediate testing of the significance of the effect of a censored covariate. In addition, they provide for unbiased estimation of the regression coefficient of the censored covariate. We derive the asymptotic properties of the resulting estimators under mild regularity conditions. Simulations demonstrate that the proposed estimators have good finite-sample performance, and often offer improved efficiency over existing methods. We also derive a principled method for selection of the threshold. We illustrate the approach in application to an Alzheimer's disease study that investigated brain amyloid levels in older individuals, as measured through positron emission tomography scans, as a function of maternal age of dementia onset, with adjustment for other covariates. We have developed an R package, censCov, for implementation of our method, available at CRAN. © 2018, The International Biometric Society.
Analysis of Sting Balance Calibration Data Using Optimized Regression Models
NASA Technical Reports Server (NTRS)
Ulbrich, N.; Bader, Jon B.
2010-01-01
Calibration data of a wind tunnel sting balance was processed using a candidate math model search algorithm that recommends an optimized regression model for the data analysis. During the calibration the normal force and the moment at the balance moment center were selected as independent calibration variables. The sting balance itself had two moment gages. Therefore, after analyzing the connection between calibration loads and gage outputs, it was decided to choose the difference and the sum of the gage outputs as the two responses that best describe the behavior of the balance. The math model search algorithm was applied to these two responses. An optimized regression model was obtained for each response. Classical strain gage balance load transformations and the equations of the deflection of a cantilever beam under load are used to show that the search algorithm s two optimized regression models are supported by a theoretical analysis of the relationship between the applied calibration loads and the measured gage outputs. The analysis of the sting balance calibration data set is a rare example of a situation when terms of a regression model of a balance can directly be derived from first principles of physics. In addition, it is interesting to note that the search algorithm recommended the correct regression model term combinations using only a set of statistical quality metrics that were applied to the experimental data during the algorithm s term selection process.
Determination of Components in Beverages by Thin-Layer Chromatography.
ERIC Educational Resources Information Center
Ma, Yinfa; Yeung, Edward S.
1990-01-01
Described is a simple and interesting chromatography experiment using three different fluorescence detection principles for the determination of caffeine, saccharin and sodium benzoate in beverages. Experimental procedures and an analysis and discussion of the results are included. (CW)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-18
... specifications (updated regarding components and weighting methodology).\\4\\ The only post- proposal difference in... practices, to promote just and equitable principles of trade, to remove impediments to and perfect the...
IMPROVING THE TMDL PROCESS USING WATERSHED RISK ASSESSMENT PRINCIPLES
Watershed ecological risk assessment (WERA) evaluates potential causal relationships between multiple sources and stressors and impacts on valued ecosystem components. This has many similarities tothe placed-based analuses that are undertaken to develop total maximum daily loads...
LIFE CYCLE ASSESSMENT: INVENTORY GUIDELINES AND PRINCIPLES
The U.S. Environmental Protection Agency (EPA) is describing the process, the underlying data, and the Inherent assumptions Involved in conducting the Inventory component of a life-cycle assessment (LCA) In order to facilitate understanding by potential users. This Inventory...
Management Development at Hewlett-Packard.
ERIC Educational Resources Information Center
Nilsson, William P.
This presentation describes the principles and policies underlying the successful management development program at Hewlett-Packard Company, a manufacturer of electronic instruments and components. The company is organized into relatively independent product divisions with decentralized decision-making responsibilities, flexible working hours, and…
Automatic method of measuring silicon-controlled-rectifier holding current
NASA Technical Reports Server (NTRS)
Maslowski, E. A.
1972-01-01
Development of automated silicon controlled rectifier circuit for measuring minimum anode current required to maintain rectifiers in conducting state is discussed. Components of circuit are described and principles of operation are explained. Illustration of circuit is provided.
Reflexion on linear regression trip production modelling method for ensuring good model quality
NASA Astrophysics Data System (ADS)
Suprayitno, Hitapriya; Ratnasari, Vita
2017-11-01
Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.
Zhang, Yuanke; Lu, Hongbing; Rong, Junyan; Meng, Jing; Shang, Junliang; Ren, Pinghong; Zhang, Junying
2017-09-01
Low-dose CT (LDCT) technique can reduce the x-ray radiation exposure to patients at the cost of degraded images with severe noise and artifacts. Non-local means (NLM) filtering has shown its potential in improving LDCT image quality. However, currently most NLM-based approaches employ a weighted average operation directly on all neighbor pixels with a fixed filtering parameter throughout the NLM filtering process, ignoring the non-stationary noise nature of LDCT images. In this paper, an adaptive NLM filtering scheme on local principle neighborhoods (PC-NLM) is proposed for structure-preserving noise/artifacts reduction in LDCT images. Instead of using neighboring patches directly, in the PC-NLM scheme, the principle component analysis (PCA) is first applied on local neighboring patches of the target patch to decompose the local patches into uncorrelated principle components (PCs), then a NLM filtering is used to regularize each PC of the target patch and finally the regularized components is transformed to get the target patch in image domain. Especially, in the NLM scheme, the filtering parameter is estimated adaptively from local noise level of the neighborhood as well as the signal-to-noise ratio (SNR) of the corresponding PC, which guarantees a "weaker" NLM filtering on PCs with higher SNR and a "stronger" filtering on PCs with lower SNR. The PC-NLM procedure is iteratively performed several times for better removal of the noise and artifacts, and an adaptive iteration strategy is developed to reduce the computational load by determining whether a patch should be processed or not in next round of the PC-NLM filtering. The effectiveness of the presented PC-NLM algorithm is validated by experimental phantom studies and clinical studies. The results show that it can achieve promising gain over some state-of-the-art methods in terms of artifact suppression and structure preservation. With the use of PCA on local neighborhoods to extract principal structural components, as well as adaptive NLM filtering on PCs of the target patch using filtering parameter estimated based on the local noise level and corresponding SNR, the proposed PC-NLM method shows its efficacy in preserving fine anatomical structures and suppressing noise/artifacts in LDCT images. © 2017 American Association of Physicists in Medicine.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.
Cunningham, Jennifer; Wallston, Kenneth A; Wilkins, Consuelo H; Hull, Pamela C; Miller, Stephania T
2015-12-01
This study describes the development and psychometric evaluation of HPV Clinical Trial Survey for Parents with Children Aged 9 to 15 (CTSP-HPV) using traditional instrument development methods and community engagement principles. An expert panel and parental input informed survey content and parents recommended study design changes (e.g., flyer wording). A convenience sample of 256 parents completed the final survey measuring parental willingness to consent to HPV clinical trial (CT) participation and other factors hypothesized to influence willingness (e.g., HPV vaccine benefits). Cronbach's a, Spearman correlations, and multiple linear regression were used to estimate internal consistency, convergent and discriminant validity, and predictively validity, respectively. Internal reliability was confirmed for all scales (a ≥ 0.70.). Parental willingness was positively associated (p < 0.05) with trust in medical researchers, adolescent CT knowledge, HPV vaccine benefits, advantages of adolescent CTs (r range 0.33-0.42), supporting convergent validity. Moderate discriminant construct validity was also demonstrated. Regression results indicate reasonable predictive validity with the six scales accounting for 31% of the variance in parents' willingness. This instrument can inform interventions based on factors that influence parental willingness, which may lead to the eventual increase in trial participation. Further psychometric testing is warranted. © 2015 Wiley Periodicals, Inc.
Frequency-domain nonlinear regression algorithm for spectral analysis of broadband SFG spectroscopy.
He, Yuhan; Wang, Ying; Wang, Jingjing; Guo, Wei; Wang, Zhaohui
2016-03-01
The resonant spectral bands of the broadband sum frequency generation (BB-SFG) spectra are often distorted by the nonresonant portion and the lineshapes of the laser pulses. Frequency domain nonlinear regression (FDNLR) algorithm was proposed to retrieve the first-order polarization induced by the infrared pulse and to improve the analysis of SFG spectra through simultaneous fitting of a series of time-resolved BB-SFG spectra. The principle of FDNLR was presented, and the validity and reliability were tested by the analysis of the virtual and measured SFG spectra. The relative phase, dephasing time, and lineshapes of the resonant vibrational SFG bands can be retrieved without any preset assumptions about the SFG bands and the incident laser pulses.
Compensation effects in molecular interactions and the quantum chemical le Chatelier principle.
Mezey, Paul G
2015-05-28
Components of molecular interactions and various changes in the components of total energy changes during molecular processes typically exhibit some degrees of compensation. This may be as prominent as the over 90% compensation of the electronic energy and nuclear repulsion energy components of the total energy in some conformational changes. Some of these compensations are enhanced by solvent effects. For various arrangements of ions in a solvent, however, not only compensation but also a formal, mutual enhancement between the electronic energy and nuclear repulsion energy components of the total energy may also occur, when the tools of nuclear charge variation are applied to establish quantum chemically rigorous energy inequalities.
NASA Astrophysics Data System (ADS)
Di Martino, Gerardo; Iodice, Antonio; Natale, Antonio; Riccio, Daniele; Ruello, Giuseppe
2015-04-01
The recently proposed polarimetric two-scale two- component model (PTSTCM) in principle allows us obtaining a reasonable estimation of the soil moisture even in moderately vegetated areas, where the volumetric scattering contribution is non-negligible, provided that the surface component is dominant and the double-bounce component is negligible. Here we test the PTSTCM validity range by applying it to polarimetric SAR data acquired on areas for which, at the same times of SAR acquisitions, ground measurements of soil moisture were performed. In particular, we employ the AGRISAR'06 database, which includes data from several fields covering a period that spans all the phases of vegetation growth.
NASA Technical Reports Server (NTRS)
Palusinski, O. A.; Allgyer, T. T.; Mosher, R. A.; Bier, M.; Saville, D. A.
1981-01-01
A mathematical model of isoelectric focusing at the steady state has been developed for an M-component system of electrochemically defined ampholytes. The model is formulated from fundamental principles describing the components' chemical equilibria, mass transfer resulting from diffusion and electromigration, and electroneutrality. The model consists of ordinary differential equations coupled with a system of algebraic equations. The model is implemented on a digital computer using FORTRAN-based simulation software. Computer simulation data are presented for several two-component systems showing the effects of varying the isoelectric points and dissociation constants of the constituents.
Jusot, Florence; Tubeuf, Sandy; Trannoy, Alain
2013-12-01
The way to treat the correlation between circumstances and effort is a central, yet largely neglected issue in the applied literature on inequality of opportunity. This paper adopts three alternative normative ways of treating this correlation championed by Roemer, Barry and Swift and assesses their empirical relevance using survey data. We combine regression analysis with the natural decomposition of the variance to compare the relative contributions of circumstances and efforts to overall health inequality according to the different normative principles. Our results suggest that, in practice, the normative principle on the way to treat the correlation between circumstances and effort makes little difference on the relative contributions of circumstances and efforts to explained health inequality. Copyright © 2013 John Wiley & Sons, Ltd.
Orthogonal decomposition of left ventricular remodeling in myocardial infarction
Zhang, Xingyu; Medrano-Gracia, Pau; Ambale-Venkatesh, Bharath; Bluemke, David A.; Cowan, Brett R; Finn, J. Paul; Kadish, Alan H.; Lee, Daniel C.; Lima, Joao A. C.; Young, Alistair A.; Suinesiaputra, Avan
2017-01-01
Abstract Left ventricular size and shape are important for quantifying cardiac remodeling in response to cardiovascular disease. Geometric remodeling indices have been shown to have prognostic value in predicting adverse events in the clinical literature, but these often describe interrelated shape changes. We developed a novel method for deriving orthogonal remodeling components directly from any (moderately independent) set of clinical remodeling indices. Results: Six clinical remodeling indices (end-diastolic volume index, sphericity, relative wall thickness, ejection fraction, apical conicity, and longitudinal shortening) were evaluated using cardiac magnetic resonance images of 300 patients with myocardial infarction, and 1991 asymptomatic subjects, obtained from the Cardiac Atlas Project. Partial least squares (PLS) regression of left ventricular shape models resulted in remodeling components that were optimally associated with each remodeling index. A Gram–Schmidt orthogonalization process, by which remodeling components were successively removed from the shape space in the order of shape variance explained, resulted in a set of orthonormal remodeling components. Remodeling scores could then be calculated that quantify the amount of each remodeling component present in each case. A one-factor PLS regression led to more decoupling between scores from the different remodeling components across the entire cohort, and zero correlation between clinical indices and subsequent scores. Conclusions: The PLS orthogonal remodeling components had similar power to describe differences between myocardial infarction patients and asymptomatic subjects as principal component analysis, but were better associated with well-understood clinical indices of cardiac remodeling. The data and analyses are available from www.cardiacatlas.org. PMID:28327972
Ohseto, Hisashi; Ishikuro, Mami; Kikuya, Masahiro; Obara, Taku; Igarashi, Yuko; Takahashi, Satomi; Kikuchi, Daisuke; Shigihara, Michiko; Yamanaka, Chizuru; Miyashita, Masako; Mizuno, Satoshi; Nagai, Masato; Matsubara, Hiroko; Sato, Yuki; Metoki, Hirohito; Tachibana, Hirofumi; Maeda-Yamamoto, Mari; Kuriyama, Shinichi
2018-04-01
Metabolic syndrome and the presence of metabolic syndrome components are risk factors for cardiovascular disease (CVD). However, the association between personality traits and metabolic syndrome remains controversial, and few studies have been conducted in East Asian populations. We measured personality traits using the Japanese version of the Eysenck Personality Questionnaire (Revised Short Form) and five metabolic syndrome components-elevated waist circumference, elevated triglycerides, reduced high-density lipoprotein cholesterol, elevated blood pressure, and elevated fasting glucose-in 1322 participants aged 51.1±12.7years old from Kakegawa city, Japan. Metabolic syndrome score (MS score) was defined as the number of metabolic syndrome components present, and metabolic syndrome as having the MS score of 3 or higher. We performed multiple logistic regression analyses to examine the relationship between personality traits and metabolic syndrome components and multiple regression analyses to examine the relationship between personality traits and MS scores adjusted for age, sex, education, income, smoking status, alcohol use, and family history of CVD and diabetes mellitus. We also examine the relationship between personality traits and metabolic syndrome presence by multiple logistic regression analyses. "Extraversion" scores were higher in those with metabolic syndrome components (elevated waist circumference: P=0.001; elevated triglycerides: P=0.01; elevated blood pressure: P=0.004; elevated fasting glucose: P=0.002). "Extraversion" was associated with the MS score (coefficient=0.12, P=0.0003). No personality trait was significantly associated with the presence of metabolic syndrome. Higher "extraversion" scores were related to higher MS scores, but no personality trait was significantly associated with the presence of metabolic syndrome. Copyright © 2018 Elsevier Inc. All rights reserved.
Design Principles of Regulatory Networks: Searching for the Molecular Algorithms of the Cell
Lim, Wendell A.; Lee, Connie M.; Tang, Chao
2013-01-01
A challenge in biology is to understand how complex molecular networks in the cell execute sophisticated regulatory functions. Here we explore the idea that there are common and general principles that link network structures to biological functions, principles that constrain the design solutions that evolution can converge upon for accomplishing a given cellular task. We describe approaches for classifying networks based on abstract architectures and functions, rather than on the specific molecular components of the networks. For any common regulatory task, can we define the space of all possible molecular solutions? Such inverse approaches might ultimately allow the assembly of a design table of core molecular algorithms that could serve as a guide for building synthetic networks and modulating disease networks. PMID:23352241
Design of virtual SCADA simulation system for pressurized water reactor
NASA Astrophysics Data System (ADS)
Wijaksono, Umar; Abdullah, Ade Gafar; Hakim, Dadang Lukman
2016-02-01
The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles of energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.
NASA Astrophysics Data System (ADS)
Hassanzadeh, S.; Hosseinibalam, F.; Omidvari, M.
2008-04-01
Data of seven meteorological variables (relative humidity, wet temperature, dry temperature, maximum temperature, minimum temperature, ground temperature and sun radiation time) and ozone values have been used for statistical analysis. Meteorological variables and ozone values were analyzed using both multiple linear regression and principal component methods. Data for the period 1999-2004 are analyzed jointly using both methods. For all periods, temperature dependent variables were highly correlated, but were all negatively correlated with relative humidity. Multiple regression analysis was used to fit the meteorological variables using the meteorological variables as predictors. A variable selection method based on high loading of varimax rotated principal components was used to obtain subsets of the predictor variables to be included in the linear regression model of the meteorological variables. In 1999, 2001 and 2002 one of the meteorological variables was weakly influenced predominantly by the ozone concentrations. However, the model did not predict that the meteorological variables for the year 2000 were not influenced predominantly by the ozone concentrations that point to variation in sun radiation. This could be due to other factors that were not explicitly considered in this study.
Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations
NASA Astrophysics Data System (ADS)
Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.
2010-11-01
We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.
Effects of general principles of person transfer techniques on low back joint extension moment.
Katsuhira, Junji; Yamasaki, Syun; Yamamoto, Sumiko; Maruyama, Hitoshi
2010-01-01
The purpose of this study was to examine the effects of general principles of person transfer techniques specifically on the low back joint extension moment. These effects were examined by the following measurable quantitative parameters: 1) trunk bending angle, 2) knee flexion angle, 3) distance between the centers of gravity (COGs) of the caregiver and patient, representing the distance between the caregiver and patient, and 4) the vertical component of the ground reaction force representing the amount of the weight-bearing load on the caregiver's low back during transfers with and without assistive devices. Twenty students each took the role of caregiver, and one healthy adult simulated a patient. The participants performed three different transfer tasks: without any assistive device, with the patient wearing a low back belt, and with the caregiver using a transfer board. We found that the distance between the COGs and the vertical component of the ground reaction force, but not the trunk bending and knee flexion angles, were the variables that affected the low back joint extension moment. Our results suggest that the general principle of decreasing the distance between COGs is most effective for decreasing the low back joint extension moment during transfers under all conditions.
USDA-ARS?s Scientific Manuscript database
Improvement of cold tolerance of winter wheat (Triticum aestivum L.) through breeding methods has been problematic. A better understanding of how individual wheat cultivars respond to components of the freezing process may provide new information that can be used to develop more cold tolerance culti...
Which Components of Working Memory Are Important in the Writing Process?
ERIC Educational Resources Information Center
Vanderberg, Robert; Swanson, H. Lee
2007-01-01
This study investigated the relationship between components of working memory (WM) and the macrostructure (e.g., planning, writing, and revision) and microstructure (e.g., grammar, punctuation) of writing. A battery of WM and writing measures were administered to 160 high-school students. Overall, hierarchical regression analyses showed that the…
Age-Dependent and Age-Independent Measures of Locus of Control.
ERIC Educational Resources Information Center
Sherman, Lawrence W.; Hofmann, Richard
Using a longitudinal data set obtained from 169 pre-adolescent children between the ages of 8 and 13 years, this study statistically divided locus of control into two independent components. The first component was noted as "age-dependent" (AD) and was determined by predicted values generated by regressing children's ages onto their…
LIFE CYCLE ASSESSMENT: PRINCIPLES AND PRACTICE
The following document provides an introductory overview of Life Cycle Assessment (LCA) and describes the general uses and major components of LCA. This document is an update and merger of two previous EPA documents on LCA ("Life Cycle Assessment: Inventory Guidelines and Princip...
Code of Federal Regulations, 2014 CFR
2014-07-01
... a minimum, such persons must have education and experience in soil resistivity, stray current, structure-to-soil potential, and component electrical isolation measurements of buried metal piping and tank... reason of thorough knowledge of the physical sciences and the principles of engineering and mathematics...
Code of Federal Regulations, 2012 CFR
2012-07-01
... a minimum, such persons must have education and experience in soil resistivity, stray current, structure-to-soil potential, and component electrical isolation measurements of buried metal piping and tank... reason of thorough knowledge of the physical sciences and the principles of engineering and mathematics...
Code of Federal Regulations, 2013 CFR
2013-07-01
... a minimum, such persons must have education and experience in soil resistivity, stray current, structure-to-soil potential, and component electrical isolation measurements of buried metal piping and tank... reason of thorough knowledge of the physical sciences and the principles of engineering and mathematics...
ERIC Educational Resources Information Center
Foutes, William A.
Written in student performance terms, this curriculum guide on diesel engine repair is divided into the following eight sections: an orientation to the occupational field and instructional program; instruction in operating principles; instruction in engine components; instruction in auxiliary systems; instruction in fuel systems; instruction in…
Reliability testing across the Environmental Quality Index and national environmental indices.
One challenge in environmental epidemiology is the exploration of cumulative environmental exposure across multiple domains (e.g. air, water, land). The Environmental Quality Index (EQI), created by the U.S. EPA, uses principle component analyses combining environmental domains (...
Decety, Jean; Yoder, Keith J.
2015-01-01
Why do people tend to care for upholding principles of justice? This study examined the association between individual differences in the affective, motivational and cognitive components of empathy, sensitivity to justice, and psychopathy in participants (N 265) who were also asked to rate the permissibility of everyday moral situations that pit personal benefit against moral standards of justice. Counter to commonsense, emotional empathy was not associated with sensitivity to injustice for others. Rather, individual differences in cognitive empathy and empathic concern predicted sensitivity to justice for others, as well as the endorsement of moral rules. Psychopathy coldheartedness scores were inversely associated with motivation for justice. Moreover, hierarchical multiple linear regression analysis revealed that self-focused and other-focused orientations toward justice had opposing influences on the permissibility of moral judgments. High scores on psychopathy were associated with less moral condemnation of immoral behavior. Together, these results contribute to a better understanding of the information processing mechanisms underlying justice motivation, and may guide interventions designed to foster justice and moral behavior. In order to promote justice motivation, it may be more effective to encourage perspective taking and reasoning to induce concern for others than emphasizing emotional sharing with the misfortune of others. PMID:25768232
NASA Astrophysics Data System (ADS)
Records, R.; Fassnacht, S. R.; Arabi, M.; Duffy, W. G.
2014-12-01
Elevated total phosphorus (P) loading into Upper Klamath Lake, southern Oregon, United States has caused hypereutrophic conditions impacting endangered lake fish species. Increases in P loading have been attributed to land use changes, such as timber harvest and wetland drainage. The contribution of P to Upper Klamath Lake has been estimated from each major tributary, yet little research has explored what land use or other variables have most influence on P loading within the tributaries. In addition, previous work has shown a range of potential hydroclimatic shifts by the 2040s, with potential to alter P loading mechanisms. In this study, we use statistical methods including principle component analysis and multiple linear regression to determine what hydroclimatic and landscape variables best explain flow-weighted P concentration in the Sprague River, one of three main tributaries to Upper Klamath Lake. Identification of key variables affecting P loading has direct implications for management decisions in the Upper Klamath River Basin. Increases in P loading related to sediment loading are due to bank and upslope erosion. The former is more prevalent in areas of historic channel alteration and cattle grazing, while the latter is more dominant in areas of heavy timber harvesting and more precipitation as rain.
Ferreira, Ana P; Tobyn, Mike
2015-01-01
In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.
Guastello, Stephen J; Craven, Joanna; Zygowicz, Karen M; Bock, Benjamin R
2005-07-01
The process by which an initially leaderless group differentiates into one containing leadership and secondary role structures was examined using the swallowtail catastrophe model and principles of selforganization. The objectives were to identify the control variables in the process of leadership emergence in creative problem solving groups and production groups. In the first of two experiments, groups of university students (total N = 114) played a creative problem solving game. Participants later rated each other on leadership behavior, styles, and variables related to the process of conversation. A performance quality measure was included also. Control parameters in the swallowtail catastrophe model were identified through a combination of factor analysis and nonlinear regression. Leaders displayed a broad spectrum of behaviors in the general categories of Controlling the Conversation and Creativity in their role-play. In the second experiment, groups of university students (total N = 197) engaged in a laboratory work experiment that had a substantial production goal component. The same system of ratings and modeling strategy was used along with a work production measure. Leaders in the production task emerged to the extent that they exhibited control over both the creative and production aspects of the task, they could keep tension low, and the externally imposed production goals were realistic.
Nyamukapa, Constance A.; Gregson, Simon; Lopman, Ben; Saito, Suzue; Watts, Helen J.; Monasch, Roeland; Jukes, Matthew C.H.
2008-01-01
Objectives. We measured the psychosocial effect of orphanhood in a sub-Saharan African population and evaluated a new framework for understanding the causes and consequences of psychosocial distress among orphans and other vulnerable children. Methods. The framework was evaluated using data from 5321 children aged 12 to 17 years who were interviewed in a 2004 national survey in Zimbabwe. We constructed a measure of psychosocial distress using principle components analysis. We used regression analyses to obtain standardized parameter estimates of psychosocial distress and odds ratios of early sexual activity. Results. Orphans had more psychosocial distress than did nonorphans. For both genders, paternal, maternal, and double orphans exhibited more-severe distress than did nonorphaned, nonvulnerable children. Orphanhood remained associated with psychosocial distress after we controlled for differences in more-proximate determinants. Maternal and paternal orphans were significantly more likely than were nonorphaned, nonvulnerable children to have engaged in sexual activity. These differences were reduced after we controlled for psychosocial distress. Conclusions. Orphaned adolescents in Zimbabwe suffer greater psychosocial distress than do nonorphaned, nonvulnerable children, which may lead to increased likelihood of early onset of sexual intercourse and HIV infection. The effect of strategies to provide psychosocial support should be evaluated scientifically. PMID:18048777
Muhammad, Said; Tahir Shah, M; Khan, Sardar
2010-10-01
The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were <1 in rest of the samples. This level of contamination should have low chronic risk and medium cancer risk when compared with US EPA guidelines. Furthermore, the inter-dependence of physio-chemical parameters and pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
Decety, Jean; Yoder, Keith J
2016-01-01
Why do people tend to care for upholding principles of justice? This study examined the association between individual differences in the affective, motivational and cognitive components of empathy, sensitivity to justice, and psychopathy in participants (N 265) who were also asked to rate the permissibility of everyday moral situations that pit personal benefit against moral standards of justice. Counter to common sense, emotional empathy was not associated with sensitivity to injustice for others. Rather, individual differences in cognitive empathy and empathic concern predicted sensitivity to justice for others, as well as the endorsement of moral rules. Psychopathy coldheartedness scores were inversely associated with motivation for justice. Moreover, hierarchical multiple linear regression analysis revealed that self-focused and other-focused orientations toward justice had opposing influences on the permissibility of moral judgments. High scores on psychopathy were associated with less moral condemnation of immoral behavior. Together, these results contribute to a better understanding of the information processing mechanisms underlying justice motivation, and may guide interventions designed to foster justice and moral behavior. In order to promote justice motivation, it may be more effective to encourage perspective taking and reasoning than emphasizing emotional sharing with the misfortune of others.
Qu, Mingkai; Wang, Yan; Huang, Biao; Zhao, Yongcun
2018-06-01
The traditional source apportionment models, such as absolute principal component scores-multiple linear regression (APCS-MLR), are usually susceptible to outliers, which may be widely present in the regional geochemical dataset. Furthermore, the models are merely built on variable space instead of geographical space and thus cannot effectively capture the local spatial characteristics of each source contributions. To overcome the limitations, a new receptor model, robust absolute principal component scores-robust geographically weighted regression (RAPCS-RGWR), was proposed based on the traditional APCS-MLR model. Then, the new method was applied to the source apportionment of soil metal elements in a region of Wuhan City, China as a case study. Evaluations revealed that: (i) RAPCS-RGWR model had better performance than APCS-MLR model in the identification of the major sources of soil metal elements, and (ii) source contributions estimated by RAPCS-RGWR model were more close to the true soil metal concentrations than that estimated by APCS-MLR model. It is shown that the proposed RAPCS-RGWR model is a more effective source apportionment method than APCS-MLR (i.e., non-robust and global model) in dealing with the regional geochemical dataset. Copyright © 2018 Elsevier B.V. All rights reserved.
Tremblay, Marlène; Crim, Stacy M; Cole, Dana J; Hoekstra, Robert M; Henao, Olga L; Döpfer, Dörte
2017-10-01
The Foodborne Diseases Active Surveillance Network (FoodNet) is currently using a negative binomial (NB) regression model to estimate temporal changes in the incidence of Campylobacter infection. FoodNet active surveillance in 483 counties collected data on 40,212 Campylobacter cases between years 2004 and 2011. We explored models that disaggregated these data to allow us to account for demographic, geographic, and seasonal factors when examining changes in incidence of Campylobacter infection. We hypothesized that modeling structural zeros and including demographic variables would increase the fit of FoodNet's Campylobacter incidence regression models. Five different models were compared: NB without demographic covariates, NB with demographic covariates, hurdle NB with covariates in the count component only, hurdle NB with covariates in both zero and count components, and zero-inflated NB with covariates in the count component only. Of the models evaluated, the nonzero-augmented NB model with demographic variables provided the best fit. Results suggest that even though zero inflation was not present at this level, individualizing the level of aggregation and using different model structures and predictors per site might be required to correctly distinguish between structural and observational zeros and account for risk factors that vary geographically.
Creep prediction of a layered fiberglass plastic
NASA Astrophysics Data System (ADS)
Aniskevich, K.; Korsgaard, J.; Mālmeisters, A.; Jansons, J.
1998-05-01
The results of short-term creep tests of a layered glass fiber/polyester resin plastic in tension at angles of 90, 70, and 45° to the direction of the principal fiber orientation are presented. The applicability of the principle of time-temperature analogy for the prediction of long-term creep of the composite and its structural components is revealed. The possibility of evaluating the viscoelastic properties of the composite from the properties of structural components is shown.
2014-12-01
Historically, MRT found its most extensive application in the inspection of critical high-strength steel components of the F-111 aircraft to...Steve Burke is Group Leader Acoustic Material Systems within Maritime Division and Task Leader for AIR 07/101 Assessment and Control of Aircraft ...Maritime Division. He has previously led research programs in advanced electromagnetic and ultrasonic NDE for aircraft applications. Geoff has BSc and BE
2014-12-01
and low Hmax. In this way, the flux density in the material for the specimen with intermediate k (k = 36.4) magnetised using Hmax = 250 Oe... aerospace components for surface-breaking fatigue cracks. In the residual-field variant of MRT, inspections are performed following the application...packages which can adequately predict the fields produced in practical residual-field MRT. Finally, central- conductor MRT, for which there are fewer
Measuring Accurately Single-Phase Sinusoidal and Non-Sinusoidal Power.
1983-01-01
current component. Since the induction watthour meter is designed for measuring ac variations only, the creation of a dc component in an ac circuit due...available and the basic principle of measurement used in each. 3.1 Power Measuring Meters Instruments designed to measure the amount of average power...1.0 percent of full scale and + 0.5% of reading. 3.2 Encrgy Measuring Meters Instruments designed to measure the amount of power consumed in a circuit
Discrimination of serum Raman spectroscopy between normal and colorectal cancer
NASA Astrophysics Data System (ADS)
Li, Xiaozhou; Yang, Tianyue; Yu, Ting; Li, Siqi
2011-07-01
Raman spectroscopy of tissues has been widely studied for the diagnosis of various cancers, but biofluids were seldom used as the analyte because of the low concentration. Herein, serum of 30 normal people, 46 colon cancer, and 44 rectum cancer patients were measured Raman spectra and analyzed. The information of Raman peaks (intensity and width) and that of the fluorescence background (baseline function coefficients) were selected as parameters for statistical analysis. Principal component regression (PCR) and partial least square regression (PLSR) were used on the selected parameters separately to see the performance of the parameters. PCR performed better than PLSR in our spectral data. Then linear discriminant analysis (LDA) was used on the principal components (PCs) of the two regression method on the selected parameters, and a diagnostic accuracy of 88% and 83% were obtained. The conclusion is that the selected features can maintain the information of original spectra well and Raman spectroscopy of serum has the potential for the diagnosis of colorectal cancer.
Endorsement of universal health coverage financial principles in Burkina Faso.
Agier, Isabelle; Ly, Antarou; Kadio, Kadidiatou; Kouanda, Seni; Ridde, Valéry
2016-02-01
In West Africa, health system funding rarely involves cross-subsidization among population segments. In some countries, a few community-based or professional health insurance programs are present, but coverage is very low. The financial principles underlying universal health coverage (UHC) sustainability and solidarity are threefold: 1) anticipation of potential health risks; 2) risk sharing and; 3) socio-economic status solidarity. In Burkina Faso, where decision-makers are favorable to national health insurance, we measured endorsement of these principles and discerned which management configurations would achieve the greatest adherence. We used a sequential exploratory design. In a qualitative step (9 interviews, 12 focus groups), we adapted an instrument proposed by Goudge et al. (2012) to the local context and addressed desirability bias. Then, in a quantitative step (1255 respondents from the general population), we measured endorsement. Thematic analysis (qualitative) and logistic regressions (quantitative) were used. High levels of endorsement were found for each principle. Actual practices showed that anticipation and risk sharing were not only intentions. Preferences were given to solidarity between socio-economic status (SES) levels and progressivity. Although respondents seemed to prefer the national level for implementation, their current solidarity practices were mainly focused on close family. Thus, contribution levels should be set so that the entire family benefits from healthcare. Some critical conditions must be met to make UHC financial principles a reality through health insurance in Burkina Faso: trust, fair and mandatory contributions, and education. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Revisiting the Relationship between Managed Care and Hospital Consolidation
Town, Robert J; Wholey, Douglas; Feldman, Roger; Burns, Lawton R
2007-01-01
Objective This paper analyzes whether the rise in managed care during the 1990s caused the increase in hospital concentration. Data Sources We assemble data from the American Hospital Association, InterStudy and government censuses from 1990 to 2000. Study Design We employ linear regression analyses on long differenced data to estimate the impact of managed care penetration on hospital consolidation. Instrumental variable analogs of these regressions are also analyzed to control for potential endogeneity. Data Collection All data are from secondary sources merged at the level of the Health Care Services Area. Principle Findings In 1990, the mean population-weighted hospital Herfindahl–Hirschman index (HHI) in a Health Services Area was .19. By 2000, the HHI had risen to .26. Most of this increase in hospital concentration is due to hospital consolidation. Over the same time frame HMO penetration increased three fold. However, our regression analysis strongly implies that the rise of managed care did not cause the hospital consolidation wave. This finding is robust to a number of different specifications. PMID:17355590
Synchronization in spread spectrum laser radar systems based on PMD-DLL
NASA Astrophysics Data System (ADS)
Buxbaum, Bernd; Schwarte, Rudolf; Ringbeck, Thorsten; Luan, Xuming; Zhang, Zhigang; Xu, Zhanping; Hess, H.
2000-09-01
This paper proposes a new optoelectronic delay locked loop (OE-DLL) and its use in optical ranging systems. The so called PMD-DLL receiver module is based on a novel electro-optical modulator (EOM), called the Photonic Mixer Device (PMD). This sensor element is a semiconductor device, which combines fast optical sensing and mixing of incoherent light signals in one component part by its unique and powerful principle of operation. Integration of some simple additional on-chip components offers a high integrated electro-optical correlation unit. Simulations and experimental results have already impressively verified the operation principle of PMD structures, all realized in CMOS technology so far. Although other technologies are also promising candidates for the PMD realization they should not be further discussed in this contribution. The principle of the new DLL approach is intensively discussed in this paper. Theoretical analysis as well as experimental results of a realized PMD-DLL system are demonstrated and judged. Due to the operation principle of sophisticated PMD devices and their unique features, a correlation process may be realized in order to synchronize a reflected incoherent light wave with an electronic reference signal. The phase shift between both signals represents the distance to an obstacle and may be determined by means of the synchronization process. This new approach, avoiding so far needed critical components such as broadband amplifiers and mixers for the detection of small photo currents in optical distance measurement, offers an extremely fast and precise phase determination in ranging applications based on the time- of-flight (TOF) principle. However, the optical measurement signal may be incoherent -- therefore a laser source is not needed imperatively. The kind of waveform used for the modulation of the light signal is variable and depends on the demands of every specific application. Even if there are plenty other alternatives (e.g., heterodyne techniques), in this contribution only so called quasi-heterodyne techniques - - also known as phase shifting methods -- are discussed and used for the implementation. The light modulation schemes described in this contribution are square-wave as well as pseudo-noise modulation. The latter approach, inspired by the wide spread use in communication as well as in position detection (e.g., IS-95 and GPS), offers essential advantages and is the most promising modulation method for the ranging approach. So called CDMA (code division multiple access) systems form a major task in communication technology investigations since the third generation mobile phone standard is also partly based on this principle. Fast and reliable synchronization in direct sequence spread spectrum communication systems (DSSS) differs hardly from the already mentioned ranging approach and will also be discussed. The possibility to integrate all components in a monolithic PMD based DLL design is also presented and discussed. This method might offer the feature to integrate complete lines or matrixes of PMD based DLLs for highly parallel, multidimensional ranging. Finally, an outlook is given with regard to further optimized PMD front ends. An estimation of the expected characteristics concerning accuracy and speed of the distance measurement is given in conclusion.
Blanco, Marco A.; Sahin, Erinc; Li, Yi; Roberts, Christopher J.
2011-01-01
The classic analysis of Rayleigh light scattering (LS) is re-examined for multi-component protein solutions, within the context of Kirkwood-Buff (KB) theory as well as a more generalized canonical treatment. Significant differences arise when traditional treatments that approximate constant pressure and neglect concentration fluctuations in one or more (co)solvent∕co-solute species are compared with more rigorous treatments at constant volume and with all species free to fluctuate. For dilute solutions, it is shown that LS can be used to rigorously and unambiguously obtain values for the osmotic second virial coefficient (B22), in contrast with recent arguments regarding protein interactions deduced from LS experiments. For more concentrated solutions, it is shown that conventional analysis over(under)-estimates the magnitude of B22 for significantly repulsive(attractive) conditions, and that protein-protein KB integrals (G22) are the more relevant quantity obtainable from LS. Published data for α–chymotrypsinogen A and a series of monoclonal antibodies at different pH and salt concentrations are re-analyzed using traditional and new treatments. The results illustrate that while traditional analysis may be sufficient if one is interested in only the sign of B22 or G22, the quantitative values can be significantly in error. A simple approach is illustrated for determining whether protein concentration (c2) is sufficiently dilute for B22 to apply, and for correcting B22 values from traditional LS regression at higher c2 values. The apparent molecular weight M2, app obtained from LS is shown to generally not be equal to the true molecular weight, with the differences arising from a combination of protein-solute and protein-cosolute interactions that may, in principle, also be determined from LS. PMID:21682538
Seguchi, Noriko; Quintyn, Conrad B; Yonemoto, Shiori; Takamuku, Hirofumi
2017-09-10
We explore variations in body and limb proportions of the Jomon hunter-gatherers (14,000-2500 BP), the Yayoi agriculturalists (2500-1700 BP) of Japan, and the Kumejima Islanders of the Ryukyus (1600-1800 AD) with 11 geographically diverse skeletal postcranial samples from Africa, Europe, Asia, Australia, and North America using brachial-crural indices, femur head-breadth-to-femur length ratio, femur head-breadth-to-lower-limb-length ratio, and body mass as indicators of phenotypic climatic adaptation. Specifically, we test the hypothesis that variation in limb proportions seen in Jomon, Yayoi, and Kumejima is a complex interaction of genetic adaptation; development and allometric constraints; selection, gene flow and genetic drift with changing cultural factors (i.e., nutrition) and climate. The skeletal data (1127 individuals) were subjected to principle components analysis, Manly's permutation multiple regression tests, and Relethford-Blangero analysis. The results of Manly's tests indicate that body proportions and body mass are significantly correlated with latitude, and minimum and maximum temperatures while limb proportions were not significantly correlated with these climatic variables. Principal components plots separated "climatic zones:" tropical, temperate, and arctic populations. The indigenous Jomon showed cold-adapted body proportions and warm-adapted limb proportions. Kumejima showed cold-adapted body proportions and limbs. The Yayoi adhered to the Allen-Bergmann expectation of cold-adapted body and limb proportions. Relethford-Blangero analysis showed that Kumejima experienced gene flow indicated by high observed variances while Jomon experienced genetic drift indicated by low observed variances. The complex interaction of evolutionary forces and development/nutritional constraints are implicated in the mismatch of limb and body proportions. © 2017 Wiley Periodicals, Inc.
A Method for Calculating the Probability of Successfully Completing a Rocket Propulsion Ground Test
NASA Technical Reports Server (NTRS)
Messer, Bradley P.
2004-01-01
Propulsion ground test facilities face the daily challenges of scheduling multiple customers into limited facility space and successfully completing their propulsion test projects. Due to budgetary and schedule constraints, NASA and industry customers are pushing to test more components, for less money, in a shorter period of time. As these new rocket engine component test programs are undertaken, the lack of technology maturity in the test articles, combined with pushing the test facilities capabilities to their limits, tends to lead to an increase in facility breakdowns and unsuccessful tests. Over the last five years Stennis Space Center's propulsion test facilities have performed hundreds of tests, collected thousands of seconds of test data, and broken numerous test facility and test article parts. While various initiatives have been implemented to provide better propulsion test techniques and improve the quality, reliability, and maintainability of goods and parts used in the propulsion test facilities, unexpected failures during testing still occur quite regularly due to the harsh environment in which the propulsion test facilities operate. Previous attempts at modeling the lifecycle of a propulsion component test project have met with little success. Each of the attempts suffered form incomplete or inconsistent data on which to base the models. By focusing on the actual test phase of the tests project rather than the formulation, design or construction phases of the test project, the quality and quantity of available data increases dramatically. A logistic regression model has been developed form the data collected over the last five years, allowing the probability of successfully completing a rocket propulsion component test to be calculated. A logistic regression model is a mathematical modeling approach that can be used to describe the relationship of several independent predictor variables X(sub 1), X(sub 2),..,X(sub k) to a binary or dichotomous dependent variable Y, where Y can only be one of two possible outcomes, in this case Success or Failure. Logistic regression has primarily been used in the fields of epidemiology and biomedical research, but lends itself to many other applications. As indicated the use of logistic regression is not new, however, modeling propulsion ground test facilities using logistic regression is both a new and unique application of the statistical technique. Results from the models provide project managers with insight and confidence into the affectivity of rocket engine component ground test projects. The initial success in modeling rocket propulsion ground test projects clears the way for more complex models to be developed in this area.
Mi, Baibing; Dang, Shaonong; Li, Qiang; Zhao, Yaling; Yang, Ruihai; Wang, Duolao; Yan, Hong
2015-07-01
Hypertensive patients have more complex health care needs and are more likely to have poorer health-related quality of life than normotensive people. The awareness of hypertension could be related to reduce health-related quality of life. We propose the use of quantile regression to explore more detailed relationships between awareness of hypertension and health-related quality of life. In a cross-sectional, population-based study, 2737 participants (including 1035 hypertensive patients and 1702 normotensive participants) completed the Short-Form Health Survey. A quantile regression model was employed to investigate the association of physical component summary scores and mental component summary scores with awareness of hypertension and to evaluate the associated factors. Patients who were aware of hypertension (N = 554) had lower scores than patients who were unaware of hypertension (N = 481). The median (IQR) of physical component summary scores: 48.20 (13.88) versus 53.27 (10.79), P < 0.01; the mental component summary scores: 50.68 (15.09) versus 51.70 (10.65), P = 0.03. adjusting for covariates, the quantile regression results suggest awareness of hypertension was associated with most physical component summary scores quantiles (P < 0.05 except 10th and 20th quantiles) in which the β-estimates from -2.14 (95% CI: -3.80 to -0.48) to -1.45 (95% CI: -2.42 to -0.47), as the same significant trend with some poorer mental component summary scores quantiles in which the β-estimates from -3.47 (95% CI: -6.65 to -0.39) to -2.18 (95% CI: -4.30 to -0.06). The awareness of hypertension has a greater effect on those with intermediate physical component summary status: the β-estimates were equal to -2.04 (95% CI: -3.51 to -0.57, P < 0.05) at the 40th and decreased further to -1.45 (95% CI: -2.42 to -0.47, P < 0.01) at the 90th quantile. Awareness of hypertension was negatively related to health-related quality of life in hypertensive patients in rural western China, which has a greater effect on mental component summary scores with the poorer status and on physical component summary scores with the intermediate status.
21 CFR 814.39 - PMA supplements.
Code of Federal Regulations, 2012 CFR
2012-04-01
.... (6) Changes in the performance or design specifications, circuits, components, ingredients, principle of operation, or physical layout of the device. (7) Extension of the expiration date of the device... new indications for use of the device, significant changes in the performance or design specifications...
21 CFR 814.39 - PMA supplements.
Code of Federal Regulations, 2011 CFR
2011-04-01
.... (6) Changes in the performance or design specifications, circuits, components, ingredients, principle of operation, or physical layout of the device. (7) Extension of the expiration date of the device... new indications for use of the device, significant changes in the performance or design specifications...
21 CFR 814.39 - PMA supplements.
Code of Federal Regulations, 2013 CFR
2013-04-01
.... (6) Changes in the performance or design specifications, circuits, components, ingredients, principle of operation, or physical layout of the device. (7) Extension of the expiration date of the device... new indications for use of the device, significant changes in the performance or design specifications...
Melvin T. Tyree
2003-01-01
Matric potential, r, is a component of water potential, ?, but has different meanings in plant physiology vs. soil science. A rigorous definition of r requires a reference to principles of thermodynamics (both classical and irreversible thermodynamics). A rigorous treatment is beyond the scope of this brief overview. Readers...
FUNDAMENTALS OF TELEVISION SYSTEMS.
ERIC Educational Resources Information Center
KESSLER, WILLIAM J.
DESIGNED FOR A READER WITHOUT SPECIAL TECHNICAL KNOWLEDGE, THIS ILLUSTRATED RESOURCE PAPER EXPLAINS THE COMPONENTS OF A TELEVISION SYSTEM AND RELATES THEM TO THE COMPLETE SYSTEM. SUBJECTS DISCUSSED ARE THE FOLLOWING--STUDIO ORGANIZATION AND COMPATIBLE COLOR TELEVISION PRINCIPLES, WIRED AND RADIO TRANSMISSION SYSTEMS, DIRECT VIEW AND PROJECTION…
ERIC Educational Resources Information Center
Pettersson, Rune
2014-01-01
Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…
DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES
Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...
Concepts in Environmental Education.
ERIC Educational Resources Information Center
Hopkins, Sally
Presented is a discussion of the components and concepts of an ecology typical of the coastal southeastern United States. Principles presented are applicable to other areas. The discussion includes several major sections: the environment, wildlife management, freshwater ecosystems, and the estuarine environment. Numerous figures and illustrations…
20 CFR 633.309 - Recordkeeping requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... data components provide federally-required records and reports that are accurate, uniform in definition... accordance with established program definitions contained in the Act and these regulations; (2) Follow..., consistent, and accurate; (5) Meet generally accepted accounting principles as prescribed in 41 CFR part 29...
1981-01-01
explanatory variable has been ommitted. Ramsey (1974) has developed a rather interesting test for detecting specification errors using estimates of the...Peter. (1979) A Guide to Econometrics , Cambridge, MA: The MIT Press. Ramsey , J.B. (1974), "Classical Model Selection Through Specification Error... Tests ," in P. Zarembka, Ed. Frontiers in Econometrics , New York: Academia Press. Theil, Henri. (1971), Principles of Econometrics , New York: John Wiley
Hatton, Leslie; Warr, Gregory
2015-01-01
That the physicochemical properties of amino acids constrain the structure, function and evolution of proteins is not in doubt. However, principles derived from information theory may also set bounds on the structure (and thus also the evolution) of proteins. Here we analyze the global properties of the full set of proteins in release 13-11 of the SwissProt database, showing by experimental test of predictions from information theory that their collective structure exhibits properties that are consistent with their being guided by a conservation principle. This principle (Conservation of Information) defines the global properties of systems composed of discrete components each of which is in turn assembled from discrete smaller pieces. In the system of proteins, each protein is a component, and each protein is assembled from amino acids. Central to this principle is the inter-relationship of the unique amino acid count and total length of a protein and its implications for both average protein length and occurrence of proteins with specific unique amino acid counts. The unique amino acid count is simply the number of distinct amino acids (including those that are post-translationally modified) that occur in a protein, and is independent of the number of times that the particular amino acid occurs in the sequence. Conservation of Information does not operate at the local level (it is independent of the physicochemical properties of the amino acids) where the influences of natural selection are manifest in the variety of protein structure and function that is well understood. Rather, this analysis implies that Conservation of Information would define the global bounds within which the whole system of proteins is constrained; thus it appears to be acting to constrain evolution at a level different from natural selection, a conclusion that appears counter-intuitive but is supported by the studies described herein.
YÜCEL, MERYEM A.; SELB, JULIETTE; COOPER, ROBERT J.; BOAS, DAVID A.
2014-01-01
As near-infrared spectroscopy (NIRS) broadens its application area to different age and disease groups, motion artifacts in the NIRS signal due to subject movement is becoming an important challenge. Motion artifacts generally produce signal fluctuations that are larger than physiological NIRS signals, thus it is crucial to correct for them before obtaining an estimate of stimulus evoked hemodynamic responses. There are various methods for correction such as principle component analysis (PCA), wavelet-based filtering and spline interpolation. Here, we introduce a new approach to motion artifact correction, targeted principle component analysis (tPCA), which incorporates a PCA filter only on the segments of data identified as motion artifacts. It is expected that this will overcome the issues of filtering desired signals that plagues standard PCA filtering of entire data sets. We compared the new approach with the most effective motion artifact correction algorithms on a set of data acquired simultaneously with a collodion-fixed probe (low motion artifact content) and a standard Velcro probe (high motion artifact content). Our results show that tPCA gives statistically better results in recovering hemodynamic response function (HRF) as compared to wavelet-based filtering and spline interpolation for the Velcro probe. It results in a significant reduction in mean-squared error (MSE) and significant enhancement in Pearson’s correlation coefficient to the true HRF. The collodion-fixed fiber probe with no motion correction performed better than the Velcro probe corrected for motion artifacts in terms of MSE and Pearson’s correlation coefficient. Thus, if the experimental study permits, the use of a collodion-fixed fiber probe may be desirable. If the use of a collodion-fixed probe is not feasible, then we suggest the use of tPCA in the processing of motion artifact contaminated data. PMID:25360181
Nakajima, Toshiyuki
2017-12-01
Evolution by natural selection requires the following conditions: (1) a particular selective environment; (2) variation of traits in the population; (3) differential survival/reproduction among the types of organisms; and (4) heritable traits. However, the traditional (standard) model does not clearly explain how and why these conditions are generated or determined. What generates a selective environment? What generates new types? How does a certain type replace, or coexist with, others? In this paper, based on the holistic philosophy of Western and Eastern traditions, I focus on the ecosystem as a higher-level system and generator of conditions that induce the evolution of component populations; I also aim to identify the ecosystem processes that generate those conditions. In particular, I employ what I call the scientific principle of dependent-arising (SDA), which is tailored for scientific use and is based on Buddhism principle called "pratītya-samutpāda" in Sanskrit. The SDA principle asserts that there exists a higher-level system, or entity, which includes a focal process of a system as a part within it; this determines or generates the conditions required for the focal process to work in a particular way. I conclude that the ecosystem generates (1) selective environments for component species through ecosystem dynamics; (2) new genetic types through lateral gene transfer, hybridization, and symbiogenesis among the component species of the ecosystem; (3) mechanistic processes of replacement of an old type with a new one. The results of this study indicate that the ecological extension of the theoretical model of adaptive evolution is required for better understanding of adaptive evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.
Time Projection Compton Spectrometer (TPCS). User`s guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landron, C.O.; Baldwin, G.T.
1994-04-01
The Time Projection Compton Spectrometer (TPCS) is a radiation diagnostic designed to determine the time-integrated energy spectrum between 100 keV -- 2 MeV of flash x-ray sources. This guide is intended as a reference for the routine operator of the TPCS. Contents include a brief overview of the principle of operation, detailed component descriptions, detailed assembly and disassembly procedures, guide to routine operations, and troubleshooting flowcharts. Detailed principle of operation, signal analysis and spectrum unfold algorithms are beyond the scope of this guide; however, the guide makes reference to sources containing this information.
[Advance in imaging spectropolarimeter].
Wang, Xin-quan; Xiangli, Bin; Huang, Min; Hu, Liang; Zhou, Jin-song; Jing, Juan-juan
2011-07-01
Imaging spectropolarimeter (ISP) is a type of novel photoelectric sensor which integrated the functions of imaging, spectrometry and polarimetry. In the present paper, the concept of the ISP is introduced, and the advances in ISP at home and abroad in recent years is reviewed. The principles of ISPs based on novel devices, such as acousto-optic tunable filter (AOTF) and liquid crystal tunable filter (LCTF), are illustrated. In addition, the principles of ISPs developed by adding polarized components to the dispersing-type imaging spectrometer, spatially modulated Fourier transform imaging spectrometer, and computer tomography imaging spectrometer are introduced. Moreover, the trends of ISP are discussed too.
2011-01-01
Background We investigate whether the changing environment caused by rapid economic growth yielded differential effects for successive Taiwanese generations on 8 components of metabolic syndrome (MetS): body mass index (BMI), systolic blood pressure (SBP), diastolic blood pressure (DBP), fasting plasma glucose (FPG), triglycerides (TG), high-density lipoprotein (HDL), Low-density lipoproteins (LDL) and uric acid (UA). Methods To assess the impact of age, birth year and year of examination on MetS components, we used partial least squares regression to analyze data collected by Mei-Jaw clinics in Taiwan in years 1996 and 2006. Confounders, such as the number of years in formal education, alcohol intake, smoking history status, and betel-nut chewing were adjusted for. Results As the age of individuals increased, the values of components generally increased except for UA. Men born after 1970 had lower FPG, lower BMI, lower DBP, lower TG, Lower LDL and greater HDL; women born after 1970 had lower BMI, lower DBP, lower TG, Lower LDL and greater HDL and UA. There is a similar pattern between the trend in levels of metabolic syndrome components against birth year of birth and economic growth in Taiwan. Conclusions We found cohort effects in some MetS components, suggesting associations between the changing environment and health outcomes in later life. This ecological association is worthy of further investigation. PMID:21619595
Aaron Weiskittel; Jereme Frank; David Walker; Phil Radtke; David Macfarlane; James Westfall
2015-01-01
Prediction of forest biomass and carbon is becoming important issues in the United States. However, estimating forest biomass and carbon is difficult and relies on empirically-derived regression equations. Based on recent findings from a national gap analysis and comprehensive assessment of the USDA Forest Service Forest Inventory and Analysis (USFS-FIA) component...
NASA Astrophysics Data System (ADS)
Cox, John
2014-05-01
Part 1. The Winning of the Principles: 1. Introduction; 2. The beginnings of statics. Archimedes. Problem of the lever and of the centre of gravity; 2. Experimental verification and applications of the principle of the lever; 3. The centre of gravity; 4. The balance; 5. Stevinus of Bruges. The principle of the inclined plane; 6. The parallelogram of forces; 7. The principle of virtual work; 8. Review of the principles of statics; 9. The beginnings of dynamics. Galileo. The problem of falling bodies; 10. Huyghens. The problem of uniform motion in a circle. 'Centrifugal force'; 11. Final statement of the principles of dynamics. Extension to the motions of the heavenly bodies. The law of universal gravitation. Newton; Part II. Mathematical Statement of the Principles: Introduction; 12. Kinematics; 13. Kinetics of a particle moving in a straight line. The laws of motion; 14. Experimental verification of the laws of motion. Atwood's machine; 15. Work and energy; 16. The parallelogram law; 17. The composition and resolution of forces. Resultant. Component. Equilibrium; 18. Forces in one plane; 19. Friction; Part III. Application to Various Problems: 20. Motion on an inclined plane. Brachistochrones; 21. Projectiles; 22. Simple harmonic motion; 23. The simple pendulum; 24. Central forces. The law of gravitation; 25. Impact and impulsive forces; Part IV. The Elements of Rigid Dynamics: 26. The compound pendulum. Huyghens' solution; 27. D'alembert's principle; 28. Moment of inertia; 29. Experimental determination of moments of inertia; 30. Determination of the value of gravity by Kater's pendulum; 31. The constant of gravitation, or weighing the Earth. The Cavendish experiment; Answers to the examples; Index.
Solar cell array design handbook - The principles and technology of photovoltaic energy conversion
NASA Technical Reports Server (NTRS)
Rauschenbach, H. S.
1980-01-01
Photovoltaic solar cell array design and technology for ground-based and space applications are discussed from the user's point of view. Solar array systems are described, with attention given to array concepts, historical development, applications and performance, and the analysis of array characteristics, circuits, components, performance and reliability is examined. Aspects of solar cell array design considered include the design process, photovoltaic system and detailed array design, and the design of array thermal, radiation shielding and electromagnetic components. Attention is then given to the characteristics and design of the separate components of solar arrays, including the solar cells, optical elements and mechanical elements, and the fabrication, testing, environmental conditions and effects and material properties of arrays and their components are discussed.
bioWidgets: data interaction components for genomics.
Fischer, S; Crabtree, J; Brunk, B; Gibson, M; Overton, G C
1999-10-01
The presentation of genomics data in a perspicuous visual format is critical for its rapid interpretation and validation. Relatively few public database developers have the resources to implement sophisticated front-end user interfaces themselves. Accordingly, these developers would benefit from a reusable toolkit of user interface and data visualization components. We have designed the bioWidget toolkit as a set of JavaBean components. It includes a wide array of user interface components and defines an architecture for assembling applications. The toolkit is founded on established software engineering design patterns and principles, including componentry, Model-View-Controller, factored models and schema neutrality. As a proof of concept, we have used the bioWidget toolkit to create three extendible applications: AnnotView, BlastView and AlignView.
Multivariate Analysis of Seismic Field Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alam, M. Kathleen
1999-06-01
This report includes the details of the model building procedure and prediction of seismic field data. Principal Components Regression, a multivariate analysis technique, was used to model seismic data collected as two pieces of equipment were cycled on and off. Models built that included only the two pieces of equipment of interest had trouble predicting data containing signals not included in the model. Evidence for poor predictions came from the prediction curves as well as spectral F-ratio plots. Once the extraneous signals were included in the model, predictions improved dramatically. While Principal Components Regression performed well for the present datamore » sets, the present data analysis suggests further work will be needed to develop more robust modeling methods as the data become more complex.« less
Symplectic geometry spectrum regression for prediction of noisy time series
NASA Astrophysics Data System (ADS)
Xie, Hong-Bo; Dokos, Socrates; Sivakumar, Bellie; Mengersen, Kerrie
2016-05-01
We present the symplectic geometry spectrum regression (SGSR) technique as well as a regularized method based on SGSR for prediction of nonlinear time series. The main tool of analysis is the symplectic geometry spectrum analysis, which decomposes a time series into the sum of a small number of independent and interpretable components. The key to successful regularization is to damp higher order symplectic geometry spectrum components. The effectiveness of SGSR and its superiority over local approximation using ordinary least squares are demonstrated through prediction of two noisy synthetic chaotic time series (Lorenz and Rössler series), and then tested for prediction of three real-world data sets (Mississippi River flow data and electromyographic and mechanomyographic signal recorded from human body).
Metacognitive components in smart learning environment
NASA Astrophysics Data System (ADS)
Sumadyo, M.; Santoso, H. B.; Sensuse, D. I.
2018-03-01
Metacognitive ability in digital-based learning process helps students in achieving learning goals. So that digital-based learning environment should make the metacognitive component as a facility that must be equipped. Smart Learning Environment is the concept of a learning environment that certainly has more advanced components than just a digital learning environment. This study examines the metacognitive component of the smart learning environment to support the learning process. A review of the metacognitive literature was conducted to examine the components involved in metacognitive learning strategies. Review is also conducted on the results of study smart learning environment, ranging from design to context in building smart learning. Metacognitive learning strategies certainly require the support of adaptable, responsive and personalize learning environments in accordance with the principles of smart learning. The current study proposed the role of metacognitive component in smart learning environment, which is useful as the basis of research in building environment in smart learning.
Organization of project works in Industry 4.0 digital item designing companies
NASA Astrophysics Data System (ADS)
Gurjanov, A. V.; Zakoldaev, D. A.; Shukalov, A. V.; Zharinov, I. O.
2018-05-01
The task of the project works organization in the Industry 4.0 item designing digital factories is being studied. There is a scheme of the item designing component life cycle. There is also a scheme how to develop and confirm the quality of the item designing component documentation using the mathematical modelling. There is a description of the self-organization principles for the cyber and physical technological equipment in the Industry 4.0 «smart factory» company during the manufacturing process.
1985-01-01
components must also perform accurately if control is to be accurate, tests were made to determine if these components were likely to introduce more...efficient. However, it also greatly increases the com- plexity of the control systems, since room temperature measurements must be made for each zone, with...involving a psychrometer (a dry-bulb and a wet-bulb mercury thermometer) provides only a rough indication. Calibration is time- consuming and only partly
Forensic applications of metallurgy - Failure analysis of metal screw and bolt products
NASA Astrophysics Data System (ADS)
Tiner, Nathan A.
1993-03-01
It is often necessary for engineering consultants in liability lawsuits to consider whether a component has a manufacturing and/or design defect, as judged by industry standards, as well as whether the component was strong enough to resist service loads. Attention is presently given to the principles that must be appealed to in order to clarify these two issues in the cases of metal screw and bolt failures, which are subject to fatigue and brittle fractures and ductile dimple rupture.
2008-10-01
and UTCHEM (Clement et al., 1998). While all four of these software packages use conservation of mass as the basic principle for tracking NAPL...simulate dissolution of a single NAPL component. UTCHEM can be used to simulate dissolution of a multiple NAPL components using either linear or first...parameters. No UTCHEM a/ 3D model, general purpose NAPL simulator. Yes Virulo a/ Probabilistic model for predicting leaching of viruses in unsaturated
Improving animal research facility operations through the application of lean principles.
Khan, Nabeel; Umrysh, Brian M
2008-06-01
Animal research is a vital component of US research and well-functioning animal research facilities are critical both to the research itself and to the housing and feeding of the animals. The Office of Animal Care (OAC) at Seattle Children's Hospital Research Institute realized it had to improve the efficiency and safety of its animal research facility (ARF) to prepare for expansion and to advance the Institute's mission. The main areas for improvement concerned excessive turnaround time to process animal housing and feeding equipment; the movement and flow of equipment and inventory; and personnel safety. To address these problems, management held two process improvement workshops to educate employees about lean principles. In this article we discuss the application of these principles and corresponding methods to advance Children's Research Institute's mission of preventing, treating, and eliminating childhood diseases.
Design of virtual SCADA simulation system for pressurized water reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wijaksono, Umar, E-mail: umar.wijaksono@student.upi.edu; Abdullah, Ade Gafar; Hakim, Dadang Lukman
The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles ofmore » energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.« less
Ibrahim, George M; Morgan, Benjamin R; Macdonald, R Loch
2014-03-01
Predictors of outcome after aneurysmal subarachnoid hemorrhage have been determined previously through hypothesis-driven methods that often exclude putative covariates and require a priori knowledge of potential confounders. Here, we apply a data-driven approach, principal component analysis, to identify baseline patient phenotypes that may predict neurological outcomes. Principal component analysis was performed on 120 subjects enrolled in a prospective randomized trial of clazosentan for the prevention of angiographic vasospasm. Correlation matrices were created using a combination of Pearson, polyserial, and polychoric regressions among 46 variables. Scores of significant components (with eigenvalues>1) were included in multivariate logistic regression models with incidence of severe angiographic vasospasm, delayed ischemic neurological deficit, and long-term outcome as outcomes of interest. Sixteen significant principal components accounting for 74.6% of the variance were identified. A single component dominated by the patients' initial hemodynamic status, World Federation of Neurosurgical Societies score, neurological injury, and initial neutrophil/leukocyte counts was significantly associated with poor outcome. Two additional components were associated with angiographic vasospasm, of which one was also associated with delayed ischemic neurological deficit. The first was dominated by the aneurysm-securing procedure, subarachnoid clot clearance, and intracerebral hemorrhage, whereas the second had high contributions from markers of anemia and albumin levels. Principal component analysis, a data-driven approach, identified patient phenotypes that are associated with worse neurological outcomes. Such data reduction methods may provide a better approximation of unique patient phenotypes and may inform clinical care as well as patient recruitment into clinical trials. http://www.clinicaltrials.gov. Unique identifier: NCT00111085.
Orthogonal decomposition of left ventricular remodeling in myocardial infarction.
Zhang, Xingyu; Medrano-Gracia, Pau; Ambale-Venkatesh, Bharath; Bluemke, David A; Cowan, Brett R; Finn, J Paul; Kadish, Alan H; Lee, Daniel C; Lima, Joao A C; Young, Alistair A; Suinesiaputra, Avan
2017-03-01
Left ventricular size and shape are important for quantifying cardiac remodeling in response to cardiovascular disease. Geometric remodeling indices have been shown to have prognostic value in predicting adverse events in the clinical literature, but these often describe interrelated shape changes. We developed a novel method for deriving orthogonal remodeling components directly from any (moderately independent) set of clinical remodeling indices. Six clinical remodeling indices (end-diastolic volume index, sphericity, relative wall thickness, ejection fraction, apical conicity, and longitudinal shortening) were evaluated using cardiac magnetic resonance images of 300 patients with myocardial infarction, and 1991 asymptomatic subjects, obtained from the Cardiac Atlas Project. Partial least squares (PLS) regression of left ventricular shape models resulted in remodeling components that were optimally associated with each remodeling index. A Gram-Schmidt orthogonalization process, by which remodeling components were successively removed from the shape space in the order of shape variance explained, resulted in a set of orthonormal remodeling components. Remodeling scores could then be calculated that quantify the amount of each remodeling component present in each case. A one-factor PLS regression led to more decoupling between scores from the different remodeling components across the entire cohort, and zero correlation between clinical indices and subsequent scores. The PLS orthogonal remodeling components had similar power to describe differences between myocardial infarction patients and asymptomatic subjects as principal component analysis, but were better associated with well-understood clinical indices of cardiac remodeling. The data and analyses are available from www.cardiacatlas.org. © The Author 2017. Published by Oxford University Press.
Multicomponent analysis of a digital Trail Making Test.
Fellows, Robert P; Dahmen, Jessamyn; Cook, Diane; Schmitter-Edgecombe, Maureen
2017-01-01
The purpose of the current study was to use a newly developed digital tablet-based variant of the TMT to isolate component cognitive processes underlying TMT performance. Similar to the paper-based trail making test, this digital variant consists of two conditions, Part A and Part B. However, this digital version automatically collects additional data to create component subtest scores to isolate cognitive abilities. Specifically, in addition to the total time to completion and number of errors, the digital Trail Making Test (dTMT) records several unique components including the number of pauses, pause duration, lifts, lift duration, time inside each circle, and time between circles. Participants were community-dwelling older adults who completed a neuropsychological evaluation including measures of processing speed, inhibitory control, visual working memory/sequencing, and set-switching. The abilities underlying TMT performance were assessed through regression analyses of component scores from the dTMT with traditional neuropsychological measures. Results revealed significant correlations between paper and digital variants of Part A (r s = .541, p < .001) and paper and digital versions of Part B (r s = .799, p < .001). Regression analyses with traditional neuropsychological measures revealed that Part A components were best predicted by speeded processing, while inhibitory control and visual/spatial sequencing were predictors of specific components of Part B. Exploratory analyses revealed that specific dTMT-B components were associated with a performance-based medication management task. Taken together, these results elucidate specific cognitive abilities underlying TMT performance, as well as the utility of isolating digital components.
Modeling longitudinal data, I: principles of multivariate analysis.
Ravani, Pietro; Barrett, Brendan; Parfrey, Patrick
2009-01-01
Statistical models are used to study the relationship between exposure and disease while accounting for the potential role of other factors' impact on outcomes. This adjustment is useful to obtain unbiased estimates of true effects or to predict future outcomes. Statistical models include a systematic component and an error component. The systematic component explains the variability of the response variable as a function of the predictors and is summarized in the effect estimates (model coefficients). The error element of the model represents the variability in the data unexplained by the model and is used to build measures of precision around the point estimates (confidence intervals).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nemec, Patrik, E-mail: patrik.nemec@fstroj.uniza.sk; Malcho, Milan, E-mail: milan.malcho@fstroj.uniza.sk
This work deal with experimental evaluation of cooling efficiency of cooling device capable transfer high heat fluxes from electric elements to the surrounding. The work contain description of cooling device, working principle of cooling device, construction of cooling device. Experimental part describe the measuring method of device cooling efficiency evaluation. The work results are presented in graphic visualization of temperature dependence of the contact area surface between cooling device evaporator and electronic components on the loaded heat of electronic components in range from 250 to 740 W and temperature dependence of the loop thermosiphon condenser surface on the loaded heatmore » of electronic components in range from 250 to 740 W.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dierauf, Timothy; Kurtz, Sarah; Riley, Evan
This paper provides a recommended method for evaluating the AC capacity of a photovoltaic (PV) generating station. It also presents companion guidance on setting the facilitys capacity guarantee value. This is a principles-based approach that incorporates plant fundamental design parameters such as loss factors, module coefficients, and inverter constraints. This method has been used to prove contract guarantees for over 700 MW of installed projects. The method is transparent, and the results are deterministic. In contrast, current industry practices incorporate statistical regression where the empirical coefficients may only characterize the collected data. Though these methods may work well when extrapolationmore » is not required, there are other situations where the empirical coefficients may not adequately model actual performance.This proposed Fundamentals Approach method provides consistent results even where regression methods start to lose fidelity.« less
Understanding Preservice Teachers' Technology Use through TPACK Framework
ERIC Educational Resources Information Center
Pamuk, S.
2012-01-01
This study discusses preservice teachers' achievement barriers to technology integration, using principles of technological pedagogical content knowledge (TPACK) as an evaluative framework. Technology-capable participants each freely chose a content area to comprise project. Data analysis based on interactions among core components of TPACK…
Evolution: Advancing Communities of Practice in Naval Intelligence
2003-06-01
Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2...principles of Knowledge Management (KM). One of the key components of KM is the Community of Practice. Communities of Practice are groups that form to
Multiculturalism: Being Canadian.
ERIC Educational Resources Information Center
Department of the Secretary of State, Ottawa (Ontario). Multiculturalism Directorate.
This booklet introduces Canada's Multiculturalism Act which provides for a new government-wide commitment to the principles and policy objectives of multiculturalism. As an essential component of the Canadian identity, multiculturalism has been fundamental to nation building and has allowed Canadians to enjoy the benefits of life in a culturally…
Quantitative site type delineation for pastures in the northeastern United States
USDA-ARS?s Scientific Manuscript database
The Grazing Lands component of the USDA-NRCS Conservation Effects Assessment Project (CEAP) is a national assessment of the environmental effects of management practices used on pasture and rangelands. The pasture subcomponent is based around applying ecological principles to temperate humid grazing...
Introductory Course in Biomedical Ethics in the Obstetrics-Gynecology Residency.
ERIC Educational Resources Information Center
Elkins, Thomas E.
1988-01-01
Information used in a brief lecture that introduces a biomedical ethics curriculum in an obstetrics and gynecology residency is described. Major components include theories of philosophic ethics (formalist and consequentialist) and principles of biomedical ethics (honesty, contract-keeping, nonmaleficence, justice, autonomy, beneficence,…
Simulation of a Moving Elastic Beam Using Hamilton’s Weak Principle
2006-03-01
versions were limited to two-dimensional systems with open tree configurations (where a cut in any component separates the system in half) [48]. This...whose com- ponents experienced large angular rotations (turbomachinery, camshafts , flywheels, etc.). More complex systems required the simultaneous
COMPARISON OF LARGE RIVER SAMPLING METHOD USING DIATOM METRICS
We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...
DEMONSTRATION BULLETIN: SOIL WASHING SYSTEM - BIOTROL, INC.
The three component technologies of the BioTrol Soil Washing System (BSWS). Tested in the SITE demonstration were a Soil Washer (SW), and Aqueous Treatment System (ATS), and a Slurry Bio-Reactor (SBR). The Soil Washer operates on the principle that a significant fraction of the...
Human health is affected by simultaneous exposure to stressors and amenities, but research typically considers single exposures. In order to account for multiple ambient environmental conditions, we constructed an Environmental Quality Index (EQI) using principle components analy...
Diesel Technology: Engines. [Teacher and Student Editions.
ERIC Educational Resources Information Center
Barbieri, Dave; Miller, Roger; Kellum, Mary
Competency-based teacher and student materials on diesel engines are provided for a diesel technology curriculum. Seventeen units of instruction cover the following topics: introduction to engine principles and procedures; engine systems and components; fuel systems; engine diagnosis and maintenance. The materials are based on the…
COMPARISON OF LARGE RIVER SAMPLING METHODS ON ALGAL METRICS
We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...