Sample records for uninformative variable elimination

  1. [Determination of soluble solids content in Nanfeng Mandarin by Vis/NIR spectroscopy and UVE-ICA-LS-SVM].

    PubMed

    Sun, Tong; Xu, Wen-Li; Hu, Tian; Liu, Mu-Hua

    2013-12-01

    The objective of the present research was to assess soluble solids content (SSC) of Nanfeng mandarin by visible/near infrared (Vis/NIR) spectroscopy combined with new variable selection method, simplify prediction model and improve the performance of prediction model for SSC of Nanfeng mandarin. A total of 300 Nanfeng mandarin samples were used, the numbers of Nanfeng mandarin samples in calibration, validation and prediction sets were 150, 75 and 75, respectively. Vis/NIR spectra of Nanfeng mandarin samples were acquired by a QualitySpec spectrometer in the wavelength range of 350-1000 nm. Uninformative variables elimination (UVE) was used to eliminate wavelength variables that had few information of SSC, then independent component analysis (ICA) was used to extract independent components (ICs) from spectra that eliminated uninformative wavelength variables. At last, least squares support vector machine (LS-SVM) was used to develop calibration models for SSC of Nanfeng mandarin using extracted ICs, and 75 prediction samples that had not been used for model development were used to evaluate the performance of SSC model of Nanfeng mandarin. The results indicate t hat Vis/NIR spectroscopy combinedwith UVE-ICA-LS-SVM is suitable for assessing SSC o f Nanfeng mandarin, and t he precision o f prediction ishigh. UVE--ICA is an effective method to eliminate uninformative wavelength variables, extract important spectral information, simplify prediction model and improve the performance of prediction model. The SSC model developed by UVE-ICA-LS-SVM is superior to that developed by PLS, PCA-LS-SVM or ICA-LS-SVM, and the coefficient of determination and root mean square error in calibration, validation and prediction sets were 0.978, 0.230%, 0.965, 0.301% and 0.967, 0.292%, respectively.

  2. [Variable selection methods combined with local linear embedding theory used for optimization of near infrared spectral quantitative models].

    PubMed

    Hao, Yong; Sun, Xu-Dong; Yang, Qiang

    2012-12-01

    Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.

  3. Combination of multiple model population analysis and mid-infrared technology for the estimation of copper content in Tegillarca granosa

    NASA Astrophysics Data System (ADS)

    Hu, Meng-Han; Chen, Xiao-Jing; Ye, Peng-Chao; Chen, Xi; Shi, Yi-Jian; Zhai, Guang-Tao; Yang, Xiao-Kang

    2016-11-01

    The aim of this study was to use mid-infrared spectroscopy coupled with multiple model population analysis based on Monte Carlo-uninformative variable elimination for rapidly estimating the copper content of Tegillarca granosa. Copper-specific wavelengths were first extracted from the whole spectra, and subsequently, a least square-support vector machine was used to develop the prediction models. Compared with the prediction model based on full wavelengths, models that used 100 multiple MC-UVE selected wavelengths without and with bin operation showed comparable performances with Rp (root mean square error of Prediction) of 0.97 (14.60 mg/kg) and 0.94 (20.85 mg/kg) versus 0.96 (17.27 mg/kg), as well as ratio of percent deviation (number of wavelength) of 2.77 (407) and 1.84 (45) versus 2.32 (1762). The obtained results demonstrated that the mid-infrared technique could be used for estimating copper content in T. granosa. In addition, the proposed multiple model population analysis can eliminate uninformative, weakly informative and interfering wavelengths effectively, that substantially reduced the model complexity and computation time.

  4. Variables selection methods in near-infrared spectroscopy.

    PubMed

    Xiaobo, Zou; Jiewen, Zhao; Povey, Malcolm J W; Holmes, Mel; Hanpin, Mao

    2010-05-14

    Near-infrared (NIR) spectroscopy has increasingly been adopted as an analytical tool in various fields, such as the petrochemical, pharmaceutical, environmental, clinical, agricultural, food and biomedical sectors during the past 15 years. A NIR spectrum of a sample is typically measured by modern scanning instruments at hundreds of equally spaced wavelengths. The large number of spectral variables in most data sets encountered in NIR spectral chemometrics often renders the prediction of a dependent variable unreliable. Recently, considerable effort has been directed towards developing and evaluating different procedures that objectively identify variables which contribute useful information and/or eliminate variables containing mostly noise. This review focuses on the variable selection methods in NIR spectroscopy. Selection methods include some classical approaches, such as manual approach (knowledge based selection), "Univariate" and "Sequential" selection methods; sophisticated methods such as successive projections algorithm (SPA) and uninformative variable elimination (UVE), elaborate search-based strategies such as simulated annealing (SA), artificial neural networks (ANN) and genetic algorithms (GAs) and interval base algorithms such as interval partial least squares (iPLS), windows PLS and iterative PLS. Wavelength selection with B-spline, Kalman filtering, Fisher's weights and Bayesian are also mentioned. Finally, the websites of some variable selection software and toolboxes for non-commercial use are given. Copyright 2010 Elsevier B.V. All rights reserved.

  5. Close-range hyperspectral image analysis for the early detection of stress responses in individual plants in a high-throughput phenotyping platform

    NASA Astrophysics Data System (ADS)

    Mohd Asaari, Mohd Shahrimie; Mishra, Puneet; Mertens, Stien; Dhondt, Stijn; Inzé, Dirk; Wuyts, Nathalie; Scheunders, Paul

    2018-04-01

    The potential of close-range hyperspectral imaging (HSI) as a tool for detecting early drought stress responses in plants grown in a high-throughput plant phenotyping platform (HTPPP) was explored. Reflectance spectra from leaves in close-range imaging are highly influenced by plant geometry and its specific alignment towards the imaging system. This induces high uninformative variability in the recorded signals, whereas the spectral signature informing on plant biological traits remains undisclosed. A linear reflectance model that describes the effect of the distance and orientation of each pixel of a plant with respect to the imaging system was applied. By solving this model for the linear coefficients, the spectra were corrected for the uninformative illumination effects. This approach, however, was constrained by the requirement of a reference spectrum, which was difficult to obtain. As an alternative, the standard normal variate (SNV) normalisation method was applied to reduce this uninformative variability. Once the envisioned illumination effects were eliminated, the remaining differences in plant spectra were assumed to be related to changes in plant traits. To distinguish the stress-related phenomena from regular growth dynamics, a spectral analysis procedure was developed based on clustering, a supervised band selection, and a direct calculation of a spectral similarity measure against a reference. To test the significance of the discrimination between healthy and stressed plants, a statistical test was conducted using a one-way analysis of variance (ANOVA) technique. The proposed analysis techniques was validated with HSI data of maize plants (Zea mays L.) acquired in a HTPPP for early detection of drought stress in maize plant. Results showed that the pre-processing of reflectance spectra with the SNV effectively reduces the variability due to the expected illumination effects. The proposed spectral analysis method on the normalized spectra successfully detected drought stress from the third day of drought induction, confirming the potential of HSI for drought stress detection studies and further supporting its adoption in HTPPP.

  6. Nondestructive determination of the modulus of elasticity of Fraxinus mandschurica using near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Yu, Huiling; Liang, Hao; Lin, Xue; Zhang, Yizhuo

    2018-04-01

    A nondestructive methodology is proposed to determine the modulus of elasticity (MOE) of Fraxinus mandschurica samples by using near-infrared (NIR) spectroscopy. The test data consisted of 150 NIR absorption spectra of the wood samples obtained using an NIR spectrometer, with the wavelength range of 900 to 1900 nm. To eliminate the high-frequency noise and the systematic variations on the baseline, Savitzky-Golay convolution combined with standard normal variate and detrending transformation was applied as data pretreated methods. The uninformative variable elimination (UVE), improved by the evolutionary Monte Carlo (EMC) algorithm and successive projections algorithm (SPA) selected three characteristic variables from full 117 variables. The predictive ability of the models was evaluated concerning the root-mean-square error of prediction (RMSEP) and coefficient of determination (Rp2) in the prediction set. In comparison with the predicted results of all the models established in the experiments, UVE-EMC-SPA-LS-SVM presented the best results with the smallest RMSEP of 0.652 and the highest Rp2 of 0.887. Thus, it is feasible to determine the MOE of F. mandschurica using NIR spectroscopy accurately.

  7. Detection of drug active ingredients by chemometric processing of solid-state NMR spectrometry data -- the case of acetaminophen.

    PubMed

    Paradowska, Katarzyna; Jamróz, Marta Katarzyna; Kobyłka, Mariola; Gowin, Ewelina; Maczka, Paulina; Skibiński, Robert; Komsta, Łukasz

    2012-01-01

    This paper presents a preliminary study in building discriminant models from solid-state NMR spectrometry data to detect the presence of acetaminophen in over-the-counter pharmaceutical formulations. The dataset, containing 11 spectra of pure substances and 21 spectra of various formulations, was processed by partial least squares discriminant analysis (PLS-DA). The model found coped with the discrimination, and its quality parameters were acceptable. It was found that standard normal variate preprocessing had almost no influence on unsupervised investigation of the dataset. The influence of variable selection with the uninformative variable elimination by PLS method was studied, reducing the dataset from 7601 variables to around 300 informative variables, but not improving the model performance. The results showed the possibility to construct well-working PLS-DA models from such small datasets without a full experimental design.

  8. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    PubMed

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  9. Distress among women receiving uninformative BRCA1/2 results: 12-month outcomes.

    PubMed

    O'Neill, Suzanne C; Rini, Christine; Goldsmith, Rachel E; Valdimarsdottir, Heiddis; Cohen, Lawrence H; Schwartz, Marc D

    2009-10-01

    Few data are available regarding the long-term psychological impact of uninformative BRCA1/2 test results. This study examines change in distress from pretesting to 12-months post-disclosure, with medical, family history, and psychological variables, such as pretesting perceived risk of carrying a deleterious mutation prior to testing and primary and secondary appraisals, as predictors. Two hundred and nine women with uninformative BRCA1/2 test results completed questionnaires at pretesting and 1-, 6-, and 12-month post-disclosure, including measures of anxiety and depression, cancer-specific and genetic testing distress. We used a mixed models approach to predict change in post-disclosure distress. Distress declined from pretesting to 1-month post-disclosure, but remained stable thereafter. Primary appraisals predicted all types of distress at 1-month post-disclosure. Primary and secondary appraisals predicted genetic testing distress at 1-month as well as change over time. Receiving a variant of uncertain clinical significance and entering testing with a high expectation for carrying a deleterious mutation predicted genetic testing distress that persisted through the year after testing. As a whole, women receiving uninformative BRCA1/2 test results are a resilient group. For some women, distress experienced in the month after testing does not dissipate. Variables, such as heightened pretesting perceived risk and cognitive appraisals, predict greater likelihood for sustained distress in this group and could be amenable to intervention.

  10. Petroleomics by electrospray ionization FT-ICR mass spectrometry coupled to partial least squares with variable selection methods: prediction of the total acid number of crude oils.

    PubMed

    Terra, Luciana A; Filgueiras, Paulo R; Tose, Lílian V; Romão, Wanderson; de Souza, Douglas D; de Castro, Eustáquio V R; de Oliveira, Mirela S L; Dias, Júlio C M; Poppi, Ronei J

    2014-10-07

    Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.

  11. Anticipatory activity in anterior cingulate cortex can be independent of conflict and error likelihood.

    PubMed

    Aarts, Esther; Roelofs, Ardi; van Turennout, Miranda

    2008-04-30

    Previous studies have found no agreement on whether anticipatory activity in the anterior cingulate cortex (ACC) reflects upcoming conflict, error likelihood, or actual control adjustments. Using event-related functional magnetic resonance imaging, we investigated the nature of preparatory activity in the ACC. Informative cues told the participants whether an upcoming target would or would not involve conflict in a Stroop-like task. Uninformative cues provided no such information. Behavioral responses were faster after informative than after uninformative cues, indicating cue-based adjustments in control. ACC activity was larger after informative than uninformative cues, as would be expected if the ACC is involved in anticipatory control. Importantly, this activation in the ACC was observed for informative cues even when the information conveyed by the cue was that the upcoming target evokes no response conflict and has low error likelihood. This finding demonstrates that the ACC is involved in anticipatory control processes independent of upcoming response conflict or error likelihood. Moreover, the response of the ACC to the target stimuli was critically dependent on whether the cue was informative or not. ACC activity differed among target conditions after uninformative cues only, indicating ACC involvement in actual control adjustments. Together, these findings argue strongly for a role of the ACC in anticipatory control independent of anticipated conflict and error likelihood, and also show that such control can eliminate conflict-related ACC activity during target processing. Models of frontal cortex conflict-detection and conflict-resolution mechanisms require modification to include consideration of these anticipatory control properties of the ACC.

  12. Quantitative determination of additive Chlorantraniliprole in Abamectin preparation: Investigation of bootstrapping soft shrinkage approach by mid-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Yan, Hong; Song, Xiangzhong; Tian, Kuangda; Chen, Yilin; Xiong, Yanmei; Min, Shungeng

    2018-02-01

    A novel method, mid-infrared (MIR) spectroscopy, which enables the determination of Chlorantraniliprole in Abamectin within minutes, is proposed. We further evaluate the prediction ability of four wavelength selection methods, including bootstrapping soft shrinkage approach (BOSS), Monte Carlo uninformative variable elimination (MCUVE), genetic algorithm partial least squares (GA-PLS) and competitive adaptive reweighted sampling (CARS) respectively. The results showed that BOSS method obtained the lowest root mean squared error of cross validation (RMSECV) (0.0245) and root mean squared error of prediction (RMSEP) (0.0271), as well as the highest coefficient of determination of cross-validation (Qcv2) (0.9998) and the coefficient of determination of test set (Q2test) (0.9989), which demonstrated that the mid infrared spectroscopy can be used to detect Chlorantraniliprole in Abamectin conveniently. Meanwhile, a suitable wavelength selection method (BOSS) is essential to conducting a component spectral analysis.

  13. Development of predictive models for total phenolics and free p-coumaric acid contents in barley grain by near-infrared spectroscopy.

    PubMed

    Han, Zhigang; Cai, Shengguan; Zhang, Xuelei; Qian, Qiufeng; Huang, Yuqing; Dai, Fei; Zhang, Guoping

    2017-07-15

    Barley grains are rich in phenolic compounds, which are associated with reduced risk of chronic diseases. Development of barley cultivars with high phenolic acid content has become one of the main objectives in breeding programs. A rapid and accurate method for measuring phenolic compounds would be helpful for crop breeding. We developed predictive models for both total phenolics (TPC) and p-coumaric acid (PA), based on near-infrared spectroscopy (NIRS) analysis. Regressions of partial least squares (PLS) and least squares support vector machine (LS-SVM) were compared for improving the models, and Monte Carlo-Uninformative Variable Elimination (MC-UVE) was applied to select informative wavelengths. The optimal calibration models generated high coefficients of correlation (r pre ) and ratio performance deviation (RPD) for TPC and PA. These results indicated the models are suitable for rapid determination of phenolic compounds in barley grains. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. [Gaussian process regression and its application in near-infrared spectroscopy analysis].

    PubMed

    Feng, Ai-Ming; Fang, Li-Min; Lin, Min

    2011-06-01

    Gaussian process (GP) is applied in the present paper as a chemometric method to explore the complicated relationship between the near infrared (NIR) spectra and ingredients. After the outliers were detected by Monte Carlo cross validation (MCCV) method and removed from dataset, different preprocessing methods, such as multiplicative scatter correction (MSC), smoothing and derivate, were tried for the best performance of the models. Furthermore, uninformative variable elimination (UVE) was introduced as a variable selection technique and the characteristic wavelengths obtained were further employed as input for modeling. A public dataset with 80 NIR spectra of corn was introduced as an example for evaluating the new algorithm. The optimal models for oil, starch and protein were obtained by the GP regression method. The performance of the final models were evaluated according to the root mean square error of calibration (RMSEC), root mean square error of cross-validation (RMSECV), root mean square error of prediction (RMSEP) and correlation coefficient (r). The models give good calibration ability with r values above 0.99 and the prediction ability is also satisfactory with r values higher than 0.96. The overall results demonstrate that GP algorithm is an effective chemometric method and is promising for the NIR analysis.

  15. Thoughts on selected movement disorder terminology and a plea for clarity.

    PubMed

    Walker, Ruth H

    2013-01-01

    Description of the phenomenology of movement disorders requires precise and accurate terminology. Many of the terms that have been widely used in the literature are imprecise and open to interpretation. An examination of these terms and the assumptions implicit in their usage is important to improve communication and hence the definition, diagnosis, and treatment of movement disorders. I recommend that the term dyskinesia should be used primarily in the settings of Parkinson's disease and tardive dyskinesia, in which its clinical implications are relatively clear; it should not be used in other situations where a precise description could more usefully facilitate diagnosis and treatment. In general dyskinesia should be used in the singular form. Extrapyramidal is based upon obsolete anatomical concepts, is uninformative, and should be discarded. The term abnormal involuntary movements (AIMs) is similarly vague and uninformative, although is unlikely to be eliminated from the psychiatric literature. Movement disorder neurologists as teachers, clinicians, article reviewers, and journal editors have the responsibility to educate our colleagues regarding appropriate usage and the importance of employing correct descriptors.

  16. Predictive ability of mid-infrared spectroscopy for major mineral composition and coagulation traits of bovine milk by using the uninformative variable selection algorithm.

    PubMed

    Visentin, G; Penasa, M; Gottardo, P; Cassandro, M; De Marchi, M

    2016-10-01

    Milk minerals and coagulation properties are important for both consumers and processors, and they can aid in increasing milk added value. However, large-scale monitoring of these traits is hampered by expensive and time-consuming reference analyses. The objective of the present study was to develop prediction models for major mineral contents (Ca, K, Mg, Na, and P) and milk coagulation properties (MCP: rennet coagulation time, curd-firming time, and curd firmness) using mid-infrared spectroscopy. Individual milk samples (n=923) of Holstein-Friesian, Brown Swiss, Alpine Grey, and Simmental cows were collected from single-breed herds between January and December 2014. Reference analysis for the determination of both mineral contents and MCP was undertaken with standardized methods. For each milk sample, the mid-infrared spectrum in the range from 900 to 5,000cm(-1) was stored. Prediction models were calibrated using partial least squares regression coupled with a wavenumber selection technique called uninformative variable elimination, to improve model accuracy, and validated both internally and externally. The average reduction of wavenumbers used in partial least squares regression was 80%, which was accompanied by an average increment of 20% of the explained variance in external validation. The proportion of explained variance in external validation was about 70% for P, K, Ca, and Mg, and it was lower (40%) for Na. Milk coagulation properties prediction models explained between 54% (rennet coagulation time) and 56% (curd-firming time) of the total variance in external validation. The ratio of standard deviation of each trait to the respective root mean square error of prediction, which is an indicator of the predictive ability of an equation, suggested that the developed models might be effective for screening and collection of milk minerals and coagulation properties at the population level. Although prediction equations were not accurate enough to be proposed for analytic purposes, mid-infrared spectroscopy predictions could be evaluated as phenotypic information to genetically improve milk minerals and MCP on a large scale. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. [Characteristic wavelengths selection of soluble solids content of pear based on NIR spectral and LS-SVM].

    PubMed

    Fan, Shu-xiang; Huang, Wen-qian; Li, Jiang-bo; Zhao, Chun-jiang; Zhang, Bao-hua

    2014-08-01

    To improve the precision and robustness of the NIR model of the soluble solid content (SSC) on pear. The total number of 160 pears was for the calibration (n=120) and prediction (n=40). Different spectral pretreatment methods, including standard normal variate (SNV) and multiplicative scatter correction (MSC) were used before further analysis. A combination of genetic algorithm (GA) and successive projections algorithm (SPA) was proposed to select most effective wavelengths after uninformative variable elimination (UVE) from original spectra, SNV pretreated spectra and MSC pretreated spectra respectively. The selected variables were used as the inputs of least squares-support vector machine (LS-SVM) model to build models for de- termining the SSC of pear. The results indicated that LS-SVM model built using SNVE-UVE-GA-SPA on 30 characteristic wavelengths selected from full-spectrum which had 3112 wavelengths achieved the optimal performance. The correlation coefficient (Rp) and root mean square error of prediction (RMSEP) for prediction sets were 0.956, 0.271 for SSC. The model is reliable and the predicted result is effective. The method can meet the requirement of quick measuring SSC of pear and might be important for the development of portable instruments and online monitoring.

  18. [Study on Application of NIR Spectral Information Screening in Identification of Maca Origin].

    PubMed

    Wang, Yuan-zhong; Zhao, Yan-li; Zhang, Ji; Jin, Hang

    2016-02-01

    Medicinal and edible plant Maca is rich in various nutrients and owns great medicinal value. Based on near infrared diffuse reflectance spectra, 139 Maca samples collected from Peru and Yunnan were used to identify their geographical origins. Multiplication signal correction (MSC) coupled with second derivative (SD) and Norris derivative filter (ND) was employed in spectral pretreatment. Spectrum range (7,500-4,061 cm⁻¹) was chosen by spectrum standard deviation. Combined with principal component analysis-mahalanobis distance (PCA-MD), the appropriate number of principal components was selected as 5. Based on the spectrum range and the number of principal components selected, two abnormal samples were eliminated by modular group iterative singular sample diagnosis method. Then, four methods were used to filter spectral variable information, competitive adaptive reweighted sampling (CARS), monte carlo-uninformative variable elimination (MC-UVE), genetic algorithm (GA) and subwindow permutation analysis (SPA). The spectral variable information filtered was evaluated by model population analysis (MPA). The results showed that RMSECV(SPA) > RMSECV(CARS) > RMSECV(MC-UVE) > RMSECV(GA), were 2. 14, 2. 05, 2. 02, and 1. 98, and the spectral variables were 250, 240, 250 and 70, respectively. According to the spectral variable filtered, partial least squares discriminant analysis (PLS-DA) was used to build the model, with random selection of 97 samples as training set, and the other 40 samples as validation set. The results showed that, R²: GA > MC-UVE > CARS > SPA, RMSEC and RMSEP: GA < MC-UVE < CARS

  19. Uninformative contexts support word learning for high-skill spellers.

    PubMed

    Eskenazi, Michael A; Swischuk, Natascha K; Folk, Jocelyn R; Abraham, Ashley N

    2018-04-30

    The current study investigated how high-skill spellers and low-skill spellers incidentally learn words during reading. The purpose of the study was to determine whether readers can use uninformative contexts to support word learning after forming a lexical representation for a novel word, consistent with instance-based resonance processes. Previous research has found that uninformative contexts damage word learning; however, there may have been insufficient exposure to informative contexts (only one) prior to exposure to uninformative contexts (Webb, 2007; Webb, 2008). In Experiment 1, participants read sentences with one novel word (i.e., blaph, clurge) embedded in them in three different conditions: Informative (six informative contexts to support word learning), Mixed (three informative contexts followed by three uninformative contexts), and Uninformative (six uninformative contexts). Experiment 2 added a new condition with only three informative contexts to further clarify the conclusions of Experiment 1. Results indicated that uninformative contexts can support word learning, but only for high-skill spellers. Further, when participants learned the spelling of the novel word, they were more likely to learn the meaning of that word. This effect was much larger for high-skill spellers than for low-skill spellers. Results are consistent with the Lexical Quality Hypothesis (LQH) in that high-skill spellers form stronger orthographic representations which support word learning (Perfetti, 2007). Results also support an instance-based resonance process of word learning in that prior informative contexts can be reactivated to support word learning in future contexts (Bolger, Balass, Landen, & Perfetti, 2008; Balass, Nelson, & Perfetti, 2010; Reichle & Perfetti, 2003). (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. A method of alignment masking for refining the phylogenetic signal of multiple sequence alignments.

    PubMed

    Rajan, Vaibhav

    2013-03-01

    Inaccurate inference of positional homologies in multiple sequence alignments and systematic errors introduced by alignment heuristics obfuscate phylogenetic inference. Alignment masking, the elimination of phylogenetically uninformative or misleading sites from an alignment before phylogenetic analysis, is a common practice in phylogenetic analysis. Although masking is often done manually, automated methods are necessary to handle the much larger data sets being prepared today. In this study, we introduce the concept of subsplits and demonstrate their use in extracting phylogenetic signal from alignments. We design a clustering approach for alignment masking where each cluster contains similar columns-similarity being defined on the basis of compatible subsplits; our approach then identifies noisy clusters and eliminates them. Trees inferred from the columns in the retained clusters are found to be topologically closer to the reference trees. We test our method on numerous standard benchmarks (both synthetic and biological data sets) and compare its performance with other methods of alignment masking. We find that our method can eliminate sites more accurately than other methods, particularly on divergent data, and can improve the topologies of the inferred trees in likelihood-based analyses. Software available upon request from the author.

  1. Detection of Soil Nitrogen Using Near Infrared Sensors Based on Soil Pretreatment and Algorithms

    PubMed Central

    Nie, Pengcheng; Dong, Tao; He, Yong; Qu, Fangfang

    2017-01-01

    Soil nitrogen content is one of the important growth nutrient parameters of crops. It is a prerequisite for scientific fertilization to accurately grasp soil nutrient information in precision agriculture. The information about nutrients such as nitrogen in the soil can be obtained quickly by using a near-infrared sensor. The data can be analyzed in the detection process, which is nondestructive and non-polluting. In order to investigate the effect of soil pretreatment on nitrogen content by near infrared sensor, 16 nitrogen concentrations were mixed with soil and the soil samples were divided into three groups with different pretreatment. The first group of soil samples with strict pretreatment were dried, ground, sieved and pressed. The second group of soil samples were dried and ground. The third group of soil samples were simply dried. Three linear different modeling methods are used to analyze the spectrum, including partial least squares (PLS), uninformative variable elimination (UVE), competitive adaptive reweighted algorithm (CARS). The model of nonlinear partial least squares which supports vector machine (LS-SVM) is also used to analyze the soil reflectance spectrum. The results show that the soil samples with strict pretreatment have the best accuracy in predicting nitrogen content by near-infrared sensor, and the pretreatment method is suitable for practical application. PMID:28492480

  2. Classification of fresh and frozen-thawed pork muscles using visible and near infrared hyperspectral imaging and textural analysis.

    PubMed

    Pu, Hongbin; Sun, Da-Wen; Ma, Ji; Cheng, Jun-Hu

    2015-01-01

    The potential of visible and near infrared hyperspectral imaging was investigated as a rapid and nondestructive technique for classifying fresh and frozen-thawed meats by integrating critical spectral and image features extracted from hyperspectral images in the region of 400-1000 nm. Six feature wavelengths (400, 446, 477, 516, 592 and 686 nm) were identified using uninformative variable elimination and successive projections algorithm. Image textural features of the principal component images from hyperspectral images were obtained using histogram statistics (HS), gray level co-occurrence matrix (GLCM) and gray level-gradient co-occurrence matrix (GLGCM). By these spectral and textural features, probabilistic neural network (PNN) models for classification of fresh and frozen-thawed pork meats were established. Compared with the models using the optimum wavelengths only, optimum wavelengths with HS image features, and optimum wavelengths with GLCM image features, the model integrating optimum wavelengths with GLGCM gave the highest classification rate of 93.14% and 90.91% for calibration and validation sets, respectively. Results indicated that the classification accuracy can be improved by combining spectral features with textural features and the fusion of critical spectral and textural features had better potential than single spectral extraction in classifying fresh and frozen-thawed pork meat. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Lossed in translation: an off-the-shelf method to recover probabilistic beliefs from loss-averse agents.

    PubMed

    Offerman, Theo; Palley, Asa B

    2016-01-01

    Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.

  4. Information Seeking in a Natural Stress Situation

    ERIC Educational Resources Information Center

    Vernon, David T. A.

    1971-01-01

    Compares hospitalized tuberculosis patients with informative and uninformative physicians as to their use of library books. Finds that the two groups did not differ in general reading, but that those with uninformative physicians tended to seek out books about tuberculosis and its treatment more often. (MB)

  5. (E)pistemological Awareness, Instantiation of Methods, and Uninformed Methodological Ambiguity in Qualitative Research Projects

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka; Yendol-Hoppey, Diane; Smith, Jason Jude; Hayes, Sharon B.

    2009-01-01

    This article explores epistemological awareness and instantiation of methods, as well as uninformed ambiguity, in qualitative methodological decision making and research reporting. The authors argue that efforts should be made to make the research process, epistemologies, values, methodological decision points, and argumentative logic open,…

  6. Variable selection in near-infrared spectroscopy: benchmarking of feature selection methods on biodiesel data.

    PubMed

    Balabin, Roman M; Smirnov, Sergey V

    2011-04-29

    During the past several years, near-infrared (near-IR/NIR) spectroscopy has increasingly been adopted as an analytical tool in various fields from petroleum to biomedical sectors. The NIR spectrum (above 4000 cm(-1)) of a sample is typically measured by modern instruments at a few hundred of wavelengths. Recently, considerable effort has been directed towards developing procedures to identify variables (wavelengths) that contribute useful information. Variable selection (VS) or feature selection, also called frequency selection or wavelength selection, is a critical step in data analysis for vibrational spectroscopy (infrared, Raman, or NIRS). In this paper, we compare the performance of 16 different feature selection methods for the prediction of properties of biodiesel fuel, including density, viscosity, methanol content, and water concentration. The feature selection algorithms tested include stepwise multiple linear regression (MLR-step), interval partial least squares regression (iPLS), backward iPLS (BiPLS), forward iPLS (FiPLS), moving window partial least squares regression (MWPLS), (modified) changeable size moving window partial least squares (CSMWPLS/MCSMWPLSR), searching combination moving window partial least squares (SCMWPLS), successive projections algorithm (SPA), uninformative variable elimination (UVE, including UVE-SPA), simulated annealing (SA), back-propagation artificial neural networks (BP-ANN), Kohonen artificial neural network (K-ANN), and genetic algorithms (GAs, including GA-iPLS). Two linear techniques for calibration model building, namely multiple linear regression (MLR) and partial least squares regression/projection to latent structures (PLS/PLSR), are used for the evaluation of biofuel properties. A comparison with a non-linear calibration model, artificial neural networks (ANN-MLP), is also provided. Discussion of gasoline, ethanol-gasoline (bioethanol), and diesel fuel data is presented. The results of other spectroscopic techniques application, such as Raman, ultraviolet-visible (UV-vis), or nuclear magnetic resonance (NMR) spectroscopies, can be greatly improved by an appropriate feature selection choice. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Informed and Uninformed Naïve Assessment Constructors' Strategies for Item Selection

    ERIC Educational Resources Information Center

    Fives, Helenrose; Barnes, Nicole

    2017-01-01

    We present a descriptive analysis of 53 naïve assessment constructors' explanations for selecting test items to include on a summative assessment. We randomly assigned participants to an informed and uninformed condition (i.e., informed participants read an article describing a Table of Specifications). Through recursive thematic analyses of…

  8. Agricultural subsidies and the American obesity epidemic.

    PubMed

    Franck, Caroline; Grandi, Sonia M; Eisenberg, Mark J

    2013-09-01

    Government-issued agricultural subsidies are worsening obesity trends in America. Current agricultural policy remains largely uninformed by public health discourse. Although findings suggest that eliminating all subsidies would have a mild impact on the prevalence of obesity, a revision of commodity programs could have a measurable public health impact on a population scale, over time. Policy reforms will be important determinants of the future of obesity in America, primarily through indemnity program revisions, and the allocation of increasing amounts of resources to sustainable agriculture. Public health intervention will be required at the policy level to promote healthy behavioral changes in consumers. The 2013 Farm Bill will be the key mechanism to induce such policy change in the near future. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  9. Ligand-based 3D QSAR analysis of reactivation potency of mono- and bis-pyridinium aldoximes toward VX-inhibited rat acetylcholinesterase.

    PubMed

    Dolezal, Rafael; Korabecny, Jan; Malinak, David; Honegr, Jan; Musilek, Kamil; Kuca, Kamil

    2015-03-01

    To predict unknown reactivation potencies of 12 mono- and bis-pyridinium aldoximes for VX-inhibited rat acetylcholinesterase (rAChE), three-dimensional quantitative structure-activity relationship (3D QSAR) analysis has been carried out. Utilizing molecular interaction fields (MIFs) calculated by molecular mechanical (MMFF94) and quantum chemical (B3LYP/6-31G*) methods, two satisfactory ligand-based CoMFA models have been developed: 1. R(2)=0.9989, Q(LOO)(2)=0.9090, Q(LTO)(2)=0.8921, Q(LMO(20%))(2)=0.8853, R(ext)(2)=0.9259, SDEP(ext)=6.8938; 2. R(2)=0.9962, Q(LOO)(2)=0.9368, Q(LTO)(2)=0.9298, Q(LMO(20%))(2)=0.9248, R(ext)(2)=0.8905, SDEP(ext)=6.6756. High statistical significance of the 3D QSAR models has been achieved through the application of several data noise reduction techniques (i.e. smart region definition SRD, fractional factor design FFD, uninformative/iterative variable elimination UVE/IVE) on the original MIFs. Besides the ligand-based CoMFA models, an alignment molecular set constructed by flexible molecular docking has been also studied. The contour maps as well as the predicted reactivation potencies resulting from 3D QSAR analyses help better understand which structural features are associated with increased reactivation potency of studied compounds. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Estimating Tree Height-Diameter Models with the Bayesian Method

    PubMed Central

    Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei

    2014-01-01

    Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the “best” model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2. PMID:24711733

  11. Estimating tree height-diameter models with the Bayesian method.

    PubMed

    Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei

    2014-01-01

    Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the "best" model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2.

  12. Understanding of BRCA1/2 genetic tests results: the importance of objective and subjective numeracy.

    PubMed

    Hanoch, Yaniv; Miron-Shatz, Talya; Rolison, Jonathan J; Ozanne, Elissa

    2014-10-01

    The majority of women (71%) who undergo BRCA1/2 testing-designed to identify genetic mutations associated with increased risk of cancer-receive results that are termed 'ambiguous' or 'uninformative negative'. How women interpret these results and the association with numerical ability was examined. In this study, 477 women at increased risk for breast and ovarian cancer were recruited via the Cancer Genetics Network. They were presented with information about the four different possible BRCA1/2 test results-positive, true negative, ambiguous and uninformative negative-and asked to indicate which of six options represents the best response. Participants were then asked which treatment options they thought a woman receiving the results should discuss with her doctor. Finally, participants completed measures of objective and subjective numeracy. Almost all of the participants correctly interpreted the positive and negative BRCA1/2 genetic test results. However, they encountered difficulties interpreting the uninformative and ambiguous BRCA1/2 genetic test results. Participants were almost equally likely to think either that the woman had learned nothing from the test result or that she was as likely to develop cancer as the average woman. Highly numerate participants were more likely to correctly interpret inconclusive test results (ambiguous, OR = 1.62; 95% CI [1.28, 2.07]; p < 0.001; uninformative, OR = 1.40; 95% CI [1.10, 1.80]). Given the medical and psychological ramifications of genetic testing, healthcare professionals should consider devoting extra effort to ensuring proper comprehension of ambiguous and uninformative negative test results by women. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Mill and the right to remain uninformed.

    PubMed

    Strasser, M

    1986-08-01

    In a recent article in the Journal of Medicine and Philosophy, David Ost (1984) claims that patients do not have a right to waive their right to information. He argues that patients cannot make informed rational decisions without full information and thus, a right to waive information would involve a right to avoid one's responsibility to act as an autonomous moral agent. In support of his position, Ost cites a passage from Mill. Yet, a correct interpretation of the passage in question would support one's right to remain uninformed in certain situations. If the information would hurt one's chances for survival or hurt one's ability to make calm, rational decisions, then one not only does not have a duty to find out the information, but one's exercising one's right to remain uninformed may be the only rational course of action to take.

  14. Architecture-Based Self-Adaptation for Moving Target Defense

    DTIC Science & Technology

    2014-08-01

    using stochastic multiplayer games to verify the the behavior of a variety of MTD scenarios, from uninformed to predictive-reactive. This work is... multiplayer games to verify the the behavior of a variety of MTD scenarios, from uninformed to predictive-reactive. This work is applied in the context...for Moving Target . . . . . . . . . . . . . . 28 5 Multiplayer Games for Moving Target Defense 31 5.1 Stochastic Game Analysis for Proactive Self

  15. The role of initial affective impressions in responses to educational communications: the case of carbon capture and sequestration (CCS).

    PubMed

    Bruine de Bruin, Wändi; Wong-Parodi, Gabrielle

    2014-06-01

    Emerging technologies promise potential benefits at a potential cost. Developers of educational communications aim to improve people's understanding and to facilitate public debate. However, even relatively uninformed recipients may have initial feelings that are difficult to change. We report that people's initial affective impressions about carbon capture and sequestration (CCS), a low-carbon coal-based electricity-generation technology with which most people are unfamiliar, influences how they interpret previously validated education materials. As a result, even individuals who had originally self-identified as uninformed persisted in their initial feelings after reading the educational communication-though perseverance of feelings about CCS was stronger among recipients who had originally self-identified as relatively informed (Study 1). Moreover, uninformed recipients whose initial feelings were experimentally manipulated by relatively uninformative pro-CCS or anti-CCS arguments persisted in their manipulated feelings after reading the educational communication, due to evaluating the educational communication in line with their manipulated impressions (Study 2). Hence, our results suggest that educational communications will have more impact if they are disseminated before people form strong feelings about the topic under consideration, especially if these are based on little to no factual understanding. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  16. [Analyzing and modeling methods of near infrared spectroscopy for in-situ prediction of oil yield from oil shale].

    PubMed

    Liu, Jie; Zhang, Fu-Dong; Teng, Fei; Li, Jun; Wang, Zhi-Hong

    2014-10-01

    In order to in-situ detect the oil yield of oil shale, based on portable near infrared spectroscopy analytical technology, with 66 rock core samples from No. 2 well drilling of Fuyu oil shale base in Jilin, the modeling and analyzing methods for in-situ detection were researched. By the developed portable spectrometer, 3 data formats (reflectance, absorbance and K-M function) spectra were acquired. With 4 different modeling data optimization methods: principal component-mahalanobis distance (PCA-MD) for eliminating abnormal samples, uninformative variables elimination (UVE) for wavelength selection and their combina- tions: PCA-MD + UVE and UVE + PCA-MD, 2 modeling methods: partial least square (PLS) and back propagation artificial neural network (BPANN), and the same data pre-processing, the modeling and analyzing experiment were performed to determine the optimum analysis model and method. The results show that the data format, modeling data optimization method and modeling method all affect the analysis precision of model. Results show that whether or not using the optimization method, reflectance or K-M function is the proper spectrum format of the modeling database for two modeling methods. Using two different modeling methods and four different data optimization methods, the model precisions of the same modeling database are different. For PLS modeling method, the PCA-MD and UVE + PCA-MD data optimization methods can improve the modeling precision of database using K-M function spectrum data format. For BPANN modeling method, UVE, UVE + PCA-MD and PCA- MD + UVE data optimization methods can improve the modeling precision of database using any of the 3 spectrum data formats. In addition to using the reflectance spectra and PCA-MD data optimization method, modeling precision by BPANN method is better than that by PLS method. And modeling with reflectance spectra, UVE optimization method and BPANN modeling method, the model gets the highest analysis precision, its correlation coefficient (Rp) is 0.92, and its standard error of prediction (SEP) is 0.69%.

  17. Cognitive and emotional factors predicting decisional conflict among high-risk breast cancer survivors who receive uninformative BRCA1/2 results.

    PubMed

    Rini, Christine; O'Neill, Suzanne C; Valdimarsdottir, Heiddis; Goldsmith, Rachel E; Jandorf, Lina; Brown, Karen; DeMarco, Tiffani A; Peshkin, Beth N; Schwartz, Marc D

    2009-09-01

    To investigate high-risk breast cancer survivors' risk reduction decision making and decisional conflict after an uninformative BRCA1/2 test. Prospective, longitudinal study of 182 probands undergoing BRCA1/2 testing, with assessments 1-, 6-, and 12-months postdisclosure. Primary predictors were health beliefs and emotional responses to testing assessed 1-month postdisclosure. Main outcomes included women's perception of whether they had made a final risk management decision (decision status) and decisional conflict related to this issue. There were four patterns of decision making, depending on how long it took women to make a final decision and the stability of their decision status across assessments. Late decision makers and nondecision makers reported the highest decisional conflict; however, substantial numbers of women--even early and intermediate decision makers--reported elevated decisional conflict. Analyses predicting decisional conflict 1- and 12-months postdisclosure found that, after accounting for control variables and decision status, health beliefs and emotional factors predicted decisional conflict at different timepoints, with health beliefs more important 1 month after test disclosure and emotional factors more important 1 year later. Many of these women may benefit from decision making assistance. Copyright 2009 APA, all rights reserved.

  18. Uninformed Clinical Decisions Resulting From Lack of Adherence Assessment in Children with New Onset Epilepsy

    PubMed Central

    Modi, Avani C.; Wu, Yelena P.; Guilfoyle, Shanna M.; Glauser, Tracy A.

    2012-01-01

    This study examined the relationship between non-adherence to antiepileptic drug (AED) therapy and clinical decision-making in a cohort of 112 children with newly-diagnosed epilepsy. AED adherence was monitored using electronic monitoring over the first six months of therapy. The primary outcome measure was rate of uninformed clinical decisions as defined by number of participants with AED dosage or drug changes to address continued seizures who demonstrated non-adherence prior to the seizure. Among the 52 (47%) participants who had an AED change for continued seizures, 30 (27% of the overall cohort) had imperfect medication adherence prior to their seizures. A quarter of children with new onset epilepsy had uninformed medication changes because adherence was not rigorously assessed in clinical practice. Results highlight the importance of routinely assessing medication adherence in this population. PMID:23159375

  19. Use of uninformative priors to initialize state estimation for dynamical systems

    NASA Astrophysics Data System (ADS)

    Worthy, Johnny L.; Holzinger, Marcus J.

    2017-10-01

    The admissible region must be expressed probabilistically in order to be used in Bayesian estimation schemes. When treated as a probability density function (PDF), a uniform admissible region can be shown to have non-uniform probability density after a transformation. An alternative approach can be used to express the admissible region probabilistically according to the Principle of Transformation Groups. This paper uses a fundamental multivariate probability transformation theorem to show that regardless of which state space an admissible region is expressed in, the probability density must remain the same under the Principle of Transformation Groups. The admissible region can be shown to be analogous to an uninformative prior with a probability density that remains constant under reparameterization. This paper introduces requirements on how these uninformative priors may be transformed and used for state estimation and the difference in results when initializing an estimation scheme via a traditional transformation versus the alternative approach.

  20. BRCA1/2 Test Results Impact Risk Management Attitudes, Intentions and Uptake

    PubMed Central

    O’Neill, Suzanne C.; Valdimarsdottir, Heiddis B.; DeMarco, Tiffani A.; Peshkin, Beth N.; Graves, Kristi D.; Brown, Karen; Hurley, Karen E.; Isaacs, Claudine; Hecker, Sharon; Schwartz, Marc D.

    2011-01-01

    BACKGROUND Women who receive positive or uninformative BRCA1/2 test results face a number of decisions about how to manage their cancer risk. The purpose of this study was to prospectively examine the effect of receiving a positive vs. uninformative BRCA1/2 genetic test result on the perceived pros and cons of risk-reducing mastectomy (RRM) and risk-reducing oophorectomy (RRO) and breast cancer screening. We further examined how perceived pros and cons of surgery predict intention for and uptake of surgery. METHODS 308 women (146 positive, 162 uninformative) were included in RRM and breast cancer screening analyses. 276 women were included in RRO analyses. Participants completed questionnaires at pre-disclosure baseline and 1-, 6-and 12-months post-disclosure. We used linear multiple regression to assess whether test result contributed to change in pros and cons and logistic regression to predict intentions and surgery uptake. RESULTS Receipt of a positive BRCA1/2 test result predicted stronger pros for RRM and RRO (Ps < .001), but not perceived cons of RRM and RRO. Pros of surgery predicted RRM and RRO intentions in carriers and RRO intentions in uninformatives. Cons predicted RRM intentions in carriers. Pros and cons predicted carriers’ RRO uptake in the year after testing (Ps < .001). CONCLUSIONS Receipt of BRCA1/2 mutation test results impacts how carriers see the positive aspects of RRO and RRM and their surgical intentions. Both the positive and negative aspects predict uptake of surgery. PMID:20383578

  1. [Research on fast detecting tomato seedlings nitrogen content based on NIR characteristic spectrum selection].

    PubMed

    Wu, Jing-zhu; Wang, Feng-zhu; Wang, Li-li; Zhang, Xiao-chao; Mao, Wen-hua

    2015-01-01

    In order to improve the accuracy and robustness of detecting tomato seedlings nitrogen content based on near-infrared spectroscopy (NIR), 4 kinds of characteristic spectrum selecting methods were studied in the present paper, i. e. competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variables elimination (MCUVE), backward interval partial least squares (BiPLS) and synergy interval partial least squares (SiPLS). There were totally 60 tomato seedlings cultivated at 10 different nitrogen-treatment levels (urea concentration from 0 to 120 mg . L-1), with 6 samples at each nitrogen-treatment level. They are in different degrees of over nitrogen, moderate nitrogen, lack of nitrogen and no nitrogen status. Each sample leaves were collected to scan near-infrared spectroscopy from 12 500 to 3 600 cm-1. The quantitative models based on the above 4 methods were established. According to the experimental result, the calibration model based on CARS and MCUVE selecting methods show better performance than those based on BiPLS and SiPLS selecting methods, but their prediction ability is much lower than that of the latter. Among them, the model built by BiPLS has the best prediction performance. The correlation coefficient (r), root mean square error of prediction (RMSEP) and ratio of performance to standard derivate (RPD) is 0. 952 7, 0. 118 3 and 3. 291, respectively. Therefore, NIR technology combined with characteristic spectrum selecting methods can improve the model performance. But the characteristic spectrum selecting methods are not universal. For the built model based or single wavelength variables selection is more sensitive, it is more suitable for the uniform object. While the anti-interference ability of the model built based on wavelength interval selection is much stronger, it is more suitable for the uneven and poor reproducibility object. Therefore, the characteristic spectrum selection will only play a better role in building model, combined with the consideration of sample state and the model indexes.

  2. Optimal recruitment strategies for groups of interacting walkers with leaders

    NASA Astrophysics Data System (ADS)

    Martínez-García, Ricardo; López, Cristóbal; Vazquez, Federico

    2015-02-01

    We introduce a model of interacting random walkers on a finite one-dimensional chain with absorbing boundaries or targets at the ends. Walkers are of two types: informed particles that move ballistically towards a given target and diffusing uninformed particles that are biased towards close informed individuals. This model mimics the dynamics of hierarchical groups of animals, where an informed individual tries to persuade and lead the movement of its conspecifics. We characterize the success of this persuasion by the first-passage probability of the uninformed particle to the target, and we interpret the speed of the informed particle as a strategic parameter that the particle can tune to maximize its success. We find that the success probability is nonmonotonic, reaching its maximum at an intermediate speed whose value increases with the diffusing rate of the uninformed particle. When two different groups of informed leaders traveling in opposite directions compete, usually the largest group is the most successful. However, the minority can reverse this situation and become the most probable winner by following two different strategies: increasing its attraction strength or adjusting its speed to an optimal value relative to the majority's speed.

  3. Research on the Optimum Water Content of Detecting Soil Nitrogen Using Near Infrared Sensor

    PubMed Central

    He, Yong; Nie, Pengcheng; Dong, Tao; Qu, Fangfang; Lin, Lei

    2017-01-01

    Nitrogen is one of the important indexes to evaluate the physiological and biochemical properties of soil. The level of soil nitrogen content influences the nutrient levels of crops directly. The near infrared sensor can be used to detect the soil nitrogen content rapidly, nondestructively, and conveniently. In order to investigate the effect of the different soil water content on soil nitrogen detection by near infrared sensor, the soil samples were dealt with different drying times and the corresponding water content was measured. The drying time was set from 1 h to 8 h, and every 1 h 90 samples (each nitrogen concentration of 10 samples) were detected. The spectral information of samples was obtained by near infrared sensor, meanwhile, the soil water content was calculated every 1 h. The prediction model of soil nitrogen content was established by two linear modeling methods, including partial least squares (PLS) and uninformative variable elimination (UVE). The experiment shows that the soil has the highest detection accuracy when the drying time is 3 h and the corresponding soil water content is 1.03%. The correlation coefficients of the calibration set are 0.9721 and 0.9656, and the correlation coefficients of the prediction set are 0.9712 and 0.9682, respectively. The prediction accuracy of both models is high, while the prediction effect of PLS model is better and more stable. The results indicate that the soil water content at 1.03% has the minimum influence on the detection of soil nitrogen content using a near infrared sensor while the detection accuracy is the highest and the time cost is the lowest, which is of great significance to develop a portable apparatus detecting nitrogen in the field accurately and rapidly. PMID:28880202

  4. Research on the Optimum Water Content of Detecting Soil Nitrogen Using Near Infrared Sensor.

    PubMed

    He, Yong; Xiao, Shupei; Nie, Pengcheng; Dong, Tao; Qu, Fangfang; Lin, Lei

    2017-09-07

    Nitrogen is one of the important indexes to evaluate the physiological and biochemical properties of soil. The level of soil nitrogen content influences the nutrient levels of crops directly. The near infrared sensor can be used to detect the soil nitrogen content rapidly, nondestructively, and conveniently. In order to investigate the effect of the different soil water content on soil nitrogen detection by near infrared sensor, the soil samples were dealt with different drying times and the corresponding water content was measured. The drying time was set from 1 h to 8 h, and every 1 h 90 samples (each nitrogen concentration of 10 samples) were detected. The spectral information of samples was obtained by near infrared sensor, meanwhile, the soil water content was calculated every 1 h. The prediction model of soil nitrogen content was established by two linear modeling methods, including partial least squares (PLS) and uninformative variable elimination (UVE). The experiment shows that the soil has the highest detection accuracy when the drying time is 3 h and the corresponding soil water content is 1.03%. The correlation coefficients of the calibration set are 0.9721 and 0.9656, and the correlation coefficients of the prediction set are 0.9712 and 0.9682, respectively. The prediction accuracy of both models is high, while the prediction effect of PLS model is better and more stable. The results indicate that the soil water content at 1.03% has the minimum influence on the detection of soil nitrogen content using a near infrared sensor while the detection accuracy is the highest and the time cost is the lowest, which is of great significance to develop a portable apparatus detecting nitrogen in the field accurately and rapidly.

  5. The evidential value of developmental age imaging for assessing age of majority

    PubMed Central

    Cole, T. J.

    2015-01-01

    Abstract Aim: To consider the evidential value of developmental age images for identifying age of majority. Methods: The published literature on hand–wrist X-rays, MRI scans of the distal radius and orthopantomograms of the lower left third molar is considered in terms of the mean age of attainment of the adult appearance and the diagnostic test performance of the adult appearance to predict adult status, either administratively (under-17 football) or forensically. Results: The mean age of attainment of a mature hand-wrist X-ray is under 18 years and most individuals are mature before age 18. For the MRI wrist scan and the third molar the age of attainment is over 19 years and the adult appearance is an indicator of adulthood, while the immature appearance is uninformative about likely age. So MRI and third molars have high specificity, but low sensitivity. Conclusions: Bone age assessed by hand–wrist X-ray is uninformative and should not be used. The adult appearance of MRI wrist scans and third molars provide evidence of being over-age, although there remains a small risk of minors being misclassified as adult. The immature appearance is uninformative about likely age and, overall, more than one third of assessments are wrong. PMID:26133364

  6. Characterization of groups using composite kernels and multi-source fMRI analysis data: application to schizophrenia

    PubMed Central

    Castro, Eduardo; Martínez-Ramón, Manel; Pearlson, Godfrey; Sui, Jing; Calhoun, Vince D.

    2011-01-01

    Pattern classification of brain imaging data can enable the automatic detection of differences in cognitive processes of specific groups of interest. Furthermore, it can also give neuroanatomical information related to the regions of the brain that are most relevant to detect these differences by means of feature selection procedures, which are also well-suited to deal with the high dimensionality of brain imaging data. This work proposes the application of recursive feature elimination using a machine learning algorithm based on composite kernels to the classification of healthy controls and patients with schizophrenia. This framework, which evaluates nonlinear relationships between voxels, analyzes whole-brain fMRI data from an auditory task experiment that is segmented into anatomical regions and recursively eliminates the uninformative ones based on their relevance estimates, thus yielding the set of most discriminative brain areas for group classification. The collected data was processed using two analysis methods: the general linear model (GLM) and independent component analysis (ICA). GLM spatial maps as well as ICA temporal lobe and default mode component maps were then input to the classifier. A mean classification accuracy of up to 95% estimated with a leave-two-out cross-validation procedure was achieved by doing multi-source data classification. In addition, it is shown that the classification accuracy rate obtained by using multi-source data surpasses that reached by using single-source data, hence showing that this algorithm takes advantage of the complimentary nature of GLM and ICA. PMID:21723948

  7. Social foraging and individual consistency in following behaviour: testing the information centre hypothesis in free-ranging vultures.

    PubMed

    Harel, Roi; Spiegel, Orr; Getz, Wayne M; Nathan, Ran

    2017-04-12

    Uncertainties regarding food location and quality are among the greatest challenges faced by foragers and communal roosting may facilitate success through social foraging. The information centre hypothesis (ICH) suggests that uninformed individuals at shared roosts benefit from following informed individuals to previously visited resources. We tested several key prerequisites of the ICH in a social obligate scavenger, the Eurasian griffon vulture ( Gyps fulvus ), by tracking movements and behaviour of sympatric individuals over extended periods and across relatively large spatial scales, thereby precluding alternative explanations such as local enhancement. In agreement with the ICH, we found that 'informed' individuals returning to previously visited carcasses were followed by 'uninformed' vultures that consequently got access to these resources. When a dyad (two individuals that depart from the same roost within 2 min of each other) included an informed individual, they spent a higher proportion of the flight time close to each other at a shorter distance between them than otherwise. Although all individuals occasionally profited from following others, they differed in their tendencies to be informed or uninformed. This study provides evidence for 'following behaviour' in natural conditions and demonstrates differential roles and information states among foragers within a population. Moreover, demonstrating the possible reliance of vultures on following behaviour emphasizes that individuals in declining populations may suffer from reduced foraging efficiency. © 2017 The Author(s).

  8. The effect of concrete supplements on metacognitive regulation during learning and open-book test taking.

    PubMed

    Ackerman, Rakefet; Leiser, David

    2014-06-01

    Previous studies have suggested that when reading texts, lower achievers are more sensitive than their stronger counterparts to surface-level cues, such as graphic illustrations, and that even when uninformative, such concrete supplements tend to raise the text's subjective comprehensibility. We examined how being led astray by uninformative concrete supplements in expository texts affects achievement. We focused on the mediating role of metacognitive processes by partialling out the role of cognitive ability, as indicated by SAT scores, in accounting for the found differences between higher and lower achievers. Undergraduate students studied expository texts in their base versions or in concrete versions, including uninformative supplements, in a within-participant design. The procedure had three phases: Studying, open-book test taking, and reanswering questions of one's choice. Overall, judgements of comprehension (JCOMPs) were higher after participants studied the concrete than the base versions, and the participants benefited from the open-book test and the reanswering opportunity. An in-depth examination of time investment, JCOMP, confidence in test answers, choice of questions to reanswer, and test scores indicated that those whose metacognitive processes were more effective and goal driven achieved higher scores. The effectiveness of metacognitive processes during learning and test taking constitutes an important factor differentiating between higher and lower achievers when studying texts that include potentially misleading cues. © 2013 The British Psychological Society.

  9. Adolescent pregnancy (image)

    MedlinePlus

    Clear, specific information about sexual behavior and its consequences is frequently not provided to adolescents by their families, schools and communities. The "sex education" that many receive comes from misinformed or uninformed peers.

  10. An Uninformative Truth: The Logic of Amarin's Off-Label Promotion.

    PubMed

    Hey, Spencer Phillips; Kesselheim, Aaron S

    2016-03-01

    Spencer Phillips Hey and Aaron Kesselheim propose that informativeness-asserting scientific facts-rather than truthfulness ought to be the standard for regulating commercial speech about pharmaceuticals.

  11. An analysis of herding behavior in security analysts’ networks

    NASA Astrophysics Data System (ADS)

    Zhao, Zheng; Zhang, YongJie; Feng, Xu; Zhang, Wei

    2014-11-01

    In this paper, we build undirected weighted networks to study herding behavior among analysts and to analyze the characteristics and the structure of these networks. We then construct a new indicator based on the average degree of nodes and the average weighted clustering coefficient to research the various types of herding behavior. Our findings suggest that every industry has, to a certain degree, herding behavior among analysts. While there is obvious uninformed herding behavior in real estate and certain other industries, industries such as mining and nonferrous metals have informed herding behavior caused by analysts’ similar reactions to public information. Furthermore, we relate the two types of herding behavior to stock price and find that uninformed herding behavior has a positive effect on market prices, whereas informed herding behavior has a negative effect.

  12. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Multicollinearity in canonical correlation analysis in maize.

    PubMed

    Alves, B M; Cargnelutti Filho, A; Burin, C

    2017-03-30

    The objective of this study was to evaluate the effects of multicollinearity under two methods of canonical correlation analysis (with and without elimination of variables) in maize (Zea mays L.) crop. Seventy-six maize genotypes were evaluated in three experiments, conducted in a randomized block design with three replications, during the 2009/2010 crop season. Eleven agronomic variables (number of days from sowing until female flowering, number of days from sowing until male flowering, plant height, ear insertion height, ear placement, number of plants, number of ears, ear index, ear weight, grain yield, and one thousand grain weight), 12 protein-nutritional variables (crude protein, lysine, methionine, cysteine, threonine, tryptophan, valine, isoleucine, leucine, phenylalanine, histidine, and arginine), and 6 energetic-nutritional variables (apparent metabolizable energy, apparent metabolizable energy corrected for nitrogen, ether extract, crude fiber, starch, and amylose) were measured. A phenotypic correlation matrix was first generated among the 29 variables for each of the experiments. A multicollinearity diagnosis was later performed within each group of variables using methodologies such as variance inflation factor and condition number. Canonical correlation analysis was then performed, with and without the elimination of variables, among groups of agronomic and protein-nutritional, and agronomic and energetic-nutritional variables. The canonical correlation analysis in the presence of multicollinearity (without elimination of variables) overestimates the variability of canonical coefficients. The elimination of variables is an efficient method to circumvent multicollinearity in canonical correlation analysis.

  14. Cables.

    PubMed

    Cushing, M

    1994-01-01

    If you want to control your own computer installation, get the satisfaction of doing your own maintenance, and compensate for an inept or uninformed vendor, the information in this article will help you achieve these ends. Good luck and good cabling!

  15. Intersubject Variability in Fearful Face Processing: The Link Between Behavior and Neural Activation

    PubMed Central

    Doty, Tracy J.; Japee, Shruti; Ingvar, Martin; Ungerleider, Leslie G.

    2014-01-01

    Stimuli that signal threat show considerable variability in the extent to which they enhance behavior, even among healthy individuals. However, the neural underpinning of this behavioral variability is not well understood. By manipulating expectation of threat in an fMRI study of fearful vs. neutral face categorization, we uncovered a network of areas underlying variability in threat processing in healthy adults. We explicitly altered expectation by presenting face images at three different expectation levels: 80%, 50%, and 20%. Subjects were instructed to report as fast and as accurately as possible whether the face was fearful (signaled threat) or not. An uninformative cue preceded each face by 4 seconds (s). By taking the difference between response times (RT) to fearful compared to neutral faces, we quantified an overall fear RT bias (i.e. faster to fearful than neutral faces) for each subject. This bias correlated positively with late trial fMRI activation (8 s after the face) during unexpected fearful face trials in bilateral ventromedial prefrontal cortex, the left subgenual cingulate cortex, and the right caudate nucleus and correlated negatively with early trial fMRI activation (4 s after the cue) during expected neutral face trials in bilateral dorsal striatum and the right ventral striatum. These results demonstrate that the variability in threat processing among healthy adults is reflected not only in behavior but also in the magnitude of activation in medial prefrontal and striatal regions that appear to encode affective value. PMID:24841078

  16. Intersubject variability in fearful face processing: the link between behavior and neural activation.

    PubMed

    Doty, Tracy J; Japee, Shruti; Ingvar, Martin; Ungerleider, Leslie G

    2014-12-01

    Stimuli that signal threat show considerable variability in the extents to which they enhance behavior, even among healthy individuals. However, the neural underpinning of this behavioral variability is not well understood. By manipulating expectation of threat in an fMRI study of fearful versus neutral face categorization, we uncovered a network of areas underlying variability in threat processing in healthy adults. We explicitly altered expectations by presenting face images at three different expectation levels: 80 %, 50 %, and 20 %. Subjects were instructed to report as quickly and accurately as possible whether the face was fearful (signaled threat) or not. An uninformative cue preceded each face by 4 s. By taking the difference between reaction times (RTs) to fearful and neutral faces, we quantified an overall fear RT bias (i.e., faster to fearful than to neutral faces) for each subject. This bias correlated positively with late-trial fMRI activation (8 s after the face) during unexpected-fearful-face trials in bilateral ventromedial prefrontal cortex, the left subgenual cingulate cortex, and the right caudate nucleus, and correlated negatively with early-trial fMRI activation (4 s after the cue) during expected-neutral-face trials in bilateral dorsal striatum and the right ventral striatum. These results demonstrate that the variability in threat processing among healthy adults is reflected not only in behavior, but also in the magnitude of activation in medial prefrontal and striatal regions that appear to encode affective value.

  17. Not accounting for interindividual variability can mask habitat selection patterns: a case study on black bears.

    PubMed

    Lesmerises, Rémi; St-Laurent, Martin-Hugues

    2017-11-01

    Habitat selection studies conducted at the population scale commonly aim to describe general patterns that could improve our understanding of the limiting factors in species-habitat relationships. Researchers often consider interindividual variation in selection patterns to control for its effects and avoid pseudoreplication by using mixed-effect models that include individuals as random factors. Here, we highlight common pitfalls and possible misinterpretations of this strategy by describing habitat selection of 21 black bears Ursus americanus. We used Bayesian mixed-effect models and compared results obtained when using random intercept (i.e., population level) versus calculating individual coefficients for each independent variable (i.e., individual level). We then related interindividual variability to individual characteristics (i.e., age, sex, reproductive status, body condition) in a multivariate analysis. The assumption of comparable behavior among individuals was verified only in 40% of the cases in our seasonal best models. Indeed, we found strong and opposite responses among sampled bears and individual coefficients were linked to individual characteristics. For some covariates, contrasted responses canceled each other out at the population level. In other cases, interindividual variability was concealed by the composition of our sample, with the majority of the bears (e.g., old individuals and bears in good physical condition) driving the population response (e.g., selection of young forest cuts). Our results stress the need to consider interindividual variability to avoid misinterpretation and uninformative results, especially for a flexible and opportunistic species. This study helps to identify some ecological drivers of interindividual variability in bear habitat selection patterns.

  18. Speed and Cardiac Recovery Variables Predict the Probability of Elimination in Equine Endurance Events.

    PubMed

    Younes, Mohamed; Robert, Céline; Cottin, François; Barrey, Eric

    2015-01-01

    Nearly 50% of the horses participating in endurance events are eliminated at a veterinary examination (a vet gate). Detecting unfit horses before a health problem occurs and treatment is required is a challenge for veterinarians but is essential for improving equine welfare. We hypothesized that it would be possible to detect unfit horses earlier in the event by measuring heart rate recovery variables. Hence, the objective of the present study was to compute logistic regressions of heart rate, cardiac recovery time and average speed data recorded at the previous vet gate (n-1) and thus predict the probability of elimination during successive phases (n and following) in endurance events. Speed and heart rate data were extracted from an electronic database of endurance events (80-160 km in length) organized in four countries. Overall, 39% of the horses that started an event were eliminated--mostly due to lameness (64%) or metabolic disorders (15%). For each vet gate, logistic regressions of explanatory variables (average speed, cardiac recovery time and heart rate measured at the previous vet gate) and categorical variables (age and/or event distance) were computed to estimate the probability of elimination. The predictive logistic regressions for vet gates 2 to 5 correctly classified between 62% and 86% of the eliminated horses. The robustness of these results was confirmed by high areas under the receiving operating characteristic curves (0.68-0.84). Overall, a horse has a 70% chance of being eliminated at the next gate if its cardiac recovery time is longer than 11 min at vet gate 1 or 2, or longer than 13 min at vet gates 3 or 4. Heart rate recovery and average speed variables measured at the previous vet gate(s) enabled us to predict elimination at the following vet gate. These variables should be checked at each veterinary examination, in order to detect unfit horses as early as possible. Our predictive method may help to improve equine welfare and ethical considerations in endurance events.

  19. Future direction of the roadway weather information system (RWIS) at PennDOT.

    DOT National Transportation Integrated Search

    2007-08-01

    Weather events have a significant impact on our transportation network. Motorist safety can be jeopardized if roadways are not : maintained in the most efficient method possible or if motorists are uninformed about roadway conditions. Mobility can be...

  20. Observing behavior in a computer game.

    PubMed Central

    Case, D A; Ploog, B O; Fantino, E

    1990-01-01

    Contingencies studied in lever-pressing procedures were incorporated into a popular computer game, "Star Trek," played by college students. One putative reinforcer, the opportunity to destroy Klingon invaders, was scheduled independently of responding according to a variable-time schedule that alternated unpredictably with equal periods of Klingon unavailability (mixed variable time, extinction schedule of reinforcement). Two commands ("observing responses") each produced stimuli that were either correlated or uncorrelated with the two components. In several variations of the basic game, an S-, or bad news, was not as reinforcing as an S+, or good news. In addition, in other conditions for the same subjects observing responses were not maintained better by bad news than by an uninformative stimulus. In both choices, more observing tended to be maintained by an S- for response-independent Klingons when its information could be (and was) used to advantage with respect to other types of reinforcement in the situation (Parts 1 and 2) than when the information could not be so used (Part 3). The findings favor the conditioned reinforcement hypothesis of observing behavior over the uncertainty-reduction hypothesis. This extends research to a more natural setting and to multialternative concurrent schedules of events of seemingly intrinsic value. PMID:2103581

  1. America's Uninformed Electorate

    ERIC Educational Resources Information Center

    Vandermyn, Gaye

    1974-01-01

    Highlights a recently reported national survey of what young Americans (ages 9-35) know and understand about their constitutional rights, the political process, the role of government and basic democratic principles. Findings indicate that many Americans are unfamiliar with the political functionings of the country or their rights guaranteed under…

  2. Psycholinguistic Foundations of Language Assessment.

    ERIC Educational Resources Information Center

    Genesee, Fred

    A review of literature on foreign language testing indicates that the earliest approaches to language assessment were generally uninformed by contemporaneous linguistic and psychological theories and were characterized by a lack of psychometric sophistication. This trend was followed by the development of test instruments that were heavily…

  3. Toward Transgender Affirmative Social Work Education

    ERIC Educational Resources Information Center

    Austin, Ashley; Craig, Shelley L.; McInroy, Lauren B.

    2016-01-01

    Social work has professional and academic standards consistent with transgender affirmative education and practice. Nevertheless, a growing body of research suggests that transgender issues are largely absent from social work education, resulting in practitioners who are uninformed or biased against transgender issues. The present study expands…

  4. Parameterization of the InVEST Crop Pollination Model to spatially predict abundance of wild blueberry (Vaccinium angustifolium Aiton) native bee pollinators in Maine, USA

    USGS Publications Warehouse

    Groff, Shannon C.; Loftin, Cynthia S.; Drummond, Frank; Bushmann, Sara; McGill, Brian J.

    2016-01-01

    Non-native honeybees historically have been managed for crop pollination, however, recent population declines draw attention to pollination services provided by native bees. We applied the InVEST Crop Pollination model, developed to predict native bee abundance from habitat resources, in Maine's wild blueberry crop landscape. We evaluated model performance with parameters informed by four approaches: 1) expert opinion; 2) sensitivity analysis; 3) sensitivity analysis informed model optimization; and, 4) simulated annealing (uninformed) model optimization. Uninformed optimization improved model performance by 29% compared to expert opinion-informed model, while sensitivity-analysis informed optimization improved model performance by 54%. This suggests that expert opinion may not result in the best parameter values for the InVEST model. The proportion of deciduous/mixed forest within 2000 m of a blueberry field also reliably predicted native bee abundance in blueberry fields, however, the InVEST model provides an efficient tool to estimate bee abundance beyond the field perimeter.

  5. Information Cost, Memory Length and Market Instability.

    PubMed

    Diks, Cees; Li, Xindan; Wu, Chengyao

    2018-07-01

    In this article, we study the instability of a stock market with a modified version of Diks and Dindo's (2008) model where the market is characterized by nonlinear interactions between informed traders and uninformed traders. In the interaction of heterogeneous agents, we replace the replicator dynamics for the fractions by logistic strategy switching. This modification makes the model more suitable for describing realistic price dynamics, as well as more robust with respect to parameter changes. One goal of our paper is to use this model to explore if the arrival of new information (news) and investor behavior have an effect on market instability. A second, related, goal is to study the way markets absorb new information, especially when the market is unstable and the price is far from being fully informative. We find that the dynamics become locally unstable and prices may deviate far from the fundamental price, routing to chaos through bifurcation, with increasing information costs or decreasing memory length of the uninformed traders.

  6. Speed and Cardiac Recovery Variables Predict the Probability of Elimination in Equine Endurance Events

    PubMed Central

    Younes, Mohamed; Robert, Céline; Cottin, François; Barrey, Eric

    2015-01-01

    Nearly 50% of the horses participating in endurance events are eliminated at a veterinary examination (a vet gate). Detecting unfit horses before a health problem occurs and treatment is required is a challenge for veterinarians but is essential for improving equine welfare. We hypothesized that it would be possible to detect unfit horses earlier in the event by measuring heart rate recovery variables. Hence, the objective of the present study was to compute logistic regressions of heart rate, cardiac recovery time and average speed data recorded at the previous vet gate (n-1) and thus predict the probability of elimination during successive phases (n and following) in endurance events. Speed and heart rate data were extracted from an electronic database of endurance events (80–160 km in length) organized in four countries. Overall, 39% of the horses that started an event were eliminated—mostly due to lameness (64%) or metabolic disorders (15%). For each vet gate, logistic regressions of explanatory variables (average speed, cardiac recovery time and heart rate measured at the previous vet gate) and categorical variables (age and/or event distance) were computed to estimate the probability of elimination. The predictive logistic regressions for vet gates 2 to 5 correctly classified between 62% and 86% of the eliminated horses. The robustness of these results was confirmed by high areas under the receiving operating characteristic curves (0.68–0.84). Overall, a horse has a 70% chance of being eliminated at the next gate if its cardiac recovery time is longer than 11 min at vet gate 1 or 2, or longer than 13 min at vet gates 3 or 4. Heart rate recovery and average speed variables measured at the previous vet gate(s) enabled us to predict elimination at the following vet gate. These variables should be checked at each veterinary examination, in order to detect unfit horses as early as possible. Our predictive method may help to improve equine welfare and ethical considerations in endurance events. PMID:26322506

  7. Item-level psychometrics and predictors of performance for Spanish/English bilingual speakers on an object and action naming battery.

    PubMed

    Edmonds, Lisa A; Donovan, Neila J

    2012-04-01

    There is a pressing need for psychometrically sound naming materials for Spanish/English bilingual adults. To address this need, in this study the authors examined the psychometric properties of An Object and Action Naming Battery (An O&A Battery; Druks & Masterson, 2000) in bilingual speakers. Ninety-one Spanish/English bilinguals named O&A Battery items in English and Spanish. Responses underwent a Rasch analysis. Using correlation and regression analyses, the authors evaluated the effect of psycholinguistic (e.g., imageability) and participant (e.g., proficiency ratings) variables on accuracy. Rasch analysis determined unidimensionality across English and Spanish nouns and verbs and robust item-level psychometric properties, evidence for content validity. Few items did not fit the model, there were no ceiling or floor effects after uninformative and misfit items were removed, and items reflected a range of difficulty. Reliability coefficients were high, and the number of statistically different ability levels provided indices of sensitivity. Regression analyses revealed significant correlations between psycholinguistic variables and accuracy, providing preliminary construct validity. The participant variables that contributed most to accuracy were proficiency ratings and time of language use. Results suggest adequate content and construct validity of O&A items retained in the analysis for Spanish/English bilingual adults and support future efforts to evaluate naming in older bilinguals and persons with bilingual aphasia.

  8. Students with AIDS. A Legal Memorandum.

    ERIC Educational Resources Information Center

    Strope, John L., Jr.; Broadwell, Cathy Allen

    When confronted with a student with acquired immune deficiency syndrome (AIDS), administrators must act very cautiously. In addition to the public relations and political problems asociated with students with AIDS, administrators are faced with the legal implications of their decisions; their actions, if uninformed, can result in monetary…

  9. Log-normal distribution of the trace element data results from a mixture of stocahstic input and deterministic internal dynamics.

    PubMed

    Usuda, Kan; Kono, Koichi; Dote, Tomotaro; Shimizu, Hiroyasu; Tominaga, Mika; Koizumi, Chisato; Nakase, Emiko; Toshina, Yumi; Iwai, Junko; Kawasaki, Takashi; Akashi, Mitsuya

    2002-04-01

    In previous article, we showed a log-normal distribution of boron and lithium in human urine. This type of distribution is common in both biological and nonbiological applications. It can be observed when the effects of many independent variables are combined, each of which having any underlying distribution. Although elemental excretion depends on many variables, the one-compartment open model following a first-order process can be used to explain the elimination of elements. The rate of excretion is proportional to the amount present of any given element; that is, the same percentage of an existing element is eliminated per unit time, and the element concentration is represented by a deterministic negative power function of time in the elimination time-course. Sampling is of a stochastic nature, so the dataset of time variables in the elimination phase when the sample was obtained is expected to show Normal distribution. The time variable appears as an exponent of the power function, so a concentration histogram is that of an exponential transformation of Normally distributed time. This is the reason why the element concentration shows a log-normal distribution. The distribution is determined not by the element concentration itself, but by the time variable that defines the pharmacokinetic equation.

  10. Curve fitting and modeling with splines using statistical variable selection techniques

    NASA Technical Reports Server (NTRS)

    Smith, P. L.

    1982-01-01

    The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  11. Fitting multidimensional splines using statistical variable selection techniques

    NASA Technical Reports Server (NTRS)

    Smith, P. L.

    1982-01-01

    This report demonstrates the successful application of statistical variable selection techniques to fit splines. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs using the B-spline basis were developed, and the one for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  12. The Forces That Shape the Work of Community College Counselors

    ERIC Educational Resources Information Center

    Ryan, Edward Francis

    2011-01-01

    Concerns exist about the quality of counseling within community colleges. Some counselors lower student aspirations and are inaccessible, uninformed, or discouraging. Evidence also suggests that ineffective counseling might be explained by role conflict. Although role conflict should not be used as an excuse to justify poor performance, it may…

  13. Using Perceived Differences in Views of Agricultural Water Use to Inform Practice

    ERIC Educational Resources Information Center

    Lamm, Alexa J.; Taylor, Melissa R.; Lamm, Kevan W.

    2016-01-01

    Water use has become increasingly contentious as the population grows and water resources become scarcer. Recent media coverage of agricultural water use has brought negative attention potentially influencing public and decision makers' attitudes towards agriculture. Negative perceptions could result in uninformed decisions being made that impact…

  14. The Capital Costs Conundrum: Why Are Capital Costs Ignored and What Are the Consequences?

    ERIC Educational Resources Information Center

    Winston, Gordon C.

    1993-01-01

    Colleges and universities historically have ignored the capital costs associated with institutional administration in their estimates of overall and per-student costs. This neglect leads to distortion of data, misunderstandings, and uninformed decision making. The real costs should be recognized in institutional accounting. (MSE)

  15. Architecture and the Ethics of Authenticity

    ERIC Educational Resources Information Center

    Spector, Tom

    2011-01-01

    Across most of Oklahoma's gently rolling prairie countryside these artistically uninformed structures often provide the only vertical punctuation to a landscape otherwise made of mostly horizontal lines. One of the pleasures of teaching architecture is to participate in the intellectual progress of students--many of whom hail from rural areas and…

  16. A Curriculum on Obedience to Authority.

    ERIC Educational Resources Information Center

    Bushman, Brad J.

    This document is a curriculum guide on obedience to authority based on the assumption that informed, educated, thoughtful individuals are more likely to make intelligent decisions regarding obedience or disobedience to authority figures' requests than are uninformed individuals. The intent of this curriculum is to expose students to a small number…

  17. Reference Letters and the Uninformed Business Educator: A U.S. Legal Perspective

    ERIC Educational Resources Information Center

    Compton, Nina; Albinsson, Pia A.

    2013-01-01

    While providing references to students, business professors have to meet dual demands of giving sincere references to prospective employers while avoiding any potential litigation claims of "defamation" and "violation of privacy" from the students. While the approach of providing bare minimum information may seem to mitigate…

  18. Dental ethics and emotional intelligence.

    PubMed

    Rosenblum, Alvin B; Wolf, Steve

    2014-01-01

    Dental ethics is often taught, viewed, and conducted as an intell enterprise, uninformed by other noncognitive factors. Emotional intelligence (EQ) is defined distinguished from the cognitive intelligence measured by Intelligence Quotient (IQ). This essay recommends more inclusion of emotional, noncognitive input to the ethical decision process in dental education and dental practice.

  19. Communicating rules in recreation areas

    Treesearch

    Terence L. Ross; George H. Moeller

    1974-01-01

    Five hundred fifty-eight campers were surveyed on the Allegheny National Forest to determine their knowledge of rules governing recreation behavior. Most of them were uninformed about the rules. Results of the study suggest that previous camping experience, age, camping style, and residence significantly affect knowledge of rules. Campers who received rule brochures or...

  20. Orienting to Eye Gaze and Face Processing

    ERIC Educational Resources Information Center

    Tipples, Jason

    2005-01-01

    The author conducted 7 experiments to examine possible interactions between orienting to eye gaze and specific forms of face processing. Participants classified a letter following either an upright or inverted face with averted, uninformative eye gaze. Eye gaze orienting effects were recorded for upright and inverted faces, irrespective of whether…

  1. Sleeping Beauty in the Classroom: What Do Teachers Know about Narcolepsy?

    ERIC Educational Resources Information Center

    Cosgrove, Maryellen S.

    2002-01-01

    Investigated teachers' awareness of narcolepsy and the accuracy of their knowledge. Found educators uninformed about narcolepsy and how to accommodate narcoleptic students. Recommended that teachers be aware of narcolepsy's symptoms, plan variety and movement into students' lessons, and allow students to redo assignments if a sleep attack…

  2. A Search for Transits of Proxima b in MOST Photometry

    NASA Astrophysics Data System (ADS)

    Kipping, David M.

    2017-01-01

    The recent discovery of a potentially rocky planet in the habitable-zone of our nearest star presents exciting prospects for future detailed characterization of another world. If Proxima b transits its star, the road to characterization would be considerably eased. In 2014 and 2015, we monitored Proxima Centauri with the Canadian space telescope MOST for a total of 43 days. As expected, the star presents considerable photometric variability due to flares, which greatly complicate our analysis. Using Gaussian process regression and Bayesian model selection with informative priors for the time of transit of Proxima b, we do find evidence for a transit of the expected depth. However, relaxing the prior on the transit time to an uninformative one returns a distinct solution highlighting the high false-positive rate induced by flaring. Using ground-based photometry from HATSouth, we show that our candidate transit is unlikely to be genuine although a conclusive answer will likely require infrared photometry, such as that from Spitzer, where flaring should be suppressed.

  3. Avoiding overstating the strength of forensic evidence: Shrunk likelihood ratios/Bayes factors.

    PubMed

    Morrison, Geoffrey Stewart; Poh, Norman

    2018-05-01

    When strength of forensic evidence is quantified using sample data and statistical models, a concern may be raised as to whether the output of a model overestimates the strength of evidence. This is particularly the case when the amount of sample data is small, and hence sampling variability is high. This concern is related to concern about precision. This paper describes, explores, and tests three procedures which shrink the value of the likelihood ratio or Bayes factor toward the neutral value of one. The procedures are: (1) a Bayesian procedure with uninformative priors, (2) use of empirical lower and upper bounds (ELUB), and (3) a novel form of regularized logistic regression. As a benchmark, they are compared with linear discriminant analysis, and in some instances with non-regularized logistic regression. The behaviours of the procedures are explored using Monte Carlo simulated data, and tested on real data from comparisons of voice recordings, face images, and glass fragments. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Do Illustrations Enhance Preschoolers' Memories for Stories? Age-Related Change in the Picture Facilitation Effect

    ERIC Educational Resources Information Center

    Greenhoot, Andrea Follmer; Semb, Patricia A.

    2008-01-01

    This study investigated whether illustrations facilitate story recall in preschoolers (N = 58) 46 to 63 months of age. Each child was exposed to either a verbal story narrative with illustrations (Verbal and Picture condition), the narrative alone (Verbal Only condition), the narrative with uninformative illustrations (Verbal and Irrelevant…

  5. Adolescents' Knowledge of Nuclear Issues and the Effects of Nuclear War.

    ERIC Educational Resources Information Center

    Roscoe, Bruce; Goodwin, Megan P.

    1987-01-01

    Surveyed 357 college students to assess awareness of the status of nuclear arms development and possible effects of nuclear war on people and environment. Results suggest that older adolescents are extremely uninformed regarding the current status of nuclear issues and consequences of nuclear war. Indicates a strong need to educate young people…

  6. DNA as information.

    PubMed

    Wills, Peter R

    2016-03-13

    This article reviews contributions to this theme issue covering the topic 'DNA as information' in relation to the structure of DNA, the measure of its information content, the role and meaning of information in biology and the origin of genetic coding as a transition from uninformed to meaningful computational processes in physical systems. © 2016 The Author(s).

  7. Report on Lyme disease Prepared for U.S. Army Corps of Engineers Field Personnel

    DTIC Science & Technology

    1992-01-01

    uninformed citizens. Ticks serve as vectors for a number of important human diseases. Rocky Mountain spotted fever (RMSF) is primarily a disease of...H. 1989. "Selected tickborne infections - A review of Lyme dis- ease, Rocky Mountain spotted fever , and Babesiosis. N.Y. State J. Med. 26 References

  8. Do Grades Tell Parents What They Want and Need to Know?

    ERIC Educational Resources Information Center

    Webber, Jim; Wilson, Maja

    2012-01-01

    Teachers' objections to an emphasis on narrative, descriptive evaluation and a de-emphasis on grades cannot rest on uninformed claims about what parents want. Decades of research show that grades don't lead to deeper understandings, increased intellectual risk-taking, or better performance on complex tasks. Similarly, conversations based around…

  9. The Impact of Information on Death Penalty Support, Revisited

    ERIC Educational Resources Information Center

    Lambert, Eric G.; Camp, Scott D.; Clarke, Alan; Jiang, Shanhe

    2011-01-01

    In 1972, former Supreme Court Justice Marshall postulated that the public was uninformed about the death penalty and information would change their support for it. There is some indication that information about the death penalty may change people's level of support. This study re-examines data used by Lambert and Clarke (2001). Using multivariate…

  10. Making Informed Decisions: Management Issues Influencing Computers in the Classroom.

    ERIC Educational Resources Information Center

    Strickland, James

    A number of noninstructional factors appear to determine the extent to which computers make a difference in writing instruction. Once computers have been purchased and installed, it is generally school administrators who make management decisions, often from an uninformed pedagogical orientation. Issues such as what hardware and software to buy,…

  11. Concurrent Schedules of Positive and Negative Reinforcement: Differential-Impact and Differential-Outcomes Hypotheses

    ERIC Educational Resources Information Center

    Magoon, Michael A.; Critchfield, Thomas S.

    2008-01-01

    Considerable evidence from outside of operant psychology suggests that aversive events exert greater influence over behavior than equal-sized positive-reinforcement events. Operant theory is largely moot on this point, and most operant research is uninformative because of a scaling problem that prevents aversive events and those based on positive…

  12. Legal Literacy: A Necessity for Florida Preservice/Inservice Teachers.

    ERIC Educational Resources Information Center

    Ringenberger, Barbara K.; Funk, Fanchon F.

    The increasing impact of judicial and legislative intervention at all levels of school operations make it necessary for educators to be more aware of their legal rights and liabilities. Few inservice and preservice teacher education programs in Florida provide legal education for teachers, and a study showed that teachers are largely uninformed of…

  13. Massing Support for a Levy without Mass Media

    ERIC Educational Resources Information Center

    Whitmoyer, Ron

    2005-01-01

    The classic campaign strategy in most school communities involves using the mass media to attract widespread attention to the upcoming budget or tax levy vote. Such strategies tend to bring uninformed voters in unknown quantities to the polls. The authors' recent experience, working with a committed and well-organized campaign chairperson, helped…

  14. Age-related differences in orienting attention to sound object representations.

    PubMed

    Alain, Claude; Cusimano, Madeline; Garami, Linda; Backer, Kristina C; Habelt, Bettina; Chan, Vanessa; Hasher, Lynn

    2018-06-01

    We examined the effect of age on listeners' ability to orient attention to an item in auditory short-term memory (ASTM) using high-density electroencephalography, while participants completed a delayed match-to-sample task. During the retention interval, an uninformative or an informative visual retro-cue guided attention to an item in ASTM. Informative cues speeded response times, but only for young adults. In young adults, informative retro-cues generated greater event-related potential amplitude between 450 and 650 ms at parietal sites, and an increased sustained potential over the left central scalp region, thought to index the deployment of attention and maintenance of the cued item in ASTM, respectively. Both modulations were reduced in older adults. Alpha and low beta oscillatory power suppression was greater when the retro-cue was informative than uninformative, especially in young adults. Our results point toward an age-related decline in orienting attention to the cued item in ASTM. Older adults may be dividing their attention between all items in working memory rather than selectively focusing attention on a single cued item. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. [Health-care reforms and insurees' preferences: a cluster analysis with data from the representative KBV survey 2010].

    PubMed

    Schnitzer, S; Grittner, U; Balke, K; Kuhlmey, A

    2013-12-01

    This study examines insurees' knowledge of and attitudes towards a number of recent reforms of the German healthcare system (electronic health insurance card, reimbursement tariff, etc.). It further examines whether it is possible to identify groups of respondents with similar patterns of preferences and knowledge. The analysis draws on a representative survey conducted by the German National Association of Statutory Health Insurance Physicians (Kassenärztliche Bundesvereinigung, KBV, 2010), in which 6 065 respondents aged between 18 and 79 years were interviewed. 4 groups of respondents can be distinguished: the "quality oriented," the "uninformed," the "internet savvy," and the "informed." The most striking finding is that members of the "uninformed" group, who knew the least about recent reforms, tended to be employed middle-aged men with a high level of formal education. To foster the successful implementation of reforms in the health-care system and their communication to specific target groups, it is recommended to take into account not only specific social determinants, but the full personal circumstances and situation of insurees. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Information spreading on mobile communication networks: A new model that incorporates human behaviors

    NASA Astrophysics Data System (ADS)

    Ren, Fei; Li, Sai-Ping; Liu, Chuang

    2017-03-01

    Recently, there is a growing interest in the modeling and simulation based on real social networks among researchers in multi-disciplines. Using an empirical social network constructed from the calling records of a Chinese mobile service provider, we here propose a new model to simulate the information spreading process. This model takes into account two important ingredients that exist in real human behaviors: information prevalence and preferential spreading. The fraction of informed nodes when the system reaches an asymptotically stable state is primarily determined by information prevalence, and the heterogeneity of link weights would slow down the information diffusion. Moreover, the sizes of blind clusters which consist of connected uninformed nodes show a power-law distribution, and these uninformed nodes correspond to a particular portion of nodes which are located at special positions in the network, namely at the edges of large clusters or inside the clusters connected through weak links. Since the simulations are performed on a real world network, the results should be useful in the understanding of the influences of social network structures and human behaviors on information propagation.

  17. Use of Speaker’s Gaze and Syntax in Verb Learning

    PubMed Central

    Nappa, Rebecca; Wessel, Allison; McEldoon, Katherine L.; Gleitman, Lila R.; Trueswell, John C.

    2013-01-01

    Speaker eye gaze and gesture are known to help child and adult listeners establish communicative alignment and learn object labels. Here we consider how learners use these cues, along with linguistic information, to acquire abstract relational verbs. Test items were perspective verb pairs (e.g., chase/flee, win/lose), which pose a special problem for observational accounts of word learning because their situational contexts overlap very closely; the learner must infer the speaker’s chosen perspective on the event. Two cues to the speaker’s perspective on a depicted event were compared and combined: (a) the speaker’s eye gaze to an event participant (e.g., looking at the Chaser vs. looking at the Flee-er) and (b) the speaker’s linguistic choice of which event participant occupies Subject position in his utterance. Participants (3-, 4-, and 5-year-olds) were eye-tracked as they watched a series of videos of a man describing drawings of perspective events (e.g., a rabbit chasing an elephant). The speaker looked at one of the two characters and then uttered either an utterance that was referentially uninformative (He’s mooping him) or informative (The rabbit’s mooping the elephant/The elephant’s mooping the rabbit) because of the syntactic positioning of the nouns. Eye-tracking results showed that all participants regardless of age followed the speaker’s gaze in both uninformative and informative contexts. However, verb-meaning choices were responsive to speaker’s gaze direction only in the linguistically uninformative condition. In the presence of a linguistically informative context, effects of speaker gaze on meaning were minimal for the youngest children to nonexistent for the older populations. Thus children, like adults, can use multiple cues to inform verb-meaning choice but rapidly learn that the syntactic positioning of referring expressions is an especially informative source of evidence for these decisions. PMID:24465183

  18. Informed Choice in the German Mammography Screening Program by Education and Migrant Status: Survey among First-Time Invitees.

    PubMed

    Berens, Eva-Maria; Reder, Maren; Razum, Oliver; Kolip, Petra; Spallek, Jacob

    2015-01-01

    Breast cancer is the most prevalent cancer among women and mammography screening programs are seen as a key strategy to reduce breast cancer mortality. In Germany, women are invited to the population-based mammography screening program between ages 50 to 69. It is still discussed whether the benefits of mammography screening outweigh its harms. Therefore, the concept of informed choice comprising knowledge, attitude and intention has gained importance. The objective of this observational study was to assess the proportion of informed choices among women invited to the German mammography screening program for the first time. A representative sample of 17,349 women aged 50 years from a sub-region of North Rhine Westphalia was invited to participate in a postal survey. Turkish immigrant women were oversampled. The effects of education level and migration status on informed choice and its components were assessed. 5,847 (33.7%) women responded to the postal questionnaire of which 4,113 were used for analyses. 31.5% of the women had sufficient knowledge. The proportion of sufficient knowledge was lower among immigrants and among women with low education levels. The proportion of women making informed choices was low (27.1%), with similar associations with education level and migration status. Women of low (OR 2.75; 95% CI 2.18-3.46) and medium education level (OR 1.49; 95% CI 1.27-1.75) were more likely to make an uninformed choice than women of high education level. Turkish immigrant women had the greatest odds for making an uninformed choice (OR 5.30, 95% CI 1.92-14.66) compared to non-immigrant women. Other immigrant women only had slightly greater odds for making an uninformed choice than non-immigrant women. As immigrant populations and women with low education level have been shown to have poor knowledge, they need special attention in measures to increase knowledge and thus informed choices.

  19. Treatment selection of early stage non-small cell lung cancer: the role of the patient in clinical decision making.

    PubMed

    Mokhles, S; Nuyttens, J J M E; de Mol, M; Aerts, J G J V; Maat, A P W M; Birim, Ö; Bogers, A J J C; Takkenberg, J J M

    2018-01-15

    The objective of this study is to investigate the role and experience of early stage non-small cell lung cancer (NSCLC) patient in decision making process concerning treatment selection in the current clinical practice. Stage I-II NSCLC patients (surgery 55 patients, SBRT 29 patients, median age 68) were included in this prospective study and completed a questionnaire that explored: (1) perceived patient knowledge of the advantages and disadvantages of the treatment options, (2) experience with current clinical decision making, and (3) the information that the patient reported to have received from their treating physician. This was assessed by multiple-choice, 1-5 Likert Scale, and open questions. The Decisional Conflict Scale was used to assess the decisional conflict. Health related quality of life (HRQoL) was measured with SF-36 questionnaire. In 19% of patients, there was self-reported perceived lack of knowledge about the advantages and disadvantages of the treatment options. Seventy-four percent of patients felt that they were sufficiently involved in decision-making by their physician, and 81% found it important to be involved in decision making. Forty percent experienced decisional conflict, and one-in-five patients to such an extent that it made them feel unsure about the decision. Subscores with regard to feeling uninformed and on uncertainty, contributed the most to decisional conflict, as 36% felt uninformed and 17% of patients were not satisfied with their decision. HRQoL was not influenced by patient experience with decision-making or patient preferences for shared decision making. Dutch early-stage NSCLC patients find it important to be involved in treatment decision making. Yet a substantial proportion experiences decisional conflict and feels uninformed. Better patient information and/or involvement in treatment-decision-making is needed in order to improve patient knowledge and hopefully reduce decisional conflict.

  20. Outcomes assessment of science & engineering doctor of philosophy (Ph.D.) programs: An exploratory study of prospective influencers in distinguished graduate placement

    NASA Astrophysics Data System (ADS)

    Williamson, Louise M.

    This exploratory study was an investigation of the mission and emphases of twenty-two science & engineering doctor of philosophy (Ph.D.) programs in ten fields of study at nine public research universities in the United States and the corresponding influence those factors impose on placement of Ph.D. graduates of those programs into academic program settings. Ph.D. program chairs participated via protocol to provide descriptive, statistical, and experiential details of their Ph.D. programs and offered insight on current conditions for academic placement opportunities. The quantitative analysis served as the basis of examination of influencers in graduate placement for those Ph.D. programs that are informed about placement activity of their graduates. Among the nine tested hypotheses there were no statistically significant findings. The qualitative expressions of this study---those found in the confounding variables, the limitations of the study, those questions that elicited opinions and further discussion and follow-up queries with program chairs---added most meaningfully, however, to the study in that they served as a gauge of the implications of neglect for those Ph.D. programs that remain uninformed about their graduate placement activity. Central to the findings of this study was that one compelling fact remains the same. Denecke, Director of Best Practice at the Council of Graduate Schools, pointed out years ago that just as "we know very little about why those who finish and why those who leave do so, we also know surprisingly little about where students go after their degrees...we therefore have little information about how effective doctoral programs are in preparing doctorates for short- and long-term career success." The fact remains that the effectiveness of doctoral programs in the context of career success is just as uncertain today. A serious admonition is that one-half of those programs that participated in this study remain uninformed about the placement activity of their graduates and therefore a more complete understanding of the underlying tenets of effectiveness in those doctoral programs remains elusive.

  1. THE INFLUENCE OF VARIABLE ELIMINATION RATE AND BODY FAT MASS IN A PBPK MODEL FOR TCDD IN PREDICTING THE SERUM TCDD CONCENTRATIONS FROM VETERANS OF OPERATION RANCH HAND

    EPA Science Inventory

    The Influence of Variable Elimination Rate and Body Fat Mass in a PBPK Model for TCDD in Predicting the Serum TCDD Concentrations from Veterans of Operation Ranch Hand.
    C Emond1,2, LS Birnbaum2, JE Michalek3, MJ DeVito2
    1 National Research Council, National Academy of Scien...

  2. The Importance of Source: A Mixed Methods Analysis of Undergraduate Students' Attitudes toward Genetically Modified Food

    ERIC Educational Resources Information Center

    Ruth, Taylor K.; Rumble, Joy N.; Gay, Keegan D.; Rodriguez, Mary T.

    2016-01-01

    Even though science says genetically modified (GM) foods are safe, many consumers remain skeptical of the technology. Additionally, the scientific community has trouble communicating to the public, causing consumers to make uninformed decisions. The Millennial Generation will have more buying power than any other generation before them, and more…

  3. Communicating the Benefits of a Full Sequence of High School Science Courses

    ERIC Educational Resources Information Center

    Nicholas, Catherine Marie

    2014-01-01

    High school students are generally uninformed about the benefits of enrolling in a full sequence of science courses, therefore only about a third of our nation's high school graduates have completed the science sequence of Biology, Chemistry and Physics. The lack of students completing a full sequence of science courses contributes to the deficit…

  4. Age 60 Study. Part 1. Bibliographic Database

    DTIC Science & Technology

    1994-10-01

    seven of these aircraft types participated in a spectacle design study. Experimental spectacles were designed for each pilot and evaluated for...observation flight administered by observers who were uninformed of the details of the experimental design . Students and instructors also completed a critique...intraindividual lability in field-dependence-field independence, and (4) various measurement, sampling, and experimental design concerns associated

  5. In-utero diagnosis of Norrie disease by ultrasonography.

    PubMed

    Redmond, R M; Vaughan, J I; Jay, M; Jay, B

    1993-03-01

    Obstetric ultrasonography of an obligate Norrie disease carrier revealed bilateral retinal detachments in a third trimester male fetus. Postnatal examination confirmed the diagnosis of Norrie disease. DNA linkage analysis with the markers L1.28 and MAO had been uninformative for this family. This report suggests that retinal detachment occurs late in the gestation of the affected fetus.

  6. "(Un)informed College and Major Choice": Verification in an Alternate Setting. CEDR Working Paper. WP #2015-11

    ERIC Educational Resources Information Center

    Huntington-Klein, Nick

    2015-01-01

    The decision to pursue formal education has significant labor market implications. To approach the decision rationally, a student must consider the costs and benefits of each available option. However, mounting empirical evidence suggests that reported expectations of costs and benefits are uncertain and vary across students. Hastings et al.…

  7. The Effect of Concrete Supplements on Metacognitive Regulation during Learning and Open-Book Test Taking

    ERIC Educational Resources Information Center

    Ackerman, Rakefet; Leiser, David

    2014-01-01

    Background: Previous studies have suggested that when reading texts, lower achievers are more sensitive than their stronger counterparts to surface-level cues, such as graphic illustrations, and that even when uninformative, such concrete supplements tend to raise the text's subjective comprehensibility. Aims: We examined how being led astray…

  8. Rethinking the Roles of Mentor and Mentee in the Context of Student Suicide

    ERIC Educational Resources Information Center

    Rishel, Teresa J.

    2006-01-01

    Student suicides and attempted suicides are on the rise globally, and the numbers continue to climb steeply in most Westernized countries. School personnel remain untrained and uninformed in the area of suicide; few teachers or administrators can accurately identify the characteristics of suicidal behavior and fewer still have any knowledge of…

  9. Uninformed in the Information Age: Why Media Necessitate Critical Thinking Education

    ERIC Educational Resources Information Center

    McBrien, J. Lynn

    2005-01-01

    Much has been written about the power of media to influence the public through instruments of advertising and a variety of venues (Considine & Haley, 1999; Cortes, 2000; Kilbourne, 1999). In this chapter, the author has chosen to use a singular media focus--news--to support the critical importance of teaching media literacy skills to students. She…

  10. Knowledge of the Health Care Law as an Issue in Teacher Education

    ERIC Educational Resources Information Center

    Wright, Jennifer M.; Henze, Erin E. C.; Coles, Jeremy T.; Miller, Nicole A.; Williams, Robert L.

    2016-01-01

    Of all the collegiate majors that affect society, none is more critical than teacher education. If teacher education students are uninformed or misinformed about issues central to society, they are likely to be inept in responding to queries and opinions voiced by their future students regarding such issues. The current study investigated one such…

  11. When Is the Local Average Treatment Close to the Average? Evidence from Fertility and Labor Supply

    ERIC Educational Resources Information Center

    Ebenstein, Avraham

    2009-01-01

    The local average treatment effect (LATE) may differ from the average treatment effect (ATE) when those influenced by the instrument are not representative of the overall population. Heterogeneity in treatment effects may imply that parameter estimates from 2SLS are uninformative regarding the average treatment effect, motivating a search for…

  12. The Composing Processes of Three College Freshmen: Focus on Revision.

    ERIC Educational Resources Information Center

    Peitzman, Faye

    As part of a larger research study, three freshman writers were observed in order to determine when they rewrote compositions, when they rethought major concepts and designs, and when they reviewed stylistic options and made purposive as opposed to uninformed changes in their writing. Their tape recorded comments were guided by questions on (1)…

  13. Where Cognitive Development and Aging Meet: Face Learning Ability Peaks after Age 30

    ERIC Educational Resources Information Center

    Germine, Laura T.; Duchaine, Bradley; Nakayama, Ken

    2011-01-01

    Research on age-related cognitive change traditionally focuses on either development or aging, where development ends with adulthood and aging begins around 55 years. This approach ignores age-related changes during the 35 years in-between, implying that this period is uninformative. Here we investigated face recognition as an ability that may…

  14. One Earth: Why Care? Red Cross Youth International Development Resource Package.

    ERIC Educational Resources Information Center

    Ormston, Randy, Ed.

    To examine the cultural characteristics of a society without exploring the human condition of that society and how it relates to all as citizens on this planet is to ignore the realities of today. Most Canadians see global problems as massive and overwhelming. Some are uninformed and others are misinformed. As a result, gross misconceptions have…

  15. Edge Pushing is Equivalent to Vertex Elimination for Computing Hessians

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Mu; Pothen, Alex; Hovland, Paul

    We prove the equivalence of two different Hessian evaluation algorithms in AD. The first is the Edge Pushing algorithm of Gower and Mello, which may be viewed as a second order Reverse mode algorithm for computing the Hessian. In earlier work, we have derived the Edge Pushing algorithm by exploiting a Reverse mode invariant based on the concept of live variables in compiler theory. The second algorithm is based on eliminating vertices in a computational graph of the gradient, in which intermediate variables are successively eliminated from the graph, and the weights of the edges are updated suitably. We provemore » that if the vertices are eliminated in a reverse topological order while preserving symmetry in the computational graph of the gradient, then the Vertex Elimination algorithm and the Edge Pushing algorithm perform identical computations. In this sense, the two algorithms are equivalent. This insight that unifies two seemingly disparate approaches to Hessian computations could lead to improved algorithms and implementations for computing Hessians. Read More: http://epubs.siam.org/doi/10.1137/1.9781611974690.ch11« less

  16. Bone Sialoproteins and Breast Cancer Detection

    DTIC Science & Technology

    2004-07-01

    used to follow proteolytic activity on more natural macromolecular substrates. These substrates are so highly substituted with fluorescein moieties that...uninformative for breast cancer, but does correlate with bone mineral density, parathyroid hormone and phosphorus . (Summary of Appendix II). Normal MEPE...calcium, phosphorus , vitamin D, as well as novel phosphatonin(s), and the bone and kidney organs. Candidate phosphaturic factors include MEPE; PHEX, a

  17. 2002 Industry Studies: News Media

    DTIC Science & Technology

    2002-01-01

    News Media responsibility introductory critique: Mustering the moxie to master the media mess: some introductory comments in the quest for media...accountable for their actions.2 Bad news reporting, on the other hand, can leave the people uninformed by failing to report important news , or by... the most alarming weaknesses of the news media have been systemic, and they have seriously underestimated or ignored America’s

  18. "Under God" and the Pledge of Allegiance: Examining a 1954 Sermon and Its Meaning

    ERIC Educational Resources Information Center

    Groce, Eric C.; Heafner, Tina; Bellows, Elizabeth

    2013-01-01

    A lesson exploring the Pledge of Allegiance, its history, and the addition of the phrase "under God," can serve as a jumping off point into major themes of U.S. history and First Amendment freedoms. Although the Pledge is ubiquitous in contemporary America, educators and students are often uninformed about the history and meaning of the…

  19. The Multiple Dimensions of Child Abuse and Neglect: New Insights into an Old Problem. Child Trends Research Brief.

    ERIC Educational Resources Information Center

    Chalk, Rosemary; Gibbons, Alison; Scarupa, Harriet J.

    Despite persistent media headlines about extreme cases of child abuse and neglect, the public remains largely uninformed about the developmental status of children affected by this tragic problem. The research brief draws on available data and recent research to summarize what is known about these outcomes in several critical areasphysical and…

  20. The Democratic Dilemma: Can Citizens Learn What They Need To Know? Political Economy of Institutions and Decisions Series.

    ERIC Educational Resources Information Center

    Lupia, Arthur; McCubbins, Mathew D.

    Although this book concedes that people lack political information and that this ignorance can allow people "of sinister designs" to deceive and betray the uninformed, it does not concede that democracy must succumb to these threats. Rather, the book argues that limited information need not prevent people from making reasoned choices,…

  1. A Bone of Contention: Teacher Beliefs on the Pedagogical Value of English Idioms for Second Language Learners

    ERIC Educational Resources Information Center

    Ramonda, Kris

    2016-01-01

    Teacher beliefs are an important area of inquiry because research has found that these beliefs are often diverse (Breen et al., 2001) and strongly impact classroom practices (Borg, 1998, 2003; Burns, 1992; Farrell & Bennis, 2013). Therefore, uninformed teacher beliefs could be to the detriment of the L2 learner. Despite the fact that knowledge…

  2. Uninformed and disinformed society and the GMO market.

    PubMed

    Twardowski, Tomasz; Małyska, Aleksandra

    2015-01-01

    The EU has a complicated regulatory framework, and this is slowing down the approval process of new genetically modified (GM) crops. Currently, labeling of GM organisms (GMOs) is mandatory in all Member States. However, the USA, in which GMO labeling is not mandatory, continues to lead the production of biotech crops, biopharmaceuticals, biomaterials, and bioenergy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Leadership, consensus decision making and collective behaviour in humans

    PubMed Central

    Dyer, John R.G.; Johansson, Anders; Helbing, Dirk; Couzin, Iain D.; Krause, Jens

    2008-01-01

    This paper reviews the literature on leadership in vertebrate groups, including recent work on human groups, before presenting the results of three new experiments looking at leadership and decision making in small and large human groups. In experiment 1, we find that both group size and the presence of uninformed individuals can affect the speed with which small human groups (eight people) decide between two opposing directional preferences and the likelihood of the group splitting. In experiment 2, we show that the spatial positioning of informed individuals within small human groups (10 people) can affect the speed and accuracy of group motion. We find that having a mixture of leaders positioned in the centre and on the edge of a group increases the speed and accuracy with which the group reaches their target. In experiment 3, we use large human crowds (100 and 200 people) to demonstrate that the trends observed from earlier work using small human groups can be applied to larger crowds. We find that only a small minority of informed individuals is needed to guide a large uninformed group. These studies build upon important theoretical and empirical work on leadership and decision making in animal groups. PMID:19073481

  4. Objective Bayesian analysis of neutrino masses and hierarchy

    NASA Astrophysics Data System (ADS)

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  5. Role of Corticospinal Suppression during Motor Preparation

    PubMed Central

    Ivry, Richard B.

    2009-01-01

    Behavior arises from a constant competition between potential actions. For example, movements performed unimanually require selecting one hand rather than the other. Corticospinal (CS) excitability of the nonselected hand is typically decreased prior to movement initiation, suggesting that response selection may involve mechanisms that inhibit nonselected candidate movements. To examine this hypothesis, participants performed a reaction time task, responding with the left, right, or both indexes. Transcranial magnetic stimulation was applied over the right primary motor cortex (M1) to induce motor-evoked potentials (MEPs) in a left hand muscle at various stages during response preparation. To vary the time of response selection, an imperative signal was preceded by a preparatory cue that was either informative or uninformative. Left MEPs decreased following the cue. Surprisingly, this decrease was greater when an informative cue indicated that the response might require the left hand than when it indicated a right hand response. In the uninformative condition, we did not observe additional attenuation of left MEP after an imperative indicating a right hand response. These results argue against the “deselection” hypothesis. Rather, CS suppression seems to arise from “impulse control” mechanisms that ensure that responses associated with potentially selected actions are not initiated prematurely. PMID:19126798

  6. The GCRP Climate Health Assessment: From Scientific Literature to Climate Health Literacy

    NASA Astrophysics Data System (ADS)

    Crimmins, A. R.; Balbus, J. M.

    2016-12-01

    As noted by the new report from the US GCRP, the Impacts of Climate Change on Human Health in the United States: A Scientific Assessment, climate change is a significant threat to the health of the American people. Despite a growing awareness of the significance of climate change in general among Americans, however, recognition of the health significance of climate change is lacking. Not only are the general public and many climate scientists relatively uninformed about the myriad health implications of climate change; health professionals, including physicians and nurses, are in need of enhanced climate literacy. This presentation will provide an overview of the new GCRP Climate Health Assessment, introducing the audience to the systems thinking that underlies the assessment of health impacts, and reviewing frameworks that tie climate and earth systems phenomena to human vulnerability and health. The impacts on health through changes in temperature, precipitation, severity of weather extremes and climate variability, and alteration of ecosystems and phenology will be explored. The process of developing the assessment report will be discussed in the context of raising climate and health literacy within the federal government.

  7. Does More Schooling Improve Health Outcomes and Health Related Behaviors? Evidence from U.K. Twins

    PubMed Central

    Amin, Vikesh; Behrman, Jere R.; Spector, Tim D.

    2013-01-01

    Several recent studies using instrumental variables based on changes in compulsory schoolleaving age laws have estimated the causal effect of schooling on health outcomes and health-related behaviors in the U.K. Despite using the same identification strategy and similar datasets, no consensus has been reached. We contribute to the literature by providing results for the U.K. using a different research design and a different dataset. Specifically, we estimate the effect of schooling on health outcomes (obesity and physical health) and health-related behaviors (smoking, alcohol consumption and exercise) for women through within-MZ twins estimates using the TwinsUK database. For physical health, alcohol consumption and exercise, the within-MZ twins estimates are uninformative about whether there is a causal effect. However, we find (1) that the significant association between schooling and smoking status is due to unobserved endowments that are correlated with schooling and smoking (2) there is some indication that more schooling reduces the body mass index for women, even once these unobserved endowments have been controlled for. PMID:24415826

  8. Dual-Mode Combustor

    NASA Technical Reports Server (NTRS)

    Trefny, Charles J (Inventor); Dippold, Vance F (Inventor)

    2013-01-01

    A new dual-mode ramjet combustor used for operation over a wide flight Mach number range is described. Subsonic combustion mode is usable to lower flight Mach numbers than current dual-mode scramjets. High speed mode is characterized by supersonic combustion in a free-jet that traverses the subsonic combustion chamber to a variable nozzle throat. Although a variable combustor exit aperture is required, the need for fuel staging to accommodate the combustion process is eliminated. Local heating from shock-boundary-layer interactions on combustor walls is also eliminated.

  9. Impact of Threat Level, Task Instruction, and Individual Characteristics on Cold Pressor Pain and Fear among Children and Their Parents.

    PubMed

    Boerner, Katelynn E; Noel, Melanie; Birnie, Kathryn A; Caes, Line; Petter, Mark; Chambers, Christine T

    2016-07-01

    The cold pressor task (CPT) is increasingly used to induce experimental pain in children, but the specific methodology of the CPT is quite variable across pediatric studies. This study examined how subtle variations in CPT methodology (eg. provision of low- or high-threat information regarding the task; provision or omission of maximum immersion time) may influence children's and parents' perceptions of the pain experience. Forty-eight children (8 to 14 years) and their parents were randomly assigned to receive information about the CPT that varied on 2 dimensions, prior to completing the task: (i) threat level: high-threat (task described as very painful, high pain expressions depicted) or low-threat (standard CPT instructions provided, low pain expressions depicted); (ii) ceiling: informed (provided maximum immersion time) or uninformed (information about maximum immersion time omitted). Parents and children in the high-threat condition expected greater child pain, and these children reported higher perceived threat of pain and state pain catastrophizing. For children in the low-threat condition, an informed ceiling was associated with less state pain catastrophizing during the CPT. Pain intensity, tolerance, and fear during the CPT did not differ by experimental group, but were predicted by child characteristics. Findings suggest that provision of threatening information may impact anticipatory outcomes, but experienced pain was better explained by individual child variables. © 2015 World Institute of Pain.

  10. Weight-elimination neural networks applied to coronary surgery mortality prediction.

    PubMed

    Ennett, Colleen M; Frize, Monique

    2003-06-01

    The objective was to assess the effectiveness of the weight-elimination cost function in improving classification performance of artificial neural networks (ANNs) and to observe how changing the a priori distribution of the training set affects network performance. Backpropagation feedforward ANNs with and without weight-elimination estimated mortality for coronary artery surgery patients. The ANNs were trained and tested on cases with 32 input variables describing the patient's medical history; the output variable was in-hospital mortality (mortality rates: training 3.7%, test 3.8%). Artificial training sets with mortality rates of 20%, 50%, and 80% were created to observe the impact of training with a higher-than-normal prevalence. When the results were averaged, weight-elimination networks achieved higher sensitivity rates than those without weight-elimination. Networks trained on higher-than-normal prevalence achieved higher sensitivity rates at the cost of lower specificity and correct classification. The weight-elimination cost function can improve the classification performance when the network is trained with a higher-than-normal prevalence. A network trained with a moderately high artificial mortality rate (artificial mortality rate of 20%) can improve the sensitivity of the model without significantly affecting other aspects of the model's performance. The ANN mortality model achieved comparable performance as additive and statistical models for coronary surgery mortality estimation in the literature.

  11. The method for measuring the groove density of variable-line-space gratings with elimination of the eccentricity effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qingbo; Liu, Zhengkun, E-mail: zhkliu@ustc.edu.cn; Chen, Huoyao

    2015-02-15

    To eliminate the eccentricity effect, a new method for measuring the groove density of a variable-line-space grating was adapted. Based on grating equation, groove density is calculated by measuring the internal angles between zeroth-order and first-order diffracted light for two different wavelengths with the same angle of incidence. The measurement system mainly includes two laser sources, a phase plate, plane mirror, and charge coupled device. The measurement results of a variable-line-space grating demonstrate that the experiment data agree well with theoretical values, and the value of measurement error (ΔN/N) is less than 2.72 × 10{sup −4}.

  12. Process for Operating a Dual-Mode Combustor

    NASA Technical Reports Server (NTRS)

    Trefny, Charles J. (Inventor); Dippold, Vance F. (Inventor)

    2017-01-01

    A new dual-mode ramjet combustor used for operation over a wide flight Mach number range is described. Subsonic combustion mode is usable to lower flight Mach numbers than current dual-mode scramjets. High speed mode is characterized by supersonic combustion in a free-jet that traverses the subsonic combustion chamber to a variable nozzle throat. Although a variable combustor exit aperture is required, the need for fuel staging to accommodate the combustion process is eliminated. Local heating from shock-boundary-layer interactions on combustor walls is also eliminated.

  13. A Variable-Selection Heuristic for K-Means Clustering.

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Cradit, J. Dennis

    2001-01-01

    Presents a variable selection heuristic for nonhierarchical (K-means) cluster analysis based on the adjusted Rand index for measuring cluster recovery. Subjected the heuristic to Monte Carlo testing across more than 2,200 datasets. Results indicate that the heuristic is extremely effective at eliminating masking variables. (SLD)

  14. Towards a Cognitive Organisational Framework for Knowledge Management

    DTIC Science & Technology

    2001-09-01

    of knowledge now exist only at the bottom of the organisation, with management uninformed on specific detail. For many knowledge- based ...the organisation’ s internal context - its internal management practices, learning culture and knowledge base . This has particularly been found to...of the process is based in ’Kaizen’. Within the Western culture we may well have ignored important insights in our own tradition. Traditionally, in

  15. BMI z-scores are a poor indicator of adiposity among 2- to 19-year-olds with very high BMIs, NHANES 1999-2000 to 2013-2014

    USDA-ARS?s Scientific Manuscript database

    Although the Centers for Disease Control and Prevention (CDC) growth charts are widely used, BMI-for-age z-Scores (BMIz) are known to be uninformative above the 97th percentile. This study compared the relations of BMIz and other BMI metrics (%BMIp95, percent of 95th percentile, and BMI minus 95th ...

  16. Sex differences in the use of delayed semantic context when listening to disrupted speech.

    PubMed

    Liederman, Jacqueline; Fisher, Janet McGraw; Coty, Alexis; Matthews, Geetha; Frye, Richard E; Lincoln, Alexis; Alexander, Rebecca

    2013-02-01

    Female as opposed to male listeners were better able to use a delayed informative cue at the end of a long sentence to report an earlier word which was disrupted by noise. Informative (semantically related) or uninformative (semantically unrelated) word cues were presented 2, 6, or 10 words after a target word whose initial phoneme had been replaced with noise. A total of 84 young adults (45 males) listened to each sentence and then repeated it after its offset. The semantic benefit effect (SBE) was the difference in the accuracy of report of the disrupted target word during informative vs. uninformative sentences. Women had significantly higher SBEs than men even though there were no significant sex differences in terms of number of non-target words reported, the effect of distance between the disrupted target word and the informative cue, or kinds of errors generated. We suggest that the superior ability of women to use delayed semantic information to decode an earlier ambiguous speech signal may be linked to women's tendency to engage the hemispheres more bilaterally than men during word processing. Since the maintenance of semantic context under ambiguous conditions demands more right than left hemispheric resources, this may give women an advantage.

  17. Effects of preference heterogeneity among landowners on spatial conservation prioritization.

    PubMed

    Nielsen, Anne Sofie Elberg; Strange, Niels; Bruun, Hans Henrik; Jacobsen, Jette Bredahl

    2017-06-01

    The participation of private landowners in conservation is crucial to efficient biodiversity conservation. This is especially the case in settings where the share of private ownership is large and the economic costs associated with land acquisition are high. We used probit regression analysis and historical participation data to examine the likelihood of participation of Danish forest owners in a voluntary conservation program. We used the results to spatially predict the likelihood of participation of all forest owners in Denmark. We merged spatial data on the presence of forest, cadastral information on participation contracts, and individual-level socioeconomic information about the forest owners and their households. We included predicted participation in a probability model for species survival. Uninformed and informed (included land owner characteristics) models were then incorporated into a spatial prioritization for conservation of unmanaged forests. The choice models are based on sociodemographic data on the entire population of Danish forest owners and historical data on their participation in conservation schemes. Inclusion in the model of information on private landowners' willingness to supply land for conservation yielded at intermediate budget levels up to 30% more expected species coverage than the uninformed prioritization scheme. Our landowner-choice model provides an example of moving toward more implementable conservation planning. © 2016 Society for Conservation Biology.

  18. Unbiased feature selection in learning random forests for high-dimensional data.

    PubMed

    Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi

    2015-01-01

    Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.

  19. Colonoscopy video quality assessment using hidden Markov random fields

    NASA Astrophysics Data System (ADS)

    Park, Sun Young; Sargent, Dusty; Spofford, Inbar; Vosburgh, Kirby

    2011-03-01

    With colonoscopy becoming a common procedure for individuals aged 50 or more who are at risk of developing colorectal cancer (CRC), colon video data is being accumulated at an ever increasing rate. However, the clinically valuable information contained in these videos is not being maximally exploited to improve patient care and accelerate the development of new screening methods. One of the well-known difficulties in colonoscopy video analysis is the abundance of frames with no diagnostic information. Approximately 40% - 50% of the frames in a colonoscopy video are contaminated by noise, acquisition errors, glare, blur, and uneven illumination. Therefore, filtering out low quality frames containing no diagnostic information can significantly improve the efficiency of colonoscopy video analysis. To address this challenge, we present a quality assessment algorithm to detect and remove low quality, uninformative frames. The goal of our algorithm is to discard low quality frames while retaining all diagnostically relevant information. Our algorithm is based on a hidden Markov model (HMM) in combination with two measures of data quality to filter out uninformative frames. Furthermore, we present a two-level framework based on an embedded hidden Markov model (EHHM) to incorporate the proposed quality assessment algorithm into a complete, automated diagnostic image analysis system for colonoscopy video.

  20. Do prostate cancer patients want to choose their own radiation treatment?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tol-Geerdink, Julia J. van; Stalmeier, Peep F.M.; Department of Medical Technology Assessment, Radboud University Nijmegen Medical Center, Nijmegen

    2006-11-15

    Purpose: The aims of this study were to investigate whether prostate cancer patients want to be involved in the choice of Radiation dose, and which patients want to be involved. Methods and Materials: This prospective study involved 150 patients with localized prostate cancer treated with three-dimensional conformal radiotherapy. A decision aid was used to explain the effects of two alternative radiation doses (70 and 74 Gy) in terms of cure and side effects. Patients were then asked whether they wanted to choose their treatment (accept choice), or leave the decision to the physician (decline choice). The treatment preference was carriedmore » out. Results: Even in this older population (mean age, 70 years), most patients (79%) accepted the option to choose. A lower score on the designations Pre-existent bowel morbidity, Anxiety, Depression, Hopelessness and a higher score on Autonomy and Numeracy were associated with an increase in choice acceptance, of which only Hopelessness held up in multiple regression (p < 0.03). The uninformed participation preference at baseline was not significantly related to choice acceptance (p = 0.10). Conclusion: Uninformed participation preference does not predict choice behavior. However, once the decision aid is provided, most patients want to choose their treatment. It should, therefore, be considered to inform patients first and ask participation preferences afterwards.« less

  1. Cognitive and Emotional Factors Predicting Decisional Conflict among High-Risk Breast Cancer Survivors Who Receive Uninformative BRCA1/2 Results

    PubMed Central

    Rini, Christine; O’Neill, Suzanne C.; Valdimarsdottir, Heiddis; Goldsmith, Rachel E.; DeMarco, Tiffani A.; Peshkin, Beth N.; Schwartz, Marc D.

    2012-01-01

    Objective To investigate high-risk breast cancer survivors’ risk reduction decision making and decisional conflict after an uninformative BRCA1/2 test. Design Prospective, longitudinal study of 182 probands undergoing BRCA1/2 testing, with assessments 1-, 6-, and 12-months post-disclosure. Measures Primary predictors were health beliefs and emotional responses to testing assessed 1-month post-disclosure. Main outcomes included women’s perception of whether they had made a final risk management decision (decision status) and decisional conflict related to this issue. Results There were four patterns of decision making, depending on how long it took women to make a final decision and the stability of their decision status across assessments. Late decision makers and non-decision makers reported the highest decisional conflict; however, substantial numbers of women—even early and intermediate decision makers—reported elevated decisional conflict. Analyses predicting decisional conflict 1- and 12-months post-disclosure found that, after accounting for controls and decision status, health beliefs and emotional factors predicted decisional conflict at different timepoints, with health beliefs more important one month after test disclosure and health beliefs more important one year later. Conclusion Many of these women may benefit from decision making assistance. PMID:19751083

  2. A longitudinal study of UK military personnel offered anthrax vaccination: informed choice, symptom reporting, uptake and pre-vaccination health.

    PubMed

    Murphy, D; Marteau, T M; Wessely, S

    2012-02-01

    To determine longer term health outcome in a cohort of UK service personnel who received the anthrax vaccination. We conducted a three year follow up of UK service personnel all of whom were in the Armed Forces at the start of the Iraq War. 3206 had been offered the anthrax vaccination as part of preparations for the 2003 invasion of Iraq. A further 1190 individuals who did not deploy to Iraq in 2003 were subsequently offered the vaccination as part of later deployments, and in whom we therefore had prospective pre-exposure data. There was no overall adverse health effect following receipt of the anthrax vaccination, with follow up data ranging from three to six years following vaccination. The previous retrospective association between making an uninformed choice to receive the anthrax vaccination and increased symptom reporting was replicated within a longitudinal sample where pre-vaccination health was known. Anthrax vaccination was not associated with long term adverse health problems. However, symptoms were associated with making an uninformed choice to undergo the vaccination. The results are important both for the safety of the vaccine and for future policies should anthrax vaccination be required in either military or non military populations. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Suspect/foil identification in actual crimes and in the laboratory: a reality monitoring analysis.

    PubMed

    Behrman, Bruce W; Richards, Regina E

    2005-06-01

    Four reality monitoring variables were used to discriminate suspect from foil identifications in 183 actual criminal cases. Four hundred sixty-one identification attempts based on five and six-person lineups were analyzed. These identification attempts resulted in 238 suspect identifications and 68 foil identifications. Confidence, automatic processing, eliminative processing and feature use comprised the set of reality monitoring variables. Thirty-five verbal confidence phrases taken from police reports were assigned numerical values on a 10-point confidence scale. Automatic processing identifications were those that occurred "immediately" or "without hesitation." Eliminative processing identifications occurred when witnesses compared or eliminated persons in the lineups. Confidence, automatic processing and eliminative processing were significant predictors, but feature use was not. Confidence was the most effective discriminator. In cases that involved substantial evidence extrinsic to the identification 43% of the suspect identifications were made with high confidence, whereas only 10% of the foil identifications were made with high confidence. The results of a laboratory study using the same predictors generally paralleled the archival results. Forensic implications are discussed.

  4. Organizational Culture Challenges to Interagency and Intelligence Community Communication and Interaction

    DTIC Science & Technology

    2006-05-31

    York: McGraw-Hill, 2003), p 121. 103 Gortner, Mahler, and Nicholson, p 74. 104 Robert B. Denhardt, Theories of Public Organization. (Belmont, CA...Uninformed: Government Secrecy in the 1980’s. New York: Pilgrim Press, 1984. Denhardt, Robert B. Theories of Public Organization. Belmont, CA: Brooks/Cole...Fixing Intelligence: For a More Secure America. New Haven, CT: Yale University Press, 2003. Odom, William E. and Robert Dujarric. America’s Inadvertent

  5. Pose-free structure from motion using depth from motion constraints.

    PubMed

    Zhang, Ji; Boutin, Mireille; Aliaga, Daniel G

    2011-10-01

    Structure from motion (SFM) is the problem of recovering the geometry of a scene from a stream of images taken from unknown viewpoints. One popular approach to estimate the geometry of a scene is to track scene features on several images and reconstruct their position in 3-D. During this process, the unknown camera pose must also be recovered. Unfortunately, recovering the pose can be an ill-conditioned problem which, in turn, can make the SFM problem difficult to solve accurately. We propose an alternative formulation of the SFM problem with fixed internal camera parameters known a priori. In this formulation, obtained by algebraic variable elimination, the external camera pose parameters do not appear. As a result, the problem is better conditioned in addition to involving much fewer variables. Variable elimination is done in three steps. First, we take the standard SFM equations in projective coordinates and eliminate the camera orientations from the equations. We then further eliminate the camera center positions. Finally, we also eliminate all 3-D point positions coordinates, except for their depths with respect to the camera center, thus obtaining a set of simple polynomial equations of degree two and three. We show that, when there are merely a few points and pictures, these "depth-only equations" can be solved in a global fashion using homotopy methods. We also show that, in general, these same equations can be used to formulate a pose-free cost function to refine SFM solutions in a way that is more accurate than by minimizing the total reprojection error, as done when using the bundle adjustment method. The generalization of our approach to the case of varying internal camera parameters is briefly discussed. © 2011 IEEE

  6. The Impact of Eliminating Extraneous Sound and Light on Students' Achievement: An Empirical Study

    ERIC Educational Resources Information Center

    Mangipudy, Rajarajeswari

    2010-01-01

    The impact of eliminating extraneous sound and light on students' achievement was investigated under four conditions: Light and Sound controlled, Sound Only controlled, Light Only controlled and neither Light nor Sound controlled. Group, age and gender were the control variables. Four randomly selected groups of high school freshmen students with…

  7. Using Propensity Score Methods to Approximate Factorial Experimental Designs to Analyze the Relationship between Two Variables and an Outcome

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2015-01-01

    Researchers have become increasingly interested in programs' main and interaction effects of two variables (A and B, e.g., two treatment variables or one treatment variable and one moderator) on outcomes. A challenge for estimating main and interaction effects is to eliminate selection bias across A-by-B groups. I introduce Rubin's causal model to…

  8. EMBRYONIC DEVELOPMENT AND A QUANTITATIVE MODEL OF PROGRAMMED DNA ELIMINATION IN MESOCYCLOPS EDAX (S. A. FORBES, 1891) (COPEPODA: CYCLOPOIDA)

    PubMed Central

    Clower, Michelle K.; Holub, Ashton S.; Smith, Rebecca T.; Wyngaard, Grace A.

    2016-01-01

    The highly programmed fragmentation of chromosomes and elimination of large amounts of nuclear DNA from the presomatic cell lineages (i.e., chromatin diminution), occurs in the embryos of the freshwater zooplankton Mesocyclops edax (S. A. Forbes, 1891) (Crustacea: Copepoda). The somatic genome is reorganized and reduced to a size five times smaller even though the germline genome remains intact. We present the first comprehensive, quantitative model of DNA content throughout embryogenesis in a copepod that possesses embryonic DNA elimination. We used densitometric image analysis to measure the DNA content of polar bodies, germline and somatic nuclei, and excised DNA “droplets.” We report: 1) variable DNA contents of polar bodies, some of which do not contain the amount corresponding to the haploid germline genome size; 2) presence of pronuclei in newly laid embryo sacs; 3) gonomeric chromosomes in the second to fourth cleavage divisions and in the primordial germ cell and primordial endoderm cell during the fifth cleavage division; 4) timing of early embryonic cell stages, elimination of DNA, and divisions of the primordial germ cell and primordial endoderm cell at 22°C; and 5) persistence of a portion of the excised DNA “droplets” throughout embryogenesis. DNA elimination is a trait that spans multiple embryonic stages and a knowledge of the timing and variability of the associated cytological events with DNA elimination will promote the study of the molecular mechanisms involved in this trait. We propose the “genome yolk hypothesis” as a functional explanation for the persistence of the eliminated DNA that might serve as a resource during postdiminution cleavage divisions. PMID:27857452

  9. Algorithm and program for information processing with the filin apparatus

    NASA Technical Reports Server (NTRS)

    Gurin, L. S.; Morkrov, V. S.; Moskalenko, Y. I.; Tsoy, K. A.

    1979-01-01

    The reduction of spectral radiation data from space sources is described. The algorithm and program for identifying segments of information obtained from the Film telescope-spectrometer on the Salyut-4 are presented. The information segments represent suspected X-ray sources. The proposed algorithm is an algorithm of the lowest level. Following evaluation, information free of uninformative segments is subject to further processing with algorithms of a higher level. The language used is FORTRAN 4.

  10. Is comprehension of problem solutions resistant to misleading heuristic cues?

    PubMed

    Ackerman, Rakefet; Leiser, David; Shpigelman, Maya

    2013-05-01

    Previous studies in the domain of metacomprehension judgments have primarily used expository texts. When these texts include illustrations, even uninformative ones, people were found to judge that they understand their content better. The present study aimed to delineate the metacognitive processes involved in understanding problem solutions - a text type often perceived as allowing reliable judgments regarding understanding, and was not previously considered from a metacognitive perspective. Undergraduate students faced difficult problems. They then studied solution explanations with or without uninformative illustrations and provided judgments of comprehension (JCOMPs). Learning was assessed by application to near-transfer problems in an open-book test format. As expected, JCOMPs were polarized - they tended to reflect good or poor understanding. Yet, JCOMPs were higher for the illustrated solutions and even high certainty did not ensure resistance to this effect. Moreover, success in the transfer problems was lower in the presence of illustrations, demonstrating a bias stronger than that found with expository texts. Previous studies have suggested that weak learners are especially prone to being misled by superficial cues. In the present study, matching the difficulty of the task to the ability of the target population revealed that even highly able participants were not immune to misleading cues. The study extends previous findings regarding potential detrimental effects of illustrations and highlights aspects of the metacomprehension process that have not been considered before. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Alpha Oscillatory Dynamics Index Temporal Expectation Benefits in Working Memory.

    PubMed

    Wilsch, Anna; Henry, Molly J; Herrmann, Björn; Maess, Burkhard; Obleser, Jonas

    2015-07-01

    Enhanced alpha power compared with a baseline can reflect states of increased cognitive load, for example, when listening to speech in noise. Can knowledge about "when" to listen (temporal expectations) potentially counteract cognitive load and concomitantly reduce alpha? The current magnetoencephalography (MEG) experiment induced cognitive load using an auditory delayed-matching-to-sample task with 2 syllables S1 and S2 presented in speech-shaped noise. Temporal expectation about the occurrence of S1 was manipulated in 3 different cue conditions: "Neutral" (uninformative about foreperiod), "early-cued" (short foreperiod), and "late-cued" (long foreperiod). Alpha power throughout the trial was highest when the cue was uninformative about the onset time of S1 (neutral) and lowest for the late-cued condition. This alpha-reducing effect of late compared with neutral cues was most evident during memory retention in noise and originated primarily in the right insula. Moreover, individual alpha effects during retention accounted best for observed individual performance differences between late-cued and neutral conditions, indicating a tradeoff between allocation of neural resources and the benefits drawn from temporal cues. Overall, the results indicate that temporal expectations can facilitate the encoding of speech in noise, and concomitantly reduce neural markers of cognitive load. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Canceling the momentum in a phase-shifting algorithm to eliminate spatially uniform errors.

    PubMed

    Hibino, Kenichi; Kim, Yangjin

    2016-08-10

    In phase-shifting interferometry, phase modulation nonlinearity causes both spatially uniform and nonuniform errors in the measured phase. Conventional linear-detuning error-compensating algorithms only eliminate the spatially variable error component. The uniform error is proportional to the inertial momentum of the data-sampling weight of a phase-shifting algorithm. This paper proposes a design approach to cancel the momentum by using characteristic polynomials in the Z-transform space and shows that an arbitrary M-frame algorithm can be modified to a new (M+2)-frame algorithm that acquires new symmetry to eliminate the uniform error.

  13. Implementation of the Department of Defense Small Business Innovation Research Commercialization Pilot Program: Recent Experience and International Lessons

    DTIC Science & Technology

    2012-07-25

    Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware ...by the program offices (DoD Inspector General [DoDIG], 2009). The Army and Air Force labs should be well aware of the defense science plans that...either the organizations are uninformed regarding the statutory alignment requirement or they are aware but do not put the requirements in place

  14. The Successive Projections Algorithm for interval selection in trilinear partial least-squares with residual bilinearization.

    PubMed

    Gomes, Adriano de Araújo; Alcaraz, Mirta Raquel; Goicoechea, Hector C; Araújo, Mario Cesar U

    2014-02-06

    In this work the Successive Projection Algorithm is presented for intervals selection in N-PLS for three-way data modeling. The proposed algorithm combines noise-reduction properties of PLS with the possibility of discarding uninformative variables in SPA. In addition, second-order advantage can be achieved by the residual bilinearization (RBL) procedure when an unexpected constituent is present in a test sample. For this purpose, SPA was modified in order to select intervals for use in trilinear PLS. The ability of the proposed algorithm, namely iSPA-N-PLS, was evaluated on one simulated and two experimental data sets, comparing the results to those obtained by N-PLS. In the simulated system, two analytes were quantitated in two test sets, with and without unexpected constituent. In the first experimental system, the determination of the four fluorophores (l-phenylalanine; l-3,4-dihydroxyphenylalanine; 1,4-dihydroxybenzene and l-tryptophan) was conducted with excitation-emission data matrices. In the second experimental system, quantitation of ofloxacin was performed in water samples containing two other uncalibrated quinolones (ciprofloxacin and danofloxacin) by high performance liquid chromatography with UV-vis diode array detector. For comparison purpose, a GA algorithm coupled with N-PLS/RBL was also used in this work. In most of the studied cases iSPA-N-PLS proved to be a promising tool for selection of variables in second-order calibration, generating models with smaller RMSEP, when compared to both the global model using all of the sensors in two dimensions and GA-NPLS/RBL. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Less Unique Variance Than Meets the Eye: Overlap Among Traditional Neuropsychological Dimensions in Schizophrenia

    PubMed Central

    Dickinson, Dwight; Gold, James M.

    2008-01-01

    The magnitude of the overlap among dimensions of neuropsychological test performance in schizophrenia has been the subject of perennial controversy. This issue has taken on renewed importance with the recent focus on cognition as a treatment target in schizophrenia. A substantial body of factor analytic literature indicates that dimensions are separable in schizophrenia. However, this literature is generally uninformative as to whether the separable dimensions are independent, weakly correlated, or strongly correlated. Factor analyses have often used methods (ie, principal components analysis with orthogonal rotation) that preclude this determination, and correlations among factor-based domain composites and underlying measures have been reported infrequently in these studies. Current meta-analyses of reported “between-dimension” correlations for individual neuropsychological measures and for cognitive domain composite variables indicate that cognition variables in schizophrenia are correlated, on average, at a “medium” level of r = 0.37 for individual measures from different cognitive dimensions and r = 0.45 for domain composites. Because these are mean bivariate correlations, the multiple correlation of an individual measure with all the other measures in a cognitive battery is likely to be higher. Measure reliabilities of 0.80 or less also imply greater commonality among traditional neuropsychological measures. In short, there are underappreciated constraints on the amount of reliable cognitive performance variance in traditional neuropsychological test batteries that is free to vary independently. The ability of such batteries to reveal cognitive domain–specific treatment effects in schizophrenia may be much more limited than is generally assumed. PMID:17702991

  16. Statistical aspects of quantitative real-time PCR experiment design.

    PubMed

    Kitchen, Robert R; Kubista, Mikael; Tichopad, Ales

    2010-04-01

    Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several 'levels' of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study. Copyright 2010 Elsevier Inc. All rights reserved.

  17. Predicting when biliary excretion of parent drug is a major route of elimination in humans.

    PubMed

    Hosey, Chelsea M; Broccatelli, Fabio; Benet, Leslie Z

    2014-09-01

    Biliary excretion is an important route of elimination for many drugs, yet measuring the extent of biliary elimination is difficult, invasive, and variable. Biliary elimination has been quantified for few drugs with a limited number of subjects, who are often diseased patients. An accurate prediction of which drugs or new molecular entities are significantly eliminated in the bile may predict potential drug-drug interactions, pharmacokinetics, and toxicities. The Biopharmaceutics Drug Disposition Classification System (BDDCS) characterizes significant routes of drug elimination, identifies potential transporter effects, and is useful in understanding drug-drug interactions. Class 1 and 2 drugs are primarily eliminated in humans via metabolism and will not exhibit significant biliary excretion of parent compound. In contrast, class 3 and 4 drugs are primarily excreted unchanged in the urine or bile. Here, we characterize the significant elimination route of 105 orally administered class 3 and 4 drugs. We introduce and validate a novel model, predicting significant biliary elimination using a simple classification scheme. The model is accurate for 83% of 30 drugs collected after model development. The model corroborates the observation that biliarily eliminated drugs have high molecular weights, while demonstrating the necessity of considering route of administration and extent of metabolism when predicting biliary excretion. Interestingly, a predictor of potential metabolism significantly improves predictions of major elimination routes of poorly metabolized drugs. This model successfully predicts the major elimination route for poorly permeable/poorly metabolized drugs and may be applied prior to human dosing.

  18. Large variable conductance heat pipe. Transverse header

    NASA Technical Reports Server (NTRS)

    Edelstein, F.

    1975-01-01

    The characteristics of gas-loaded, variable conductance heat pipes (VCHP) are discussed. The difficulties involved in developing a large VCHP header are analyzed. The construction of the large capacity VCHP is described. A research project to eliminate some of the problems involved in large capacity VCHP operation is explained.

  19. Establishing Factor Validity Using Variable Reduction in Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Hofmann, Rich

    1995-01-01

    Using a 21-statement attitude-type instrument, an iterative procedure for improving confirmatory model fit is demonstrated within the context of the EQS program of P. M. Bentler and maximum likelihood factor analysis. Each iteration systematically eliminates the poorest fitting statement as identified by a variable fit index. (SLD)

  20. Challenges for malaria elimination in Brazil.

    PubMed

    Ferreira, Marcelo U; Castro, Marcia C

    2016-05-20

    Brazil currently contributes 42 % of all malaria cases reported in the Latin America and the Caribbean, a region where major progress towards malaria elimination has been achieved in recent years. In 2014, malaria burden in Brazil (143,910 microscopically confirmed cases and 41 malaria-related deaths) has reached its lowest levels in 35 years, Plasmodium falciparum is highly focal, and the geographic boundary of transmission has considerably shrunk. Transmission in Brazil remains entrenched in the Amazon Basin, which accounts for 99.5 % of the country's malaria burden. This paper reviews major lessons learned from past and current malaria control policies in Brazil. A comprehensive discussion of the scientific and logistic challenges that may impact malaria elimination efforts in the country is presented in light of the launching of the Plan for Elimination of Malaria in Brazil in November 2015. Challenges for malaria elimination addressed include the high prevalence of symptomless and submicroscopic infections, emerging anti-malarial drug resistance in P. falciparum and Plasmodium vivax and the lack of safe anti-relapse drugs, the largely neglected burden of malaria in pregnancy, the need for better vector control strategies where Anopheles mosquitoes present a highly variable biting behaviour, human movement, the need for effective surveillance and tools to identify foci of infection in areas with low transmission, and the effects of environmental changes and climatic variability in transmission. Control actions launched in Brazil and results to come are likely to influence control programs in other countries in the Americas.

  1. A sticky situation: the unexpected stability of malaria elimination

    PubMed Central

    Smith, David L.; Cohen, Justin M.; Chiyaka, Christinah; Johnston, Geoffrey; Gething, Peter W.; Gosling, Roly; Buckee, Caroline O.; Laxminarayan, Ramanan; Hay, Simon I.; Tatem, Andrew J.

    2013-01-01

    Malaria eradication involves eliminating malaria from every country where transmission occurs. Current theory suggests that the post-elimination challenges of remaining malaria-free by stopping transmission from imported malaria will have onerous operational and financial requirements. Although resurgent malaria has occurred in a majority of countries that tried but failed to eliminate malaria, a review of resurgence in countries that successfully eliminated finds only four such failures out of 50 successful programmes. Data documenting malaria importation and onwards transmission in these countries suggests malaria transmission potential has declined by more than 50-fold (i.e. more than 98%) since before elimination. These outcomes suggest that elimination is a surprisingly stable state. Elimination's ‘stickiness’ must be explained either by eliminating countries starting off qualitatively different from non-eliminating countries or becoming different once elimination was achieved. Countries that successfully eliminated were wealthier and had lower baseline endemicity than those that were unsuccessful, but our analysis shows that those same variables were at best incomplete predictors of the patterns of resurgence. Stability is reinforced by the loss of immunity to disease and by the health system's increasing capacity to control malaria transmission after elimination through routine treatment of cases with antimalarial drugs supplemented by malaria outbreak control. Human travel patterns reinforce these patterns; as malaria recedes, fewer people carry malaria from remote endemic areas to remote areas where transmission potential remains high. Establishment of an international resource with backup capacity to control large outbreaks can make elimination stickier, increase the incentives for countries to eliminate, and ensure steady progress towards global eradication. Although available evidence supports malaria elimination's stickiness at moderate-to-low transmission in areas with well-developed health systems, it is not yet clear if such patterns will hold in all areas. The sticky endpoint changes the projected costs of maintaining elimination and makes it substantially more attractive for countries acting alone, and it makes spatially progressive elimination a sensible strategy for a malaria eradication endgame. PMID:23798693

  2. 'Informed and uninformed decision making'--women's reasoning, experiences and perceptions with regard to advanced maternal age and delayed childbearing: a meta-synthesis.

    PubMed

    Cooke, Alison; Mills, Tracey A; Lavender, Tina

    2010-10-01

    To identify what factors affect women's decisions to delay childbearing, and to explore women's experiences and their perceptions of associated risks. Systematic procedures were used for search strategy, study selection, data extraction and analysis. Findings were synthesised using an approach developed from meta-ethnography. We included qualitative papers, not confined to geographical area (1980-2009). Databases included CINAHL, MEDLINE, EMBASE, PsycInfo, ASSIA, MIDIRS, British Nursing Index and the National Research Register. We selected qualitative empirical studies exploring the views and experiences of women of advanced maternal age who were childless or primigravidae with a singleton pregnancy or primiparous. Twelve papers fulfilled the selection criteria and were included for synthesis. Women appear to face an issue of 'informed and uninformed decision making'; those who believe they are informed but may not be, those who are not informed and find out they are at risk once pregnant, and those who are well informed but choose to delay pregnancy anyway. Maternity services could provide information to enable informed choice regarding timing of childbearing. Health professionals need to be mindful of the fact that women delay childbearing for various reasons. A strategy of pre-conception education may be beneficial in informing childbearing decisions. Obstetricians and midwives should be sensitive to the fact that women may not be aware of all the risks associated with delayed childbearing. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  3. A Diagnosis-Prognosis Feedback Loop for Improved Performance Under Uncertainties

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Warner, James E.

    2017-01-01

    The feed-forward relationship between diagnosis and prognosis is the foundation of both aircraft structural health management and the digital twin concept. Measurements of structural response are obtained either in-situ with mounted sensor networks or offline using more traditional techniques (e.g., nondestructive evaluation). Diagnosis algorithms process this information to detect and quantify damage and then feed this data forward to a prognostic framework. A prognosis of the structure's future operational readiness (e.g., remaining useful life or residual strength) is then made and is used to inform mission- critical decision-making. Years of research have been devoted to improving the elements of this process, but the process itself has not changed significantly. Here, a new approach is proposed in which prognosis information is not only fed forward for decision-making, but it is also fed back to the forthcoming diagnosis. In this way, diagnosis algorithms can take advantage of a priori information about the expected state of health, rather than operating in an uninformed condition. As a feasibility test, a diagnosis-prognosis feedback loop of this manner is demonstrated. The approach is applied to a numerical example in which fatigue crack growth is simulated in a simple aluminum alloy test specimen. A prognosis was derived from a set of diagnoses which provided feedback to a subsequent set of diagnoses. Improvements in accuracy and a reduction in uncertainty in the prognosis- informed diagnoses were observed when compared with an uninformed diagnostic approach.

  4. An electrophysiological insight into visual attention mechanisms underlying schizotypy.

    PubMed

    Fuggetta, Giorgio; Bennett, Matthew A; Duke, Philip A

    2015-07-01

    A theoretical framework has been put forward to understand attention deficits in schizophrenia (Luck SJ & Gold JM. Biological Psychiatry. 2008; 64:34-39). We adopted this framework to evaluate any deficits in attentional processes in schizotypy. Sixteen low schizotypal (LoS) and 16 high schizotypal (HiS) individuals performed a novel paradigm combining a match-to-sample task, with inhibition of return (using spatially uninformative cues) and memory-guided efficient visual-search within one trial sequence. Behavioural measures and event-related potentials (ERPs) were recorded. Behaviourally, HiS individuals exhibited a spatial cueing effect while LoS individuals showed the more typical inhibition of return effect. These results suggest HiS individuals have a relative deficit in rule selection - the endogenous control process involved in disengaging attention from the uninformative location cue. ERP results showed that the late-phase of N2pc evoked by the target stimulus had greater peak latency and amplitude in HiS individuals. This suggests a relative deficit in the implementation of selection - the process of focusing attention onto target features that enhances relevant/suppresses irrelevant inputs. This is a different conclusion than when the same theoretical framework has been applied to schizophrenia, which argues little or no deficit in implementation of selection amongst patients. Also, HiS individuals exhibited earlier onset and greater amplitude of the mismatch-triggered negativity component. In summary, our results indicate deficits of both control and implementation of selection in HiS individuals. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. The Interpretation of Scholars' Interpretations of Confidence Intervals: Criticism, Replication, and Extension of Hoekstra et al. (2014)

    PubMed Central

    García-Pérez, Miguel A.; Alcalá-Quintana, Rocío

    2016-01-01

    Hoekstra et al. (Psychonomic Bulletin & Review, 2014, 21:1157–1164) surveyed the interpretation of confidence intervals (CIs) by first-year students, master students, and researchers with six items expressing misinterpretations of CIs. They asked respondents to answer all items, computed the number of items endorsed, and concluded that misinterpretation of CIs is robust across groups. Their design may have produced this outcome artifactually for reasons that we describe. This paper discusses first the two interpretations of CIs and, hence, why misinterpretation cannot be inferred from endorsement of some of the items. Next, a re-analysis of Hoekstra et al.'s data reveals some puzzling differences between first-year and master students that demand further investigation. For that purpose, we designed a replication study with an extended questionnaire including two additional items that express correct interpretations of CIs (to compare endorsement of correct vs. nominally incorrect interpretations) and we asked master students to indicate which items they would have omitted had they had the option (to distinguish deliberate from uninformed endorsement caused by the forced-response format). Results showed that incognizant first-year students endorsed correct and nominally incorrect items identically, revealing that the two item types are not differentially attractive superficially; in contrast, master students were distinctively more prone to endorsing correct items when their uninformed responses were removed, although they admitted to nescience more often that might have been expected. Implications for teaching practices are discussed. PMID:27458424

  6. May Stakeholders be Involved in Design Without Informed Consent? The Case of Hidden Design.

    PubMed

    Pols, A J K

    2017-06-01

    Stakeholder involvement in design is desirable from both a practical and an ethical point of view. It is difficult to do well, however, and some problems recur again and again, both of a practical nature, e.g. stakeholders acting strategically rather than openly, and of an ethical nature, e.g. power imbalances unduly affecting the outcome of the process. Hidden Design has been proposed as a method to deal with the practical problems of stakeholder involvement. It aims to do so by taking the observation of stakeholder actions, rather than the outcomes of a deliberative process, as its input. Furthermore, it hides from stakeholders the fact that a design process is taking place so that they will not behave differently than they otherwise would. Both aspects of Hidden Design have raised ethical worries. In this paper I make an ethical analysis of what it means for a design process to leave participants uninformed or deceived rather than acquiring their informed consent beforehand, and to use observation of actions rather than deliberation as input for design, using Hidden Design as a case study. This analysis is based on two sets of normative guidelines: the ethical guidelines for psychological research involving deception or uninformed participants from two professional psychological organisations, and Habermasian norms for a fair and just (deliberative) process. It supports the conclusion that stakeholder involvement in design organised in this way can be ethically acceptable, though under a number of conditions and constraints.

  7. Adolescents' and Young Adults' Beliefs about Mental Health Services and Care: A Systematic Review.

    PubMed

    Goodwin, John; Savage, Eileen; Horgan, Aine

    2016-10-01

    Adolescents and young people are known to hold negative views about mental illness. There is less known about their beliefs about mental health services and care. The aim of this study was to systematically examine literature on the beliefs of adolescents and young people from the general population about mental health services and care. Factors that positively and negatively influence these beliefs are also explored. Relevant electronic databases were searched for papers published in the English language between January 2004 and October 2015. Culture seemed to influence how adolescents and young adults perceived mental health interventions. This was particularly evident in countries such as Palestine and South Africa where prayer was highly valued. Adolescents and young people were uninformed about psychiatric medication. They believed that accessing mental health care was a sign of weakness. Furthermore, they viewed psychiatric hospitals and various mental health professionals negatively. Film was found to have a negative impact on how adolescents and young people perceived mental health services, whereas open communication with family members was found to have a positive impact. Adolescents and young adults hold uninformed and stigmatizing beliefs about mental health treatments, mental health professionals, and access to care. The sources of these beliefs remain unclear although some at least seem influenced by culture. Further research, (particularly qualitative research) in this area is recommended in order to address current gaps in knowledge. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Risk Judgment by General Dental practitioners: Rational but Uninformed.

    PubMed

    Ellervall, Eva; Brehmer, Berndt; Knutsson, Kerstin

    2010-01-01

    Decisions by dentists to administer antibiotic prophylaxis to prevent infectious complications in patients involves professional risk assessment. While recommendations for rational use have been published, several studies have shown that dentists have low adherence to these recommendations. To examine general dental practitioners' (GDPs') assessments of the risk of complications if not administering antibiotic prophylaxis in connection with dental procedures in patients with specific medical conditions. Postal questionnaires in combination with telephone interviews. Risk assessments were made using visual analogue scales (VAS), where zero represented "insignificant risk" and 100 represented a "very significant risk". Response rate: 51%. The mean risk assessments were higher for GDPs who administered antibiotics (mean = 54, SD = 23, range 26-72 mm on the VAS) than those who did not (mean = 14, SD = 12, range 7-31 mm) (P < 0.05). Generally, GDPs made higher risk assessments for patients with medical conditions that are included in recommendations than those with conditions that are not included. Overall, risk assessments were higher for tooth removal than for scaling or root canal treatment, even though the risk assessments should be considered equal for these interventions. GDPs' risk assessments were rational but uninformed. They administered antibiotics in a manner that was consistent with their risk assessments. Their risk assessments, however, were overestimated. Inaccurate judgments of risk should not be expected to disappear in the presence of new information. To achieve change, clinicians must be motivated to improve behaviour and an evidence-based implementation strategy is required.

  9. Algebraic solution for the forward displacement analysis of the general 6-6 stewart mechanism

    NASA Astrophysics Data System (ADS)

    Wei, Feng; Wei, Shimin; Zhang, Ying; Liao, Qizheng

    2016-01-01

    The solution for the forward displacement analysis(FDA) of the general 6-6 Stewart mechanism(i.e., the connection points of the moving and fixed platforms are not restricted to lying in a plane) has been extensively studied, but the efficiency of the solution remains to be effectively addressed. To this end, an algebraic elimination method is proposed for the FDA of the general 6-6 Stewart mechanism. The kinematic constraint equations are built using conformal geometric algebra(CGA). The kinematic constraint equations are transformed by a substitution of variables into seven equations with seven unknown variables. According to the characteristic of anti-symmetric matrices, the aforementioned seven equations can be further transformed into seven equations with four unknown variables by a substitution of variables using the Gröbner basis. Its elimination weight is increased through changing the degree of one variable, and sixteen equations with four unknown variables can be obtained using the Gröbner basis. A 40th-degree univariate polynomial equation is derived by constructing a relatively small-sized 9´9 Sylvester resultant matrix. Finally, two numerical examples are employed to verify the proposed method. The results indicate that the proposed method can effectively improve the efficiency of solution and reduce the computational burden because of the small-sized resultant matrix.

  10. Geometry-aware multiscale image registration via OBBTree-based polyaffine log-demons.

    PubMed

    Seiler, Christof; Pennec, Xavier; Reyes, Mauricio

    2011-01-01

    Non-linear image registration is an important tool in many areas of image analysis. For instance, in morphometric studies of a population of brains, free-form deformations between images are analyzed to describe the structural anatomical variability. Such a simple deformation model is justified by the absence of an easy expressible prior about the shape changes. Applying the same algorithms used in brain imaging to orthopedic images might not be optimal due to the difference in the underlying prior on the inter-subject deformations. In particular, using an un-informed deformation prior often leads to local minima far from the expected solution. To improve robustness and promote anatomically meaningful deformations, we propose a locally affine and geometry-aware registration algorithm that automatically adapts to the data. We build upon the log-domain demons algorithm and introduce a new type of OBBTree-based regularization in the registration with a natural multiscale structure. The regularization model is composed of a hierarchy of locally affine transformations via their logarithms. Experiments on mandibles show improved accuracy and robustness when used to initialize the demons, and even similar performance by direct comparison to the demons, with a significantly lower degree of freedom. This closes the gap between polyaffine and non-rigid registration and opens new ways to statistically analyze the registration results.

  11. A flexible model for multivariate interval-censored survival times with complex correlation structure.

    PubMed

    Falcaro, Milena; Pickles, Andrew

    2007-02-10

    We focus on the analysis of multivariate survival times with highly structured interdependency and subject to interval censoring. Such data are common in developmental genetics and genetic epidemiology. We propose a flexible mixed probit model that deals naturally with complex but uninformative censoring. The recorded ages of onset are treated as possibly censored ordinal outcomes with the interval censoring mechanism seen as arising from a coarsened measurement of a continuous variable observed as falling between subject-specific thresholds. This bypasses the requirement for the failure times to be observed as falling into non-overlapping intervals. The assumption of a normal age-of-onset distribution of the standard probit model is relaxed by embedding within it a multivariate Box-Cox transformation whose parameters are jointly estimated with the other parameters of the model. Complex decompositions of the underlying multivariate normal covariance matrix of the transformed ages of onset become possible. The new methodology is here applied to a multivariate study of the ages of first use of tobacco and first consumption of alcohol without parental permission in twins. The proposed model allows estimation of the genetic and environmental effects that are shared by both of these risk behaviours as well as those that are specific. 2006 John Wiley & Sons, Ltd.

  12. Time-division multiplexer uses digital gates

    NASA Technical Reports Server (NTRS)

    Myers, C. E.; Vreeland, A. E.

    1977-01-01

    Device eliminates errors caused by analog gates in multiplexing a large number of channels at high frequency. System was designed for use in aerospace work to multiplex signals for monitoring such variables as fuel consumption, pressure, temperature, strain, and stress. Circuit may be useful in monitoring variables in process control and medicine as well.

  13. Reduced-Sodium Lunches Are Well-Accepted by Uninformed Consumers Over a 3-Week Period and Result in Decreased Daily Dietary Sodium Intakes: A Randomized Controlled Trial.

    PubMed

    Janssen, Anke M; Kremer, Stefanie; van Stipriaan, Willeke L; Noort, Martijn W J; de Vries, Jeanne H M; Temme, Elisabeth H M

    2015-10-01

    Processed foods are major contributors to excessive sodium intake in Western populations. We investigated the effect of food reformulation on daily dietary sodium intake. To determine whether uninformed consumers accept reduced-sodium lunches and to determine the effect of consuming reduced-sodium lunches on 24-hour urinary sodium excretion. A single-blind randomized controlled pretest-posttest design with two parallel treatment groups was used. Participants chose foods in an experimental real-life canteen setting at the Restaurant of the Future in Wageningen, the Netherlands, from May 16 until July 1, 2011. After a run-in period with regular foods for both groups, the intervention group (n=36) consumed foods with 29% to 61% sodium reduction (some were partially flavor compensated). The control group (n=38) continued consuming regular foods. Outcomes for assessment of acceptance were the amount of foods consumed, energy and sodium intake, remembered food liking, and intensity of sensory aspects. Influence on daily dietary sodium intake was assessed by 24-hour urinary sodium excretion. Between and within-subject comparisons were assessed by analysis of covariance. Energy intake and amount consumed of each food category per lunch remained similar for both groups. Compared with the control group, the intervention group's sodium intake per lunch was significantly reduced by -1,093 mg (adjusted difference) (95% CI -1,285 to -901), equivalent to 43 mmol sodium. Remembered food liking, taste intensity, and saltiness were scored similarly for almost all of the reduced-sodium foods compared with the regular foods. After consuming reduced-sodium lunches, compared with the control group, intervention participants' 24-hour urinary sodium excretion was significantly lower by -40 mEq (adjusted difference) (95% CI -63 to -16) than after consuming regular lunches, and this reflects a decreased daily sodium intake of 1 g. Comparing the two treatment groups, consumption of reduced-sodium foods over a 3-week period was well accepted by the uninformed participants in an experimental real-life canteen setting. The reduced-sodium foods did not trigger compensation behavior during the remainder of the day in the intervention group compared with the control group, as reflected by 24-hour urinary sodium excretion. Therefore, offering reduced-sodium foods without explicitly informing consumers of the sodium reduction can contribute to daily sodium intake reduction. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  14. Wavelength selection-based nonlinear calibration for transcutaneous blood glucose sensing using Raman spectroscopy

    PubMed Central

    Dingari, Narahara Chari; Barman, Ishan; Kang, Jeon Woong; Kong, Chae-Ryon; Dasari, Ramachandra R.; Feld, Michael S.

    2011-01-01

    While Raman spectroscopy provides a powerful tool for noninvasive and real time diagnostics of biological samples, its translation to the clinical setting has been impeded by the lack of robustness of spectroscopic calibration models and the size and cumbersome nature of conventional laboratory Raman systems. Linear multivariate calibration models employing full spectrum analysis are often misled by spurious correlations, such as system drift and covariations among constituents. In addition, such calibration schemes are prone to overfitting, especially in the presence of external interferences that may create nonlinearities in the spectra-concentration relationship. To address both of these issues we incorporate residue error plot-based wavelength selection and nonlinear support vector regression (SVR). Wavelength selection is used to eliminate uninformative regions of the spectrum, while SVR is used to model the curved effects such as those created by tissue turbidity and temperature fluctuations. Using glucose detection in tissue phantoms as a representative example, we show that even a substantial reduction in the number of wavelengths analyzed using SVR lead to calibration models of equivalent prediction accuracy as linear full spectrum analysis. Further, with clinical datasets obtained from human subject studies, we also demonstrate the prospective applicability of the selected wavelength subsets without sacrificing prediction accuracy, which has extensive implications for calibration maintenance and transfer. Additionally, such wavelength selection could substantially reduce the collection time of serial Raman acquisition systems. Given the reduced footprint of serial Raman systems in relation to conventional dispersive Raman spectrometers, we anticipate that the incorporation of wavelength selection in such hardware designs will enhance the possibility of miniaturized clinical systems for disease diagnosis in the near future. PMID:21895336

  15. Analysis of sequence variability in the macronuclear DNA of Paramecium tetraurelia: A somatic view of the germline

    PubMed Central

    Duret, Laurent; Cohen, Jean; Jubin, Claire; Dessen, Philippe; Goût, Jean-François; Mousset, Sylvain; Aury, Jean-Marc; Jaillon, Olivier; Noël, Benjamin; Arnaiz, Olivier; Bétermier, Mireille; Wincker, Patrick; Meyer, Eric; Sperling, Linda

    2008-01-01

    Ciliates are the only unicellular eukaryotes known to separate germinal and somatic functions. Diploid but silent micronuclei transmit the genetic information to the next sexual generation. Polyploid macronuclei express the genetic information from a streamlined version of the genome but are replaced at each sexual generation. The macronuclear genome of Paramecium tetraurelia was recently sequenced by a shotgun approach, providing access to the gene repertoire. The 72-Mb assembly represents a consensus sequence for the somatic DNA, which is produced after sexual events by reproducible rearrangements of the zygotic genome involving elimination of repeated sequences, precise excision of unique-copy internal eliminated sequences (IES), and amplification of the cellular genes to high copy number. We report use of the shotgun sequencing data (>106 reads representing 13× coverage of a completely homozygous clone) to evaluate variability in the somatic DNA produced by these developmental genome rearrangements. Although DNA amplification appears uniform, both of the DNA elimination processes produce sequence heterogeneity. The variability that arises from IES excision allowed identification of hundreds of putative new IESs, compared to 42 that were previously known, and revealed cases of erroneous excision of segments of coding sequences. We demonstrate that IESs in coding regions are under selective pressure to introduce premature termination of translation in case of excision failure. PMID:18256234

  16. Special Interests and the Media: Theory and an Application to Climate Change.

    PubMed

    Shapiro, Jesse M

    2016-12-01

    A journalist reports to a voter on an unknown, policy-relevant state. Competing special interests can make claims that contradict the facts but seem credible to the voter. A reputational incentive to avoid taking sides leads the journalist to report special interests' claims to the voter. In equilibrium, the voter can remain uninformed even when the journalist is perfectly informed. Communication is improved if the journalist discloses her partisan leanings. The model provides an account of persistent public ignorance on climate change that is consistent with narrative and quantitative evidence.

  17. Special Interests and the Media

    PubMed Central

    Shapiro, Jesse M.

    2017-01-01

    A journalist reports to a voter on an unknown, policy-relevant state. Competing special interests can make claims that contradict the facts but seem credible to the voter. A reputational incentive to avoid taking sides leads the journalist to report special interests’ claims to the voter. In equilibrium, the voter can remain uninformed even when the journalist is perfectly informed. Communication is improved if the journalist discloses her partisan leanings. The model provides an account of persistent public ignorance on climate change that is consistent with narrative and quantitative evidence. PMID:28725092

  18. The automatic lumber planing mill

    Treesearch

    Peter Koch

    1957-01-01

    It is probable that a truly automatic planning operation could be devised if some of the variables commonly present in the mill-run lumber were eliminated and the remaining variables kept under close control. This paper will deal with the more general situation faced by mostl umber manufacturing plants. In other words, it will be assumed that the incoming lumber has...

  19. Driving towards malaria elimination in Botswana by 2018: progress on case-based surveillance, 2013–2014

    PubMed Central

    Edwards, J.; Namboze, J.; Butt, W.; Moakofhi, K.; Obopile, M.; Manzi, M.; Takarinda, K. C.; Zachariah, R.; Owiti, P.; Oumer, N.; Mosweunyane, T.

    2018-01-01

    Background: Reliable information reporting systems ensure that all malaria cases are tested, treated and tracked to avoid further transmission. Botswana aimed to eliminate malaria by 2018, and surveillance is key. This study focused on assessing the uptake of the new malaria case-based surveillance (CBS) system introduced in 2012, which captures information on malaria cases reported in the Integrated Disease Surveillance and Response (IDSR) system. Methods: This was a retrospective descriptive study based on routine data focusing on Ngami, Chobe and Okavango, three high-risk districts in Botswana. Aggregated data variables were extracted from the IDSR and compared with data from the CBS. Results: The IDSR reported 456 malaria cases in 2013 and 1346 in 2014, of which respectively only 305 and 884 were reported by the CBS. The CBS reported 34% fewer cases than the IDSR system, indicating substantial differences between the two systems. The key malaria indicators with the greatest variability among the districts included in the study were case identification number and date of diagnosis. Conclusion: The IDSR and CBS systems are essential for malaria elimination, as shown by the significant gaps in reporting between the two systems. These findings highlight the need for further investigation into these discrepancies. Strengthening the CBS system will help to reach the objective of malaria elimination in Botswana. PMID:29713590

  20. Driving towards malaria elimination in Botswana by 2018: progress on case-based surveillance, 2013-2014.

    PubMed

    Motlaleng, M; Edwards, J; Namboze, J; Butt, W; Moakofhi, K; Obopile, M; Manzi, M; Takarinda, K C; Zachariah, R; Owiti, P; Oumer, N; Mosweunyane, T

    2018-04-25

    Background: Reliable information reporting systems ensure that all malaria cases are tested, treated and tracked to avoid further transmission. Botswana aimed to eliminate malaria by 2018, and surveillance is key. This study focused on assessing the uptake of the new malaria case-based surveillance (CBS) system introduced in 2012, which captures information on malaria cases reported in the Integrated Disease Surveillance and Response (IDSR) system. Methods: This was a retrospective descriptive study based on routine data focusing on Ngami, Chobe and Okavango, three high-risk districts in Botswana. Aggregated data variables were extracted from the IDSR and compared with data from the CBS. Results: The IDSR reported 456 malaria cases in 2013 and 1346 in 2014, of which respectively only 305 and 884 were reported by the CBS. The CBS reported 34% fewer cases than the IDSR system, indicating substantial differences between the two systems. The key malaria indicators with the greatest variability among the districts included in the study were case identification number and date of diagnosis. Conclusion: The IDSR and CBS systems are essential for malaria elimination, as shown by the significant gaps in reporting between the two systems. These findings highlight the need for further investigation into these discrepancies. Strengthening the CBS system will help to reach the objective of malaria elimination in Botswana.

  1. Dissolving variables in connectionist combinatory logic

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    A connectionist system which can represent and execute combinator expressions to elegantly solve the variable binding problem in connectionist networks is presented. This system is a graph reduction machine utilizing graph representations and traversal mechanisms similar to ones described in the BoltzCONS system of Touretzky (1986). It is shown that, as combinators eliminate variables by introducing special functions, these functions can be connectionistically implemented without reintroducing variable binding. This approach 'dissolves' an important part of the variable binding problem, in that a connectionist system still has to manipulate complex data structures, but those structures and their manipulations are rendered more uniform.

  2. Predicting factors for malaria re-introduction: an applied model in an elimination setting to prevent malaria outbreaks.

    PubMed

    Ranjbar, Mansour; Shoghli, Alireza; Kolifarhood, Goodarz; Tabatabaei, Seyed Mehdi; Amlashi, Morteza; Mohammadi, Mahdi

    2016-03-02

    Malaria re-introduction is a challenge in elimination settings. To prevent re-introduction, receptivity, vulnerability, and health system capacity of foci should be monitored using appropriate tools. This study aimed to design an applicable model to monitor predicting factors of re-introduction of malaria in highly prone areas. This exploratory, descriptive study was conducted in a pre-elimination setting with a high-risk of malaria transmission re-introduction. By using nominal group technique and literature review, a list of predicting indicators for malaria re-introduction and outbreak was defined. Accordingly, a checklist was developed and completed in the field for foci affected by re-introduction and for cleared-up foci as a control group, for a period of 12 weeks before re-introduction and for the same period in the previous year. Using field data and analytic hierarchical process (AHP), each variable and its sub-categories were weighted, and by calculating geometric means for each sub-category, score of corresponding cells of interaction matrices, lower and upper threshold of different risks strata, including low and mild risk of re-introduction and moderate and high risk of malaria outbreaks, were determined. The developed predictive model was calibrated through resampling with different sets of explanatory variables using R software. Sensitivity and specificity of the model were calculated based on new samples. Twenty explanatory predictive variables of malaria re-introduction were identified and a predictive model was developed. Unpermitted immigrants from endemic neighbouring countries were determined as a pivotal factor (AHP score: 0.181). Moreover, quality of population movement (0.114), following malaria transmission season (0.088), average daily minimum temperature in the previous 8 weeks (0.062), an outdoor resting shelter for vectors (0.045), and rainfall (0.042) were determined. Positive and negative predictive values of the model were 81.8 and 100 %, respectively. This study introduced a new, simple, yet reliable model to forecast malaria re-introduction and outbreaks eight weeks in advance in pre-elimination and elimination settings. The model incorporates comprehensive deterministic factors that can easily be measured in the field, thereby facilitating preventive measures.

  3. Imitation Combined with a Characteristic Stimulus Duration Results in Robust Collective Decision-Making.

    PubMed

    Toulet, Sylvain; Gautrais, Jacques; Bon, Richard; Peruani, Fernando

    2015-01-01

    For group-living animals, reaching consensus to stay cohesive is crucial for their fitness, particularly when collective motion starts and stops. Understanding the decision-making at individual and collective levels upon sudden disturbances is central in the study of collective animal behavior, and concerns the broader question of how information is distributed and evaluated in groups. Despite the relevance of the problem, well-controlled experimental studies that quantify the collective response of groups facing disruptive events are lacking. Here we study the behavior of small-sized groups of uninformed individuals subject to the departure and stop of a trained conspecific. We find that the groups reach an effective consensus: either all uninformed individuals follow the trained one (and collective motion occurs) or none does. Combining experiments and a simple mathematical model we show that the observed phenomena results from the interplay between simple mimetic rules and the characteristic duration of the stimulus, here, the time during which the trained individual is moving away. The proposed mechanism strongly depends on group size, as observed in the experiments, and even if group splitting can occur, the most likely outcome is always a coherent collective group response (consensus). The prevalence of a consensus is expected even if the groups of naives face conflicting information, e.g. if groups contain two subgroups of trained individuals, one trained to stay and one trained to leave. Our results indicate that collective decision-making and consensus in (small) animal groups are likely to be self-organized phenomena that do not involve concertation or even communication among the group members.

  4. Sure, or unsure? Measuring students' confidence and the potential impact on patient safety in multiple-choice questions.

    PubMed

    Rangel, Rafael Henrique; Möller, Leona; Sitter, Helmut; Stibane, Tina; Strzelczyk, Adam

    2017-11-01

    Multiple-choice questions (MCQs) provide useful information about correct and incorrect answers, but they do not offer information about students' confidence. Ninety and another 81 medical students participated each in a curricular neurology multiple-choice exam and indicated their confidence for every single MCQ. Each MCQ had a defined level of potential clinical impact on patient safety (uncritical, risky, harmful). Our first objective was to detect informed (IF), guessed (GU), misinformed (MI), and uninformed (UI) answers. Further, we evaluated whether there were significant differences for confidence at correct and incorrect answers. Then, we explored if clinical impact had a significant influence on students' confidence. There were 1818 IF, 635 GU, 71 MI, and 176 UI answers in exam I and 1453 IF, 613 GU, 92 MI, and 191 UI answers in exam II. Students' confidence was significantly higher for correct than for incorrect answers at both exams (p < 0.001). For exam I, students' confidence was significantly higher for incorrect harmful than for incorrect risky classified MCQs (p = 0.01). At exam II, students' confidence was significantly higher for incorrect harmful than for incorrect benign (p < 0.01) and significantly higher for correct benign than for correct harmful categorized MCQs (p = 0.01). We were pleased to see that there were more informed than guessed, more uninformed than misinformed answers and higher students' confidence for correct than for incorrect answers. Our expectation that students state higher confidence in correct and harmful and lower confidence in incorrect and harmful MCQs could not be confirmed.

  5. Labeling and advertising of home insulation. Final staff report to the Federal Trade Commission and proposed trade regulation rule (16 CFR Part 460)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Insulation can save significant amounts of fuel and money, and has therefore captured public attention as a desirable energy conservation measure. Because insulation is a very difficult product for uninformed consumers to evaluate, there was broad support for a rule requiring the disclosure of information facilitating choices among insulation products. With the information that the Recommended Rule will require, consumers will be able to compare the thermal properties of varous types of insulation and make the best purchases for their needs. In order to provide consumers, as quickly as possible, with information aiding their purchase of this major conservation measure,more » and to protect consumers from the abuses that rising demand has brought, the Commission undertook this rulemaking proceeding on an expedited schedule. The Rule was proposed on November 18, 1977. The tests mandated by the Rule will provide reproducible and accurate R-values, permitting comparisons of thermal performance. As a result of the testing and required disclosures of R-values and related information, consumers should be able to make sound choices for their needs, without being uninformed or misinformed about the relative values of different types of insulation. The Recommended Rule covers the testing, advertising, and labeling of thermal insulation products. It includes organic, fibrous, cellular, and reflective insulations sold for use in homes, apartments, and other residential dwellings. Insulation sold directly to consumers for do-it-yourself installation is covered, as well as insulation installed by professionals.« less

  6. Predictive value of the complex magnetocardiographic index in patients with intermediate pretest probability of chronic coronary artery disease: results of a two-center study.

    PubMed

    Chaikovsky, Illya; Hailer, Birgit; Sosnytskyy, Volodymyr; Lutay, Mykhaylo; Mjasnikov, Georgiy; Kazmirchuk, Anatoly; Bydnyk, Mykola; Lomakovskyy, Alexander; Sosnytskaja, Taisia

    2014-09-01

    The aim of this paper is to investigate the predictive value of the new integrated magnetocardiographic (MCG) index (CI) in the diagnosis of coronary artery disease (CAD) in patients with suspected CAD with intermediate pretest probability of the disease and uninformative results of routine tests. The study was carried out in the Clinic of Cardiology of the Main Military Clinical Hospital of Ukraine, Kiev (clinic 1), and in the Second Medical Clinic of the 'Katholisches Klinikum Essen', Germany (clinic 2).The main group (group 1) included 89 patients without a history of myocardial infarction. Coronary angiography was performed because of chest pain. Depending on the results of coronary angiography, this group was divided into two subgroups: (i) those with at least 70% stenosis in at least one of the main coronary arteries (subgroup 1a) and (ii) those without hemodynamically significant stenosis (subgroup 1b). The control group included 43 healthy volunteers.In all participants, the MCG examination was performed using a seven-channel MCG system located in an unshielded room. An integrated MCG index (CI), consisting of six parameters, was calculated. It can be shown that CI was significantly higher in patients with stenosis 70% or more compared with the patients without stenosis and healthy volunteers. Sensitivity was 93%, specificity was 84%, positive predictive value was 85%, and negative predictive value was 93%. The MCG test at rest has the potential to be useful in the noninvasive diagnosis of CAD in patients with intermediate pretest probability of disease and uninformative results of routine tests.

  7. A pilot study on the Chinese Minnesota Multiphasic Personality Inventory-2 in detecting feigned mental disorders: Simulators classified by using the Structured Interview of Reported Symptoms.

    PubMed

    Chang, Yi-Ting; Tam, Wai-Cheong C; Shiah, Yung-Jong; Chiang, Shih-Kuang

    2017-09-01

    The Minnesota Multiphasic Personality Inventory-2 (MMPI-2) is often used in forensic psychological/psychiatric assessment. This was a pilot study on the utility of the Chinese MMPI-2 in detecting feigned mental disorders. The sample consisted of 194 university students who were either simulators (informed or uninformed) or controls. All the participants were administered the Chinese MMPI-2 and the Structured Interview of Reported Symptoms-2 (SIRS-2). The results of the SIRS-2 were utilized to classify the participants into the feigning or control groups. The effectiveness of eight detection indices was investigated by using item analysis, multivariate analysis of covariance (MANCOVA), and receiver operating characteristic (ROC) analysis. Results indicated that informed-simulating participants with prior knowledge of mental disorders did not perform better in avoiding feigning detection than uninformed-simulating participants. In addition, the eight detection indices of the Chinese MMPI-2 were effective in discriminating participants in the feigning and control groups, and the best cut-off scores of three of the indices were higher than those obtained from the studies using the English MMPI-2. Thus, in this sample of university students, the utility of the Chinese MMPI-2 in detecting feigned mental disorders was tentatively supported, and the Chinese Infrequency Scale (ICH), a scale developed specifically for the Chinese MMPI-2, was also supported as a valid scale for validity checking. © 2017 The Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  8. Silencing of omega-5 gliadins in transgenic wheat eliminates a major source of environmental variability and improves dough mixing properties of flour

    USDA-ARS?s Scientific Manuscript database

    Background The end-use quality of wheat flour varies as a result of the growth conditions of the plant. Among the wheat gluten proteins, the omega-5 gliadins have been identified as a major source of environmental variability, increasing in proportion in grain from plants that receive fertilizer or ...

  9. P09.62 Towards individualized survival prediction in glioblastoma patients using machine learning methods

    PubMed Central

    Vera, L.; Pérez-Beteta, J.; Molina, D.; Borrás, J. M.; Benavides, M.; Barcia, J. A.; Velásquez, C.; Albillo, D.; Lara, P.; Pérez-García, V. M.

    2017-01-01

    Abstract Introduction: Machine learning methods are integrated in clinical research studies due to their strong capability to discover parameters having a high information content and their predictive combined potential. Several studies have been developed using glioblastoma patient’s imaging data. Many of them have focused on including large numbers of variables, mostly two-dimensional textural features and/or genomic data, regardless of their meaning or potential clinical relevance. Materials and methods: 193 glioblastoma patients were included in the study. Preoperative 3D magnetic resonance images were collected and semi-automatically segmented using an in-house software. After segmentation, a database of 90 parameters including geometrical and textural image-based measures together with patients’ clinical data (including age, survival, type of treatment, etc.) was constructed. The criterion for including variables in the study was that they had either shown individual impact on survival in single or multivariate analyses or have a precise clinical or geometrical meaning. These variables were used to perform several machine learning experiments. In a first set of computational cross-validation experiments based on regression trees, those attributes showing the highest information measures were extracted. In the second phase, more sophisticated learning methods were employed in order to validate the potential of the previous variables predicting survival. Concretely support vector machines, neural networks and sparse grid methods were used. Results: Variables showing high information measure in the first phase provided the best prediction results in the second phase. Specifically, patient age, Stupp regimen and a geometrical measure related with the irregularity of contrast-enhancing areas were the variables showing the highest information measure in the first stage. For the second phase, the combinations of patient age and Stupp regimen together with one tumor geometrical measure and one tumor heterogeneity feature reached the best quality prediction. Conclusions: Advanced machine learning methods identified the parameters with the highest information measure and survival predictive potential. The uninformed machine learning methods identified a novel feature measure with direct impact on survival. Used in combination with other previously known variables multi-indexes can be defined that can help in tumor characterization and prognosis prediction. Recent advances on the definition of those multi-indexes will be reported in the conference. Funding: James S. Mc. Donnell Foundation (USA) 21st Century Science Initiative in Mathematical and Complex Systems Approaches for Brain Cancer [Collaborative award 220020450 and planning grant 220020420], MINECO/FEDER [MTM2015-71200-R], JCCM [PEII-2014-031-P].

  10. Suppressing Transients In Digital Phase-Locked Loops

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1993-01-01

    Loop of arbitrary order starts in steady-state lock. Method for initializing variables of digital phase-locked loop reduces or eliminates transients in phase and frequency typically occurring during acquisition of lock on signal or when changes made in values of loop-filter parameters called "loop constants". Enables direct acquisition by third-order loop without prior acquisition by second-order loop of greater bandwidth, and eliminates those perturbations in phase and frequency lock occurring when loop constants changed by arbitrarily large amounts.

  11. Developing deterioration models for Wyoming bridges.

    DOT National Transportation Integrated Search

    2016-05-01

    Deterioration models for the Wyoming Bridge Inventory were developed using both stochastic and deterministic models. : The selection of explanatory variables is investigated and a new method using LASSO regression to eliminate human bias : in explana...

  12. Using an Adjoint Approach to Eliminate Mesh Sensitivities in Computational Design

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Park, Michael A.

    2006-01-01

    An algorithm for efficiently incorporating the effects of mesh sensitivities in a computational design framework is introduced. The method is based on an adjoint approach and eliminates the need for explicit linearizations of the mesh movement scheme with respect to the geometric parameterization variables, an expense that has hindered practical large-scale design optimization using discrete adjoint methods. The effects of the mesh sensitivities can be accounted for through the solution of an adjoint problem equivalent in cost to a single mesh movement computation, followed by an explicit matrix-vector product scaling with the number of design variables and the resolution of the parameterized surface grid. The accuracy of the implementation is established and dramatic computational savings obtained using the new approach are demonstrated using several test cases. Sample design optimizations are also shown.

  13. Using an Adjoint Approach to Eliminate Mesh Sensitivities in Computational Design

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Park, Michael A.

    2005-01-01

    An algorithm for efficiently incorporating the effects of mesh sensitivities in a computational design framework is introduced. The method is based on an adjoint approach and eliminates the need for explicit linearizations of the mesh movement scheme with respect to the geometric parameterization variables, an expense that has hindered practical large-scale design optimization using discrete adjoint methods. The effects of the mesh sensitivities can be accounted for through the solution of an adjoint problem equivalent in cost to a single mesh movement computation, followed by an explicit matrix-vector product scaling with the number of design variables and the resolution of the parameterized surface grid. The accuracy of the implementation is established and dramatic computational savings obtained using the new approach are demonstrated using several test cases. Sample design optimizations are also shown.

  14. Does partial Granger causality really eliminate the influence of exogenous inputs and latent variables?

    PubMed

    Roelstraete, Bjorn; Rosseel, Yves

    2012-04-30

    Partial Granger causality was introduced by Guo et al. (2008) who showed that it could better eliminate the influence of latent variables and exogenous inputs than conditional G-causality. In the recent literature we can find some reviews and applications of this type of Granger causality (e.g. Smith et al., 2011; Bressler and Seth, 2010; Barrett et al., 2010). These articles apparently do not take into account a serious flaw in the original work on partial G-causality, being the negative F values that were reported and even proven to be plausible. In our opinion, this undermines the credibility of the obtained results and thus the validity of the approach. Our study is aimed to further validate partial G-causality and to find an answer why negative partial Granger causality estimates were reported. Time series were simulated from the same toy model as used in the original paper and partial and conditional causal measures were compared in the presence of confounding variables. Inference was done parametrically and using non-parametric block bootstrapping. We counter the proof that partial Granger F values can be negative, but the main conclusion of the original article remains. In the presence of unknown latent and exogenous influences, it appears that partial G-causality will better eliminate their influence than conditional G-causality, at least when non-parametric inference is used. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Elimination of residual amplitude modulation in tunable diode laser wavelength modulation spectroscopy using an optical fiber delay line.

    PubMed

    Chakraborty, Arup Lal; Ruxton, Keith; Johnstone, Walter; Lengden, Michael; Duffin, Kevin

    2009-06-08

    A new fiber-optic technique to eliminate residual amplitude modulation in tunable diode laser wavelength modulation spectroscopy is presented. The modulated laser output is split to pass in parallel through the gas measurement cell and an optical fiber delay line, with the modulation frequency / delay chosen to introduce a relative phase shift of pi between them. The two signals are balanced using a variable attenuator and recombined through a fiber coupler. In the absence of gas, the direct laser intensity modulation cancels, thereby eliminating the high background. The presence of gas induces a concentration-dependent imbalance at the coupler's output from which the absolute absorption profile is directly recovered with high accuracy using 1f detection.

  16. Stochastic processes on multiple scales: averaging, decimation and beyond

    NASA Astrophysics Data System (ADS)

    Bo, Stefano; Celani, Antonio

    The recent advances in handling microscopic systems are increasingly motivating stochastic modeling in a large number of physical, chemical and biological phenomena. Relevant processes often take place on widely separated time scales. In order to simplify the description, one usually focuses on the slower degrees of freedom and only the average effect of the fast ones is retained. It is then fundamental to eliminate such fast variables in a controlled fashion, carefully accounting for their net effect on the slower dynamics. We shall present how this can be done by either decimating or coarse-graining the fast processes and discuss applications to physical, biological and chemical examples. With the same tools we will address the fate of functionals of the stochastic trajectories (such as residence times, counting statistics, fluxes, entropy production, etc.) upon elimination of the fast variables. In general, for functionals, such elimination can present additional difficulties. In some cases, it is not possible to express them in terms of the effective trajectories on the slow degrees of freedom but additional details of the fast processes must be retained. We will focus on such cases and show how naive procedures can lead to inconsistent results.

  17. Steady-state pharmacokinetics of isotretinoin and its 4-oxo metabolite: implications for fetal safety.

    PubMed

    Nulman, I; Berkovitch, M; Klein, J; Pastuszak, A; Lester, R S; Shear, N; Koren, G

    1998-10-01

    Isotretinoin is the most potent human teratogen on the market. Women for whom contraception fails may conceive during or soon after discontinuing isotretinoin therapy, making its elimination kinetics a crucial determinant of fetal safety. The steady-state pharmacokinetics of isotretinoin and its major 4-oxo metabolite were studied in 16 adult patients treated for acne who were receiving doses that ranged from 0.47 to 1.7 mg/kg daily. This is the first study of the pharmacokinetics of isotretinoin in women of childbearing age (n = 11). The clinical efficacy and tolerability of isotretinoin was investigated, and the correlation between these data and steady-state serum concentrations of isotretinoin was tested. The concentration-time data best fitted a two-compartment open model with linear elimination. There was no correlation between efficacy and tolerability of isotretinoin and steady-state serum concentrations. There was no correlation between dose of isotretinoin and steady-state concentration, due to the large variability in apparent clearance. Values for elimination half-life (t1/2) of isotretinoin and its metabolite were 29+/-40 hours and 22+/-10 hours, respectively. These data suggest a longer elimination t1/2 of the parent drug than previously reported. This is probably due to the longer sampling time used in this study (as long as 28 days). This study suggests that a greater variability exists in the safe time after discontinuation of the drug for onset of conception.

  18. Parsing brain activity with fMRI and mixed designs: what kind of a state is neuroimaging in?

    PubMed

    Donaldson, David I

    2004-08-01

    Neuroimaging is often pilloried for providing little more than pretty pictures that simply show where activity occurs in the brain. Strong critics (notably Uttal) have even argued that neuroimaging is nothing more than a modern day version of phrenology: destined to fail, and fundamentally uninformative. Here, I make the opposite case, arguing that neuroimaging is in a vibrant and healthy state of development. As recent investigations of memory illustrate, when used well, neuroimaging goes beyond asking 'where' activity is occurring, to ask questions concerned more with 'what' functional role the activity reflects.

  19. What types of investors generate the two-phase phenomenon?

    NASA Astrophysics Data System (ADS)

    Ryu, Doojin

    2013-12-01

    We examine the two-phase phenomenon described by Plerou, Gopikrishnan, and Stanley (2003) [1] in the KOSPI 200 options market, one of the most liquid options markets in the world. By analysing a unique intraday dataset that contains information about investor type for each trade and quote, we find that the two-phase phenomenon is generated primarily by domestic individual investors, who are generally considered to be uninformed and noisy traders. In contrast, our empirical results indicate that trades by foreign institutions, who are generally considered informed and sophisticated investors, do not exhibit two-phase behaviour.

  20. TERMINAL ILLNESS IN AN INDIAN SETTING: PROBLEMS OF COMMUNICATION

    PubMed Central

    Khanna, R.; Singh, R.P.N.

    1988-01-01

    SUMMARY A study of 50 terminally ill cancer patients revealed that 52% were uninformed regarding their diagnosis and prognosis. In almost all cases the relatives had been adequately informed. No less than 82% of the terminally ill patients showed an awareness of the fatal prognosis. Most of the patients found the communication with the doctor and the relatives as unsatisfactory. Comparing this group with another group of non-terminal medically ill patients showed striking differences between the two groups. The findings are compared with those reported from the West and the implications of the above observations discussed. PMID:21927320

  1. Neuroethics vs neurophysiologically and neuropsychologically uninformed influences in child-rearing, education, emerging hunter-gatherers, and artificial intelligence models of the brain.

    PubMed

    Pontius, A A

    1993-04-01

    Potentially negative long-term consequences in four areas are emphasized, if specific neuromaturational, neurophysiological, and neuropsychological facts within a neurodevelopmental and ecological context are neglected in normal functional levels of child development and maturational lag of the frontal lobe system in "Attention Deficit Disorder," in education (reading/writing and arithmetic), in assessment of cognitive functioning in hunter-gatherer populations, specifically modified in the service of their survival, and in constructing computer models of the brain, neglecting consciousness and intentionality as criticized recently by Searle.

  2. Characterization and Quantification of Uncertainty in the NARCCAP Regional Climate Model Ensemble and Application to Impacts on Water Systems

    NASA Astrophysics Data System (ADS)

    Mearns, L. O.; Sain, S. R.; McGinnis, S. A.; Steinschneider, S.; Brown, C. M.

    2015-12-01

    In this talk we present the development of a joint Bayesian Probabilistic Model for the climate change results of the North American Regional Climate Change Assessment Program (NARCCAP) that uses a unique prior in the model formulation. We use the climate change results (joint distribution of seasonal temperature and precipitation changes (future vs. current)) from the global climate models (GCMs) that provided boundary conditions for the six different regional climate models used in the program as informative priors for the bivariate Bayesian Model. The two variables involved are seasonal temperature and precipitation over sub-regions (i.e., Bukovsky Regions) of the full NARCCAP domain. The basic approach to the joint Bayesian hierarchical model follows the approach of Tebaldi and Sansó (2009). We compare model results using informative (i.e., GCM information) as well as uninformative priors. We apply these results to the Water Evaluation and Planning System (WEAP) model for the Colorado Springs Utility in Colorado. We investigate the layout of the joint pdfs in the context of the water model sensitivities to ranges of temperature and precipitation results to determine the likelihoods of future climate conditions that cannot be accommodated by possible adaptation options. Comparisons may also be made with joint pdfs formed from the CMIP5 collection of global climate models and empirically downscaled to the region of interest.

  3. Variable polarity plasma arc welding on the Space Shuttle external tank

    NASA Technical Reports Server (NTRS)

    Nunes, A. C., Jr.; Bayless, E. O., Jr.; Jones, C. S., III; Munafo, P. M.; Biddle, A. P.; Wilson, W. A.

    1984-01-01

    Variable polarity plasma arc (VPPA) techniques used at NASA's Marshall Space Flight Center for the fabrication of the Space Shuttle External Tank are presentedd. The high plasma arc jet velocities of 300-2000 m/s are produced by heating the plasma gas as it passes through a constraining orifice, with the plasma arc torch becoming a miniature jet engine. As compared to the GTA jet, the VPPA has the following advantages: (1) less sensitive to contamination, (2) a more symmetrical fusion zone, and (3) greater joint penetration. The VPPA welding system is computerized, operating with a microprocessor, to set welding variables in accordance with set points inputs, including the manipulator and wire feeder, as well as torch control and power supply. Some other VPPA welding technique advantages are: reduction in weld repair costs by elimination of porosity; reduction of joint preparation costs through elimination of the need to scrape or file faying surfaces; reduction in depeaking costs; eventual reduction of the 100 percent-X-ray inspection requirements. The paper includes a series of schematic and block diagrams.

  4. Random forest feature selection approach for image segmentation

    NASA Astrophysics Data System (ADS)

    Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina; Vaida, Mircea Florin

    2017-03-01

    In the field of image segmentation, discriminative models have shown promising performance. Generally, every such model begins with the extraction of numerous features from annotated images. Most authors create their discriminative model by using many features without using any selection criteria. A more reliable model can be built by using a framework that selects the important variables, from the point of view of the classification, and eliminates the unimportant once. In this article we present a framework for feature selection and data dimensionality reduction. The methodology is built around the random forest (RF) algorithm and its variable importance evaluation. In order to deal with datasets so large as to be practically unmanageable, we propose an algorithm based on RF that reduces the dimension of the database by eliminating irrelevant features. Furthermore, this framework is applied to optimize our discriminative model for brain tumor segmentation.

  5. Influence of UV illumination on the cold temperature operation of a LiNbO(3) Q-switched Nd:YAG laser.

    PubMed

    Cole, Brian; Goldberg, Lew; King, Vernon; Leach, Jeff

    2010-04-26

    UV illumination of a lithium niobate Q-switch was demonstrated as an effective means to eliminate a loss in hold-off and associated prelasing that occurs under cold temperature operation of Q-switched lasers. This degradation occurs due to the pyroelectric effect, where an accumulation of charge on crystal faces results in a reduction in the Q-switch hold-off and a spatially variable loss of the Q-switch in its high-transmission state, both resulting in lowering of the maximum Q-switched pulse energy. With UV illumination, the resulting creation of photo-generated carriers was shown to be effective in eliminating both of these effects. A Q-switched Nd:YAG laser utilizing UV-illuminated LiNbO(3) was shown to operate under cold temperatures without prelasing or spatially variable loss.

  6. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    PubMed

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in using RF to develop predictive models with large environmental data sets.

  7. [Interspecies differences of noopept pharmacokinetics].

    PubMed

    Boĭko, S S; Korotkov, S A; Zherdev, V P; Gudasheva, T A; Ostrovskaia, R U; Voronina, T A

    2004-01-01

    Significant interspecific differences in the pharmacokinetics of noopept are manifested by a decrease in the drug elimination rate on the passage from rats to rabbits and humans. Very intensive metabolism of noopept was observed upon intravenous administration in rats. In these animals, presystemic elimination mechanisms lead to the formation of a specific metabolite representing a product of drug biotransformation hydroxylated at the phenyl ring. In rabbits, unchanged noopept circulates in the blood for a longer time upon both intravenous and peroral introduction, biotransformation proceeds at a much slower rate, and no metabolites analogous to that found in rats are detected. The noopept pharmacokinetics in humans differs from that in animals by still slower elimination and considerable individual variability. No drug metabolites are found in the human blood plasma, probably because of a relatively small dose and low concentration.

  8. Optimization of non-thermal plasma efficiency in the simultaneous elimination of benzene, toluene, ethyl-benzene, and xylene from polluted airstreams using response surface methodology.

    PubMed

    Najafpoor, Ali Asghar; Jonidi Jafari, Ahmad; Hosseinzadeh, Ahmad; Khani Jazani, Reza; Bargozin, Hasan

    2018-01-01

    Treatment with a non-thermal plasma (NTP) is a new and effective technology applied recently for conversion of gases for air pollution control. This research was initiated to optimize the efficient application of the NTP process in benzene, toluene, ethyl-benzene, and xylene (BTEX) removal. The effects of four variables including temperature, initial BTEX concentration, voltage, and flow rate on the BTEX elimination efficiency were investigated using response surface methodology (RSM). The constructed model was evaluated by analysis of variance (ANOVA). The model goodness-of-fit and statistical significance was assessed using determination coefficients (R 2 and R 2 adj ) and the F-test. The results revealed that the R 2 proportion was greater than 0.96 for BTEX removal efficiency. The statistical analysis demonstrated that the BTEX removal efficiency was significantly correlated with the temperature, BTEX concentration, voltage, and flow rate. Voltage was the most influential variable affecting the dependent variable as it exerted a significant effect (p < 0.0001) on the response variable. According to the achieved results, NTP can be applied as a progressive, cost-effective, and practical process for treatment of airstreams polluted with BTEX in conditions of low residence time and high concentrations of pollutants.

  9. A simple technique investigating baseline heterogeneity helped to eliminate potential bias in meta-analyses.

    PubMed

    Hicks, Amy; Fairhurst, Caroline; Torgerson, David J

    2018-03-01

    To perform a worked example of an approach that can be used to identify and remove potentially biased trials from meta-analyses via the analysis of baseline variables. True randomisation produces treatment groups that differ only by chance; therefore, a meta-analysis of a baseline measurement should produce no overall difference and zero heterogeneity. A meta-analysis from the British Medical Journal, known to contain significant heterogeneity and imbalance in baseline age, was chosen. Meta-analyses of baseline variables were performed and trials systematically removed, starting with those with the largest t-statistic, until the I 2 measure of heterogeneity became 0%, then the outcome meta-analysis repeated with only the remaining trials as a sensitivity check. We argue that heterogeneity in a meta-analysis of baseline variables should not exist, and therefore removing trials which contribute to heterogeneity from a meta-analysis will produce a more valid result. In our example none of the overall outcomes changed when studies contributing to heterogeneity were removed. We recommend routine use of this technique, using age and a second baseline variable predictive of outcome for the particular study chosen, to help eliminate potential bias in meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Theophylline kinetics in relation to age: the importance of smoking.

    PubMed Central

    Cusack, B; Kelly, J G; Lavan, J; Noel, J; O'Malley, K

    1980-01-01

    1 Single dose studies of theophylline kinetics were compared in groups of young and elderly smokers and non-smokers to assess the effect of age on theophylline absorption and the effect of smoking on drug metabolising enzyme activity in old age. 2 The rate and extent of absorption was not affected by age. Distribution and elimination kinetics were similar in young and elderly non-smokers. 3 In young subjects the elimination half-life of theophylline was shorter and clearance was significantly greater in smokers than in non-smokers. 4 In the elderly mean elimination half-life was significantly shorter in smokers and their plasma clearance was 40% higher than in non-smokers. The statistical difference for clearance was at the 7% level of significance. 5 These data indicate that ageing per se does not affect theophylline elimination and also that induction of theophylline metabolism due to smoking occurs in old age. Smoking is a variable that should be taken account of when assessing drug metabolism in elderly patients. PMID:7426272

  11. A black box optimization approach to parameter estimation in a model for long/short term variations dynamics of commodity prices

    NASA Astrophysics Data System (ADS)

    De Santis, Alberto; Dellepiane, Umberto; Lucidi, Stefano

    2012-11-01

    In this paper we investigate the estimation problem for a model of the commodity prices. This model is a stochastic state space dynamical model and the problem unknowns are the state variables and the system parameters. Data are represented by the commodity spot prices, very seldom time series of Futures contracts are available for free. Both the system joint likelihood function (state variables and parameters) and the system marginal likelihood (the state variables are eliminated) function are addressed.

  12. Medications influencing central cholinergic pathways affect fixation stability, saccadic response time and associated eye movement dynamics during a temporally-cued visual reaction time task.

    PubMed

    Naicker, Preshanta; Anoopkumar-Dukie, Shailendra; Grant, Gary D; Modenese, Luca; Kavanagh, Justin J

    2017-02-01

    Anticholinergic medications largely exert their effects due to actions on the muscarinic receptor, which mediates the functions of acetylcholine in the peripheral and central nervous systems. In the central nervous system, acetylcholine plays an important role in the modulation of movement. This study investigated the effects of over-the-counter medications with varying degrees of central anticholinergic properties on fixation stability, saccadic response time and the dynamics associated with this eye movement during a temporally-cued visual reaction time task, in order to establish the significance of central cholinergic pathways in influencing eye movements during reaction time tasks. Twenty-two participants were recruited into the placebo-controlled, human double-blind, four-way crossover investigation. Eye tracking technology recorded eye movements while participants reacted to visual stimuli following temporally informative and uninformative cues. The task was performed pre-ingestion as well as 0.5 and 2 h post-ingestion of promethazine hydrochloride (strong centrally acting anticholinergic), hyoscine hydrobromide (moderate centrally acting anticholinergic), hyoscine butylbromide (anticholinergic devoid of central properties) and a placebo. Promethazine decreased fixation stability during the reaction time task. In addition, promethazine was the only drug to increase saccadic response time during temporally informative and uninformative cued trials, whereby effects on response time were more pronounced following temporally informative cues. Promethazine also decreased saccadic amplitude and increased saccadic duration during the temporally-cued reaction time task. Collectively, the results of the study highlight the significant role that central cholinergic pathways play in the control of eye movements during tasks that involve stimulus identification and motor responses following temporal cues.

  13. Push-pull tracer tests: Their information content and use for characterizing non-Fickian, mobile-immobile behavior: INFORMATION CONTENT OF PUSH-PULL TESTS

    DOE PAGES

    Hansen, Scott K.; Berkowitz, Brian; Vesselinov, Velimir V.; ...

    2016-12-01

    Path reversibility and radial symmetry are often assumed in push-pull tracer test analysis. In reality, heterogeneous flow fields mean that both assumptions are idealizations. In this paper, to understand their impact, we perform a parametric study which quantifies the scattering effects of ambient flow, local-scale dispersion, and velocity field heterogeneity on push-pull breakthrough curves and compares them to the effects of mobile-immobile mass transfer (MIMT) processes including sorption and diffusion into secondary porosity. We identify specific circumstances in which MIMT overwhelmingly determines the breakthrough curve, which may then be considered uninformative about drift and local-scale dispersion. Assuming path reversibility, wemore » develop a continuous-time-random-walk-based interpretation framework which is flow-field-agnostic and well suited to quantifying MIMT. Adopting this perspective, we show that the radial flow assumption is often harmless: to the extent that solute paths are reversible, the breakthrough curve is uninformative about velocity field heterogeneity. Our interpretation method determines a mapping function (i.e., subordinator) from travel time in the absence of MIMT to travel time in its presence. A mathematical theory allowing this function to be directly “plugged into” an existing Laplace-domain transport model to incorporate MIMT is presented and demonstrated. Algorithms implementing the calibration are presented and applied to interpretation of data from a push-pull test performed in a heterogeneous environment. A successful four-parameter fit is obtained, of comparable fidelity to one obtained using a million-node 3-D numerical model. In conclusion, we demonstrate analytically and numerically how push-pull tests quantifying MIMT are sensitive to remobilization, but not immobilization, kinetics.« less

  14. Push-pull tracer tests: Their information content and use for characterizing non-Fickian, mobile-immobile behavior: INFORMATION CONTENT OF PUSH-PULL TESTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Scott K.; Berkowitz, Brian; Vesselinov, Velimir V.

    Path reversibility and radial symmetry are often assumed in push-pull tracer test analysis. In reality, heterogeneous flow fields mean that both assumptions are idealizations. In this paper, to understand their impact, we perform a parametric study which quantifies the scattering effects of ambient flow, local-scale dispersion, and velocity field heterogeneity on push-pull breakthrough curves and compares them to the effects of mobile-immobile mass transfer (MIMT) processes including sorption and diffusion into secondary porosity. We identify specific circumstances in which MIMT overwhelmingly determines the breakthrough curve, which may then be considered uninformative about drift and local-scale dispersion. Assuming path reversibility, wemore » develop a continuous-time-random-walk-based interpretation framework which is flow-field-agnostic and well suited to quantifying MIMT. Adopting this perspective, we show that the radial flow assumption is often harmless: to the extent that solute paths are reversible, the breakthrough curve is uninformative about velocity field heterogeneity. Our interpretation method determines a mapping function (i.e., subordinator) from travel time in the absence of MIMT to travel time in its presence. A mathematical theory allowing this function to be directly “plugged into” an existing Laplace-domain transport model to incorporate MIMT is presented and demonstrated. Algorithms implementing the calibration are presented and applied to interpretation of data from a push-pull test performed in a heterogeneous environment. A successful four-parameter fit is obtained, of comparable fidelity to one obtained using a million-node 3-D numerical model. In conclusion, we demonstrate analytically and numerically how push-pull tests quantifying MIMT are sensitive to remobilization, but not immobilization, kinetics.« less

  15. Boosting pitch encoding with audiovisual interactions in congenital amusia.

    PubMed

    Albouy, Philippe; Lévêque, Yohana; Hyde, Krista L; Bouchet, Patrick; Tillmann, Barbara; Caclin, Anne

    2015-01-01

    The combination of information across senses can enhance perception, as revealed for example by decreased reaction times or improved stimulus detection. Interestingly, these facilitatory effects have been shown to be maximal when responses to unisensory modalities are weak. The present study investigated whether audiovisual facilitation can be observed in congenital amusia, a music-specific disorder primarily ascribed to impairments of pitch processing. Amusic individuals and their matched controls performed two tasks. In Task 1, they were required to detect auditory, visual, or audiovisual stimuli as rapidly as possible. In Task 2, they were required to detect as accurately and as rapidly as possible a pitch change within an otherwise monotonic 5-tone sequence that was presented either only auditorily (A condition), or simultaneously with a temporally congruent, but otherwise uninformative visual stimulus (AV condition). Results of Task 1 showed that amusics exhibit typical auditory and visual detection, and typical audiovisual integration capacities: both amusics and controls exhibited shorter response times for audiovisual stimuli than for either auditory stimuli or visual stimuli. Results of Task 2 revealed that both groups benefited from simultaneous uninformative visual stimuli to detect pitch changes: accuracy was higher and response times shorter in the AV condition than in the A condition. The audiovisual improvements of response times were observed for different pitch interval sizes depending on the group. These results suggest that both typical listeners and amusic individuals can benefit from multisensory integration to improve their pitch processing abilities and that this benefit varies as a function of task difficulty. These findings constitute the first step towards the perspective to exploit multisensory paradigms to reduce pitch-related deficits in congenital amusia, notably by suggesting that audiovisual paradigms are effective in an appropriate range of unimodal performance. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Can an agent’s false belief be corrected by an appropriate communication? Psychological reasoning in 18-month-old infants

    PubMed Central

    Song, Hyun-joo; Onishi, Kristine H.; Baillargeon, Renée; Fisher, Cynthia

    2008-01-01

    Do 18-month-olds understand that an agent’s false belief can be corrected by an appropriate, though not an inappropriate, communication? In Experiment 1, infants watched a series of events involving two agents, a ball, and two containers: a box and a cup. To start, agent1 played with the ball and then hid it in the box, while agent2 looked on. Next, in agent1’s absence, agent2 moved the ball from the box to the cup. When agent1 returned, agent2 told her “The ball is in the cup!” (informative-intervention condition) or “I like the cup!” (uninformative-intervention condition). During test, agent1 reached for either the box (box event) or the cup (cup event). In the informative-intervention condition, infants who saw the box event looked reliably longer than those who saw the cup event; in the uninformative-intervention condition, the reverse pattern was found. These results suggest that infants expected agent1’s false belief about the ball’s location to be corrected when she was told “The ball is in the cup!”, but not “I like the cup!”. In Experiment 2, agent2 simply pointed to the ball’s new location, and infants again expected agent1’s false belief to be corrected. These and control results provide additional evidence that infants in the second year of life can attribute false beliefs to agents. In addition, the results suggest that by 18 months of age infants expect agents’ false beliefs to be corrected by relevant communications involving words or gestures. PMID:18976745

  17. Imprint cytology on microcalcifications excised by vacuum-assisted breast biopsy: a rapid preliminary diagnosis.

    PubMed

    Fotou, Maria; Oikonomou, Vassiliki; Zagouri, Flora; Sergentanis, Theodoros N; Nonni, Afroditi; Athanassiadou, Pauline; Drouveli, Theodora; Atsouris, Efstratios; Kotzia, Evagelia; Zografos, George C

    2007-04-03

    To evaluate imprint cytology in the context of specimens with microcalcifications derived from Vacuum-Assisted Breast Biopsy (VABB). A total of 93 women with microcalcifications BI-RADS 3 and 4 underwent VABB and imprint samples were examined. VABB was performed on Fischer's table using 11-gauge Mammotome vacuum probes. A mammogram of the cores after the procedure confirmed the excision of microcalcifications. For the application of imprint cytology, the cores with microcalcifications confirmed by mammogram were gently rolled against glass microscope slides and thus imprint smears were made. For rapid preliminary diagnosis Diff-Quick stain, modified Papanicolaou stain and May Grunwald Giemsa were used. Afterwards, the core was dipped into a CytoRich Red Collection fluid for a few seconds in order to obtain samples with the use of the specimen wash. After the completion of cytological procedures, the core was prepared for routine histological study. The pathologist was blind to the preliminary cytological results. The cytological and pathological diagnoses were comparatively evaluated. According to the pathological examination, 73 lesions were benign, 15 lesions were carcinomas (12 ductal carcinomas in situ, 3 invasive ductal carcinomas), and 5 lesions were precursor: 3 cases of atypical ductal hyperplasia (ADH) and 2 cases of lobular neoplasia (LN). The observed sensitivity and specificity of the cytological imprints for cancer were 100% (one-sided, 97.5% CI: 78.2%-100%). Only one case of ADH could be detected by imprint cytology. Neither of the two LN cases was detected by the imprints. The imprints were uninformative in 11 out of 93 cases (11.8%). There was no uninformative case among women with malignancy. Imprint cytology provides a rapid, accurate preliminary diagnosis in a few minutes. This method might contribute to the diagnosis of early breast cancer and possibly attenuates patients' anxiety.

  18. Informed Decision-Making in the Context of Prenatal Chromosomal Microarray.

    PubMed

    Baker, Jessica; Shuman, Cheryl; Chitayat, David; Wasim, Syed; Okun, Nan; Keunen, Johannes; Hofstedter, Renee; Silver, Rachel

    2018-03-07

    The introduction of chromosomal microarray (CMA) into the prenatal setting has involved considerable deliberation due to the wide range of possible outcomes (e.g., copy number variants of uncertain clinical significance). Such issues are typically discussed in pre-test counseling for pregnant women to support informed decision-making regarding prenatal testing options. This research study aimed to assess the level of informed decision-making with respect to prenatal CMA and the factor(s) influencing decision-making to accept CMA for the selected prenatal testing procedure (i.e., chorionic villus sampling or amniocentesis). We employed a questionnaire that was adapted from a three-dimensional measure previously used to assess informed decision-making with respect to prenatal screening for Down syndrome and neural tube defects. This measure classifies an informed decision as one that is knowledgeable, value-consistent, and deliberated. Our questionnaire also included an optional open-ended question, soliciting factors that may have influenced the participants' decision to accept prenatal CMA; these responses were analyzed qualitatively. Data analysis on 106 participants indicated that 49% made an informed decision (i.e., meeting all three criteria of knowledgeable, deliberated, and value-consistent). Analysis of 59 responses to the open-ended question showed that "the more information the better" emerged as the dominant factor influencing both informed and uninformed participants' decisions to accept prenatal CMA. Despite learning about the key issues in pre-test genetic counseling, our study classified a significant portion of women as making uninformed decisions due to insufficient knowledge, lack of deliberation, value-inconsistency, or a combination of these three measures. Future efforts should focus on developing educational approaches and counseling strategies to effectively increase the rate of informed decision-making among women offered prenatal CMA.

  19. Assessing biofiltration repeatability: statistical comparison of two identical toluene removal systems.

    PubMed

    Jiménez, Lucero; Arriaga, Sonia; Aizpuru, Aitor

    2016-01-01

    Biofiltration of volatile organic compounds is still considered an emerging technology. Its reliability remains questionable as no data is available regarding process intrinsic repeatability. Herein, two identically operated toluene biofiltration systems are comprehensively compared, during long-term operation (129 days). Globally, reactors responded very similarly, even during transient conditions, with, for example, strong biological activities from the first days of operation, and comparable periods of lower removal efficiency (81.2%) after exposure to high inlet loads (140 g m(-3) h(-1)). Regarding steady states, very similar maximum elimination capacities up to 99 g m(-3) h(-1) were attained. Estimation of the process repeatability, with the paired samples Student's t-test, indicated no statistically significant difference between elimination capacities. Repeatability was also established for several descriptors of the process such as the carbon dioxide and biomass production, the pH and organic content of the leachates, and the moisture content of the packing material. While some parameters, such as the pH, presented a remarkably low divergence between biofilters (coefficient of variability of 1.4%), others, such as the organic content of the leachates, presented higher variability (30.6%) due to an uneven biomass lixiviation associated with stochastic hydrodynamics and biomass repartitions. Regarding process efficiency, it was established that less than 10% of fluctuation is to be expected between the elimination capacities of identical biofilter set-ups. A further statistical comparison between the first halves of the biofilter columns indicated very similar coefficients of variability, confirming the repeatability of the process, for different biofilter lengths.

  20. Malaria Elimination Campaigns in the Lake Kariba Region of Zambia: A Spatial Dynamical Model

    PubMed Central

    Nikolov, Milen; Bever, Caitlin A.; Upfill-Brown, Alexander; Hamainza, Busiku; Miller, John M.; Eckhoff, Philip A.; Wenger, Edward A.; Gerardin, Jaline

    2016-01-01

    As more regions approach malaria elimination, understanding how different interventions interact to reduce transmission becomes critical. The Lake Kariba area of Southern Province, Zambia, is part of a multi-country elimination effort and presents a particular challenge as it is an interconnected region of variable transmission intensities. In 2012–13, six rounds of mass test-and-treat drug campaigns were carried out in the Lake Kariba region. A spatial dynamical model of malaria transmission in the Lake Kariba area, with transmission and climate modeled at the village scale, was calibrated to the 2012–13 prevalence survey data, with case management rates, insecticide-treated net usage, and drug campaign coverage informed by surveillance. The model captured the spatio-temporal trends of decline and rebound in malaria prevalence in 2012–13 at the village scale. Various interventions implemented between 2016–22 were simulated to compare their effects on reducing regional transmission and achieving and maintaining elimination through 2030. Simulations predict that elimination requires sustaining high coverage with vector control over several years. When vector control measures are well-implemented, targeted mass drug campaigns in high-burden areas further increase the likelihood of elimination, although drug campaigns cannot compensate for insufficient vector control. If infections are regularly imported from outside the region into highly receptive areas, vector control must be maintained within the region until importations cease. Elimination in the Lake Kariba region is possible, although human movement both within and from outside the region risk damaging the success of elimination programs. PMID:27880764

  1. Design of an optimized biomixture for the degradation of carbofuran based on pesticide removal and toxicity reduction of the matrix.

    PubMed

    Chin-Pampillo, Juan Salvador; Ruiz-Hidalgo, Karla; Masís-Mora, Mario; Carazo-Rojas, Elizabeth; Rodríguez-Rodríguez, Carlos E

    2015-12-01

    Pesticide biopurification systems contain a biologically active matrix (biomixture) responsible for the accelerated elimination of pesticides in wastewaters derived from pest control in crop fields. Biomixtures have been typically prepared using the volumetric composition 50:25:25 (lignocellulosic substrate/humic component/soil); nonetheless, formal composition optimization has not been performed so far. Carbofuran is an insecticide/nematicide of high toxicity widely employed in developing countries. Therefore, the composition of a highly efficient biomixture (composed of coconut fiber, compost, and soil, FCS) for the removal of carbofuran was optimized by means of a central composite design and response surface methodology. The volumetric content of soil and the ratio coconut fiber/compost were used as the design variables. The performance of the biomixture was assayed by considering the elimination of carbofuran, the mineralization of (14)C-carbofuran, and the residual toxicity of the matrix, as response variables. Based on the models, the optimal volumetric composition of the FCS biomixture consists of 45:13:42 (coconut fiber/compost/soil), which resulted in minimal residual toxicity and ∼99% carbofuran elimination after 3 days. This optimized biomixture considerably differs from the standard 50:25:25 composition, which remarks the importance of assessing the performance of newly developed biomixtures during the design of biopurification systems.

  2. Random chromosome elimination in synthetic Triticum-Aegilops amphiploids leads to development of a stable partial amphiploid with high grain micro- and macronutrient content and powdery mildew resistance.

    PubMed

    Tiwari, Vijay K; Rawat, Nidhi; Neelam, Kumari; Kumar, Sundip; Randhawa, Gursharn S; Dhaliwal, Harcharan S

    2010-12-01

    Synthetic amphiploids are the immortal sources for studies on crop evolution, genome dissection, and introgression of useful variability from related species. Cytological analysis of synthetic decaploid wheat (Triticum aestivum L.) - Aegilops kotschyi Boiss. amphiploids (AABBDDUkUkSkSk) showed some univalents from the C1 generation onward followed by chromosome elimination. Most of the univalents came to metaphase I plate after the reductional division of paired chromosomes and underwent equational division leading to their elimination through laggards and micronuclei. Substantial variation in the chromosome number of pollen mother cells from different tillers, spikelets, and anthers of some plants also indicated somatic chromosome elimination. Genomic in situ hybridization, fluorescence in situ hybridization, and simple sequence repeat markers analysis of two amphiploids with reduced chromosomes indicated random chromosome elimination of various genomes with higher sensitivity of D followed by the Sk and Uk genomes to elimination, whereas 1D chromosome was preferentially eliminated in both the amphiploids investigated. One of the partial amphiploids, C4 T. aestivum 'Chinese Spring' - Ae. kotschyi 396 (2n = 58), with 34 T. aestivum, 14 Uk, and 10 Sk had stable meiosis and high fertility. The partial amphiploids with white glumes, bold seeds, and tough rachis with high grain macro- and micronutrients and resistance to powdery mildew could be used for T. aestivum biofortification and transfer of powdery mildew resistance.

  3. A quasi-Newton approach to optimization problems with probability density constraints. [problem solving in mathematical programming

    NASA Technical Reports Server (NTRS)

    Tapia, R. A.; Vanrooy, D. L.

    1976-01-01

    A quasi-Newton method is presented for minimizing a nonlinear function while constraining the variables to be nonnegative and sum to one. The nonnegativity constraints were eliminated by working with the squares of the variables and the resulting problem was solved using Tapia's general theory of quasi-Newton methods for constrained optimization. A user's guide for a computer program implementing this algorithm is provided.

  4. Statokinesigram normalization method.

    PubMed

    de Oliveira, José Magalhães

    2017-02-01

    Stabilometry is a technique that aims to study the body sway of human subjects, employing a force platform. The signal obtained from this technique refers to the position of the foot base ground-reaction vector, known as the center of pressure (CoP). The parameters calculated from the signal are used to quantify the displacement of the CoP over time; there is a large variability, both between and within subjects, which prevents the definition of normative values. The intersubject variability is related to differences between subjects in terms of their anthropometry, in conjunction with their muscle activation patterns (biomechanics); and the intrasubject variability can be caused by a learning effect or fatigue. Age and foot placement on the platform are also known to influence variability. Normalization is the main method used to decrease this variability and to bring distributions of adjusted values into alignment. In 1996, O'Malley proposed three normalization techniques to eliminate the effect of age and anthropometric factors from temporal-distance parameters of gait. These techniques were adopted to normalize the stabilometric signal by some authors. This paper proposes a new method of normalization of stabilometric signals to be applied in balance studies. The method was applied to a data set collected in a previous study, and the results of normalized and nonnormalized signals were compared. The results showed that the new method, if used in a well-designed experiment, can eliminate undesirable correlations between the analyzed parameters and the subjects' characteristics and show only the experimental conditions' effects.

  5. Effects of sunscreen on skin cancer and photoaging.

    PubMed

    Iannacone, Michelle R; Hughes, Maria Celia B; Green, Adèle C

    2014-01-01

    Application of sunscreen to the skin is widely used as an adjunct strategy, along with wearing protective clothing and seeking shade, to protect against skin cancer and photoaging that result from excessive sun exposure. Many epidemiological studies of case-control and cohort study design have studied the effects of sunscreen use on skin cancer, and more recently photoaging, but their findings have been mostly uninformative. This review of results of randomized controlled trials shows that the evidence, though limited, supports beneficial effects of sunscreen application on the occurrence of skin cancers and skin photoaging. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Predicting population survival under future climate change: density dependence, drought and extraction in an insular bighorn sheep.

    PubMed

    Colchero, Fernando; Medellin, Rodrigo A; Clark, James S; Lee, Raymond; Katul, Gabriel G

    2009-05-01

    1. Our understanding of the interplay between density dependence, climatic perturbations, and conservation practices on the dynamics of small populations is still limited. This can result in uninformed strategies that put endangered populations at risk. Moreover, the data available for a large number of populations in such circumstances are sparse and mined with missing data. Under the current climate change scenarios, it is essential to develop appropriate inferential methods that can make use of such data sets. 2. We studied a population of desert bighorn sheep introduced to Tiburon Island, Mexico in 1975 and subjected to irregular extractions for the last 10 years. The unique attributes of this population are absence of predation and disease, thereby permitting us to explore the combined effect of density dependence, environmental variability and extraction in a 'controlled setting.' Using a combination of nonlinear discrete models with long-term field data, we constructed three basic Bayesian state space models with increasing density dependence (DD), and the same three models with the addition of summer drought effects. 3. We subsequently used Monte Carlo simulations to evaluate the combined effect of drought, DD, and increasing extractions on the probability of population survival under two climate change scenarios (based on the Intergovernmental Panel on Climate Change predictions): (i) increase in drought variability; and (ii) increase in mean drought severity. 4. The population grew from 16 individuals introduced in 1975 to close to 700 by 1993. Our results show that the population's growth was dominated by DD, with drought having a secondary but still relevant effect on its dynamics. 5. Our predictions suggest that under climate change scenario (i), extraction dominates the fate of the population, while for scenario (ii), an increase in mean drought affects the population's probability of survival in an equivalent magnitude as extractions. Thus, for the long-term survival of the population, our results stress that a more variable environment is less threatening than one in which the mean conditions become harsher. Current climate change scenarios and their underlying uncertainty make studies such as this one crucial for understanding the dynamics of ungulate populations and their conservation.

  7. Population pharmacokinetic study of memantine: effects of clinical and genetic factors.

    PubMed

    Noetzli, Muriel; Guidi, Monia; Ebbing, Karsten; Eyer, Stephan; Wilhelm, Laurence; Michon, Agnès; Thomazic, Valérie; Alnawaqil, Abdel-Messieh; Maurer, Sophie; Zumbach, Serge; Giannakopoulos, Panteleimon; von Gunten, Armin; Csajka, Chantal; Eap, Chin B

    2013-03-01

    Memantine, a frequently prescribed anti-dementia drug, is mainly eliminated unchanged by the kidneys, partly via tubular secretion. Considerable inter-individual variability in plasma concentrations has been reported. We aimed to investigate clinical and genetic factors influencing memantine disposition. A population pharmacokinetic study was performed including data from 108 patients recruited in a naturalistic setting. Patients were genotyped for common polymorphisms in renal cation transporters (SLC22A1/2/5, SLC47A1, ABCB1) and nuclear receptors (NR1I2, NR1I3, RXR, PPAR) involved in transporter expression. The average clearance was 5.2 L/h with a 27 % inter-individual variability (percentage coefficient of variation). Glomerular filtration rate (p = 0.007) and sex (p = 0.001) markedly influenced memantine clearance. NR1I2 rs1523130 was identified as the unique significant genetic covariate for memantine clearance (p = 0.006), with carriers of the NR1I2 rs1523130 CT/TT genotypes presenting a 16 % slower memantine elimination than carriers of the CC genotype. The better understanding of inter-individual variability of memantine disposition might be beneficial in the context of individual dose optimization.

  8. Thrust control system design of ducted rockets

    NASA Astrophysics Data System (ADS)

    Chang, Juntao; Li, Bin; Bao, Wen; Niu, Wenyu; Yu, Daren

    2011-07-01

    The investigation of the thrust control system is aroused by the need for propulsion system of ducted rockets. Firstly the dynamic mathematical models of gas flow regulating system, pneumatic servo system and ducted rocket engine were established and analyzed. Then, to conquer the discussed problems of thrust control, the idea of information fusion was proposed to construct a new feedback variable. With this fused feedback variable, the thrust control system was designed. According to the simulation results, the introduction of the new fused feedback variable is valid in eliminating the contradiction between rapid response and stability for the thrust control system of ducted rockets.

  9. The Ultrasensitivity of Living Polymers

    NASA Astrophysics Data System (ADS)

    O'Shaughnessy, Ben; Vavylonis, Dimitrios

    2003-03-01

    Synthetic and biological living polymers are self-assembling chains whose chain length distributions (CLDs) are dynamic. We show these dynamics are ultrasensitive: Even a small perturbation (e.g., temperature jump) nonlinearly distorts the CLD, eliminating or massively augmenting short chains. The origin is fast relaxation of mass variables (mean chain length, monomer concentration) which perturbs CLD shape variables before these can relax via slow chain growth rate fluctuations. Viscosity relaxation predictions agree with experiments on the best-studied synthetic system, α-methylstyrene.

  10. Novel formulation of abiraterone acetate might allow significant dose reduction and eliminates substantial positive food effect.

    PubMed

    Solymosi, Tamás; Ötvös, Zsolt; Angi, Réka; Ordasi, Betti; Jordán, Tamás; Molnár, László; McDermott, John; Zann, Vanessa; Church, Ann; Mair, Stuart; Filipcsei, Genovéva; Heltovics, Gábor; Glavinas, Hristos

    2017-10-01

    Zytiga (abiraterone acetate, AA) is known to exhibit very low bioavailability and a significant positive food effect in men. The unfavorable pharmacokinetic properties are attributed to the inadequate and variable dissolution of the compound. Using a continuous flow precipitation technology, a novel AA formulation has been developed with improved solubility and dissolution characteristics. The current study was performed to evaluate the pharmacokinetics and safety of this novel formulation in healthy volunteers. The study was conducted in 11 healthy men aged 47-57 years. All subjects received 3 consecutive single doses of the novel formulation of AA (100 and 200 mg in the fasted state and 200 mg in the fed state). Data were compared with pharmacokinetic and safety data reported for 1000 mg Zytiga, the marketed drug. The novel formulation of AA allows rapid absorption of the compound with t max values within 1 hour. Based on AUC values, a ~250 mg dose of the novel formulation is predicted to give the same exposure as 1000 mg Zytiga in the fasted state. The significant positive food effect was also eliminated; actually, a slight, but statistically significant negative food effect was observed. Variability of exposure was significantly reduced when compared to Zytiga. AA administered in the novel formulation was well tolerated with no IMP-related safety AEs reported. The novel formulation might allow a 75% dose reduction with significant reduction of inter-individual variability. The negative food effect observed requires further investigations; however, elimination of the significant positive food effect could be adequate to negate the restriction of a food label.

  11. Review of four publications on the Danish cohort study on mobile phone subscribers and risk of brain tumors.

    PubMed

    Söderqvist, Fredrik; Carlberg, Michael; Hardell, Lennart

    2012-01-01

    Since the International Agency for Research on Cancer recently classified radiofrequency electromagnetic fields, such as those emanating from mobile and cordless phones, as possibly carcinogenic to humans (group 2B), two additional reports relevant to the topic have been published. Both articles were new updates of a Danish cohort on mobile phone subscribers and concern the possible association between assumed use of mobile phones and risk of brain tumors. The aim of the present review is to reexamine all four publications on this cohort. In brief, publications were scrutinized, and in particular, if the authors made explicit claims to have either proved or disproved their hypothesis, such claims were reviewed in light of applied methods and study design, and in principle, the stronger the claims, the more careful our review. The nationwide Danish cohort study on mobile phone subscribers and risk of brain tumors, including at best 420,095 persons (58% of the initial cohort), is the only one of its kind. In comparison with previous investigations, i.e., case-control studies, its strength lies in the possibility to eliminate non-response, selection, and recall bias. Although at least non-response and recall bias can be excluded, the study has serious limitations related to exposure assessment. In fact, these limitations cloud the findings of the four reports to such an extent that render them uninformative at best. At worst, they may be used in a seemingly solid argument against an increased risk--as reassuring results from a large nationwide cohort study, which rules out not only non-response and recall bias but also an increased risk as indicated by tight confidence intervals. Although two of the most comprehensive case-control studies on the matter both have methodological limitations that need to be carefully considered, type I errors are not the only threats to the validity of studies on this topic--the Danish cohort study is a textbook example of that.

  12. Cost Effectiveness of Malaria Interventions from Preelimination through Elimination: a Study in Iran.

    PubMed

    Rezaei-Hemami, Mohsen; Akbari-Sari, Ali; Raiesi, Ahmad; Vatandoost, Hassan; Majdzadeh, Reza

    2014-01-01

    Malaria still is considered as a public health problem in Iran. The aim of the National Malaria Control Department is to reach the elimination by 2024. By decreasing the number of malaria cases in preelimination phase the cost effectiveness of malaria interventions decreases considerably. This study estimated the cost effectiveness of various strategies to combat malaria in preelimination and elimination phases in Iran. running costs of the interventions at each level of intervention was estimated by using evidence and expert opinions. The effect of each intervention was estimated using the documentary evidence available and expert opinions. Using a point estimate and distribution of each variable the sensitivity was evaluated with the Monte Carlo method. The most cost-effective interventions were insecticide treated net (ITN), larviciding, surveillance for diagnosis and treatment of patients less than 24 hours, and indoor residual spraying (IRS) respectively, No related evidence found for the effectiveness of the border facilities. This study showed that interventions in the elimination phase of malaria have low cost effectiveness in Iran like many other countries. However ITN is the most cost effective intervention among the available interventions.

  13. Clinical diagnostic exome evaluation for an infant with a lethal disorder: genetic diagnosis of TARP syndrome and expansion of the phenotype in a patient with a newly reported RBM10 alteration.

    PubMed

    Powis, Zöe; Hart, Alexa; Cherny, Sara; Petrik, Igor; Palmaer, Erika; Tang, Sha; Jones, Carolyn

    2017-06-02

    Diagnostic Exome Sequencing (DES) has been shown to be an effective tool for diagnosis individuals with suspected genetic conditions. We report a male infant born with multiple anomalies including bilateral dysplastic kidneys, cleft palate, bilateral talipes, and bilateral absence of thumbs and first toes. Prenatal testing including chromosome analysis and microarray did not identify a cause for the multiple congenital anomalies. Postnatal diagnostic exome studies (DES) were utilized to find a molecular diagnosis for the patient. Exome sequencing of the proband, mother, and father showed a previously unreported maternally inherited RNA binding motif protein 10 (RBM10) c.1352_1353delAG (p.E451Vfs*66) alteration. Mutations in RBM10 are associated with TARP syndrome, an X-linked recessive disorder originally described with cardinal features of talipes equinovarus, atrial septal defect, Robin sequence, and persistent left superior vena cava. DES established a molecular genetic diagnosis of TARP syndrome for a neonatal patient with a poor prognosis in whom traditional testing methods were uninformative and allowed for efficient diagnosis and future reproductive options for the parents. Other reported cases of TARP syndrome demonstrate significant variability in clinical phenotype. The reported features in this infant including multiple hemivertebrae, imperforate anus, aplasia of thumbs and first toes have not been reported in previous patients, thus expanding the clinical phenotype for this rare disorder.

  14. Visually guided auditory attention in a dynamic "cocktail-party" speech perception task: ERP evidence for age-related differences.

    PubMed

    Getzmann, Stephan; Wascher, Edmund

    2017-02-01

    Speech understanding in the presence of concurring sound is a major challenge especially for older persons. In particular, conversational turn-takings usually result in switch costs, as indicated by declined speech perception after changes in the relevant target talker. Here, we investigated whether visual cues indicating the future position of a target talker may reduce the costs of switching in younger and older adults. We employed a speech perception task, in which sequences of short words were simultaneously presented by three talkers, and analysed behavioural measures and event-related potentials (ERPs). Informative cues resulted in increased performance after a spatial change in target talker compared to uninformative cues, not indicating the future target position. Especially the older participants benefited from knowing the future target position in advance, indicated by reduced response times after informative cues. The ERP analysis revealed an overall reduced N2, and a reduced P3b to changes in the target talker location in older participants, suggesting reduced inhibitory control and context updating. On the other hand, a pronounced frontal late positive complex (f-LPC) to the informative cues indicated increased allocation of attentional resources to changes in target talker in the older group, in line with the decline-compensation hypothesis. Thus, knowing where to listen has the potential to compensate for age-related decline in attentional switching in a highly variable cocktail-party environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. STUDY OF THE SUBARCTIC HEAT ISLAND AT FAIRBANKS, ALASKA

    EPA Science Inventory

    The heat island associated with the City of Fairbanks, Alaska was studied as a means of isolating the effects of self-heating modified radiative transfer from other causes of heat islands. Minimal winter insolation virtually eliminated the effects of variable albedo and the daily...

  16. Strong diffusion formulation of Markov chain ensembles and its optimal weaker reductions

    NASA Astrophysics Data System (ADS)

    Güler, Marifi

    2017-10-01

    Two self-contained diffusion formulations, in the form of coupled stochastic differential equations, are developed for the temporal evolution of state densities over an ensemble of Markov chains evolving independently under a common transition rate matrix. Our first formulation derives from Kurtz's strong approximation theorem of density-dependent Markov jump processes [Stoch. Process. Their Appl. 6, 223 (1978), 10.1016/0304-4149(78)90020-0] and, therefore, strongly converges with an error bound of the order of lnN /N for ensemble size N . The second formulation eliminates some fluctuation variables, and correspondingly some noise terms, within the governing equations of the strong formulation, with the objective of achieving a simpler analytic formulation and a faster computation algorithm when the transition rates are constant or slowly varying. There, the reduction of the structural complexity is optimal in the sense that the elimination of any given set of variables takes place with the lowest attainable increase in the error bound. The resultant formulations are supported by numerical simulations.

  17. Theory and simulations of covariance mapping in multiple dimensions for data analysis in high-event-rate experiments

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.

    2014-05-01

    Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.

  18. [Variability patterns of nest construction, physiological state, and morphometric traits in honey bee].

    PubMed

    Es'kov, E K; Es'kova, M D

    2014-01-01

    High variability of cells size is used selectively for reproduction of working bees and drones. A decrease in both distance between cells and cells size themselves causes similar effects to body mass and morphometric traits of developing individuals. Adaptation of honey bees to living in shelters has led to their becoming tolerant to hypoxia. Improvement of ethological and physiological mechanisms of thermal regulation is associated with limitation of ecological valence and acquiring of stenothermic features by breed. Optimal thermal conditions for breed are limited by the interval 33-34.5 degrees C. Deviations of temperature by 3-4 degrees C beyond this range have minimum lethal effect at embryonic stage of development and medium effect at the stage of pre-pupa and pupa. Developing at the low bound of the vital range leads to increasing, while developing at the upper bound--to decreasing of body mass, mandibular and hypopharyngeal glands, as well as other organs, which, later, affects the variability of these traits during the adult stage of development. Eliminative and teratogenic efficiency of ecological factors that affect a breed is most often manifested in underdevelopment of wings. However, their size (in case of wing laminas formation). is characterized by relatively low variability and size-dependent asymmetry. Asymmetry variability of wings and other pair organs is expressed through realignment of size excess from right- to left-side one with respect to their increase. Selective elimination by those traits whose emerging probability increases as developmental conditions deviate from the optimal ones promotes restrictions on individual variability. Physiological mechanisms that facilitate adaptability enhancement under conditions of increasing anthropogenic contamination of eivironment and trophic substrates consumed by honey bees, arrear to be toxicants accumulation in rectum and crops' ability to absorb contaminants from nectar in course of its processing to honey.

  19. A proposed national strategy for tuberculosis vaccine development.

    PubMed

    Ginsberg, A M

    2000-06-01

    The global tuberculosis epidemic causes approximately 5% of deaths worldwide. Despite recent concerted and largely successful tuberculosis control efforts, the incidence of tuberculosis in the United States remains 74-fold higher than the stated elimination goal of <1 case per million population by the year 2010. Current bacille Calmette-Guérin vaccines, although efficacious in preventing extrapulmonary tuberculosis in young children, have shown widely variable efficacy in preventing adult pulmonary tuberculosis, confound skin test screening, and are not recommended for use in the United States. The Advisory Council for Elimination of Tuberculosis recently stated that tuberculosis would not be eliminated from the United States without a more effective vaccine. Recent scientific advances have created unprecedented opportunity for tuberculosis vaccine development. Therefore, members of the broad tuberculosis research and control communities have recently created and proposed a national strategy, or blueprint, for tuberculosis vaccine development, which is presented here.

  20. Electron dynamics and transverse-kick elimination in a high-field short-period helical microwave undulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C.; Shumail, M.; Tantawi, S.

    2012-10-15

    Single electron dynamics for a circular polarized standing wave (CPSW) undulator synthesized from a corrugated cavity operating with a very low-loss HE{sub 11} mode are analyzed. The mechanism of the transverse drift of the CPSW undulator and its elimination are researched, and the tapered-field ends are found effectively to suppress the kick. A prototype of the CPSW undulator with the characters of short undulating-period 1.4 cm, high field K {approx} 1, large aperture {approx} 5 cm, and variable polarization is designed and modeled, whose 3-dimensional electromagnetic fields are used to research the suppression of the transverse kick.

  1. Microcounseling Skill Discrimination Scale: A Methodological Note

    ERIC Educational Resources Information Center

    Stokes, Joseph; Romer, Daniel

    1977-01-01

    Absolute ratings on the Microcounseling Skill Discrimination Scale (MSDS) confound the individual's use of the rating scale and actual ability to discriminate effective and ineffective counselor behaviors. This note suggests methods of scoring the MSDS that will eliminate variability attributable to response language and improve the validity of…

  2. Novel and traditional traits of frozen-thawed porcine sperm related to in vitro fertilization success

    USDA-ARS?s Scientific Manuscript database

    Cryopreserved semen allows the use of single ejaculates for repeated analyses, potentially improving in vitro fertilization (IVF) consistency by eliminating inter-ejaculate variability observed with fresh semen. However, the freezing and thawing processes result in compromised sperm function and IVF...

  3. Cryptic diversity in Afro-tropical lowland forests: The systematics and biogeography of the avian genus Bleda.

    PubMed

    Huntley, Jerry W; Voelker, Gary

    2016-06-01

    Recent investigations of distributional patterns of Afro-tropical lowland forest species have demonstrated to some degree our overall lack of understanding involving historical diversification patterns. Traditionally, researchers have relied upon two hypotheses, each of which views the lowland forest of Africa in differing roles. The Pleistocene Forest Refuge Hypothesis (PFRH) posits that biogeographic patterns of avian lowland species are explained via allopatric speciation during forest fragmentation cycles in the Pleistocene epoch (c. 1.8Ma-11,700Ka). The Montane Speciation Hypothesis (MSH) countered by suggesting that lowland forests are "evolutionary museums" where species, which originally evolved in montane forest refuge centers, remained without further diversification. Furthermore, investigations have largely regarded widespread, avian species which lack phenotypic variability as biogeographically "uninformative", with regards to historical biogeographic patterns. To test the tenets of these ideas, we investigated the systematics and biogeography of the genus Bleda, whose constituent species are restricted to lowland forest and are lacking in phenotypic variation. Using extracted DNA from 179 individuals, we amplified two mitochondrial genes and three nuclear loci and utilized Bayesian phylogenetic methods and molecular clock dating to develop a time-calibrated phylogeny of Bleda. We used LaGrange to develop an ancestral area reconstruction for the genus. Haplotype networks for three species were generated using Network. We recovered the four currently recognized species of Bleda, plus a monophyletic B. ugandae, a current sub-species which may warrant full species status. We found that the origins of the genus Bleda are estimated to be in the Upper Guinean forests of West Africa, dating to the Miocene (c. 7.5Ma), while the speciation events for the rest of the genus are dated to the Pliocene (c. 5-1.8Ma). Our analyses recovered discrete and highly differentiated geographic structuring of genetic diversity in West and Central Africa in three of five species, with many of the diversification events dating to the Pleistocene. The biogeographic patterns observed in Bleda can be explained through a combination of isolation via forest refuges during the Plio-Pleistocene and riverine barriers limiting secondary contact after forest expansion. We find evidence for the PFRH as a driver of intra-specific diversity, but conclude that it does not facilitate an explanation for speciation in the genus Bleda. The "evolutionary museum" concept furnished by the MSH is countered by our evidence of in situ diversification in the lowland forests of Africa. Additionally, our results provide strong evidence of the value of seemingly "uninformative" widespread avian taxa for revealing complex patterns of forest diversity. Overall, our study highlights that past researchers have both underestimated the amount of diversity found in lowland forests and failed to understand the complexity of historical forces shaping that diversity. Gaining a better understanding of lowland forest diversity and the historical factors which have shaped it will crucial in determining conservation tactics in the near future. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. ELIMINATION OF THE CHARACTERIZATION OF DWPF POUR STREAM SAMPLE AND THE GLASS FABRICATION AND TESTING OF THE DWPF SLUDGE BATCH QUALIFICATION SAMPLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amoroso, J.; Peeler, D.; Edwards, T.

    2012-05-11

    A recommendation to eliminate all characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification sample was made by a Six-Sigma team chartered to eliminate non-value-added activities for the Defense Waste Processing Facility (DWPF) sludge batch qualification program and is documented in the report SS-PIP-2006-00030. That recommendation was supported through a technical data review by the Savannah River National Laboratory (SRNL) and is documented in the memorandums SRNL-PSE-2007-00079 and SRNL-PSE-2007-00080. At the time of writing those memorandums, the DWPF was processing sludge-only waste but, has since transitioned to a coupledmore » operation (sludge and salt). The SRNL was recently tasked to perform a similar data review relevant to coupled operations and re-evaluate the previous recommendations. This report evaluates the validity of eliminating the characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification samples based on sludge-only and coupled operations. The pour stream sample has confirmed the DWPF's ability to produce an acceptable waste form from Slurry Mix Evaporator (SME) blending and product composition/durability predictions for the previous sixteen years but, ultimately the pour stream analysis has added minimal value to the DWPF's waste qualification strategy. Similarly, the information gained from the glass fabrication and PCT of the sludge batch qualification sample was determined to add minimal value to the waste qualification strategy since that sample is routinely not representative of the waste composition ultimately processed at the DWPF due to blending and salt processing considerations. Moreover, the qualification process has repeatedly confirmed minimal differences in glass behavior from actual radioactive waste to glasses fabricated from simulants or batch chemicals. In contrast, the variability study has significantly added value to the DWPF's qualification strategy. The variability study has evolved to become the primary aspect of the DWPF's compliance strategy as it has been shown to be versatile and capable of adapting to the DWPF's various and diverse waste streams and blending strategies. The variability study, which aims to ensure durability requirements and the PCT and chemical composition correlations are valid for the compositional region to be processed at the DWPF, must continue to be performed. Due to the importance of the variability study and its place in the DWPF's qualification strategy, it will also be discussed in this report. An analysis of historical data and Production Records indicated that the recommendation of the Six Sigma team to eliminate all characterization of pour stream glass samples and the glass fabrication and PCT performed with the qualification glass does not compromise the DWPF's current compliance plan. Furthermore, the DWPF should continue to produce an acceptable waste form following the remaining elements of the Glass Product Control Program; regardless of a sludge-only or coupled operations strategy. If the DWPF does decide to eliminate the characterization of pour stream samples, pour stream samples should continue to be collected for archival reasons, which would allow testing to be performed should any issues arise or new repository test methods be developed.« less

  5. Misconduct accounts for the majority of retracted scientific publications

    PubMed Central

    Fang, Ferric C.; Steen, R. Grant; Casadevall, Arturo

    2012-01-01

    A detailed review of all 2,047 biomedical and life-science research articles indexed by PubMed as retracted on May 3, 2012 revealed that only 21.3% of retractions were attributable to error. In contrast, 67.4% of retractions were attributable to misconduct, including fraud or suspected fraud (43.4%), duplicate publication (14.2%), and plagiarism (9.8%). Incomplete, uninformative or misleading retraction announcements have led to a previous underestimation of the role of fraud in the ongoing retraction epidemic. The percentage of scientific articles retracted because of fraud has increased ∼10-fold since 1975. Retractions exhibit distinctive temporal and geographic patterns that may reveal underlying causes. PMID:23027971

  6. The impact of recreational video game play on children's and adolescents' cognition.

    PubMed

    Blumberg, Fran C; Altschuler, Elizabeth A; Almonte, Debby E; Mileaf, Maxwell I

    2013-01-01

    Current empirical findings show linkages between recreational video game play and enhanced cognitive skills, primarily among young adults. However, consideration of this linkage among children and adolescents is sparse. Thus, discussions about facilitating transfer of cognitive skills from video game play to academic tasks among children and adolescents remains largely uninformed by research. To inform this discussion, we review available research concerning the cognitive benefits of video game play among children and adolescents and their impressions of video games as learning tools as these impressions may impact their application of cognitive skills used during game play to academic tasks. Copyright © 2013 Wiley Periodicals, Inc., A Wiley Company.

  7. [Evaluation of the virus-elimination efficacy of nanofiltration (Viresolve NFP) for the parvovirus B19 and hepatitis A virus].

    PubMed

    Oh, Deok Ja; Lee, Yoo La; Kang, Jae Won; Kwon, So Yong; Cho, Nam Sun; Kim, In Seop

    2010-02-01

    The safety of plasma derivatives has been reinforced since 1980s by variable pathogen inactivation or elimination techniques. Nucleic acid amplification test (NAT) for the source plasma has also been implemented worldwide. Recently nanofiltration has been used in some country for ensuring safety of plasma derivatives to eliminate non-enveloped viruses such as parvovirus B19 (B19V) and hepatitis A virus (HAV). We evaluated the efficacy of nanofiltration for the elimination of B19V and HAV. To verify the efficacy of nanofiltration, we adopted a 20 nm Viresolve NFP (Millipore, USA) in the scaling down (1:1,370) model of the antithrombin III production. As virus stock solutions, we used B19V reactive plasma and porcine parvovirus (PPV) and HAV obtained from cell culture. And 50% tissue culture infectious dose was consumed as infectious dose. The methods used to evaluate the virus-elimination efficacy were reverse-transcriptase polymerase chain reaction for B19V and the cytopathic effect calculation after filtration for PPV and HAV. B19V was not detected by RT-PCR in the filtered antithrombin III solutions with initial viral load of 6.42 x 10(5) IU/mL and 1.42 x 10(5) IU/mL before filtration. The virus-elimination efficacy of nanofiltration for PPV and HAV were > or = (3.32) and > or = (3.31), respectively. Nanofiltration would be an effective method for the elimination of B19V and HAV. It may be used as a substitute for NAT screening of these viruses in source plasma to ensure safety of plasma derivatives in Korea.

  8. Motion of the angular momentum vector in body coordinates for torque-free dual-spin spacecraft

    NASA Technical Reports Server (NTRS)

    Fedor, J. V.

    1981-01-01

    The motion of the angular momentum vector in body coordinates for torque free, asymmetric dual spin spacecraft without and, for a special case, with energy dissipation on the main spacecraft is investigated. Without energy dissipation, two integrals can be obtained from the Euler equations of motion. Using the classical method of elimination of variable, the motion about the equilibrium points (six for the general case) are derived with these integrals. For small nutation angle, theta, the trajectories about the theta = 0 deg and theta = 180 deg points readily show the requirements for stable motion about these points. Also the conditions needed to eliminate stable motion about the theta = 180 deg point as well as the other undesireable equilibrium points follow directly from these equations. For the special case where the angular momentum vector moves about the principal axis which contains the momentum wheel, the notion of 'free variable' azimuth angle is used. Physically this angle must vary from 0 to 2 pi in a circular periodic fashion. Expressions are thus obtained for the nutation angle in terms of the free variable and other spacecraft parameters. Results show that in general there are two separate trajectory expressions that govern the motion of the angular momentum vector in body coordinates.

  9. Gene × environment interaction studies have not properly controlled for potential confounders: the problem and the (simple) solution.

    PubMed

    Keller, Matthew C

    2014-01-01

    Candidate gene × environment (G × E) interaction research tests the hypothesis that the effects of some environmental variable (e.g., childhood maltreatment) on some outcome measure (e.g., depression) depend on a particular genetic polymorphism. Because this research is inherently nonexperimental, investigators have been rightly concerned that detected interactions could be driven by confounders (e.g., ethnicity, gender, age, socioeconomic status) rather than by the specified genetic or environmental variables per se. In an attempt to eliminate such alternative explanations for detected G × E interactions, investigators routinely enter the potential confounders as covariates in general linear models. However, this practice does not control for the effects these variables might have on the G × E interaction. Rather, to properly control for confounders, researchers need to enter the covariate × environment and the covariate × gene interaction terms in the same model that tests the G × E term. In this manuscript, I demonstrate this point analytically and show that the practice of improperly controlling for covariates is the norm in the G × E interaction literature to date. Thus, many alternative explanations for G × E findings that investigators had thought were eliminated have not been. © 2013 Society of Biological Psychiatry Published by Society of Biological Psychiatry All rights reserved.

  10. Classical versus Computer Algebra Methods in Elementary Geometry

    ERIC Educational Resources Information Center

    Pech, Pavel

    2005-01-01

    Computer algebra methods based on results of commutative algebra like Groebner bases of ideals and elimination of variables make it possible to solve complex, elementary and non elementary problems of geometry, which are difficult to solve using a classical approach. Computer algebra methods permit the proof of geometric theorems, automatic…

  11. The Iodochlorination of Styrene: An Experiment that Makes a Difference

    ERIC Educational Resources Information Center

    Amiet, R. Gary; Urban, Sylvia

    2008-01-01

    The iodochlorination of styrene, involving the addition of iodine monochloride to styrene, followed by the sodium methoxide-initiated dehydrohalogenation of the product results in a variable mixture of substituted styrenes by way of various substitution and elimination reaction mechanisms. As a result individual results are obtained for each…

  12. Cost Effectiveness of Malaria Interventions from Preelimination through Elimination: a Study in Iran

    PubMed Central

    Rezaei-Hemami, Mohsen; Akbari-Sari, Ali; Raiesi, Ahmad; Vatandoost, Hassan; Majdzadeh, Reza

    2014-01-01

    Background Malaria still is considered as a public health problem in Iran. The aim of the National Malaria Control Department is to reach the elimination by 2024. By decreasing the number of malaria cases in preelimination phase the cost effectiveness of malaria interventions decreases considerably. This study estimated the cost effectiveness of various strategies to combat malaria in preelimination and elimination phases in Iran. Methods: running costs of the interventions at each level of intervention was estimated by using evidence and expert opinions. The effect of each intervention was estimated using the documentary evidence available and expert opinions. Using a point estimate and distribution of each variable the sensitivity was evaluated with the Monte Carlo method. Results: The most cost-effective interventions were insecticide treated net (ITN), larviciding, surveillance for diagnosis and treatment of patients less than 24 hours, and indoor residual spraying (IRS) respectively, No related evidence found for the effectiveness of the border facilities. Conclusion: This study showed that interventions in the elimination phase of malaria have low cost effectiveness in Iran like many other countries. However ITN is the most cost effective intervention among the available interventions. PMID:25629064

  13. An automated A-value measurement tool for accurate cochlear duct length estimation.

    PubMed

    Iyaniwura, John E; Elfarnawany, Mai; Ladak, Hanif M; Agrawal, Sumit K

    2018-01-22

    There has been renewed interest in the cochlear duct length (CDL) for preoperative cochlear implant electrode selection and postoperative generation of patient-specific frequency maps. The CDL can be estimated by measuring the A-value, which is defined as the length between the round window and the furthest point on the basal turn. Unfortunately, there is significant intra- and inter-observer variability when these measurements are made clinically. The objective of this study was to develop an automated A-value measurement algorithm to improve accuracy and eliminate observer variability. Clinical and micro-CT images of 20 cadaveric cochleae specimens were acquired. The micro-CT of one sample was chosen as the atlas, and A-value fiducials were placed onto that image. Image registration (rigid affine and non-rigid B-spline) was applied between the atlas and the 19 remaining clinical CT images. The registration transform was applied to the A-value fiducials, and the A-value was then automatically calculated for each specimen. High resolution micro-CT images of the same 19 specimens were used to measure the gold standard A-values for comparison against the manual and automated methods. The registration algorithm had excellent qualitative overlap between the atlas and target images. The automated method eliminated the observer variability and the systematic underestimation by experts. Manual measurement of the A-value on clinical CT had a mean error of 9.5 ± 4.3% compared to micro-CT, and this improved to an error of 2.7 ± 2.1% using the automated algorithm. Both the automated and manual methods correlated significantly with the gold standard micro-CT A-values (r = 0.70, p < 0.01 and r = 0.69, p < 0.01, respectively). An automated A-value measurement tool using atlas-based registration methods was successfully developed and validated. The automated method eliminated the observer variability and improved accuracy as compared to manual measurements by experts. This open-source tool has the potential to benefit cochlear implant recipients in the future.

  14. Emergent collective decision-making: Control, model and behavior

    NASA Astrophysics Data System (ADS)

    Shen, Tian

    In this dissertation we study emergent collective decision-making in social groups with time-varying interactions and heterogeneously informed individuals. First we analyze a nonlinear dynamical systems model motivated by animal collective motion with heterogeneously informed subpopulations, to examine the role of uninformed individuals. We find through formal analysis that adding uninformed individuals in a group increases the likelihood of a collective decision. Secondly, we propose a model for human shared decision-making with continuous-time feedback and where individuals have little information about the true preferences of other group members. We study model equilibria using bifurcation analysis to understand how the model predicts decisions based on the critical threshold parameters that represent an individual's tradeoff between social and environmental influences. Thirdly, we analyze continuous-time data of pairs of human subjects performing an experimental shared tracking task using our second proposed model in order to understand transient behavior and the decision-making process. We fit the model to data and show that it reproduces a wide range of human behaviors surprisingly well, suggesting that the model may have captured the mechanisms of observed behaviors. Finally, we study human behavior from a game-theoretic perspective by modeling the aforementioned tracking task as a repeated game with incomplete information. We show that the majority of the players are able to converge to playing Nash equilibrium strategies. We then suggest with simulations that the mean field evolution of strategies in the population resemble replicator dynamics, indicating that the individual strategies may be myopic. Decisions form the basis of control and problems involving deciding collectively between alternatives are ubiquitous in nature and in engineering. Understanding how multi-agent systems make decisions among alternatives also provides insight for designing decentralized control laws for engineering applications from mobile sensor networks for environmental monitoring to collective construction robots. With this dissertation we hope to provide additional methodology and mathematical models for understanding the behavior and control of collective decision-making in multi-agent systems.

  15. Is anterior N2 enhancement a reliable electrophysiological index of concealed information?

    PubMed

    Ganis, Giorgio; Bridges, David; Hsu, Chun-Wei; Schendan, Haline E

    2016-12-01

    Concealed information tests (CITs) are used to determine whether an individual possesses information about an item of interest. Event-related potential (ERP) measures in CITs have focused almost exclusively on the P3b component, showing that this component is larger when lying about the item of interest (probe) than telling the truth about control items (irrelevants). Recent studies have begun to examine other ERP components, such as the anterior N2, with mixed results. A seminal CIT study found that visual probes elicit a larger anterior N2 than irrelevants (Gamer and Berti, 2010) and suggested that this component indexes cognitive control processes engaged when lying about probes. However, this study did not control for potential intrinsic differences among the stimuli: the same probe and irrelevants were used for all participants, and there was no control condition composed of uninformed participants. Here, first we show that the N2 effect found in the study by Gamer and Berti (2010) was in large part due to stimulus differences, as the effect observed in a concealed information condition was comparable to that found in two matched control conditions without any concealed information (Experiments 1 and 2). Next, we addressed the issue of the generality of the N2 findings by counterbalancing a new set of stimuli across participants and by using a control condition with uninformed participants (Experiment 3). Results show that the probe did not elicit a larger anterior N2 than the irrelevants under these controlled conditions. These findings suggest that caution should be taken in using the N2 as an index of concealed information in CITs. Furthermore, they are a reminder that results of CIT studies (not only with ERPs) performed without stimulus counterbalancing and suitable control conditions may be confounded by differential intrinsic properties of the stimuli employed. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Predictive data modeling of human type II diabetes related statistics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Kristina L.; Jaenisch, Holger M.; Handley, James W.; Albritton, Nathaniel G.

    2009-04-01

    During the course of routine Type II treatment of one of the authors, it was decided to derive predictive analytical Data Models of the daily sampled vital statistics: namely weight, blood pressure, and blood sugar, to determine if the covariance among the observed variables could yield a descriptive equation based model, or better still, a predictive analytical model that could forecast the expected future trend of the variables and possibly eliminate the number of finger stickings required to montior blood sugar levels. The personal history and analysis with resulting models are presented.

  17. New evidence favoring multilevel decomposition and optimization

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Polignone, Debra A.

    1990-01-01

    The issue of the utility of multilevel decomposition and optimization remains controversial. To date, only the structural optimization community has actively developed and promoted multilevel optimization techniques. However, even this community acknowledges that multilevel optimization is ideally suited for a rather limited set of problems. It is warned that decomposition typically requires eliminating local variables by using global variables and that this in turn causes ill-conditioning of the multilevel optimization by adding equality constraints. The purpose is to suggest a new multilevel optimization technique. This technique uses behavior variables, in addition to design variables and constraints, to decompose the problem. The new technique removes the need for equality constraints, simplifies the decomposition of the design problem, simplifies the programming task, and improves the convergence speed of multilevel optimization compared to conventional optimization.

  18. Ratio index variables or ANCOVA? Fisher's cats revisited.

    PubMed

    Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S

    2010-01-01

    Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.

  19. Genetic diversity in natural populations of a soil bacterium across a landscape gradient

    PubMed Central

    McArthur, J. Vaun; Kovacic, David A.; Smith, Michael H.

    1988-01-01

    Genetic diversity in natural populations of the bacterium Pseudomonas cepacia was surveyed in 10 enzymes from 70 clones isolated along a landscape gradient. Estimates of genetic diversity, ranging from 0.54 to 0.70, were higher than any previously reported values of which we are aware and were positively correlated with habitat variability. Patterns of bacterial genetic diversity were correlated with habitat variability. Findings indicate that the source of strains used in genetic engineering will greatly affect the outcome of planned releases in variable environments. Selection of generalist strains may confer a large advantage to engineered populations, while selection of laboratory strains may result in quick elimination of the engineered strains. PMID:16594009

  20. On Rater Agreement and Rater Training

    ERIC Educational Resources Information Center

    Wang, Binhong

    2010-01-01

    This paper first analyzed two studies on rater factors and rating criteria to raise the problem of rater agreement. After that the author reveals the causes of discrepencies in rating administration by discussing rater variability and rater bias. The author argues that rater bias can not be eliminated completely, we can only reduce the error to a…

  1. Sex-Ratio and Gender Differences in Depression in an Unselected Adult Population.

    ERIC Educational Resources Information Center

    Baumgart, E. P.; Oliver, J. M.

    1981-01-01

    Neither sex-ratio nor gender differences in depression were found in adult sample, similar to pattern found among university students. No demographic variable was correlated significantly with depression. Suggests results may be due to the elimination of face-to-face interviews, which expose males to greater negative repercussions for exhibiting…

  2. USE OF HOPANE AS A CONSERVATIVE BIOMARKER FOR MONITORING THE BIOREMEDIATION EFFECTIVENESS OF CRUDE OIL CONTAMINATING A SANDY BEACH

    EPA Science Inventory

    Much of the variability inherent in crude oil bioremediation field studies can be eliminated by normalizing analyte concentrations to the concentration of a nonbiodegradable biomarker such as hopane. This was demonstrated with data from a field study in which crude oil was intent...

  3. Wind, Wave, and Tidal Energy Without Power Conditioning

    NASA Technical Reports Server (NTRS)

    Jones, Jack A.

    2013-01-01

    Most present wind, wave, and tidal energy systems require expensive power conditioning systems that reduce overall efficiency. This new design eliminates power conditioning all, or nearly all, of the time. Wind, wave, and tidal energy systems can transmit their energy to pumps that send high-pressure fluid to a central power production area. The central power production area can consist of a series of hydraulic generators. The hydraulic generators can be variable displacement generators such that the RPM, and thus the voltage, remains constant, eliminating the need for further power conditioning. A series of wind blades is attached to a series of radial piston pumps, which pump fluid to a series of axial piston motors attached to generators. As the wind is reduced, the amount of energy is reduced, and the number of active hydraulic generators can be reduced to maintain a nearly constant RPM. If the axial piston motors have variable displacement, an exact RPM can be maintained for all, or nearly all, wind speeds. Analyses have been performed that show over 20% performance improvements with this technique over conventional wind turbines

  4. Study to eliminate ground resonance using active controls

    NASA Technical Reports Server (NTRS)

    Straub, F. K.

    1984-01-01

    The effectiveness of active control blade feathering in increasing rotor body damping and the possibility to eliminate ground resonance instabilities were investigated. An analytical model representing rotor flapping and lead-lag degrees of freedom and body pitch, roll, longitudinal and lateral motion is developed. Active control blade feathering is implemented as state variable feedback through a conventional swashplate. The influence of various feedback states, feedback gain, and weighting between the cyclic controls is studied through stability and response analyses. It is shown that blade cyclic inplane motion, roll rate and roll acceleration feedback can add considerable damping to the system and eliminate ground resonance instabilities, which the feedback phase is also a powerful parameter, if chosen properly, it maximizes augmentation of the inherent regressing lag mode damping. It is shown that rotor configuration parameters, like blade root hinge offset, flapping stiffness, and precone considerably influence the control effectiveness. It is found that active control is particularly powerful for hingeless and bearingless rotor systems.

  5. Investigation of air flow in open-throat wind tunnels

    NASA Technical Reports Server (NTRS)

    Jacobs, Eastman N

    1930-01-01

    Tests were conducted on the 6-inch wind tunnel of the National Advisory Committee for Aeronautics to form a part of a research on open-throat wind tunnels. The primary object of this part of the research was to study a type of air pulsation which has been encountered in open-throat tunnels, and to find the most satisfactory means of eliminating such pulsations. In order to do this it was necessary to study the effects of different variable on all of the important characteristics of the tunnel. This paper gives not only the results of the study of air pulsations and methods of eliminating them, but also the effects of changing the exit-cone diameter and flare and the effects of air leakage from the return passage. It was found that the air pulsations in the 6-inch wind tunnel could be practically eliminated by using a moderately large flare on the exit cone in conjunction with leakage introduced by cutting holes in the exit cone somewhat aft of its minimum diameter.

  6. EDAC: Epithelial defence against cancer-cell competition between normal and transformed epithelial cells in mammals.

    PubMed

    Kajita, Mihoko; Fujita, Yasuyuki

    2015-07-01

    During embryonic development or under certain pathological conditions, viable but suboptimal cells are often eliminated from the cellular society through a process termed cell competition. Cell competition was originally identified in Drosophila where cells with different properties compete for survival; 'loser' cells are eliminated from tissues and consequently 'winner' cells become dominant. Recent studies have shown that cell competition also occurs in mammals. While apoptotic cell death is the major fate for losers in Drosophila, outcompeted cells show more variable phenotypes in mammals, such as cell death-independent apical extrusion and cellular senescence. Molecular mechanisms underlying these processes have been recently revealed. Especially, in epithelial tissues, normal cells sense and actively eliminate the neighbouring transformed cells via cytoskeletal proteins by the process named epithelial defence against cancer (EDAC). Here, we introduce this newly emerging research field: cell competition in mammals. © The Authors 2015. Published by Oxford University Press on behalf of the Japanese Biochemical Society. All rights reserved.

  7. Retraction of Ross and LoLordo findings concerning blocking in serial feature-positive discriminations.

    PubMed

    LoLordo, V M; Ross, R T

    1990-10-01

    Findings concerning the effectiveness of stimuli from various conditioning procedures in blocking conditioned excitation and occasion-setting functions of an added stimulus in a serial feature-postive discrimination training procedure (LoLordo & Ross, 1987; Ross & LoLordo, 1986, 1987) are retracted. Videotapes on which the published data were based were rescored by 2-5 people, most of whom were uninformed about group memberships of the subjects. In no case did the rescoring confirm any of the orginal findings of blocking. Possible factors contributing to the discrepancies are discussed. The experiments should be repeated with feature stimuli that are less similar to each other and with several scorers, at least one of whom is unaware of the group assignment of the subjects.

  8. How Can Evolution Learn?

    PubMed

    Watson, Richard A; Szathmáry, Eörs

    2016-02-01

    The theory of evolution links random variation and selection to incremental adaptation. In a different intellectual domain, learning theory links incremental adaptation (e.g., from positive and/or negative reinforcement) to intelligent behaviour. Specifically, learning theory explains how incremental adaptation can acquire knowledge from past experience and use it to direct future behaviours toward favourable outcomes. Until recently such cognitive learning seemed irrelevant to the 'uninformed' process of evolution. In our opinion, however, new results formally linking evolutionary processes to the principles of learning might provide solutions to several evolutionary puzzles - the evolution of evolvability, the evolution of ecological organisation, and evolutionary transitions in individuality. If so, the ability for evolution to learn might explain how it produces such apparently intelligent designs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. First, do no harm: the US sexually transmitted disease experiments in Guatemala.

    PubMed

    Rodriguez, Michael A; García, Robert

    2013-12-01

    Beginning in 1946, the United States government immorally and unethically-and, arguably, illegally-engaged in research experiments in which more than 5000 uninformed and unconsenting Guatemalan people were intentionally infected with bacteria that cause sexually transmitted diseases. Many have been left untreated to the present day. Although US President Barack Obama apologized in 2010, and although the US Presidential Commission for the Study of Bioethical Issues found the Guatemalan experiments morally wrong, little if anything has been done to compensate the victims and their families. We explore the backdrop for this unethical medical research and violation of human rights and call for steps the United States should take to provide relief and compensation to Guatemala and its people.

  10. First, Do No Harm: The US Sexually Transmitted Disease Experiments in Guatemala

    PubMed Central

    García, Robert

    2013-01-01

    Beginning in 1946, the United States government immorally and unethically—and, arguably, illegally—engaged in research experiments in which more than 5000 uninformed and unconsenting Guatemalan people were intentionally infected with bacteria that cause sexually transmitted diseases. Many have been left untreated to the present day. Although US President Barack Obama apologized in 2010, and although the US Presidential Commission for the Study of Bioethical Issues found the Guatemalan experiments morally wrong, little if anything has been done to compensate the victims and their families. We explore the backdrop for this unethical medical research and violation of human rights and call for steps the United States should take to provide relief and compensation to Guatemala and its people. PMID:24134370

  11. A review of ecologic studies of lung cancer and indoor radon.

    PubMed

    Stidley, C A; Samet, J M

    1993-09-01

    Although radon exposure is an established cause of lung cancer among underground miners, the lung cancer risk to the general population from indoor radon remains controversial. This controversy stems in part from the contradictory results of published studies of indoor radon and lung cancer, including 15 ecologic studies, seven of which found a positive association, six no association, and two a negative association. To address the misunderstanding of the indoor radon risk that has resulted from these ecologic studies, the authors discuss the general methodologic problems and limitations of ecologic studies, and the particular limitations of these 15 studies. The authors conclude that the shortcomings of the ecologic studies render them uninformative on the lung cancer risk associated with indoor radon.

  12. The Cyber Security Crisis

    ScienceCinema

    Spafford, Eugene

    2018-05-11

    Despite considerable activity and attention, the overall state of information security continues to get worse. Attacks are increasing, fraud and theft are rising, and losses may exceed $100 billion per year worldwide. Many factors contribute to this, including misplaced incentives for industry, a lack of attention by government, ineffective law enforcement, and an uninformed image of who the perpetrators really are. As a result, many of the intended attempts at solutions are of limited (if any) overall effectiveness. This presentation will illustrate some key aspects of the cyber security problem and its magnitude, as well as provide some insight into causes and enabling factors. The talk will conclude with some observations on how the computing community can help improve the situation, as well as some suggestions for 'cyber self-defense.'

  13. Why didn't Box-Jenkins win (again)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pack, D.J.; Downing, D.J.

    This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less

  14. Pharmacokinetics of Intravenous Finafloxacin in Healthy Volunteers

    PubMed Central

    Chiesa, Joseph; Lückermann, Mark; Fischer, Carsten; Dalhoff, Axel; Fuhr, Uwe

    2017-01-01

    ABSTRACT Finafloxacin is a novel fluoroquinolone exhibiting enhanced activity under acidic conditions and a broad-spectrum antibacterial profile. The present study assessed the pharmacokinetic properties and the safety and tolerability of finafloxacin following intravenous infusions. In this mixed-parallel-group, crossover study, healthy male and female volunteers received single ascending doses (18 volunteers, 200 to 1,000 mg) or multiple ascending doses (40 volunteers, 600 to 1,000 mg) of finafloxacin or placebo. Plasma and urine samples were collected by a dense sampling scheme to determine the pharmacokinetics of finafloxacin using a noncompartmental approach. Standard safety and tolerability data were documented. Finafloxacin had a volume of distribution of 90 to 127 liters (range) at steady state and 446 to 550 liters at pseudoequilibrium, indicating the elimination of a large fraction before pseudoequilibrium was reached. Areas under the concentration-time curves and maximum plasma concentrations (geometric means) increased slightly more than proportionally (6.73 to 45.9 μg · h/ml and 2.56 to 20.2 μg/ml, respectively), the terminal elimination half-life increased (10.6 to 17.1 h), and the urinary recovery decreased (44.2% to 31.7%) with increasing finafloxacin doses (single doses of 200 to 1,000 mg). The pharmacokinetic profiles suggested multiphasic elimination by both glomerular filtration and saturable tubular secretion. The values of the parameters were similar for single and multiple administrations. The coefficient of variation for the between-subject variability of exposure ranged from 10% (≤600 mg) to 38% (>600 mg). Adverse events were mild and nonspecific, with no dependence of adverse events on dose or treatment (including placebo) being detected. Despite a relatively high interindividual variability at higher doses, the level of exposure following intravenous administration of finafloxacin appears to be predictable. Individual elimination processes should be evaluated in more detail. Finafloxacin exhibited a favorable safety and tolerability profile. (This study has been registered at ClinicalTrials.gov under registration no. NCT01910883.) PMID:28784673

  15. Humid tropical forest clearing from 2000 to 2005 quantified by using multitemporal and multiresolution remotely sensed data.

    PubMed

    Hansen, Matthew C; Stehman, Stephen V; Potapov, Peter V; Loveland, Thomas R; Townshend, John R G; DeFries, Ruth S; Pittman, Kyle W; Arunarwati, Belinda; Stolle, Fred; Steininger, Marc K; Carroll, Mark; Dimiceli, Charlene

    2008-07-08

    Forest cover is an important input variable for assessing changes to carbon stocks, climate and hydrological systems, biodiversity richness, and other sustainability science disciplines. Despite incremental improvements in our ability to quantify rates of forest clearing, there is still no definitive understanding on global trends. Without timely and accurate forest monitoring methods, policy responses will be uninformed concerning the most basic facts of forest cover change. Results of a feasible and cost-effective monitoring strategy are presented that enable timely, precise, and internally consistent estimates of forest clearing within the humid tropics. A probability-based sampling approach that synergistically employs low and high spatial resolution satellite datasets was used to quantify humid tropical forest clearing from 2000 to 2005. Forest clearing is estimated to be 1.39% (SE 0.084%) of the total biome area. This translates to an estimated forest area cleared of 27.2 million hectares (SE 2.28 million hectares), and represents a 2.36% reduction in area of humid tropical forest. Fifty-five percent of total biome clearing occurs within only 6% of the biome area, emphasizing the presence of forest clearing "hotspots." Forest loss in Brazil accounts for 47.8% of total biome clearing, nearly four times that of the next highest country, Indonesia, which accounts for 12.8%. Over three-fifths of clearing occurs in Latin America and over one-third in Asia. Africa contributes 5.4% to the estimated loss of humid tropical forest cover, reflecting the absence of current agro-industrial scale clearing in humid tropical Africa.

  16. Humid tropical forest clearing from 2000 to 2005 quantified by using multitemporal and multiresolution remotely sensed data

    USGS Publications Warehouse

    Hansen, Matthew C.; Stehman, S.V.; Potapov, Peter V.; Loveland, Thomas R.; Townshend, J.R.G.; DeFries, R.S.; Pittman, K.W.; Arunarwati, B.; Stolle, F.; Steininger, M.K.; Carroll, M.; DiMiceli, C.

    2008-01-01

    Forest cover is an important input variable for assessing changes to carbon stocks, climate and hydrological systems, biodiversity richness, and other sustainability science disciplines. Despite incremental improvements in our ability to quantify rates of forest clearing, there is still no definitive understanding on global trends. Without timely and accurate forest monitoring methods, policy responses will be uninformed concerning the most basic facts of forest cover change. Results of a feasible and cost-effective monitoring strategy are presented that enable timely, precise, and internally consistent estimates of forest clearing within the humid tropics. A probability-based sampling approach that synergistically employs low and high spatial resolution satellite datasets was used to quantify humid tropical forest clearing from 2000 to 2005. Forest clearing is estimated to be 1.39% (SE 0.084%) of the total biome area. This translates to an estimated forest area cleared of 27.2 million hectares (SE 2.28 million hectares), and represents a 2.36% reduction in area of humid tropical forest. Fifty-five percent of total biome clearing occurs within only 6% of the biome area, emphasizing the presence of forest clearing “hotspots.” Forest loss in Brazil accounts for 47.8% of total biome clearing, nearly four times that of the next highest country, Indonesia, which accounts for 12.8%. Over three-fifths of clearing occurs in Latin America and over one-third in Asia. Africa contributes 5.4% to the estimated loss of humid tropical forest cover, reflecting the absence of current agro-industrial scale clearing in humid tropical Africa.

  17. Normalization of flow-mediated dilation to shear stress area under the curve eliminates the impact of variable hyperemic stimulus.

    PubMed

    Padilla, Jaume; Johnson, Blair D; Newcomer, Sean C; Wilhite, Daniel P; Mickleborough, Timothy D; Fly, Alyce D; Mather, Kieren J; Wallace, Janet P

    2008-09-04

    Normalization of brachial artery flow-mediated dilation (FMD) to individual shear stress area under the curve (peak FMD:SSAUC ratio) has recently been proposed as an approach to control for the large inter-subject variability in reactive hyperemia-induced shear stress; however, the adoption of this approach among researchers has been slow. The present study was designed to further examine the efficacy of FMD normalization to shear stress in reducing measurement variability. Five different magnitudes of reactive hyperemia-induced shear stress were applied to 20 healthy, physically active young adults (25.3 +/- 0. 6 yrs; 10 men, 10 women) by manipulating forearm cuff occlusion duration: 1, 2, 3, 4, and 5 min, in a randomized order. A venous blood draw was performed for determination of baseline whole blood viscosity and hematocrit. The magnitude of occlusion-induced forearm ischemia was quantified by dual-wavelength near-infrared spectrometry (NIRS). Brachial artery diameters and velocities were obtained via high-resolution ultrasound. The SSAUC was individually calculated for the duration of time-to-peak dilation. One-way repeated measures ANOVA demonstrated distinct magnitudes of occlusion-induced ischemia (volume and peak), hyperemic shear stress, and peak FMD responses (all p < 0.0001) across forearm occlusion durations. Differences in peak FMD were abolished when normalizing FMD to SSAUC (p = 0.785). Our data confirm that normalization of FMD to SSAUC eliminates the influences of variable shear stress and solidifies the utility of FMD:SSAUC ratio as an index of endothelial function.

  18. Malaria control under unstable dynamics: reactive vs. climate-based strategies.

    PubMed

    Baeza, Andres; Bouma, Menno J; Dhiman, Ramesh; Pascual, Mercedes

    2014-01-01

    In areas of the world where malaria prevails under unstable conditions, attacking the adult vector population through insecticide-based Indoor Residual Spraying (IRS) is the most common method for controlling epidemics. Defined in policy guidance, the use of Annual Parasitic Incidence (API) is an important tool for assessing the effectiveness of control and for planning new interventions. To investigate the consequences that a policy based on API in previous seasons might have on the population dynamics of the disease and on control itself in regions of low and seasonal transmission, we formulate a mathematical malaria model that couples epidemiologic and vector dynamics with IRS intervention. This model is parameterized for a low transmission and semi-arid region in northwest India, where epidemics are driven by high rainfall variability. We show that this type of feedback mechanism in control strategies can generate transient cycles in malaria even in the absence of environmental variability, and that this tendency to cycle can in turn limit the effectiveness of control in the presence of such variability. Specifically, for realistic rainfall conditions and over a range of control intensities, the effectiveness of such 'reactive' intervention is compared to that of an alternative strategy based on rainfall and therefore vector variability. Results show that the efficacy of intervention is strongly influenced by rainfall variability and the type of policy implemented. In particular, under an API 'reactive' policy, high vector populations can coincide more frequently with low control coverage, and in so doing generate large unexpected epidemics and decrease the likelihood of elimination. These results highlight the importance of incorporating information on climate variability, rather than previous incidence, in planning IRS interventions in regions of unstable malaria. These findings are discussed in the more general context of elimination and other low transmission regions such as highlands. Copyright © 2013. Published by Elsevier B.V.

  19. The extraction of simple relationships in growth factor-specific multiple-input and multiple-output systems in cell-fate decisions by backward elimination PLS regression.

    PubMed

    Akimoto, Yuki; Yugi, Katsuyuki; Uda, Shinsuke; Kudo, Takamasa; Komori, Yasunori; Kubota, Hiroyuki; Kuroda, Shinya

    2013-01-01

    Cells use common signaling molecules for the selective control of downstream gene expression and cell-fate decisions. The relationship between signaling molecules and downstream gene expression and cellular phenotypes is a multiple-input and multiple-output (MIMO) system and is difficult to understand due to its complexity. For example, it has been reported that, in PC12 cells, different types of growth factors activate MAP kinases (MAPKs) including ERK, JNK, and p38, and CREB, for selective protein expression of immediate early genes (IEGs) such as c-FOS, c-JUN, EGR1, JUNB, and FOSB, leading to cell differentiation, proliferation and cell death; however, how multiple-inputs such as MAPKs and CREB regulate multiple-outputs such as expression of the IEGs and cellular phenotypes remains unclear. To address this issue, we employed a statistical method called partial least squares (PLS) regression, which involves a reduction of the dimensionality of the inputs and outputs into latent variables and a linear regression between these latent variables. We measured 1,200 data points for MAPKs and CREB as the inputs and 1,900 data points for IEGs and cellular phenotypes as the outputs, and we constructed the PLS model from these data. The PLS model highlighted the complexity of the MIMO system and growth factor-specific input-output relationships of cell-fate decisions in PC12 cells. Furthermore, to reduce the complexity, we applied a backward elimination method to the PLS regression, in which 60 input variables were reduced to 5 variables, including the phosphorylation of ERK at 10 min, CREB at 5 min and 60 min, AKT at 5 min and JNK at 30 min. The simple PLS model with only 5 input variables demonstrated a predictive ability comparable to that of the full PLS model. The 5 input variables effectively extracted the growth factor-specific simple relationships within the MIMO system in cell-fate decisions in PC12 cells.

  20. Assessment of the Status of Measles Elimination in the United States, 2001-2014.

    PubMed

    Gastañaduy, Paul A; Paul, Prabasaj; Fiebelkorn, Amy Parker; Redd, Susan B; Lopman, Ben A; Gambhir, Manoj; Wallace, Gregory S

    2017-04-01

    We assessed the status of measles elimination in the United States using outbreak notification data. Measles transmissibility was assessed by estimation of the reproduction number, R, the average number of secondary cases per infection, using 4 methods; elimination requires maintaining R at <1. Method 1 estimates R as 1 minus the proportion of cases that are imported. Methods 2 and 3 estimate R by fitting a model of the spread of infection to data on the sizes and generations of chains of transmission, respectively. Method 4 assesses transmissibility before public health interventions, by estimating R for the case with the earliest symptom onset in each cluster (Rindex). During 2001-2014, R and Rindex estimates obtained using methods 1-4 were 0.72 (95% confidence interval (CI): 0.68, 0.76), 0.66 (95% CI: 0.62, 0.70), 0.45 (95% CI: 0.40, 0.49), and 0.63 (95% CI: 0.57, 0.69), respectively. Year-to-year variability in the values of R and Rindex and an increase in transmissibility in recent years were noted with all methods. Elimination of endemic measles transmission is maintained in the United States. A suggested increase in measles transmissibility since elimination warrants continued monitoring and emphasizes the importance of high measles vaccination coverage throughout the population. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  1. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    PubMed

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  2. Fractional Factorial Experiment Designs to Minimize Configuration Changes in Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Cler, Daniel L.; Graham, Albert B.

    2002-01-01

    This paper serves as a tutorial to introduce the wind tunnel research community to configuration experiment designs that can satisfy resource constraints in a configuration study involving several variables, without arbitrarily eliminating any of them from the experiment initially. The special case of a configuration study featuring variables at two levels is examined in detail. This is the type of study in which each configuration variable has two natural states - 'on or off', 'deployed or not deployed', 'low or high', and so forth. The basic principles are illustrated by results obtained in configuration studies conducted in the Langley National Transonic Facility and in the ViGYAN Low Speed Tunnel in Hampton, Virginia. The crucial role of interactions among configuration variables is highlighted with an illustration of difficulties that can be encountered when they are not properly taken into account.

  3. Solution of the Time-Dependent Schrödinger Equation by the Laplace Transform Method

    PubMed Central

    Lin, S. H.; Eyring, H.

    1971-01-01

    The time-dependent Schrödinger equation for two quite general types of perturbation has been solved by introducing the Laplace transforms to eliminate the time variable. The resulting time-independent differential equation can then be solved by the perturbation method, the variation method, the variation-perturbation method, and other methods. PMID:16591898

  4. Research and Development of Advanced Life Support Equipment.

    DTIC Science & Technology

    1999-02-01

    kg.) were catheterized f or measurement of left ventricular pressure (LVP), right ventricular pressure (RVP), mean aortic pressure (MAP), central ...Orientation Laboratory Venous Gas Emboli Variable Profile Breathing Simulator Wingate Anaerobic Test Weapons System Trainer World Wide Web... history screening of the potential subjects was conducted to eliminate those individuals who have known health conditions/ histories which would

  5. A tree classification for the selection forests of the Sierra Nevada

    Treesearch

    Duncan Dunning

    1928-01-01

    Individuality in man is accepted without question. In domestic animals, also, good and bad individuals are generally recognized. Even in some cultivated plants —orange trees and rubber trees— the poor producers are searched out and eliminated. Indeed, individual variability is a normal condition in all groups of organisms. Yet forest trees are...

  6. Current advances on polynomial resultant formulations

    NASA Astrophysics Data System (ADS)

    Sulaiman, Surajo; Aris, Nor'aini; Ahmad, Shamsatun Nahar

    2017-08-01

    Availability of computer algebra systems (CAS) lead to the resurrection of the resultant method for eliminating one or more variables from the polynomials system. The resultant matrix method has advantages over the Groebner basis and Ritt-Wu method due to their high complexity and storage requirement. This paper focuses on the current resultant matrix formulations and investigates their ability or otherwise towards producing optimal resultant matrices. A determinantal formula that gives exact resultant or a formulation that can minimize the presence of extraneous factors in the resultant formulation is often sought for when certain conditions that it exists can be determined. We present some applications of elimination theory via resultant formulations and examples are given to explain each of the presented settings.

  7. Mechanisms for the dehydrogenation of alkanes on platinum: insights gained from the reactivity of gaseous cluster cations, Ptn + n=1-21.

    PubMed

    Adlhart, Christian; Uggerud, Einar

    2007-01-01

    Rates for the dihydrogen elimination of methane, ethane, and propane with cationic platinum clusters, Pt(n) (+) (1

  8. Fokker-Planck description of electron and photon transport in homogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akcasu, A.Z.; Holloway, J.P.

    1997-06-01

    Starting from a Fokker-Planck description of particle transport, which is valid when the scattering is forwardly peaked and the energy change in scattering is small, we systematically obtain an approximate diffusionlike equation for the particle density by eliminating the direction variable {bold {cflx {Omega}}} with an elimination scheme based on Zwanzig{close_quote}s projection operator formalism in the interaction representation. The elimination procedure closely follows one described by Grigolini and Marchesoni [in {ital Memory Function Approaches to Stochastic Problems in Condensed Matter}, edited by Myron W. Evans, Paolo Grigolini, and Guiseppe P. Parravicini, Advances in Physical Chemistry, Vol. 62 (Wiley-Interscience, New York,more » 1985), Chap. II, p. 29], but with a different projection operator. The resulting diffusion equation is correct up to the second order in the coupling operator between the particle direction and position variable. The diffusion coefficients and mobility in the resulting diffusion equation depend on the initial distribution of the particles in direction and on the path length traveled by the particles. The full solution is obtained for a monoenergetic and monodirectional pulsed point source of particles in an infinite homogeneous medium. This solution is used to study the penetration and the transverse and longitudinal spread of the particles as they are transported through the medium. Application to diffusive wave spectroscopy in calculating the path-length distribution of photons, as well as application to dose calculations in tissue due to an electron beam are mentioned. {copyright} {ital 1997} {ital The American Physical Society}« less

  9. Automated Welding System

    NASA Technical Reports Server (NTRS)

    Bayless, E. O.; Lawless, K. G.; Kurgan, C.; Nunes, A. C.; Graham, B. F.; Hoffman, D.; Jones, C. S.; Shepard, R.

    1993-01-01

    Fully automated variable-polarity plasma arc VPPA welding system developed at Marshall Space Flight Center. System eliminates defects caused by human error. Integrates many sensors with mathematical model of the weld and computer-controlled welding equipment. Sensors provide real-time information on geometry of weld bead, location of weld joint, and wire-feed entry. Mathematical model relates geometry of weld to critical parameters of welding process.

  10. Evaluation of the reference unit method for herbaceous biomass estimation in native grasslands of southwestern South Dakota

    Treesearch

    Eric D. Boyda

    2013-01-01

    The high costs associated with physically harvesting plant biomass may prevent sufficient data collection, which is necessary to account for the natural variability of vegetation at a landscape scale. A biomass estimation technique was previously developed using representative samples or "reference units", which eliminated the need to harvest biomass from all...

  11. Response of terrestrial small mammals to varying amounts and patterns of green-tree retention in Pacific Northwest forests

    Treesearch

    Robert A. Gitzen; Stephen West; Chris C. Maguireb; Tom Manning; Charles B. Halpern

    2007-01-01

    To sustain native species in managed forests, landowners need silvicultural strategies that retain habitat elements often eliminated during traditional harvests such as clearcut logging. One alternative is green-tree or variable retention. We investigated the response of terrestrial small mammals to experimental harvests that retained large live trees in varying...

  12. Can Response Speed Be Fixed Experimentally, and Does This Lead to Unconfounded Measurement of Ability?

    ERIC Educational Resources Information Center

    Bolsinova, Maria; Tijmstra, Jesper

    2015-01-01

    Goldhammer (this issue) proposes an interesting approach to dealing with the speededness of item responses. Rather than modeling speed as a latent variable that varies from person to person, he proposes to use experimental conditions that are expected to fix the speed, thereby eliminating individual differences on this dimension in order to make…

  13. Responsibility Factors of Reducing Inefficiencies in Information System Processes and Their Role on Intention to Acquire Six Sigma Certification

    ERIC Educational Resources Information Center

    Hejazi, Sara

    2009-01-01

    Organizations worldwide have been turning to Six Sigma program (SSP) to eliminate the defects in their products or drive out the variability in their processes to attain a competitive advantage in their marketplace. An effective certification program has been touted as a major contributor to successful implementation of SSP. An effective…

  14. Case study: Differences in milk characteristics between a cow herd transitioning to organic versus milk from a conventional dairy herd

    USDA-ARS?s Scientific Manuscript database

    Differences between organic and conventional milk were studied by comparing two adjacent farms over a 12-mo period starting at the beginning of the grazing season, thus eliminating variables due to geography and weather. Milk was collected from a farm where cows were fed a conventional total mixed ...

  15. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    PubMed

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  16. Contributions of sociodemographic factors to criminal behavior

    PubMed Central

    Mundia, Lawrence; Matzin, Rohani; Mahalle, Salwa; Hamid, Malai Hayati; Osman, Ratna Suriani

    2016-01-01

    We explored the extent to which prisoner sociodemographic variables (age, education, marital status, employment, and whether their parents were married or not) influenced offending in 64 randomly selected Brunei inmates, comprising both sexes. A quantitative field survey design ideal for the type of participants used in a prison context was employed to investigate the problem. Hierarchical multiple regression analysis with backward elimination identified prisoner marital status and age groups as significantly related to offending. Furthermore, hierarchical multinomial logistic regression analysis with backward elimination indicated that prisoners’ age, primary level education, marital status, employment status, and parental marital status as significantly related to stealing offenses with high odds ratios. All 29 nonrecidivists were false negatives and predicted to reoffend upon release. Similarly, all 33 recidivists were projected to reoffend after release. Hierarchical binary logistic regression analysis revealed age groups (24–29 years and 30–35 years), employed prisoner, and primary level education as variables with high likelihood trends for reoffending. The results suggested that prisoner interventions (educational, counseling, and psychotherapy) in Brunei should treat not only antisocial personality, psychopathy, and mental health problems but also sociodemographic factors. The study generated offending patterns, trends, and norms that may inform subsequent investigations on Brunei prisoners. PMID:27382342

  17. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    PubMed Central

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data. PMID:23573172

  18. Managerial process improvement: a lean approach to eliminating medication delivery.

    PubMed

    Hussain, Aftab; Stewart, LaShonda M; Rivers, Patrick A; Munchus, George

    2015-01-01

    Statistical evidence shows that medication errors are a major cause of injuries that concerns all health care oganizations. Despite all the efforts to improve the quality of care, the lack of understanding and inability of management to design a robust system that will strategically target those factors is a major cause of distress. The paper aims to discuss these issues. Achieving optimum organizational performance requires two key variables; work process factors and human performance factors. The approach is that healthcare administrators must take in account both variables in designing a strategy to reduce medication errors. However, strategies that will combat such phenomena require that managers and administrators understand the key factors that are causing medication delivery errors. The authors recommend that healthcare organizations implement the Toyota Production System (TPS) combined with human performance improvement (HPI) methodologies to eliminate medication delivery errors in hospitals. Despite all the efforts to improve the quality of care, there continues to be a lack of understanding and the ability of management to design a robust system that will strategically target those factors associated with medication errors. This paper proposes a solution to an ambiguous workflow process using the TPS combined with the HPI system.

  19. Sensored Field Oriented Control of a Robust Induction Motor Drive Using a Novel Boundary Layer Fuzzy Controller

    PubMed Central

    Saghafinia, Ali; Ping, Hew Wooi; Uddin, Mohammad Nasir

    2013-01-01

    Physical sensors have a key role in implementation of real-time vector control for an induction motor (IM) drive. This paper presents a novel boundary layer fuzzy controller (NBLFC) based on the boundary layer approach for speed control of an indirect field-oriented control (IFOC) of an induction motor (IM) drive using physical sensors. The boundary layer approach leads to a trade-off between control performances and chattering elimination. For the NBLFC, a fuzzy system is used to adjust the boundary layer thickness to improve the tracking performance and eliminate the chattering problem under small uncertainties. Also, to eliminate the chattering under the possibility of large uncertainties, the integral filter is proposed inside the variable boundary layer. In addition, the stability of the system is analyzed through the Lyapunov stability theorem. The proposed NBLFC based IM drive is implemented in real-time using digital signal processor (DSP) board TI TMS320F28335. The experimental and simulation results show the effectiveness of the proposed NBLFC based IM drive at different operating conditions.

  20. Fetal anemia as a signal of congenital syphilis.

    PubMed

    Macé, Guillaume; Castaigne, Vanina; Trabbia, Aurore; Guigue, Virginie; Cynober, Evelyne; Cortey, Anne; Lalande, Valérie; Carbonne, Bruno

    2014-09-01

    An upsurge in syphilis has been observed almost everywhere over the past decade. The mother's clinical presentation is often uninformative. The diagnosis of maternal syphilis infection is most often based on serologic tests that allow early Extencilline treatment. Syphilis ultrasound findings are non-specific, and delay before treatment can be decisive for prognosis. Fetal anemia is a physiological consequence of severe infection. We confirmed that syphilis can be suggested non-invasively by MCA-PSV measurements in a context of ascitis or atypical hydrops in the absence of usual causes. It is therefore important to perform maternal TPHA/VDRL serology if fetal anemia is suspected. In association with Extencilline treatment, intra uterine transfusion can limit consequences of infection. Reduced fetal movements and non-reactive fetal heart rate may prefigure acute perinatal complications or stillbirth.

  1. Particular applications of food irradiation: Meat, fish and others

    NASA Astrophysics Data System (ADS)

    Ehlermann, Dieter A. E.

    2016-12-01

    It is surprising what all can be achieved by radiation processing of food; this chapter narrates a number of less obvious applications mostly hidden to the consumer. Also the labelling regulations differing world-wide are responsible for leaving the consumer uninformed. Several of the early proposals could not reach technological maturity or are commercially not competitive. Still considerable energy is spent in research for such applications. Other applications are serving a certain niche, companies mostly are reluctant to release reliable information about their activities. Labelling regulation vary world-wide significantly. Hence, the market place does not really give the full picture of irradiated food available to the consumer. Despite those restrictions, this report intends to give a full picture of the actual situation for meat, fish and others and of unique uses.

  2. The Cyber Security Crisis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spafford, Eugene

    2006-05-10

    Despite considerable activity and attention, the overall state of information security continues to get worse. Attacks are increasing, fraud and theft are rising, and losses may exceed $100 billion per year worldwide. Many factors contribute to this, including misplaced incentives for industry, a lack of attention by government, ineffective law enforcement, and an uninformed image of who the perpetrators really are. As a result, many of the intended attempts at solutions are of limited (if any) overall effectiveness. This presentation will illustrate some key aspects of the cyber security problem and its magnitude, as well as provide some insight intomore » causes and enabling factors. The talk will conclude with some observations on how the computing community can help improve the situation, as well as some suggestions for 'cyber self-defense.'« less

  3. Rational choice(s)? Rethinking decision-making on breast cancer risk and screening mammography.

    PubMed

    Vahabi, Mandana; Gastaldo, Denise

    2003-12-01

    Women who refrain from undergoing breast cancer screening are believed to be uninformed about risks and usually labeled as irrational. Our purpose in writing this paper is to challenge the traditional notion of rational behaviour, illustrating with qualitative data that people's rationality is influenced by their socio-cultural and political identities. We explore three major themes: (1) cultural explanations regarding intention to use screening mammography (2) (dis)trust in science and expert opinion, and (3) self-responsibility and self-surveillance in caring for one's body. Understanding that women rely on different risk discourses to make decisions about their health should aide researchers, health professionals, and the community in better understanding alternative ways of conceptualizing people's health-related behaviours when they do not coincide with health authorities recommendations.

  4. The confidence-accuracy relationship in eyewitness identification: effects of lineup instructions, foil similarity, and target-absent base rates.

    PubMed

    Brewer, Neil; Wells, Gary L

    2006-03-01

    Discriminating accurate from mistaken eyewitness identifications is a major issue facing criminal justice systems. This study examined whether eyewitness confidence assists such decisions under a variety of conditions using a confidence-accuracy (CA) calibration approach. Participants (N = 1,200) viewed a simulated crime and attempted 2 separate identifications from 8-person target-present or target-absent lineups. Confidence and accuracy were calibrated for choosers (but not nonchoosers) for both targets under all conditions. Lower overconfidence was associated with higher diagnosticity, lower target-absent base rates, and shorter identification latencies. Although researchers agree that courtroom expressions of confidence are uninformative, our findings indicate that confidence assessments obtained immediately after a positive identification can provide a useful guide for investigators about the likely accuracy of an identification.

  5. Forensic analysis of dyed textile fibers.

    PubMed

    Goodpaster, John V; Liszewski, Elisa A

    2009-08-01

    Textile fibers are a key form of trace evidence, and the ability to reliably associate or discriminate them is crucial for forensic scientists worldwide. While microscopic and instrumental analysis can be used to determine the composition of the fiber itself, additional specificity is gained by examining fiber color. This is particularly important when the bulk composition of the fiber is relatively uninformative, as it is with cotton, wool, or other natural fibers. Such analyses pose several problems, including extremely small sample sizes, the desire for nondestructive techniques, and the vast complexity of modern dye compositions. This review will focus on more recent methods for comparing fiber color by using chromatography, spectroscopy, and mass spectrometry. The increasing use of multivariate statistics and other data analysis techniques for the differentiation of spectra from dyed fibers will also be discussed.

  6. Maximizing the information learned from finite data selects a simple model

    NASA Astrophysics Data System (ADS)

    Mattingly, Henry H.; Transtrum, Mark K.; Abbott, Michael C.; Machta, Benjamin B.

    2018-02-01

    We use the language of uninformative Bayesian prior choice to study the selection of appropriately simple effective models. We advocate for the prior which maximizes the mutual information between parameters and predictions, learning as much as possible from limited data. When many parameters are poorly constrained by the available data, we find that this prior puts weight only on boundaries of the parameter space. Thus, it selects a lower-dimensional effective theory in a principled way, ignoring irrelevant parameter directions. In the limit where there are sufficient data to tightly constrain any number of parameters, this reduces to the Jeffreys prior. However, we argue that this limit is pathological when applied to the hyperribbon parameter manifolds generic in science, because it leads to dramatic dependence on effects invisible to experiment.

  7. Intrapopulation polymorphism in Anopheles messeae (An. maculipennis complex) inferred by molecular analysis.

    PubMed

    Di Luca, Marco; Boccolini, Daniela; Marinuccil, Marino; Romi, Roberto

    2004-07-01

    We evaluated the internal transcribed spacer two (ITS2) sequence to detect intraspecific polymorphism in the Palearctic Anopheles maculipennis complex, analyzing 52 populations from 12 countries and representing six species. For An. messene, two fragments of the cytochrome oxidase I (COI) gene were also evaluated. The results were compared with GenBank sequences and data from the literature. ITS2 analysis revealed evident intraspecific polymorphism for An. messeae and a slightly less evident polymorphism for An. melanoon, whereas for each of the other species, 100% identity was found among populations. ITS2 analysis of An. messeae identified five haplotypes that were consistent with the geographical origin of the populations. ITS2 seems to be a reliable marker of intraspecific polymorphism for this complex, whereas the COI gene is apparently uninformative.

  8. Necessity of guides in pedestrian emergency evacuation

    NASA Astrophysics Data System (ADS)

    Yang, Xiaoxia; Dong, Hairong; Yao, Xiuming; Sun, Xubin; Wang, Qianling; Zhou, Min

    2016-01-01

    The role of guide who is in charge of leading pedestrians to evacuate in the case of emergency plays a critical role for the uninformed people. This paper first investigates the influence of mass behavior on evacuation dynamics and mainly focuses on the guided evacuation dynamics. In the extended crowd model proposed in this paper, individualistic behavior, herding behavior and environment influence are all considered for pedestrians who are not informed by the guide. According to the simulation results, herding behavior makes more pedestrians evacuate from the room in the same period of time. Besides, guided crowd demonstrates the same behavior of group dynamics which is characterized by gathering, conflicts and balance. Moreover, simulation results indicate guides with appropriate initial positions and quantity are more conducive to evacuation under a moderate initial density of pedestrians.

  9. Role of meaningful subgroups in explaining differences in perceived variability for in-groups and out-groups.

    PubMed

    Park, B; Ryan, C S; Judd, C M

    1992-10-01

    Five aspects of the complexity of the knowledge representation of business and engineering majors were examined to see whether these differed by group membership and whether these differences were related to differences in perceived variability. Significantly more subgroups were generated when describing the in-group than the out-group; this difference predicted the relative tendency to see the in-group as more variable, and when controlled for statistically, out-group homogeneity effects were eliminated. Familiarity, redundancy, number of attributes used to describe the group, and the deviance of the subgroups from the larger group generally showed differences for in-group and out-group but did not show consistent evidence of mediation. In a 2nd study, Ss who were asked to sort group members into meaningful subgroups perceived greater variability relative to those who did not perform the sorting task.

  10. Alternative metrics for real-ear-to-coupler difference average values in children.

    PubMed

    Blumsack, Judith T; Clark-Lewis, Sandra; Watts, Kelli M; Wilson, Martha W; Ross, Margaret E; Soles, Lindsey; Ennis, Cydney

    2014-10-01

    Ideally, individual real-ear-to-coupler difference (RECD) measurements are obtained for pediatric hearing instrument-fitting purposes. When RECD measurements cannot be obtained, age-related average RECDs based on typically developing North American children are used. Evidence suggests that these values may not be appropriate for populations of children with retarded growth patterns. The purpose of this study was to determine if another metric, such as head circumference, height, or weight, can be used for prediction of RECDs in children. Design was a correlational study. For all participants, RECD values in both ears, head circumference, height, and weight were measured. The sample consisted of 68 North American children (ages 3-11 yr). Height, weight, head circumference, and RECDs were measured and were analyzed for both ears at 500, 750, 1000, 1500, 2000, 3000, 4000, and 6000 Hz. A backward elimination multiple-regression analysis was used to determine if age, height, weight, and/or head circumference are significant predictors of RECDs. For the left ear, head circumference was retained as the only statistically significant variable in the final model. For the right ear, head circumference was retained as the only statistically significant independent variable at all frequencies except at 2000 and 4000 Hz. At these latter frequencies, weight was retained as the only statistically significant independent variable after all other variables were eliminated. Head circumference can be considered as a metric for RECD prediction in children when individual measurements cannot be obtained. In developing countries where equipment is often unavailable and stunted growth can reduce the value of using age as a metric, head circumference can be considered as an alternative metric in the prediction of RECDs. American Academy of Audiology.

  11. Removing Batch Effects from Longitudinal Gene Expression - Quantile Normalization Plus ComBat as Best Approach for Microarray Transcriptome Data

    PubMed Central

    Müller, Christian; Schillert, Arne; Röthemeier, Caroline; Trégouët, David-Alexandre; Proust, Carole; Binder, Harald; Pfeiffer, Norbert; Beutel, Manfred; Lackner, Karl J.; Schnabel, Renate B.; Tiret, Laurence; Wild, Philipp S.; Blankenberg, Stefan

    2016-01-01

    Technical variation plays an important role in microarray-based gene expression studies, and batch effects explain a large proportion of this noise. It is therefore mandatory to eliminate technical variation while maintaining biological variability. Several strategies have been proposed for the removal of batch effects, although they have not been evaluated in large-scale longitudinal gene expression data. In this study, we aimed at identifying a suitable method for batch effect removal in a large study of microarray-based longitudinal gene expression. Monocytic gene expression was measured in 1092 participants of the Gutenberg Health Study at baseline and 5-year follow up. Replicates of selected samples were measured at both time points to identify technical variability. Deming regression, Passing-Bablok regression, linear mixed models, non-linear models as well as ReplicateRUV and ComBat were applied to eliminate batch effects between replicates. In a second step, quantile normalization prior to batch effect correction was performed for each method. Technical variation between batches was evaluated by principal component analysis. Associations between body mass index and transcriptomes were calculated before and after batch removal. Results from association analyses were compared to evaluate maintenance of biological variability. Quantile normalization, separately performed in each batch, combined with ComBat successfully reduced batch effects and maintained biological variability. ReplicateRUV performed perfectly in the replicate data subset of the study, but failed when applied to all samples. All other methods did not substantially reduce batch effects in the replicate data subset. Quantile normalization plus ComBat appears to be a valuable approach for batch correction in longitudinal gene expression data. PMID:27272489

  12. Variations in respiratory excretion of carbon dioxide can be used to calculate pulmonary blood flow.

    PubMed

    Preiss, David A; Azami, Takafumi; Urman, Richard D

    2015-02-01

    A non-invasive means of measuring pulmonary blood flow (PBF) would have numerous benefits in medicine. Traditionally, respiratory-based methods require breathing maneuvers, partial rebreathing, or foreign gas mixing because exhaled CO2 volume on a per-breath basis does not accurately represent alveolar exchange of CO2. We hypothesized that if the dilutional effect of the functional residual capacity was accounted for, the relationship between the calculated volume of CO2 removed per breath and the alveolar partial pressure of CO2 would be reversely linear. A computer model was developed that uses variable tidal breathing to calculate CO2 removal per breath at the level of the alveoli. We iterated estimates for functional residual capacity to create the best linear fit of alveolar CO2 pressure and CO2 elimination for 10 minutes of breathing and incorporated the volume of CO2 elimination into the Fick equation to calculate PBF. The relationship between alveolar pressure of CO2 and CO2 elimination produced an R(2) = 0.83. The optimal functional residual capacity differed from the "actual" capacity by 0.25 L (8.3%). The repeatability coefficient leveled at 0.09 at 10 breaths and the difference between the PBF calculated by the model and the preset blood flow was 0.62 ± 0.53 L/minute. With variations in tidal breathing, a linear relationship exists between alveolar CO2 pressure and CO2 elimination. Existing technology may be used to calculate CO2 elimination during quiet breathing and might therefore be used to accurately calculate PBF in humans with healthy lungs.

  13. Gene selection in cancer classification using sparse logistic regression with Bayesian regularization.

    PubMed

    Cawley, Gavin C; Talbot, Nicola L C

    2006-10-01

    Gene selection algorithms for cancer classification, based on the expression of a small number of biomarker genes, have been the subject of considerable research in recent years. Shevade and Keerthi propose a gene selection algorithm based on sparse logistic regression (SLogReg) incorporating a Laplace prior to promote sparsity in the model parameters, and provide a simple but efficient training procedure. The degree of sparsity obtained is determined by the value of a regularization parameter, which must be carefully tuned in order to optimize performance. This normally involves a model selection stage, based on a computationally intensive search for the minimizer of the cross-validation error. In this paper, we demonstrate that a simple Bayesian approach can be taken to eliminate this regularization parameter entirely, by integrating it out analytically using an uninformative Jeffrey's prior. The improved algorithm (BLogReg) is then typically two or three orders of magnitude faster than the original algorithm, as there is no longer a need for a model selection step. The BLogReg algorithm is also free from selection bias in performance estimation, a common pitfall in the application of machine learning algorithms in cancer classification. The SLogReg, BLogReg and Relevance Vector Machine (RVM) gene selection algorithms are evaluated over the well-studied colon cancer and leukaemia benchmark datasets. The leave-one-out estimates of the probability of test error and cross-entropy of the BLogReg and SLogReg algorithms are very similar, however the BlogReg algorithm is found to be considerably faster than the original SLogReg algorithm. Using nested cross-validation to avoid selection bias, performance estimation for SLogReg on the leukaemia dataset takes almost 48 h, whereas the corresponding result for BLogReg is obtained in only 1 min 24 s, making BLogReg by far the more practical algorithm. BLogReg also demonstrates better estimates of conditional probability than the RVM, which are of great importance in medical applications, with similar computational expense. A MATLAB implementation of the sparse logistic regression algorithm with Bayesian regularization (BLogReg) is available from http://theoval.cmp.uea.ac.uk/~gcc/cbl/blogreg/

  14. Partial Granger causality--eliminating exogenous inputs and latent variables.

    PubMed

    Guo, Shuixia; Seth, Anil K; Kendrick, Keith M; Zhou, Cong; Feng, Jianfeng

    2008-07-15

    Attempts to identify causal interactions in multivariable biological time series (e.g., gene data, protein data, physiological data) can be undermined by the confounding influence of environmental (exogenous) inputs. Compounding this problem, we are commonly only able to record a subset of all related variables in a system. These recorded variables are likely to be influenced by unrecorded (latent) variables. To address this problem, we introduce a novel variant of a widely used statistical measure of causality--Granger causality--that is inspired by the definition of partial correlation. Our 'partial Granger causality' measure is extensively tested with toy models, both linear and nonlinear, and is applied to experimental data: in vivo multielectrode array (MEA) local field potentials (LFPs) recorded from the inferotemporal cortex of sheep. Our results demonstrate that partial Granger causality can reveal the underlying interactions among elements in a network in the presence of exogenous inputs and latent variables in many cases where the existing conditional Granger causality fails.

  15. SVM-RFE based feature selection and Taguchi parameters optimization for multiclass SVM classifier.

    PubMed

    Huang, Mei-Ling; Hung, Yung-Hsiang; Lee, W M; Li, R K; Jiang, Bo-Ru

    2014-01-01

    Recently, support vector machine (SVM) has excellent performance on classification and prediction and is widely used on disease diagnosis or medical assistance. However, SVM only functions well on two-group classification problems. This study combines feature selection and SVM recursive feature elimination (SVM-RFE) to investigate the classification accuracy of multiclass problems for Dermatology and Zoo databases. Dermatology dataset contains 33 feature variables, 1 class variable, and 366 testing instances; and the Zoo dataset contains 16 feature variables, 1 class variable, and 101 testing instances. The feature variables in the two datasets were sorted in descending order by explanatory power, and different feature sets were selected by SVM-RFE to explore classification accuracy. Meanwhile, Taguchi method was jointly combined with SVM classifier in order to optimize parameters C and γ to increase classification accuracy for multiclass classification. The experimental results show that the classification accuracy can be more than 95% after SVM-RFE feature selection and Taguchi parameter optimization for Dermatology and Zoo databases.

  16. SVM-RFE Based Feature Selection and Taguchi Parameters Optimization for Multiclass SVM Classifier

    PubMed Central

    Huang, Mei-Ling; Hung, Yung-Hsiang; Lee, W. M.; Li, R. K.; Jiang, Bo-Ru

    2014-01-01

    Recently, support vector machine (SVM) has excellent performance on classification and prediction and is widely used on disease diagnosis or medical assistance. However, SVM only functions well on two-group classification problems. This study combines feature selection and SVM recursive feature elimination (SVM-RFE) to investigate the classification accuracy of multiclass problems for Dermatology and Zoo databases. Dermatology dataset contains 33 feature variables, 1 class variable, and 366 testing instances; and the Zoo dataset contains 16 feature variables, 1 class variable, and 101 testing instances. The feature variables in the two datasets were sorted in descending order by explanatory power, and different feature sets were selected by SVM-RFE to explore classification accuracy. Meanwhile, Taguchi method was jointly combined with SVM classifier in order to optimize parameters C and γ to increase classification accuracy for multiclass classification. The experimental results show that the classification accuracy can be more than 95% after SVM-RFE feature selection and Taguchi parameter optimization for Dermatology and Zoo databases. PMID:25295306

  17. Study of the Algorithm of Backtracking Decoupling and Adaptive Extended Kalman Filter Based on the Quaternion Expanded to the State Variable for Underwater Glider Navigation

    PubMed Central

    Huang, Haoqian; Chen, Xiyuan; Zhou, Zhikai; Xu, Yuan; Lv, Caiping

    2014-01-01

    High accuracy attitude and position determination is very important for underwater gliders. The cross-coupling among three attitude angles (heading angle, pitch angle and roll angle) becomes more serious when pitch or roll motion occurs. This cross-coupling makes attitude angles inaccurate or even erroneous. Therefore, the high accuracy attitude and position determination becomes a difficult problem for a practical underwater glider. To solve this problem, this paper proposes backing decoupling and adaptive extended Kalman filter (EKF) based on the quaternion expanded to the state variable (BD-AEKF). The backtracking decoupling can eliminate effectively the cross-coupling among the three attitudes when pitch or roll motion occurs. After decoupling, the adaptive extended Kalman filter (AEKF) based on quaternion expanded to the state variable further smoothes the filtering output to improve the accuracy and stability of attitude and position determination. In order to evaluate the performance of the proposed BD-AEKF method, the pitch and roll motion are simulated and the proposed method performance is analyzed and compared with the traditional method. Simulation results demonstrate the proposed BD-AEKF performs better. Furthermore, for further verification, a new underwater navigation system is designed, and the three-axis non-magnetic turn table experiments and the vehicle experiments are done. The results show that the proposed BD-AEKF is effective in eliminating cross-coupling and reducing the errors compared with the conventional method. PMID:25479331

  18. Selenium analysis by an integrated microwave digestion-needle trap device with hydride sorption on carbon nanotubes and electrothermal atomic absorption spectrometry determination

    NASA Astrophysics Data System (ADS)

    Maratta Martínez, Ariel; Vázquez, Sandra; Lara, Rodolfo; Martínez, Luis Dante; Pacheco, Pablo

    2018-02-01

    An integrated microwave assisted digestion (MW-AD) - needle trap device (NTD) for selenium determination in grape pomace samples is presented. The NTD was filled with oxidized multiwall carbon nanotubes (oxMWCNTS) where Se hydrides were preconcentrated. Determination was carried out by flow injection-electrothermal atomic absorption spectrometry (FI-ETAAS). The variables affecting the system were established by a multivariate design (Plackett Burman), indicating that the following variables significantly affect the system: sample amount, HNO3 digestion solution concentration, NaBH4 volume and elution volume. A Box-Behnken design was implemented to determine the optimized values of these variables. The system improved Se atomization in the graphite furnace, since only trapped hydrides reached the graphite furnace, and the pyrolysis stage was eliminated according to the aqueous matrix of the eluate. Under optimized conditions the system reached a limit of quantification of 0.11 μg kg- 1, a detection limit of 0.032 μg kg- 1, a relative standard deviation of 4% and a preconcentration factor (PF) of 100, reaching a throughput sample of 5 samples per hour. Sample analysis show Se concentrations between 0.34 ± 0.03 μg kg- 1 to 0.48 ± 0.03 μg kg- 1 in grape pomace. This system provides minimal reagents and sample consumption, eliminates discontinuous stages between samples processing reaching a simpler and faster Se analysis.

  19. Study of the algorithm of backtracking decoupling and adaptive extended Kalman filter based on the quaternion expanded to the state variable for underwater glider navigation.

    PubMed

    Huang, Haoqian; Chen, Xiyuan; Zhou, Zhikai; Xu, Yuan; Lv, Caiping

    2014-12-03

    High accuracy attitude and position determination is very important for underwater gliders. The cross-coupling among three attitude angles (heading angle, pitch angle and roll angle) becomes more serious when pitch or roll motion occurs. This cross-coupling makes attitude angles inaccurate or even erroneous. Therefore, the high accuracy attitude and position determination becomes a difficult problem for a practical underwater glider. To solve this problem, this paper proposes backing decoupling and adaptive extended Kalman filter (EKF) based on the quaternion expanded to the state variable (BD-AEKF). The backtracking decoupling can eliminate effectively the cross-coupling among the three attitudes when pitch or roll motion occurs. After decoupling, the adaptive extended Kalman filter (AEKF) based on quaternion expanded to the state variable further smoothes the filtering output to improve the accuracy and stability of attitude and position determination. In order to evaluate the performance of the proposed BD-AEKF method, the pitch and roll motion are simulated and the proposed method performance is analyzed and compared with the traditional method. Simulation results demonstrate the proposed BD-AEKF performs better. Furthermore, for further verification, a new underwater navigation system is designed, and the three-axis non-magnetic turn table experiments and the vehicle experiments are done. The results show that the proposed BD-AEKF is effective in eliminating cross-coupling and reducing the errors compared with the conventional method.

  20. Generalized hydrodynamic correlations and fractional memory functions

    NASA Astrophysics Data System (ADS)

    Rodríguez, Rosalio F.; Fujioka, Jorge

    2015-12-01

    A fractional generalized hydrodynamic (GH) model of the longitudinal velocity fluctuations correlation, and its associated memory function, for a complex fluid is analyzed. The adiabatic elimination of fast variables introduces memory effects in the transport equations, and the dynamic of the fluctuations is described by a generalized Langevin equation with long-range noise correlations. These features motivate the introduction of Caputo time fractional derivatives and allows us to calculate analytic expressions for the fractional longitudinal velocity correlation function and its associated memory function. Our analysis eliminates a spurious constant term in the non-fractional memory function found in the non-fractional description. It also produces a significantly slower power-law decay of the memory function in the GH regime that reduces to the well-known exponential decay in the non-fractional Navier-Stokes limit.

  1. Primary encopresis: evaluation and treatment.

    PubMed Central

    O'Brien, S; Ross, L V; Christophersen, E R

    1986-01-01

    Cathartic and behavioral treatment procedures for eliminating diurnal and nocturnal primary encopresis were investigated using a multiple-baseline design across four children. The dependent and independent variables measured were appropriate bowel movements, soiling accidents, independent toiletings, and cathartic use. Over 177 reliability observations (home visits) were conducted. For two of the children, treatment with cathartics and child-time remedied their soiling accidents and increased their independent toiletings in 8 to 11 weeks. While the cathartics and child-time increased the rate of appropriate bowel movements, they did not eliminate the soiling accidents with the other two children. Independent toiletings for these two children were achieved after 32 to 39 weeks of treatment when punishment procedures (positive practice, time-out, and hourly toilet sits) were incorporated and the suppositories were faded systematically. PMID:3733585

  2. An Interval Type-2 Neural Fuzzy System for Online System Identification and Feature Elimination.

    PubMed

    Lin, Chin-Teng; Pal, Nikhil R; Wu, Shang-Lin; Liu, Yu-Ting; Lin, Yang-Yin

    2015-07-01

    We propose an integrated mechanism for discarding derogatory features and extraction of fuzzy rules based on an interval type-2 neural fuzzy system (NFS)-in fact, it is a more general scheme that can discard bad features, irrelevant antecedent clauses, and even irrelevant rules. High-dimensional input variable and a large number of rules not only enhance the computational complexity of NFSs but also reduce their interpretability. Therefore, a mechanism for simultaneous extraction of fuzzy rules and reducing the impact of (or eliminating) the inferior features is necessary. The proposed approach, namely an interval type-2 Neural Fuzzy System for online System Identification and Feature Elimination (IT2NFS-SIFE), uses type-2 fuzzy sets to model uncertainties associated with information and data in designing the knowledge base. The consequent part of the IT2NFS-SIFE is of Takagi-Sugeno-Kang type with interval weights. The IT2NFS-SIFE possesses a self-evolving property that can automatically generate fuzzy rules. The poor features can be discarded through the concept of a membership modulator. The antecedent and modulator weights are learned using a gradient descent algorithm. The consequent part weights are tuned via the rule-ordered Kalman filter algorithm to enhance learning effectiveness. Simulation results show that IT2NFS-SIFE not only simplifies the system architecture by eliminating derogatory/irrelevant antecedent clauses, rules, and features but also maintains excellent performance.

  3. Reinforcement Learning Trees

    PubMed Central

    Zhu, Ruoqing; Zeng, Donglin; Kosorok, Michael R.

    2015-01-01

    In this paper, we introduce a new type of tree-based method, reinforcement learning trees (RLT), which exhibits significantly improved performance over traditional methods such as random forests (Breiman, 2001) under high-dimensional settings. The innovations are three-fold. First, the new method implements reinforcement learning at each selection of a splitting variable during the tree construction processes. By splitting on the variable that brings the greatest future improvement in later splits, rather than choosing the one with largest marginal effect from the immediate split, the constructed tree utilizes the available samples in a more efficient way. Moreover, such an approach enables linear combination cuts at little extra computational cost. Second, we propose a variable muting procedure that progressively eliminates noise variables during the construction of each individual tree. The muting procedure also takes advantage of reinforcement learning and prevents noise variables from being considered in the search for splitting rules, so that towards terminal nodes, where the sample size is small, the splitting rules are still constructed from only strong variables. Last, we investigate asymptotic properties of the proposed method under basic assumptions and discuss rationale in general settings. PMID:26903687

  4. Acute brain trauma

    PubMed Central

    Martin, GT

    2016-01-01

    In the 20th century, the complications of head injuries were controlled but not eliminated. The wars of the 21st century turned attention to blast, the instant of impact and the primary injury of concussion. Computer calculations have established that in the first 5 milliseconds after the impact, four independent injuries on the brain are inflicted: 1) impact and its shockwave, 2) deceleration, 3) rotation and 4) skull deformity with vibration (or resonance). The recovery, pathology and symptoms after acute brain trauma have always been something of a puzzle. The variability of these four modes of injury, along with a variable reserve of neurones, explains some of this problem. PMID:26688392

  5. A variable mixing-length ratio for convection theory

    NASA Technical Reports Server (NTRS)

    Chan, K. L.; Wolff, C. L.; Sofia, S.

    1981-01-01

    It is argued that a natural choice for the local mixing length in the mixing-length theory of convection has a value proportional to the local density scale height of the convective bubbles. The resultant variable mixing-length ratio (the ratio between the mixing length and the pressure scale height) of this theory is enhanced in the superadiabatic region and approaches a constant in deeper layers. Numerical tests comparing the new mixing length successfully eliminate most of the density inversion that typically plagues conventional results. The new approach also seems to indicate the existence of granular motion at the top of the convection zone.

  6. An introduction to instrumental variables analysis: part 1.

    PubMed

    Bennett, Derrick A

    2010-01-01

    There are several examples in the medical literature where the associations of treatment effects predicted by observational studies have been refuted by evidence from subsequent large-scale randomised trials. This is because of the fact that non-experimental studies are subject to confounding - and confounding cannot be entirely eliminated even if all known confounders have been measured in the study as there may be unknown confounders. The aim of this 2-part methodological primer is to introduce an emerging methodology for estimating treatment effects using observational data in the absence of good randomised evidence known as the method of instrumental variables. Copyright © 2010 S. Karger AG, Basel.

  7. High-efficiency Gaussian key reconciliation in continuous variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bai, ZengLiang; Wang, XuYang; Yang, ShenShen; Li, YongMin

    2016-01-01

    Efficient reconciliation is a crucial step in continuous variable quantum key distribution. The progressive-edge-growth (PEG) algorithm is an efficient method to construct relatively short block length low-density parity-check (LDPC) codes. The qua-sicyclic construction method can extend short block length codes and further eliminate the shortest cycle. In this paper, by combining the PEG algorithm and qua-si-cyclic construction method, we design long block length irregular LDPC codes with high error-correcting capacity. Based on these LDPC codes, we achieve high-efficiency Gaussian key reconciliation with slice recon-ciliation based on multilevel coding/multistage decoding with an efficiency of 93.7%.

  8. Population pharmacokinetic study of isepamicin with intensive care unit patients.

    PubMed Central

    Tod, M; Padoin, C; Minozzi, C; Cougnard, J; Petitjean, O

    1996-01-01

    The pharmacokinetics (PK) of isepamicin, a new aminoglycoside, were studied in 85 intensive care unit (ICU) patients and were compared with those observed in 10 healthy volunteers. A parametric method based on a nonlinear mixed-effect model was used to assess population PK. Isepamicin was given intravenously over 0.5 h at dosages of 15 mg/kg once daily or 7.5 mg/kg twice daily. The data were fitted to a bicompartmental open model. Compared with healthy volunteers, the mean values of the PK parameters were profoundly modified in ICU patients: elimination clearance was reduced by 48%, the volume of distribution in the central compartment (Vc) was increased by 50%, the peripheral volume of distribution was 70% higher, the distribution clearance was 146% lower, and the elimination half-life was ca. 3.4 times higher. The interindividual variability in PK parameters was about 50% in ICU patients. Five covariates (body weight [BW], simplified acute physiology score [SAPS], temperature, serum creatinine level, and creatinine clearance [CLCR]) were tentatively correlated with PK parameters by multivariate linear regression analysis with stepwise addition and deletion. The variability of isepamicin clearance was explained by three covariates (BW, SAPS, and CLCR), that of Vc was explained by BW and SAPS, and that of the elimination half-life was explained by CLCR and SAPS. Simulation of the concentration-versus-time profile for 500 individuals showed that the mean peak (0.75 h) concentration was 18% lower in ICU patients than in healthy volunteers and that the range in ICU patients was very broad (28.4 to 95.4 mg/liter). Therefore, monitoring of the isepamicin concentration is in ICU patients is mandatory. PMID:8849264

  9. Detection of dehydration by using volume kinetics.

    PubMed

    Zdolsek, Joachim; Li, Yuhong; Hahn, Robert G

    2012-10-01

    Patients admitted to surgery may be dehydrated, which is difficult to diagnose except when it is severe (>5% Gl116 of the body weight). We hypothesized that modest dehydration can be detected by kinetic analysis of the blood hemoglobin concentration after a bolus infusion of crystalloid fluid. Four series of experiments were performed on 10 conscious, healthy male volunteers. Separated by at least 2 days, they received 5 or 10 mL/kg acetated Ringer's solution over 15 minutes. Before starting half of the IV infusions, volume depletion amounting to 1.5 to 2.0 L (approximately 2% of body weight) was induced with furosemide. The elimination clearance and the half-life of the infused fluid were calculated based on blood hemoglobin over 120 minutes. The perfusion index and the pleth variability index were monitored by pulse oximetry after a change of body position. Dehydration decreased the elimination clearance of acetated Ringer's solution [median (25th-75th percentile)] from 1.84 (1.23-2.57) to 0.53 (0.41-0.79) mL/kg/min (Wilcoxon matched-pair test P < 0.001) and increased the half-life from 23 (12-37) to 76 (57-101) minutes (P < 0.001). The smaller infusion, 5 mL/kg, fully discriminated between experiments performed in the euhydrated and dehydrated states, whereas the urinary excretion provided a less-reliable indication of hydration status. Dehydration decreased the perfusion index but did not affect the pleth variability index. Dehydration amounting to 2% of the body weight could be detected from the elimination clearance and the half-life of an infusion of 5 mL/kg Ringer's solution.

  10. Inhibitory input from slowly adapting lung stretch receptors to retrotrapezoid nucleus chemoreceptors

    PubMed Central

    Moreira, Thiago S; Takakura, Ana C; Colombari, Eduardo; West, Gavin H; Guyenet, Patrice G

    2007-01-01

    The retrotrapezoid nucleus (RTN) contains CO2-activated interneurons with properties consistent with central respiratory chemoreceptors. These neurons are glutamatergic and express the transcription factor Phox2b. Here we tested whether RTN neurons receive an input from slowly adapting pulmonary stretch receptors (SARs) in halothane-anaesthetized ventilated rats. In vagotomized rats, RTN neurons were inhibited to a variable extent by stimulating myelinated vagal afferents using the lowest intensity needed to inhibit the phrenic nerve discharge (PND). In rats with intact vagus nerves, RTN neurons were inhibited, also to a variable extent, by increasing positive end-expiratory pressure (PEEP; 2–6 cmH2O). The cells most sensitive to PEEP were inhibited during each lung inflation at rest and were instantly activated by stopping ventilation. Muscimol (GABA-A agonist) injection in or next to the solitary tract at area postrema level desynchronized PND from ventilation, eliminated the lung inflation-synchronous inhibition of RTN neurons and their steady inhibition by PEEP but did not change their CO2 sensitivity. Muscimol injection into the rostral ventral respiratory group eliminated PND but did not change RTN neuron response to either lung inflation, PEEP increases, vagal stimulation or CO2. Generalized glutamate receptor blockade with intracerebroventricular (i.c.v.) kynurenate eliminated PND and the response of RTN neurons to lung inflation but did not change their CO2 sensitivity. PEEP-sensitive RTN neurons expressed Phox2b. In conclusion, RTN chemoreceptors receive an inhibitory input from myelinated lung stretch receptors, presumably SARs. The lung input to RTN may be di-synaptic with inhibitory pump cells as sole interneurons. PMID:17255166

  11. Population Pharmacokinetic Analysis of Isoniazid, Acetylisoniazid, and Isonicotinic Acid in Healthy Volunteers

    PubMed Central

    Seng, Kok-Yong; Hee, Kim-Hor; Soon, Gaik-Hong; Chew, Nicholas; Khoo, Saye H.

    2015-01-01

    In this study, we aimed to quantify the effects of the N-acetyltransferase 2 (NAT2) phenotype on isoniazid (INH) metabolism in vivo and identify other sources of pharmacokinetic variability following single-dose administration in healthy Asian adults. The concentrations of INH and its metabolites acetylisoniazid (AcINH) and isonicotinic acid (INA) in plasma were evaluated in 33 healthy Asians who were also given efavirenz and rifampin. The pharmacokinetics of INH, AcINH, and INA were analyzed using nonlinear mixed-effects modeling (NONMEM) to estimate the population pharmacokinetic parameters and evaluate the relationships between the parameters and the elimination status (fast, intermediate, and slow acetylators), demographic status, and measures of renal and hepatic function. A two-compartment model with first-order absorption best described the INH pharmacokinetics. AcINH and INA data were best described by a two- and a one-compartment model, respectively, linked to the INH model. In the final model for INH, the derived metabolic phenotypes for NAT2 were identified as a significant covariate in the INH clearance, reducing its interindividual variability from 86% to 14%. The INH clearance in fast eliminators was 1.9- and 7.7-fold higher than in intermediate and slow eliminators, respectively (65 versus 35 and 8 liters/h). Creatinine clearance was confirmed as a significant covariate for AcINH clearance. Simulations suggested that the current dosing guidelines (200 mg for 30 to 45 kg and 300 mg for >45 kg) may be suboptimal (3 mg/liter ≤ Cmax ≤ 6 mg/liter) irrespective of the acetylator class. The analysis established a model that adequately characterizes INH, AcINH, and INA pharmacokinetics in healthy Asians. Our results refine the NAT2 phenotype-based predictions of the pharmacokinetics for INH. PMID:26282412

  12. Systematic review of methods to predict and detect anastomotic leakage in colorectal surgery.

    PubMed

    Hirst, N A; Tiernan, J P; Millner, P A; Jayne, D G

    2014-02-01

    Anastomotic leakage is a serious complication of gastrointestinal surgery resulting in increased morbidity and mortality, poor function and predisposing to cancer recurrence. Earlier diagnosis and intervention can minimize systemic complications but is hindered by current diagnostic methods that are non-specific and often uninformative. The purpose of this paper is to review current developments in the field and to identify strategies for early detection and treatment of anastomotic leakage. A systematic literature search was performed using the MEDLINE, Embase, PubMed and Cochrane Library databases. Search terms included 'anastomosis' and 'leak' and 'diagnosis' or 'detection' and 'gastrointestinal' or 'colorectal'. Papers concentrating on the diagnosis of gastrointestinal anastomotic leak were identified and further searches were performed by cross-referencing. Computerized tomography CT scanning and water-soluble contrast studies are the current preferred techniques for diagnosing anastomotic leakage but suffer from variable sensitivity and specificity, have logistical constraints and may delay timely intervention. Intra-operative endoscopy and imaging may offer certain advantages, but the ability to predict anastomotic leakage is unproven. Newer techniques involve measurement of biomarkers for anastomotic leakage and have the potential advantage of providing cheap real-time monitoring for postoperative complications. Current diagnostic tests often fail to diagnose anastomotic leak at an early stage that enables timely intervention and minimizes serious morbidity and mortality. Emerging technologies, based on detection of local biomarkers, have achieved proof of concept status but require further evaluation to determine whether they translate into improved patient outcomes. Further research is needed to address this important, yet relatively unrecognized, area of unmet clinical need. Colorectal Disease © 2013 The Association of Coloproctology of Great Britain and Ireland.

  13. Non-linear effects of transcranial direct current stimulation as a function of individual baseline performance: Evidence from biparietal tDCS influence on lateralized attention bias.

    PubMed

    Benwell, Christopher S Y; Learmonth, Gemma; Miniussi, Carlo; Harvey, Monika; Thut, Gregor

    2015-08-01

    Transcranial direct current stimulation (tDCS) is a well-established technique for non-invasive brain stimulation (NIBS). However, the technique suffers from a high variability in outcome, some of which is likely explained by the state of the brain at tDCS-delivery but for which explanatory, mechanistic models are lacking. Here, we tested the effects of bi-parietal tDCS on perceptual line bisection as a function of tDCS current strength (1 mA vs 2 mA) and individual baseline discrimination sensitivity (a measure associated with intrinsic uncertainty/signal-to-noise balance). Our main findings were threefold. We replicated a previous finding (Giglia et al., 2011) of a rightward shift in subjective midpoint after Left anode/Right cathode tDCS over parietal cortex (sham-controlled). We found this effect to be weak over our entire sample (n = 38), but to be substantial in a subset of participants when they were split according to tDCS-intensity and baseline performance. This was due to a complex, nonlinear interaction between these two factors. Our data lend further support to the notion of state-dependency in NIBS which suggests outcome to depend on the endogenous balance between task-informative 'signal' and task-uninformative 'noise' at baseline. The results highlight the strong influence of individual differences and variations in experimental parameters on tDCS outcome, and the importance of fostering knowledge on the factors influencing tDCS outcome across cognitive domains. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Light Actuation of Liquid by Optoelectrowetting

    DTIC Science & Technology

    2005-06-01

    liquid lenses with variable focal length [7]. Transport of liquid in droplet forms offers many advan- tages. It eliminates the need for pumps and...novel mechanism for light actuation of liquid droplets. This is realized by integrating a photoconductive material underneath the electrowetting ...optoelectrowetting 2.1. General concept Fig. 1(a) shows the general electrowetting mechanism. A droplet of polarizable liquid is placed on a substrate

  15. Guaranteed Student Loans: Eliminating Interest Rate Floors Could Generate Substantial Savings. Report to the Honorable George D. Mitchell, U.S. Senate.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Div. of Human Resources.

    A study was done of how interest rate floors on certain guaranteed student loans affect the federal government's and students' costs when rates on short-term government securities decline. The study developed cost comparisons using fixed and variable loan interest rates. For comparison Department of Education projections of loan volumes for fiscal…

  16. Noise enhanced stability of a metastable state containing coupled Brownian particles

    NASA Astrophysics Data System (ADS)

    Singh, R. K.

    2017-05-01

    Dynamics of coupled Brownian particles with color correlated additive Gaussian colored noises in a metastable state is analyzed to study the phenomenon of noise enhanced stability. The lifetime of such a metastable state is found to depend on the noise correlations and initial conditions. Dynamics of the slow variable is analyzed using the method of adiabatic elimination in the weak color limit.

  17. Development and fabrication of a chargeable magnet system for spacecraft control

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design of variable permanent magnets for use in magnetic balancing and control of earth orbiting spacecraft is discussed. These magnets can be used instead of air coils or electromagnets in applications where the objective is to produce, or eliminate, torque on the spacecraft through interaction with the earth's magnetic field. The configuration of the magnet for minimum size and weight is described.

  18. Influence of persistent exchangeable oxygen on biogenic silica δ18O in deep sea cores

    NASA Astrophysics Data System (ADS)

    Menicucci, A. J.; Spero, H. J.

    2016-12-01

    The removal of exchangeable oxygen from biogenic opal prior to IRMS analysis is critical during sample preparation. Exchangeable oxygen is found in the form of hydroxyl and between defects within the amorphous silicate lattice structure. Typical analytical procedures utilize a variety of dehydroxylation methods to eliminate this exchangeable oxygen, including vacuum dehydroxylation and prefluorination. Such methods are generally considered sufficient for elimination of non-lattice bound oxygen that would obfuscate environmental oxygen isotopic signals contained within the silicate tetrahedra. δ18O data that are then empirically calibrated against modern hydrographic data, and applied down core in paleoceanographic applications. We have conducted a suite of experiments on purified marine opal samples using the new microfluorination method (Menicucci et al., 2013). Our data demonstrate that the amount of exchangeable oxygen in biogenic opal decreases as sample age/depth in core increases. These changes are not accounted for by current researchers. Further, our experimental data indicate that vacuum dehydroxylation does not eliminate all exchangeable oxygen, even after hydroxyl is undetectable. We have conducted experiments to quantify the amount of time necessary to ensure vacuum dehydroxylation has eliminated exchangeable oxygen so that opal samples are stable prior to δ18Odiatom analysis. Our experiments suggest that previously generated opal δ18O data may contain a variable down-core offset due to the presence of exchangeable, non-lattice bound oxygen sources. Our experiments indicate that diatom silica requires dehydroxylation for ≥ 44 hours at 1060oC to quantitatively remove all non-lattice bound oxygen. Further, this variable amount of exchangeable oxygen may be responsible for some of the disagreement between existing empirical calibrations based on core-top diatom frustule remains. Analysis of δ18Odiatom values after this long vacuum dehydroxylation time is necessary for quantitative comparisons of stable isotopic values across geologic time periods. Menicucci, A. J., et al. (2013). "Oxygen isotope analyses of biogenic opal and quartz using a novel microfluorination technique." Rapid Communications in Mass Spectrometry 27(16): 1873-1881

  19. Quantifying Grain-Size Variability of Metal Pollutants in Road-Deposited Sediments Using the Coefficient of Variation

    PubMed Central

    Wang, Xiaoxue; Li, Xuyong

    2017-01-01

    Particle grain size is an important indicator for the variability in physical characteristics and pollutants composition of road-deposited sediments (RDS). Quantitative assessment of the grain-size variability in RDS amount, metal concentration, metal load and GSFLoad is essential to elimination of the uncertainty it causes in estimation of RDS emission load and formulation of control strategies. In this study, grain-size variability was explored and quantified using the coefficient of variation (Cv) of the particle size compositions, metal concentrations, metal loads, and GSFLoad values in RDS. Several trends in grain-size variability of RDS were identified: (i) the medium class (105–450 µm) variability in terms of particle size composition, metal loads, and GSFLoad values in RDS was smaller than the fine (<105 µm) and coarse (450–2000 µm) class; (ii) The grain-size variability in terms of metal concentrations increased as the particle size increased, while the metal concentrations decreased; (iii) When compared to the Lorenz coefficient (Lc), the Cv was similarly effective at describing the grain-size variability, whereas it is simpler to calculate because it did not require the data to be pre-processed. The results of this study will facilitate identification of the uncertainty in modelling RDS caused by grain-size class variability. PMID:28788078

  20. Nutritional management of cow's milk allergy in children: An update.

    PubMed

    Dupont, C; Chouraqui, J-P; Linglart, A; Bocquet, A; Darmaun, D; Feillet, F; Frelut, M-L; Girardet, J-P; Hankard, R; Rozé, J-C; Simeoni, U; Briend, A

    2018-04-01

    Cow's milk is one of the most common foods responsible for allergic reactions in children. Cow's milk allergy (CMA) involves immunoglobulin E (IgE)- and non-IgE-mediated reactions, the latter being both variable and nonspecific. Guidelines thus emphasize the need for physicians to recognize the specific syndromes of CMA and to respect strict diagnostic modalities. Whatever the clinical pattern of CMA, the mainstay of treatment is the elimination from the diet of cow's milk proteins. The challenge is that both the disease and the elimination diet may result in insufficient height and weight gain and bone mineralization. If, during CMA, the mother is not able or willing to breastfeed, the child must be fed a formula adapted to CMA dietary management, during infancy and later, if the disease persists. This type of formula must be adequate in terms of allergic efficacy and nutritional safety. In older children, when CMA persists, the use of cow's milk baked or heated at a sufficient temperature, frequently tolerated by children with CMA, may help alleviate the stringency of the elimination diet. Guidance on the implementation of the elimination diet by qualified healthcare professionals is always necessary. This guidance should also include advice to ensure adequate bone growth, especially relating to calcium intake. Specific attention should be given to children presenting with several risk factors for weak bone mineral density, i.e., multiple food allergies, vitamin D deficiency, poor sun exposure, steroid use, or severe eczema. When CMA is outgrown, a prolonged elimination diet may negatively impact the quality of the diet over the long term. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  1. Identity method for particle number fluctuations and correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorenstein, M. I.

    An incomplete particle identification distorts the observed event-by-event fluctuations of the hadron chemical composition in nucleus-nucleus collisions. A new experimental technique called the identity method was recently proposed. It eliminated the misidentification problem for one specific combination of the second moments in a system of two hadron species. In the present paper, this method is extended to calculate all the second moments in a system with an arbitrary number of hadron species. Special linear combinations of the second moments are introduced. These combinations are presented in terms of single-particle variables and can be found experimentally from the event-by-event averaging. Themore » mathematical problem is then reduced to solving a system of linear equations. The effect of incomplete particle identification is fully eliminated from the final results.« less

  2. Idiopathic orthostatic hypotension: Recent data (eleven cases) and review of the literature

    NASA Technical Reports Server (NTRS)

    Ninet, J.; Annat, G.; Boisson, D.; Holzhapfel, L.; Vincent, M.; Peyrin, L.; Michel, D.; Schott, B.; Devic, M.; Levrat, R.

    1981-01-01

    Eight cases of Shy-Drager syndrome and three of Bradbury-Eggleston idiopathic orthostatic hypotension were examined. In all cases, examination of circulatory reflexes showed major dysfunction of the sympathetic vasoconstrictor system. Anomalies in the vagal cardiomoderator system were less constant. Normal urinary elimination of catecholamines was recorded daily. Characteristically, no elevation of blood or urine norepinephrine levels were found in orthostatism. Insulin hypoglycemia normally raised urinary adrenalin elimination in three of ten patients. Plasma dopa-beta-hydroxylase activity was normal. Renin-angiotensin-aldosterone system showed variable activity at basal state but usually rose during orthostatism. On the average, very low homovanillic acid levels were found in cerebrospinal fluid before and after probenecid; hydroxyindolacetic acid was normal. Cerebral autoregulation had deteriorated in two of four cases. Physiopathologically the two clinical types are indistinguishable with or without central neurological signs.

  3. Insulin aspart pharmacokinetics: an assessment of its variability and underlying mechanisms.

    PubMed

    Rasmussen, Christian Hove; Røge, Rikke Meldgaard; Ma, Zhulin; Thomsen, Maria; Thorisdottir, Rannveig Linda; Chen, Jian-Wen; Mosekilde, Erik; Colding-Jørgensen, Morten

    2014-10-01

    Insulin aspart (IAsp) is used by many diabetics as a meal-time insulin to control post-prandial glucose levels. As is the case with many other insulin types, the pharmacokinetics (PK), and consequently the pharmacodynamics (PD), is associated with clinical variability, both between and within individuals. The present article identifies the main physiological mechanisms that govern the PK of IAsp following subcutaneous administration and quantifies them in terms of their contribution to the overall variability. CT scanning data from Thomsen et al. (2012) are used to investigate and quantify the properties of the subcutaneous depot. Data from Brange et al. (1990) are used to determine the effects of insulin chemistry in subcutis on the absorption rate. Intravenous (i.v.) bolus and infusion PK data for human insulin are used to understand and quantify the systemic distribution and elimination (Pørksen et al., 1997; Sjöstrand et al., 2002). PK and PD profiles for type 1 diabetics from Chen et al. (2005) are analyzed to demonstrate the effects of IAsp antibodies in terms of bound and unbound insulin. PK profiles from Thorisdottir et al. (2009) and Ma et al. (2012b) are analyzed in the nonlinear mixed effects software Monolix® to determine the presence and effects of the mechanisms described in this article. The distribution of IAsp in the subcutaneous depot show an initial dilution of approximately a factor of two in a single experiment. Injected insulin hexamers exist in a chemical equilibrium with monomers and dimers, which depends strongly on the degree of dilution in subcutis, the presence of auxiliary substances, and a variety of other factors. Sensitivity to the initial dilution in subcutis can thus be a cause of some of the variability. Temporal variations in the PK are explained by variations in the subcutaneous blood flow. IAsp antibodies are found to be a large contributor to the variability of total insulin PK in a study by Chen et al. (2005), since only the free fraction is eliminated via the receptors. The contribution of these and other sources of variability to the total variability is quantified via a population PK analysis and two recent clinical studies (Thorisdottir et al., 2009; Ma et al., 2012b), which support the presence and significance of the identified mechanisms. IAsp antibody binding, oligomeric transitions in subcutis, and blood flow dependent variations in absorption rate seem to dominate the PK variability of IAsp. It may be possible via e.g. formulation design to reduce some of these variability factors. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Order Selection for General Expression of Nonlinear Autoregressive Model Based on Multivariate Stepwise Regression

    NASA Astrophysics Data System (ADS)

    Shi, Jinfei; Zhu, Songqing; Chen, Ruwen

    2017-12-01

    An order selection method based on multiple stepwise regressions is proposed for General Expression of Nonlinear Autoregressive model which converts the model order problem into the variable selection of multiple linear regression equation. The partial autocorrelation function is adopted to define the linear term in GNAR model. The result is set as the initial model, and then the nonlinear terms are introduced gradually. Statistics are chosen to study the improvements of both the new introduced and originally existed variables for the model characteristics, which are adopted to determine the model variables to retain or eliminate. So the optimal model is obtained through data fitting effect measurement or significance test. The simulation and classic time-series data experiment results show that the method proposed is simple, reliable and can be applied to practical engineering.

  5. Robust high-precision attitude control for flexible spacecraft with improved mixed H2/H∞ control strategy under poles assignment constraint

    NASA Astrophysics Data System (ADS)

    Liu, Chuang; Ye, Dong; Shi, Keke; Sun, Zhaowei

    2017-07-01

    A novel improved mixed H2/H∞ control technique combined with poles assignment theory is presented to achieve attitude stabilization and vibration suppression simultaneously for flexible spacecraft in this paper. The flexible spacecraft dynamics system is described and transformed into corresponding state space form. Based on linear matrix inequalities (LMIs) scheme and poles assignment theory, the improved mixed H2/H∞ controller does not restrict the equivalence of the two Lyapunov variables involved in H2 and H∞ performance, which can reduce conservatives compared with traditional mixed H2/H∞ controller. Moreover, it can eliminate the coupling of Lyapunov matrix variables and system matrices by introducing slack variable that provides additional degree of freedom. Several simulations are performed to demonstrate the effectiveness and feasibility of the proposed method in this paper.

  6. Believe it or not: on the possibility of suspending belief.

    PubMed

    Hasson, Uri; Simmons, Joseph P; Todorov, Alexander

    2005-07-01

    We present two experiments that cast doubt on existing evidence suggesting that it is impossible to suspend belief in a comprehended proposition. In Experiment 1, we found that interrupting the encoding of a statement's veracity decreased memory for the statement's falsity when the false version of the statement was uninformative, but not when the false version was informative. This suggests that statements that are informative when false are not represented as if they were true. In Experiment 2, participants made faster lexical decisions to words implied by preceding statements when they were told that the statements were true than when the veracity of the statements was unknown or when the statements were false. The findings suggest that comprehending a statement may not require believing it, and that it may be possible to suspend belief in comprehended propositions.

  7. SNOMED CT's RF2: Is the future bright?

    PubMed

    Ceusters, Werner

    2011-01-01

    SNOMED CT's new RF2 format is said to come with features for better configuration management of the SNOMED vocabulary, thereby accommodating evolving requirements without the need for further fundamental change in the foreseeable future. Although the available documentation is not yet convincing enough to support this claim, the newly introduced Model Component hierarchy and associated reference set mechanism seem to hold real promise of being able to deal successfully with a number of ontological issues that have been discussed in the recent literature. Backed up by a study of the old and new format and of the relevant literature and documentation, three recommendations are presented that would free SNOMED CT from use-mention confusions, unclear referencing of real-world entities and uninformative reasons for change in a way that does not force SNOMED CT to take a specific philosophical or ontological position.

  8. Visual speech segmentation: using facial cues to locate word boundaries in continuous speech

    PubMed Central

    Mitchel, Aaron D.; Weiss, Daniel J.

    2014-01-01

    Speech is typically a multimodal phenomenon, yet few studies have focused on the exclusive contributions of visual cues to language acquisition. To address this gap, we investigated whether visual prosodic information can facilitate speech segmentation. Previous research has demonstrated that language learners can use lexical stress and pitch cues to segment speech and that learners can extract this information from talking faces. Thus, we created an artificial speech stream that contained minimal segmentation cues and paired it with two synchronous facial displays in which visual prosody was either informative or uninformative for identifying word boundaries. Across three familiarisation conditions (audio stream alone, facial streams alone, and paired audiovisual), learning occurred only when the facial displays were informative to word boundaries, suggesting that facial cues can help learners solve the early challenges of language acquisition. PMID:25018577

  9. Devious Chatbots - Interactive Malware with a Plot

    NASA Astrophysics Data System (ADS)

    Jonathan, Pan Juin Yang; Fung, Chun Che; Wong, Kok Wai

    Many social robots in the forms of conversation agents or Chatbots have been put to practical use in recent years. Their typical roles are online help or acting as a cyber agent representing an organisation. However, there exists a new form of devious chatbots lurking in the Internet. It is effectively an interactive malware seeking to lure its prey not through vicious assault, but with seductive conversation. It talks to its prey through the same channel that is normally used for human-to-human communication. These devious chatbots are using social engineering to attack the uninformed and unprepared victims. This type of attacks is becoming more pervasive with the advent of Web 2.0. This survey paper presents results from a research on how this breed of devious Malware is spreading, and what could be done to stop it.

  10. Study of market model describing the contrary behaviors of informed and uninformed agents: Being minority and being majority

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Xia; Liao, Hao; Medo, Matus; Shang, Ming-Sheng; Yeung, Chi Ho

    2016-05-01

    In this paper we analyze the contrary behaviors of the informed investors and uniformed investors, and then construct a competition model with two groups of agents, namely agents who intend to stay in minority and those who intend to stay in majority. We find two kinds of competitions, inter- and intra-groups. The model shows periodic fluctuation feature. The average distribution of strategies illustrates a prominent central peak which is relevant to the peak-fat-tail character of price change distribution in stock markets. Furthermore, in the modified model the tolerance time parameter makes the agents diversified. Finally, we compare the strategies distribution with the price change distribution in real stock market, and we conclude that contrary behavior rules and tolerance time parameter are indeed valid in the description of market model.

  11. Mario Bunge's Materialist Theory of Mind and Contemporary Cognitive Science

    NASA Astrophysics Data System (ADS)

    Slezak, Peter

    2012-10-01

    Bunge's writings on the mind-body problem provide a rigorous, analytical antidote to the persistent anti-materialist tendency that has characterized the history of philosophy and science. Bunge gives special attention to dualism and its shortcomings, and this attention is welcome in view of the resurgence of the doctrine today. However, I focus my comments selectively on Bunge's more controversial, provocative claims, not to dismiss them, but to engage with them seriously. For example, a difficulty arising from Bunge's rhetorical style and its undoubted virtues is that not all the targets of his selfconfessed "bashings" (2010, xi) are equally deserving. For example, Bunge suggests "most contemporary philosophers of mind are indifferent to psychology, or are remarkably uninformed about it". This charge cannot be sustained today in light of the work of foremost philosophers today.

  12. Repliscan: a tool for classifying replication timing regions.

    PubMed

    Zynda, Gregory J; Song, Jawon; Concia, Lorenzo; Wear, Emily E; Hanley-Bowdoin, Linda; Thompson, William F; Vaughn, Matthew W

    2017-08-07

    Replication timing experiments that use label incorporation and high throughput sequencing produce peaked data similar to ChIP-Seq experiments. However, the differences in experimental design, coverage density, and possible results make traditional ChIP-Seq analysis methods inappropriate for use with replication timing. To accurately detect and classify regions of replication across the genome, we present Repliscan. Repliscan robustly normalizes, automatically removes outlying and uninformative data points, and classifies Repli-seq signals into discrete combinations of replication signatures. The quality control steps and self-fitting methods make Repliscan generally applicable and more robust than previous methods that classify regions based on thresholds. Repliscan is simple and effective to use on organisms with different genome sizes. Even with analysis window sizes as small as 1 kilobase, reliable profiles can be generated with as little as 2.4x coverage.

  13. An Avian Basal Ganglia-Forebrain Circuit Contributes Differentially to Syllable Versus Sequence Variability of Adult Bengalese Finch Song

    PubMed Central

    Hampton, Cara M.; Sakata, Jon T.; Brainard, Michael S.

    2009-01-01

    Behavioral variability is important for motor skill learning but continues to be present and actively regulated even in well-learned behaviors. In adult songbirds, two types of song variability can persist and are modulated by social context: variability in syllable structure and variability in syllable sequencing. The degree to which the control of both types of adult variability is shared or distinct remains unknown. The output of a basal ganglia-forebrain circuit, LMAN (the lateral magnocellular nucleus of the anterior nidopallium), has been implicated in song variability. For example, in adult zebra finches, neurons in LMAN actively control the variability of syllable structure. It is unclear, however, whether LMAN contributes to variability in adult syllable sequencing because sequence variability in adult zebra finch song is minimal. In contrast, Bengalese finches retain variability in both syllable structure and syllable sequencing into adulthood. We analyzed the effects of LMAN lesions on the variability of syllable structure and sequencing and on the social modulation of these forms of variability in adult Bengalese finches. We found that lesions of LMAN significantly reduced the variability of syllable structure but not of syllable sequencing. We also found that LMAN lesions eliminated the social modulation of the variability of syllable structure but did not detect significant effects on the modulation of sequence variability. These results show that LMAN contributes differentially to syllable versus sequence variability of adult song and suggest that these forms of variability are regulated by distinct neural pathways. PMID:19357331

  14. Transient Finite Element Computations on a Variable Transputer System

    NASA Technical Reports Server (NTRS)

    Smolinski, Patrick J.; Lapczyk, Ireneusz

    1993-01-01

    A parallel program to analyze transient finite element problems was written and implemented on a system of transputer processors. The program uses the explicit time integration algorithm which eliminates the need for equation solving, making it more suitable for parallel computations. An interprocessor communication scheme was developed for arbitrary two dimensional grid processor configurations. Several 3-D problems were analyzed on a system with a small number of processors.

  15. Beyond the Floor Effect on the Wechsler Intelligence Scale for Children-4th Ed. (WISC-IV): Calculating IQ and Indexes of Subjects Presenting a Floored Pattern of Results

    ERIC Educational Resources Information Center

    Orsini, A.; Pezzuti, L.; Hulbert, S.

    2015-01-01

    Background: It is now widely known that children with severe intellectual disability show a 'floor effect' on the Wechsler scales. This effect emerges because the practice of transforming raw scores into scaled scores eliminates any variability present in participants with low intellectual ability and because intelligence quotient (IQ) scores are…

  16. Range of control of cardiovascular variables by the hypothalamus

    NASA Technical Reports Server (NTRS)

    Smith, O. A.; Stephenson, R. B.; Randall, D. C.

    1974-01-01

    New methodologies were utilized to study the influence of the hypothalamus on the cardiovascular system. The regulation of myocardial activity was investigated in monkeys with hypothalamic lesions that eliminate cardiovascular responses. Observations showed that a specific part of the hypothalamus regulates changes in myocardial contractility that accompanies emotion. Studies of the hypothalamus control of renal blood flow showed the powerful potential control of this organ over renal circulation.

  17. Seismic intrusion detector system

    DOEpatents

    Hawk, Hervey L.; Hawley, James G.; Portlock, John M.; Scheibner, James E.

    1976-01-01

    A system for monitoring man-associated seismic movements within a control area including a geophone for generating an electrical signal in response to seismic movement, a bandpass amplifier and threshold detector for eliminating unwanted signals, pulse counting system for counting and storing the number of seismic movements within the area, and a monitoring system operable on command having a variable frequency oscillator generating an audio frequency signal proportional to the number of said seismic movements.

  18. Brief functional analysis and treatment of a vocal tic.

    PubMed

    Watson, T S; Sterling, H E

    1998-01-01

    This study sought to extend functional methodology to the assessment and treatment of habits. After a descriptive assessment indicated that coughing occurred while eating, a brief functional analysis suggested that social attention was the maintaining variable. Results demonstrated that treatment, derived from the assessment and analysis data, rapidly eliminated the cough. We discuss the appropriateness of using functional analysis procedures for deriving treatments for habits in a clinical setting.

  19. Organic Isothiocyanates: Dietary Modulators of Doxorubicin Resistance in Breast Cancer

    DTIC Science & Technology

    2004-06-01

    al. (30) have dem- isothiocyanates: chemistry and mechanisms. Cancer Res. 54: onstrated the inhibition of renal clearance of colchicine by 1976s-1981s... Chemistry , Uni- possibility of flip-flop kinetics (absorption being slower versity at Buffalo) for his assistance in internal standard than elimination) in...Anticarcinogenic activities of organic sult fi’om interindividual variability and limited subject isothiocyanates: chemistry and mechanisms, Cancer Res. 54 (1994

  20. Relativistic and the first sectorial harmonics corrections in the critical inclination

    NASA Astrophysics Data System (ADS)

    Rahoma, W. A.; Khattab, E. H.; Abd El-Salam, F. A.

    2014-05-01

    The problem of the critical inclination is treated in the Hamiltonian framework taking into consideration post-Newtonian corrections as well as the main correction term of sectorial harmonics for an earth-like planet. The Hamiltonian is expressed in terms of Delaunay canonical variables. A canonical transformation is applied to eliminate short period terms. A modified critical inclination is obtained due to relativistic and the first sectorial harmonics corrections.

  1. Network Design for Reliability and Resilience to Attack

    DTIC Science & Technology

    2014-03-01

    attacker can destroy n arcs in the network SPNI Shortest-Path Network-Interdiction problem TSP Traveling Salesman Problem UB upper bound UKR Ukraine...elimination from the traveling salesman problem (TSP). Literature calls a walk that does not contain a cycle a path [19]. The objective function in...arc lengths as random variables with known probability distributions. The m-median problem seeks to design a network with minimum average travel cost

  2. Simple and fast spectral domain algorithm for quantitative phase imaging of living cells with digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Min, Junwei; Yao, Baoli; Ketelhut, Steffi; Kemper, Björn

    2017-02-01

    The modular combination of optical microscopes with digital holographic microscopy (DHM) has been proven to be a powerful tool for quantitative live cell imaging. The introduction of condenser and different microscope objectives (MO) simplifies the usage of the technique and makes it easier to measure different kinds of specimens with different magnifications. However, the high flexibility of illumination and imaging also causes variable phase aberrations that need to be eliminated for high resolution quantitative phase imaging. The existent phase aberrations compensation methods either require add additional elements into the reference arm or need specimen free reference areas or separate reference holograms to build up suitable digital phase masks. These inherent requirements make them unpractical for usage with highly variable illumination and imaging systems and prevent on-line monitoring of living cells. In this paper, we present a simple numerical method for phase aberration compensation based on the analysis of holograms in spatial frequency domain with capabilities for on-line quantitative phase imaging. From a single shot off-axis hologram, the whole phase aberration can be eliminated automatically without numerical fitting or pre-knowledge of the setup. The capabilities and robustness for quantitative phase imaging of living cancer cells are demonstrated.

  3. Conditions that influence the elimination of postural constraints after office employees working with VDU have received ergonomics training.

    PubMed

    Montreuil, Sylvie; Laflamme, Lucie; Brisson, Chantal; Teiger, Catherine

    2006-01-01

    The goal of this article is to better understand how preventive measures are undertaken after training. It examines how certain variables, such as musculoskeletal pain, participant age and workstation and work content characteristics influence the reduction of postural constraints after office employees working with a computer have received ergonomics training. A pre-test/post-test design was used. The 207 female office workers were given 6 hours of ergonomics training. The variables were determined using a self-administered questionnaire and an observation grid filled out 2 weeks before and 6 months after the training session. The FAC and HAC were used in the data processing. The presence or absence of musculoskeletal pain had no statistically significant influence on whether or not postural constraints were eliminated. The age of the participants and the possibility of adjusting the workstation characteristics and work content produced differentiated results with regard to postural constraint reduction. We concluded that trained people succeed in taking relevant and effective measures to reduce the postural constraints found in VDUs. However other measures than work station adjustments lead to this prevention and such training must be strongly supported by the various hierarchical levels of an enterprise or an institution.

  4. Evolutionary patterns in trace metal (cd and zn) efflux capacity in aquatic organisms.

    PubMed

    Poteat, Monica D; Garland, Theodore; Fisher, Nicholas S; Wang, Wen-Xiong; Buchwalter, David B

    2013-07-16

    The ability to eliminate (efflux) metals is a physiological trait that acts as a major driver of bioaccumulation differences among species. This species-specific trait plays a large role in determining the metal loads that species will need to detoxify to persist in chronically contaminated environments and, therefore, contributes significantly to differences in environmental sensitivity among species. To develop a better understanding of how efflux varies within and among taxonomic groupings, we compared Cd and Zn efflux rate constants (ke values) among members of two species-rich aquatic insect families, Ephemerellidae and Hydropsychidae, and discovered that ke values strongly covaried across species. This relationship allowed us to successfully predict Zn efflux from Cd data gathered from aquatic species belonging to other insect orders and families. We then performed a broader, comparative analysis of Cd and Zn ke values from existing data for arthropods, mollusks, annelids, and chordates (77 species total) and found significant phylogenetic patterns. Taxonomic groups exhibited marked variability in ke magnitudes and ranges, suggesting that some groups are more constrained than others in their abilities to eliminate metals. Understanding broader patterns of variability can lead to more rational extrapolations across species and improved protectiveness in water-quality criteria and ecological assessment.

  5. Preliminary design of a prototype particulate stack sampler. [For stack gas temperature under 300/sup 0/C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elder, J.C.; Littlefield, L.G.; Tillery, M.I.

    1978-06-01

    A preliminary design of a prototype particulate stack sampler (PPSS) has been prepared, and development of several components is under way. The objective of this Environmental Protection Agency (EPA)-sponsored program is to develop and demonstrate a prototype sampler with capabilities similar to EPA Method 5 apparatus but without some of the more troublesome aspects. Features of the new design include higher sampling flow; display (on demand) of all variables and periodic calculation of percent isokinetic, sample volume, and stack velocity; automatic control of probe and filter heaters; stainless steel surfaces in contact with the sample stream; single-point particle size separationmore » in the probe nozzle; null-probe capability in the nozzle; and lower weight in the components of the sampling train. Design considerations will limit use of the PPSS to stack gas temperatures under approximately 300/sup 0/C, which will exclude sampling some high-temperature stacks such as incinerators. Although need for filter weighing has not been eliminated in the new design, introduction of a variable-slit virtual impactor nozzle may eliminate the need for mass analysis of particles washed from the probe. Component development has shown some promise for continuous humidity measurement by an in-line wet-bulb, dry-bulb psychrometer.« less

  6. Contamination of the turbine air chamber: a risk of cross infection.

    PubMed

    Checchi, L; Montebugnoli, L; Samaritani, S

    1998-08-01

    In the present work, we evaluated (a) the influx of contaminating fluid into the air chamber when a high-speed turbine stops rotating, (b) the significance of a series of variables (type of handpiece and dental unit, shape of the bur, number of stops set on the turbine) which condition it, and (c) the time required to expel the contaminating fluid from the turbine head. Results showed that contamination takes place every time the turbine stops rotating with the bur in contact with an external fluid. The main variable affecting the influx of contaminating fluid into the air chamber of the turbine head was represented by the shape of the bur (F=54.9; p<0.01). Another significant variable was the type of handpiece and dental unit (F=7.3; p<0.01). The number of stops set on the turbine was irrelevant (F=0.03; p=n.s.). The expulsion of the contaminant from the turbine head showed 2 different exponential rates: a very rapid-elimination phase within 30 s and a slow-elimination phase between 60 and 300 s. In order to remove over 99% of the contaminant from the air chamber, a turbine had to run for more than 4-7 min depending on the type of the handpiece. In conclusion, data from the present study suggest that a significant cross-infection potential exists with high-speed handpieces whenever they are only externally scrubbed and disinfected so the internal cleaning and sterilization between patients is mandatory. The practice of flushing by running the turbines between patients should be discouraged.

  7. Utilization of lean management principles in the ambulatory clinic setting.

    PubMed

    Casey, Jessica T; Brinton, Thomas S; Gonzalez, Chris M

    2009-03-01

    The principles of 'lean management' have permeated many sectors of today's business world, secondary to the success of the Toyota Production System. This management method enables workers to eliminate mistakes, reduce delays, lower costs, and improve the overall quality of the product or service they deliver. These lean management principles can be applied to health care. Their implementation within the ambulatory care setting is predicated on the continuous identification and elimination of waste within the process. The key concepts of flow time, inventory and throughput are utilized to improve the flow of patients through the clinic, and to identify points that slow this process -- so-called bottlenecks. Nonessential activities are shifted away from bottlenecks (i.e. the physician), and extra work capacity is generated from existing resources, rather than being added. The additional work capacity facilitates a more efficient response to variability, which in turn results in cost savings, more time for the physician to interact with patients, and faster completion of patient visits. Finally, application of the lean management principle of 'just-in-time' management can eliminate excess clinic inventory, better synchronize office supply with patient demand, and reduce costs.

  8. [Application of characteristic NIR variables selection in portable detection of soluble solids content of apple by near infrared spectroscopy].

    PubMed

    Fan, Shu-Xiang; Huang, Wen-Qian; Li, Jiang-Bo; Guo, Zhi-Ming; Zhaq, Chun-Jiang

    2014-10-01

    In order to detect the soluble solids content(SSC)of apple conveniently and rapidly, a ring fiber probe and a portable spectrometer were applied to obtain the spectroscopy of apple. Different wavelength variable selection methods, including unin- formative variable elimination (UVE), competitive adaptive reweighted sampling (CARS) and genetic algorithm (GA) were pro- posed to select effective wavelength variables of the NIR spectroscopy of the SSC in apple based on PLS. The back interval LS- SVM (BiLS-SVM) and GA were used to select effective wavelength variables based on LS-SVM. Selected wavelength variables and full wavelength range were set as input variables of PLS model and LS-SVM model, respectively. The results indicated that PLS model built using GA-CARS on 50 characteristic variables selected from full-spectrum which had 1512 wavelengths achieved the optimal performance. The correlation coefficient (Rp) and root mean square error of prediction (RMSEP) for prediction sets were 0.962, 0.403°Brix respectively for SSC. The proposed method of GA-CARS could effectively simplify the portable detection model of SSC in apple based on near infrared spectroscopy and enhance the predictive precision. The study can provide a reference for the development of portable apple soluble solids content spectrometer.

  9. Population pharmacokinetic study of teicoplanin in severely neutropenic patients.

    PubMed Central

    Lortholary, O; Tod, M; Rizzo, N; Padoin, C; Biard, O; Casassus, P; Guillevin, L; Petitjean, O

    1996-01-01

    The teicoplanin pharmacokinetics (PK) of 30 febrile and severely neutropenic patients (polymorphonuclear count, < 500/mm3) with hematologic malignancies were compared with those determined for five healthy volunteers (HV). Neutropenic patients were given piperacillin combined with amikacin, and teicoplanin was added to the regimen the day fever developed in patients suspected of having a staphylococcal infection or 48 h later. Teicoplanin was given intravenously at a dosage of 6 mg/kg of body weight at 0, 12, and 24 h and once a day thereafter. Five to eleven blood samples per patient were collected. Teicoplanin concentrations were measured by liquid chromatography. A bicompartmental model was fitted to the data by a nonlinear mixed-effect-model approach. Multiple-linear regression analysis was applied in an attempt to correlate PK parameters to nine covariates. The mean trough concentrations of teicoplanin 48 h after the onset of treatment and 24 h after the last injection (last trough) +/- standard deviations were 8.8 +/- 4.1 and 17.5 +/- 13.5 mg/liter, respectively. A significant increase was noted in the mean rate of elimination clearance of teicoplanin in neutropenic patients compared with that of HV (0.86 versus 0.73 liter/h, P = 0.002), as was the case with rates of distribution clearance (5.89 versus 4.94 liter/h, P = 0.002); the mean half-life of distribution was significantly shorter in patients than in HV (0.43 versus 0.61 h, P = 0.002). In contrast, the volumes of the central compartment (ca. 5.8 liters for both groups), the volumes of distribution at steady state (HV, 37.6 liters; patients, 55.9 liters), and the elimination half-lives (HV, 39.6 h; patients, 52.7 h) were not significantly different between HV and neutropenic patients. Interindividual variabilities of rates of clearance (coefficient of variation [CV], 43%) and elimination half-lives (CV, 56%) were mainly explained by the variabilities among rates of creatinine clearance. Interindividual variabilities of the volumes of the central compartment (CV, 33%) and the volumes of distribution at steady state (CV = 51%) were correlated to interindividual variabilities among numbers of leukocytes and the ages of patients, respectively. On the basis of the population PK model of teicoplanin, simulations were made to optimize the dosing schedule. A supplemental 6 mg/kg dose of teicoplanin at 36 h resulted in a trough concentration at 48 h of 16.0 +/- 4.5 mg/liter, with only 7% of patients having a trough concentration of less than 10 mg/liter, compared with 46% of patients on the usual schedule. PMID:8723474

  10. Input variable selection for data-driven models of Coriolis flowmeters for two-phase flow measurement

    NASA Astrophysics Data System (ADS)

    Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao

    2017-03-01

    Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction.

  11. An imbalance fault detection method based on data normalization and EMD for marine current turbines.

    PubMed

    Zhang, Milu; Wang, Tianzhen; Tang, Tianhao; Benbouzid, Mohamed; Diallo, Demba

    2017-05-01

    This paper proposes an imbalance fault detection method based on data normalization and Empirical Mode Decomposition (EMD) for variable speed direct-drive Marine Current Turbine (MCT) system. The method is based on the MCT stator current under the condition of wave and turbulence. The goal of this method is to extract blade imbalance fault feature, which is concealed by the supply frequency and the environment noise. First, a Generalized Likelihood Ratio Test (GLRT) detector is developed and the monitoring variable is selected by analyzing the relationship between the variables. Then, the selected monitoring variable is converted into a time series through data normalization, which makes the imbalance fault characteristic frequency into a constant. At the end, the monitoring variable is filtered out by EMD method to eliminate the effect of turbulence. The experiments show that the proposed method is robust against turbulence through comparing the different fault severities and the different turbulence intensities. Comparison with other methods, the experimental results indicate the feasibility and efficacy of the proposed method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Analysis of covariance as a remedy for demographic mismatch of research subject groups: some sobering simulations.

    PubMed

    Adams, K M; Brown, G G; Grant, I

    1985-08-01

    Analysis of Covariance (ANCOVA) is often used in neuropsychological studies to effect ex-post-facto adjustment of performance variables amongst groups of subjects mismatched on some relevant demographic variable. This paper reviews some of the statistical assumptions underlying this usage. In an attempt to illustrate the complexities of this statistical technique, three sham studies using actual patient data are presented. These staged simulations have varying relationships between group test performance differences and levels of covariate discrepancy. The results were robust and consistent in their nature, and were held to support the wisdom of previous cautions by statisticians concerning the employment of ANCOVA to justify comparisons between incomparable groups. ANCOVA should not be used in neuropsychological research to equate groups unequal on variables such as age and education or to exert statistical control whose objective is to eliminate consideration of the covariate as an explanation for results. Finally, the report advocates by example the use of simulation to further our understanding of neuropsychological variables.

  13. An analysis of rotor blade twist variables associated with different Euler sequences and pretwist treatments

    NASA Technical Reports Server (NTRS)

    Alkire, K.

    1984-01-01

    A nonlinear analysis which is necessary to adequately model elastic helicopter rotor blades experiencing moderately large deformations was examined. The analysis must be based on an appropriate description of the blade's deformation geometry including elastic bending and twist. Built-in pretwist angles complicate the deformation process ant its definition. Relationships between the twist variables associated with different rotation sequences and corresponding forms of the transformation matrix are lasted. Relationships between the twist variables associated with first, the pretwist combined with the deformation twist are included. Many of the corresponding forms of the transformation matrix for the two cases are listed. It is shown that twist variables connected with the combined twist treatment are related to those where the pretwist is applied initially. A method to determine the relationships and some results are outlined. A procedure to evaluate the transformation matrix that eliminates the Eulerlike sequence altogether is demonstrated. The resulting form of the transformation matrix is unaffected by rotation sequence or pretwist treatment.

  14. ACA-mandated elimination of cost sharing for preventive screening has had limited early impact.

    PubMed

    Mehta, Shivan J; Polsky, Daniel; Zhu, Jingsan; Lewis, James D; Kolstad, Jonathan T; Loewenstein, George; Volpp, Kevin G

    2015-07-01

    The Affordable Care Act eliminated patient cost sharing for evidence-based preventive care, yet the impact of this policy on colonoscopy and mammography rates is unclear. We examined the elimination of cost sharing among small business beneficiaries of Humana, a large national insurer. This was a retrospective interrupted time series analysis of whether the change in cost-sharing policy was associated with a change in screening utilization, using grandfathered plans as a comparison group. We compared beneficiaries in small business nongrandfathered plans that were required to eliminate cost sharing (intervention) with those in grandfathered plans that did not have to change cost sharing (control). There were 63,246 men and women aged 50 to 64 years eligible for colorectal cancer screening, and 30,802 women aged 50 to 64 years eligible for breast cancer screening. The primary outcome variables were rates of colonoscopy and mammography per person-month, with secondary analysis of colonoscopy rates coded as preventive only. There was no significant change in the level or slope of colonoscopy and mammography utilization for intervention plans relative to the control plans. There was also no significant relevant change among those colonoscopies coded as preventive. The results suggest that the implementation of the policy is not having its intended effects, as cost sharing rates for colonoscopy and mammography did not change substantially, and utilization of colonoscopy and mammography changed little, following this new policy approach.

  15. Induced static asymmetry of the pelvis is associated with functional asymmetry of the lumbo-pelvo-hip complex.

    PubMed

    Gnat, Rafał; Saulicz, Edward

    2008-03-01

    This study evaluates the hypothesis that triggering and eliminating induced static pelvic asymmetry (SPA) may be followed by immediate change in functional asymmetry of the lumbo-pelvo-hip complex. Repeated measures experimental design with 2 levels of independent variable, that is, induced SPA triggered and induced SPA eliminated, was implemented. Three series of measurements were performed, that is, baseline, after triggering SPA, and after eliminating SPA. A group of 84 subjects with no initial symptoms of SPA was studied. Different forms of mechanical stimulation were applied aiming to induce SPA, and the 2 manual stretching-manipulating techniques were performed aiming to eliminate it. A hand inclinometer was used to measure SPA in standing posture. Selected ranges of motion of the hip joints and lumbar spine were used to depict functional asymmetry of the lumbo-pelvo-hip complex. The functional asymmetry indices for individual movements were calculated. Repeated measures design of analysis of variance, dependent data Student t test, and linear Pearson's correlation test were used. Assessment of the SPA showed its significant increase between baseline and series 2 measurements, with a subsequent significant decrease between series 2 and series 3 measurements. Values of the functional asymmetry indices changed accordingly, that is, they increased significantly between series 1 and series 2 and had returned to their initial level in series 3 measurements. Induced SPA shows considerable association with functional asymmetry of the lumbo-pelvo-hip complex.

  16. PYTHON for Variable Star Astronomy (Abstract)

    NASA Astrophysics Data System (ADS)

    Craig, M.

    2018-06-01

    (Abstract only) Open source PYTHON packages that are useful for data reduction, photometry, and other tasks relevant to variable star astronomy have been developed over the last three to four years as part of the Astropy project. Using this software, it is relatively straightforward to reduce images, automatically detect sources, and match them to catalogs. Over the last year browser-based tools for performing some of those tasks have been developed that minimize or eliminate the need to write any of your own code. After providing an overview of the current state of the software, an application that calculates transformation coefficients on a frame-by-frame basis by matching stars in an image to the APASS catalog will be described.

  17. Quantifying the variability in stiffness and damping of an automotive vehicle's trim-structure mounts

    NASA Astrophysics Data System (ADS)

    Abolfathi, Ali; O'Boy, Dan J.; Walsh, Stephen J.; Dowsett, Amy; Fisher, Stephen A.

    2016-09-01

    Small plastic clips are used in large numbers in automotive vehicles to connect interior trims to vehicle structures. The variability in their properties can contribute to the overall variability in noise and vibration response of the vehicle. The variability arises due to their material and manufacturing tolerances and more importantly due to the boundary condition. To measure their stiffness and damping, a simple experimental rig is used where a mass is supported by the clip which is modelled as a single degree of freedom system. The rig is designed in a way that it simulates the boundary condition as those of the real vehicle. The variability in clip and also due to the boundary condition at the structure side is first examined which is 7% for stiffness and 8% for damping. To simulate the connection of the trim side, a mount is built using a 3D printer. Rattling occurs in the response of the clips with loose connections, however by preloading the mount the effective stiffness increases and the rattling is eliminated. The variability due to the boundary condition at the trim side was as large as 40% for stiffness and 52% for damping.

  18. Assessing the accuracy and stability of variable selection ...

    EPA Pesticide Factsheets

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used, or stepwise procedures are employed which iteratively add/remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating dataset consists of the good/poor condition of n=1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p=212) of landscape features from the StreamCat dataset. Two types of RF models are compared: a full variable set model with all 212 predictors, and a reduced variable set model selected using a backwards elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors, and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substanti

  19. Interrelations between different canonical descriptions of dissipative systems

    NASA Astrophysics Data System (ADS)

    Schuch, D.; Guerrero, J.; López-Ruiz, F. F.; Aldaya, V.

    2015-04-01

    There are many approaches for the description of dissipative systems coupled to some kind of environment. This environment can be described in different ways; only effective models are being considered here. In the Bateman model, the environment is represented by one additional degree of freedom and the corresponding momentum. In two other canonical approaches, no environmental degree of freedom appears explicitly, but the canonical variables are connected with the physical ones via non-canonical transformations. The link between the Bateman approach and those without additional variables is achieved via comparison with a canonical approach using expanding coordinates, as, in this case, both Hamiltonians are constants of motion. This leads to constraints that allow for the elimination of the additional degree of freedom in the Bateman approach. These constraints are not unique. Several choices are studied explicitly, and the consequences for the physical interpretation of the additional variable in the Bateman model are discussed.

  20. A two-field modified Lagrangian formulation for robust simulations of extrinsic cohesive zone models

    NASA Astrophysics Data System (ADS)

    Cazes, F.; Coret, M.; Combescure, A.

    2013-06-01

    This paper presents the robust implementation of a cohesive zone model based on extrinsic cohesive laws (i.e. laws involving an infinite initial stiffness). To this end, a two-field Lagrangian weak formulation in which cohesive tractions are chosen as the field variables along the crack's path is presented. Unfortunately, this formulation cannot model the infinite compliance of the broken elements accurately, and no simple criterion can be defined to determine the loading-unloading change of state at the integration points of the cohesive elements. Therefore, a modified Lagrangian formulation using a fictitious cohesive traction instead of the classical cohesive traction as the field variable is proposed. Thanks to this change of variable, the cohesive law becomes an increasing function of the equivalent displacement jump, which eliminates the problems mentioned previously. The ability of the proposed formulations to simulate fracture accurately and without field oscillations is investigated through three numerical test examples.

  1. Electronic Thermometer Readings

    NASA Technical Reports Server (NTRS)

    2001-01-01

    NASA Stennis' adaptive predictive algorithm for electronic thermometers uses sample readings during the initial rise in temperature and applies an algorithm that accurately and rapidly predicts the steady state temperature. The final steady state temperature of an object can be calculated based on the second-order logarithm of the temperature signals acquired by the sensor and predetermined variables from the sensor characteristics. These variables are calculated during tests of the sensor. Once the variables are determined, relatively little data acquisition and data processing time by the algorithm is required to provide a near-accurate approximation of the final temperature. This reduces the delay in the steady state response time of a temperature sensor. This advanced algorithm can be implemented in existing software or hardware with an erasable programmable read-only memory (EPROM). The capability for easy integration eliminates the expense of developing a whole new system that offers the benefits provided by NASA Stennis' technology.

  2. Assessment of Communications-related Admissions Criteria in a Three-year Pharmacy Program

    PubMed Central

    Tejada, Frederick R.; Lang, Lynn A.; Purnell, Miriam; Acedera, Lisa; Ngonga, Ferdinand

    2015-01-01

    Objective. To determine if there is a correlation between TOEFL and other admissions criteria that assess communications skills (ie, PCAT variables: verbal, reading, essay, and composite), interview, and observational scores and to evaluate TOEFL and these admissions criteria as predictors of academic performance. Methods. Statistical analyses included two sample t tests, multiple regression and Pearson’s correlations for parametric variables, and Mann-Whitney U for nonparametric variables, which were conducted on the retrospective data of 162 students, 57 of whom were foreign-born. Results. The multiple regression model of the other admissions criteria on TOEFL was significant. There was no significant correlation between TOEFL scores and academic performance. However, significant correlations were found between the other admissions criteria and academic performance. Conclusion. Since TOEFL is not a significant predictor of either communication skills or academic success of foreign-born PharmD students in the program, it may be eliminated as an admissions criterion. PMID:26430273

  3. Assessment of Communications-related Admissions Criteria in a Three-year Pharmacy Program.

    PubMed

    Parmar, Jayesh R; Tejada, Frederick R; Lang, Lynn A; Purnell, Miriam; Acedera, Lisa; Ngonga, Ferdinand

    2015-08-25

    To determine if there is a correlation between TOEFL and other admissions criteria that assess communications skills (ie, PCAT variables: verbal, reading, essay, and composite), interview, and observational scores and to evaluate TOEFL and these admissions criteria as predictors of academic performance. Statistical analyses included two sample t tests, multiple regression and Pearson's correlations for parametric variables, and Mann-Whitney U for nonparametric variables, which were conducted on the retrospective data of 162 students, 57 of whom were foreign-born. The multiple regression model of the other admissions criteria on TOEFL was significant. There was no significant correlation between TOEFL scores and academic performance. However, significant correlations were found between the other admissions criteria and academic performance. Since TOEFL is not a significant predictor of either communication skills or academic success of foreign-born PharmD students in the program, it may be eliminated as an admissions criterion.

  4. [Customizing dosage drugs what contribution in therapeutic drug monitoring?].

    PubMed

    Abdessadek, Mohammed; Magoul, Rabia; Amarti, Afaf; El Ouezzani, Seloua; Khabbal, Youssef

    2014-01-01

    Drug response is often variable from an individual to another: the same dose of drug administered to different patients could cause variable pharmacological effects in nature and intensity. Those effects are often the result of variability in drugs pharmacokinetics (absorption, distribution, metabolism and elimination) which alter their bioavailability. In fact, two factors should be taken into account: the disease(s) from which the patient suffers, and the associated drugs, because many drug interactions may alter their pharmacokinetics causing consequently quite enough of different therapeutic effects. The choice of the assay of the drug subject in monitoring is crucial, it allows quantifying the in vivo dose of the drug and the quality of compliance thereof, the pharmacokinetic characteristics allows the clinician to adjust the dosage by different approaches so that plasma concentrations are included in the therapeutic range. Therapeutic monitoring aims to increase clinical efficacy and to minimize toxicity.

  5. Drivers of solar radiation variability in the McMurdo Dry Valleys, Antarctica

    USGS Publications Warehouse

    Obryk, Maciej; Fountain, Andrew G.; Doran, Peter; Lyons, Berry; Eastman, Ryan

    2018-01-01

    Annually averaged solar radiation in the McMurdo Dry Valleys, Antarctica has varied by over 20 W m−2 during the past three decades; however, the drivers of this variability are unknown. Because small differences in radiation are important to water availability and ecosystem functioning in polar deserts, determining the causes are important to predictions of future desert processes. We examine the potential drivers of solar variability and systematically eliminate all but stratospheric sulfur dioxide. We argue that increases in stratospheric sulfur dioxide increase stratospheric aerosol optical depth and decrease solar intensity. Because of the polar location of the McMurdo Dry Valleys (77–78°S) and relatively long solar ray path through the stratosphere, terrestrial solar intensity is sensitive to small differences in stratospheric transmissivity. Important sources of sulfur dioxide include natural (wildfires and volcanic eruptions) and anthropogenic emission.

  6. Study of a Vocal Feature Selection Method and Vocal Properties for Discriminating Four Constitution Types

    PubMed Central

    Kim, Keun Ho; Ku, Boncho; Kang, Namsik; Kim, Young-Su; Jang, Jun-Su; Kim, Jong Yeol

    2012-01-01

    The voice has been used to classify the four constitution types, and to recognize a subject's health condition by extracting meaningful physical quantities, in traditional Korean medicine. In this paper, we propose a method of selecting the reliable variables from various voice features, such as frequency derivative features, frequency band ratios, and intensity, from vowels and a sentence. Further, we suggest a process to extract independent variables by eliminating explanatory variables and reducing their correlation and remove outlying data to enable reliable discriminant analysis. Moreover, the suitable division of data for analysis, according to the gender and age of subjects, is discussed. Finally, the vocal features are applied to a discriminant analysis to classify each constitution type. This method of voice classification can be widely used in the u-Healthcare system of personalized medicine and for improving diagnostic accuracy. PMID:22529874

  7. Taming the nonlinearity of the Einstein equation.

    PubMed

    Harte, Abraham I

    2014-12-31

    Many of the technical complications associated with the general theory of relativity ultimately stem from the nonlinearity of Einstein's equation. It is shown here that an appropriate choice of dynamical variables may be used to eliminate all such nonlinearities beyond a particular order: Both Landau-Lifshitz and tetrad formulations of Einstein's equation are obtained that involve only finite products of the unknowns and their derivatives. Considerable additional simplifications arise in physically interesting cases where metrics become approximately Kerr or, e.g., plane waves, suggesting that the variables described here can be used to efficiently reformulate perturbation theory in a variety of contexts. In all cases, these variables are shown to have simple geometrical interpretations that directly relate the local causal structure associated with the metric of interest to the causal structure associated with a prescribed background. A new method to search for exact solutions is outlined as well.

  8. Iterative Stable Alignment and Clustering of 2D Transmission Electron Microscope Images

    PubMed Central

    Yang, Zhengfan; Fang, Jia; Chittuluru, Johnathan; Asturias, Francisco J.; Penczek, Pawel A.

    2012-01-01

    SUMMARY Identification of homogeneous subsets of images in a macromolecular electron microscopy (EM) image data set is a critical step in single-particle analysis. The task is handled by iterative algorithms, whose performance is compromised by the compounded limitations of image alignment and K-means clustering. Here we describe an approach, iterative stable alignment and clustering (ISAC) that, relying on a new clustering method and on the concepts of stability and reproducibility, can extract validated, homogeneous subsets of images. ISAC requires only a small number of simple parameters and, with minimal human intervention, can eliminate bias from two-dimensional image clustering and maximize the quality of group averages that can be used for ab initio three-dimensional structural determination and analysis of macromolecular conformational variability. Repeated testing of the stability and reproducibility of a solution within ISAC eliminates heterogeneous or incorrect classes and introduces critical validation to the process of EM image clustering. PMID:22325773

  9. Scale-up and economic analysis of biodiesel production from municipal primary sewage sludge.

    PubMed

    Olkiewicz, Magdalena; Torres, Carmen M; Jiménez, Laureano; Font, Josep; Bengoa, Christophe

    2016-08-01

    Municipal wastewater sludge is a promising lipid feedstock for biodiesel production, but the need to eliminate the high water content before lipid extraction is the main limitation for scaling up. This study evaluates the economic feasibility of biodiesel production directly from liquid primary sludge based on experimental data at laboratory scale. Computational tools were used for the modelling of the process scale-up and the different configurations of lipid extraction to optimise this step, as it is the most expensive. The operational variables with a major influence in the cost were the extraction time and the amount of solvent. The optimised extraction process had a break-even price of biodiesel of 1232 $/t, being economically competitive with the current cost of fossil diesel. The proposed biodiesel production process from waste sludge eliminates the expensive step of sludge drying, lowering the biodiesel price. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Seamless image stitching by homography refinement and structure deformation using optimal seam pair detection

    NASA Astrophysics Data System (ADS)

    Lee, Daeho; Lee, Seohyung

    2017-11-01

    We propose an image stitching method that can remove ghost effects and realign the structure misalignments that occur in common image stitching methods. To reduce the artifacts caused by different parallaxes, an optimal seam pair is selected by comparing the cross correlations from multiple seams detected by variable cost weights. Along the optimal seam pair, a histogram of oriented gradients is calculated, and feature points for matching are detected. The homography is refined using the matching points, and the remaining misalignment is eliminated using the propagation of deformation vectors calculated from matching points. In multiband blending, the overlapping regions are determined from a distance between the matching points to remove overlapping artifacts. The experimental results show that the proposed method more robustly eliminates misalignments and overlapping artifacts than the existing method that uses single seam detection and gradient features.

  11. A method to eliminate the influence of incident light variations in spectral analysis

    NASA Astrophysics Data System (ADS)

    Luo, Yongshun; Li, Gang; Fu, Zhigang; Guan, Yang; Zhang, Shengzhao; Lin, Ling

    2018-06-01

    The intensity of the light source and consistency of the spectrum are the most important factors influencing the accuracy in quantitative spectrometric analysis. An efficient "measuring in layer" method was proposed in this paper to limit the influence of inconsistencies in the intensity and spectrum of the light source. In order to verify the effectiveness of this method, a light source with a variable intensity and spectrum was designed according to Planck's law and Wien's displacement law. Intra-lipid samples with 12 different concentrations were prepared and divided into modeling sets and prediction sets according to different incident lights and solution concentrations. The spectra of each sample were measured with five different light intensities. The experimental results showed that the proposed method was effective in eliminating the influence caused by incident light changes and was more effective than normalized processing.

  12. An inter- laboratory proficiency testing exercise for rabies diagnosis in Latin America and the Caribbean

    PubMed Central

    Clavijo, Alfonso; Freire de Carvalho, Mary H.; Orciari, Lillian A.; Velasco-Villa, Andres; Ellison, James A.; Greenberg, Lauren; Yager, Pamela A.; Green, Douglas B.; Vigilato, Marco A.; Cosivi, Ottorino; Del Rio-Vilas, Victor J.

    2017-01-01

    The direct fluorescent antibody test (DFA), is performed in all rabies reference laboratories across Latin America and the Caribbean (LAC). Despite DFA being a critical capacity in the control of rabies, there is not a standardized protocol in the region. We describe the results of the first inter-laboratory proficiency exercise of national rabies laboratories in LAC countries as part of the regional efforts towards dog-maintained rabies elimination in the American region. Twenty three laboratories affiliated to the Ministries of Health and Ministries of Agriculture participated in this exercise. In addition, the laboratories completed an online questionnaire to assess laboratory practices. Answers to the online questionnaire indicated large variability in the laboratories throughput, equipment used, protocols availability, quality control standards and biosafety requirements. Our results will inform actions to improve and harmonize laboratory rabies capacities across LAC in support for the regional efforts towards elimination of dog-maintained rabies. PMID:28369139

  13. On the efficacy of procedures to normalize Ex-Gaussian distributions.

    PubMed

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2014-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results.

  14. Confidence mediates the sex difference in mental rotation performance.

    PubMed

    Estes, Zachary; Felker, Sydney

    2012-06-01

    On tasks that require the mental rotation of 3-dimensional figures, males typically exhibit higher accuracy than females. Using the most common measure of mental rotation (i.e., the Mental Rotations Test), we investigated whether individual variability in confidence mediates this sex difference in mental rotation performance. In each of four experiments, the sex difference was reliably elicited and eliminated by controlling or manipulating participants' confidence. Specifically, confidence predicted performance within and between sexes (Experiment 1), rendering confidence irrelevant to the task reliably eliminated the sex difference in performance (Experiments 2 and 3), and manipulating confidence significantly affected performance (Experiment 4). Thus, confidence mediates the sex difference in mental rotation performance and hence the sex difference appears to be a difference of performance rather than ability. Results are discussed in relation to other potential mediators and mechanisms, such as gender roles, sex stereotypes, spatial experience, rotation strategies, working memory, and spatial attention.

  15. The effect of changes in the USF/NASA toxicity screening test method on data from some cellular polymers

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Miller, C. M.

    1976-01-01

    Rankings of relative toxicity can be markedly affected by changes in test variables. Revision of the USF/NASA toxicity screening test procedure to eliminate the connecting tube and supporting floor and incorporate a 1.0 g sample weight, 200 C starting temperature, and 800 C upper limit temperature for pyrolysis, reversed the rankings of flexible polyurethane and polychloroprene foams, not only in relation to each other, but also in relation to cotton and red oak. Much of the change is attributed to reduction of the distance between the sample and the test animals, and reduction of the sample weight charged. Elimination of the connecting tube increased the relative toxicity of the polyurethane foams. The materials tested were flexible polyurethane foam, without and with fire retardant; rigid polyurethane foam with fire retardant; flexible polychloroprene foam; cotton, Douglas fir, red oak, hemlock, hardboard, particle board, polystyrene, and polymethyl methacrylate.

  16. The Unsupervised Acquisition of a Lexicon from Continuous Speech.

    DTIC Science & Technology

    1995-11-01

    Com- munication, 2(1):57{89, 1982. [42] J. Ziv and A. Lempel . Compression of individual sequences by variable rate coding. IEEE Trans- actions on...parameters of the compression algorithm , in a never-ending attempt to identify and eliminate the predictable. They lead us to a class of grammars in...the rst 10 sentences of the test set, previously unseen by the algorithm . Vertical bars indicate word boundaries. 7.1 Text Compression and Language

  17. Study of three-dimensional effects on vortex breakdown

    NASA Technical Reports Server (NTRS)

    Salas, M. D.; Kuruvila, G.

    1988-01-01

    The incompressible axisymmetric steady Navier-Stokes equations in primitive variables are used to simulate vortex breakdown. The equations, discretized using a second-order, central-difference scheme, are linearized and then solved using an exact LU decomposition, Gaussian elimination, and Newton iteration. Solutions are presented for Reynolds numbers, based on vortex-core radius, as high as 1500. An attempt to study the stability of the axisymmetric solutions against three-dimensional perturbations is discussed.

  18. High Fidelity and Multiscale Algorithms for Collisional-radiative and Nonequilibrium Plasmas (Briefing Charts)

    DTIC Science & Technology

    2014-07-01

    of models for variable conditions: – Use implicit models to eliminate constraint of sequence of fast time scales: c, ve, – Price to pay: lack...collisions: – Elastic – Bragiinski terms – Inelastic – warning! Rates depend on both T and relative velocity – Multi-fluid CR model from...merge/split for particle management, efficient sampling, inelastic collisions … – Level grouping schemes of electronic states, for dynamical coarse

  19. Identification of spectral regions for the quantification of red wine tannins with fourier transform mid-infrared spectroscopy.

    PubMed

    Jensen, Jacob S; Egebo, Max; Meyer, Anne S

    2008-05-28

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.

  20. Exposure reconstruction for the TCDD-exposed NIOSH cohort using a concentration- and age-dependent model of elimination.

    PubMed

    Aylward, Lesa L; Brunet, Robert C; Starr, Thomas B; Carrier, Gaétan; Delzell, Elizabeth; Cheng, Hong; Beall, Colleen

    2005-08-01

    Recent studies demonstrating a concentration dependence of elimination of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) suggest that previous estimates of exposure for occupationally exposed cohorts may have underestimated actual exposure, resulting in a potential overestimate of the carcinogenic potency of TCDD in humans based on the mortality data for these cohorts. Using a database on U.S. chemical manufacturing workers potentially exposed to TCDD compiled by the National Institute for Occupational Safety and Health (NIOSH), we evaluated the impact of using a concentration- and age-dependent elimination model (CADM) (Aylward et al., 2005) on estimates of serum lipid area under the curve (AUC) for the NIOSH cohort. These data were used previously by Steenland et al. (2001) in combination with a first-order elimination model with an 8.7-year half-life to estimate cumulative serum lipid concentration (equivalent to AUC) for these workers for use in cancer dose-response assessment. Serum lipid TCDD measurements taken in 1988 for a subset of the cohort were combined with the NIOSH job exposure matrix and work histories to estimate dose rates per unit of exposure score. We evaluated the effect of choices in regression model (regression on untransformed vs. ln-transformed data and inclusion of a nonzero regression intercept) as well as the impact of choices of elimination models and parameters on estimated AUCs for the cohort. Central estimates for dose rate parameters derived from the serum-sampled subcohort were applied with the elimination models to time-specific exposure scores for the entire cohort to generate AUC estimates for all cohort members. Use of the CADM resulted in improved model fits to the serum sampling data compared to the first-order models. Dose rates varied by a factor of 50 among different combinations of elimination model, parameter sets, and regression models. Use of a CADM results in increases of up to five-fold in AUC estimates for the more highly exposed members of the cohort compared to estimates obtained using the first-order model with 8.7-year half-life. This degree of variation in the AUC estimates for this cohort would affect substantially the cancer potency estimates derived from the mortality data from this cohort. Such variability and uncertainty in the reconstructed serum lipid AUC estimates for this cohort, depending on elimination model, parameter set, and regression model, have not been described previously and are critical components in evaluating the dose-response data from the occupationally exposed populations.

  1. Uninformed sacrifice: Evidence against long-range alarm transmission in foraging ants exposed to localized abduction

    NASA Astrophysics Data System (ADS)

    Tejera, F.; Reyes, A.; Altshuler, E.

    2016-07-01

    It is well established that danger information can be transmitted by ants through relatively small distances, provoking either a state of alarm when they move away from potentially dangerous stimulus, or charge toward it aggressively. There is almost no knowledge if danger information can be transmitted along large distances. In this paper, we abduct leaf cutting ants of the species Atta insularis while they forage in their natural environment at a certain point of the foraging line, so ants make a "U" turn to escape from the danger zone and go back to the nest. Our results strongly suggest that those ants do not transmit "danger information" to other nestmates marching towards the abduction area. The individualistic behavior of the ants returning from the danger zone results in a depression of the foraging activity due to the systematic sacrifice of non-informed individuals.

  2. Reciprocity of weighted networks

    PubMed Central

    Squartini, Tiziano; Picciolo, Francesco; Ruzzenenti, Franco; Garlaschelli, Diego

    2013-01-01

    In directed networks, reciprocal links have dramatic effects on dynamical processes, network growth, and higher-order structures such as motifs and communities. While the reciprocity of binary networks has been extensively studied, that of weighted networks is still poorly understood, implying an ever-increasing gap between the availability of weighted network data and our understanding of their dyadic properties. Here we introduce a general approach to the reciprocity of weighted networks, and define quantities and null models that consistently capture empirical reciprocity patterns at different structural levels. We show that, counter-intuitively, previous reciprocity measures based on the similarity of mutual weights are uninformative. By contrast, our measures allow to consistently classify different weighted networks according to their reciprocity, track the evolution of a network's reciprocity over time, identify patterns at the level of dyads and vertices, and distinguish the effects of flux (im)balances or other (a)symmetries from a true tendency towards (anti-)reciprocation. PMID:24056721

  3. Reciprocity of weighted networks.

    PubMed

    Squartini, Tiziano; Picciolo, Francesco; Ruzzenenti, Franco; Garlaschelli, Diego

    2013-01-01

    In directed networks, reciprocal links have dramatic effects on dynamical processes, network growth, and higher-order structures such as motifs and communities. While the reciprocity of binary networks has been extensively studied, that of weighted networks is still poorly understood, implying an ever-increasing gap between the availability of weighted network data and our understanding of their dyadic properties. Here we introduce a general approach to the reciprocity of weighted networks, and define quantities and null models that consistently capture empirical reciprocity patterns at different structural levels. We show that, counter-intuitively, previous reciprocity measures based on the similarity of mutual weights are uninformative. By contrast, our measures allow to consistently classify different weighted networks according to their reciprocity, track the evolution of a network's reciprocity over time, identify patterns at the level of dyads and vertices, and distinguish the effects of flux (im)balances or other (a)symmetries from a true tendency towards (anti-)reciprocation.

  4. Informal STEM Education in Antarctica

    NASA Astrophysics Data System (ADS)

    Chell, K.

    2010-12-01

    Tourism in Antarctica has increased dramatically with tens of thousands of tourists visiting the White Continent each year. Tourism cruises to Antarctica offer a unique educational experience for lay people through informal science-technology-engineering-mathematics (STEM) education. Passengers attend numerous scientific lectures that cover topics such as the geology of Antarctica, plate tectonics, glaciology, and climate change. Furthermore, tourists experience the geology and glaciology first hand during shore excursions. Currently, the grand challenges facing our global society are closely connected to the Earth sciences. Issues such as energy, climate change, water security, and natural hazards, are consistently on the legislative docket of policymakers around the world. However, the majority of the world’s population is uninformed about the role Earth sciences play in their everyday lives. Tourism in Antarctica provides opportunities for informal STEM learning and, as a result, tourists leave with a better understanding and greater appreciation for both Antarctica and Earth sciences.

  5. Transformation of a Spatial Map across the Hippocampal-Lateral Septal Circuit.

    PubMed

    Tingley, David; Buzsáki, György

    2018-05-15

    The hippocampus constructs a map of the environment. How this "cognitive map" is utilized by other brain regions to guide behavior remains unexplored. To examine how neuronal firing patterns in the hippocampus are transmitted and transformed, we recorded neurons in its principal subcortical target, the lateral septum (LS). We observed that LS neurons carry reliable spatial information in the phase of action potentials, relative to hippocampal theta oscillations, while the firing rates of LS neurons remained uninformative. Furthermore, this spatial phase code had an anatomical microstructure within the LS and was bound to the hippocampal spatial code by synchronous gamma frequency cell assemblies. Using a data-driven model, we show that rate-independent spatial tuning arises through the dynamic weighting of CA1 and CA3 cell assemblies. Our findings demonstrate that transformation of the hippocampal spatial map depends on higher-order theta-dependent neuronal sequences. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. A glossary of theories for understanding policymaking.

    PubMed

    Smith, Katherine Elizabeth; Katikireddi, Srinivasa Vittal

    2013-02-01

    Public health practitioners and researchers often seek to influence public policies in order to improve population health and/or reduce health inequalities. However, these efforts frequently appear to be uninformed by the many empirically-based theories about policymaking that have been developed within political science. This glossary provides a brief overview of some of the most popular of these theories, describing how each: frames the policymaking process; portrays the relationships and influence of specific policy actors; and depicts the potential for policy change (or inertia). Examples of their application to public health are provided to help improve understanding of the material presented. Throughout the article, the implications of the different theories for public health researchers and advocates seeking to inform policy decisions are emphasised. The glossary aims to provide an accessible overview to key theories about policy and decision-making, with a view to supporting public health efforts to achieve healthier public policies.

  7. Transgender identity development as represented by a group of female-to-male transgendered adults.

    PubMed

    Morgan, Sarah W; Stevens, Patricia E

    2008-06-01

    This article represents work done in the discipline of nursing to raise awareness about the lives and experiences of transgendered persons, who receive little coverage in our nursing textbooks, professional journals, or student clinical experiences. The findings presented here are from a larger qualitative examination of the lives and experiences of a group of 11 transgendered adults that examined four broad areas: transgender identity recognition, acknowledgement, and development; bodily experiences; relationships with others; and health care experiences. The focus of this article is the relevant findings related to four participants in the study who identified as female-to-male (FTM), meaning they were born female-bodied, but identify as male. The highlight here is on the recognition, acknowledgement, and development of transgender identity. Our intention is to expose uninformed people to first-hand accounts by FTM transgendered persons about their life trajectories, particularly during childhood, adolescence, and the early adult years.

  8. Disclosure of hydraulic fracturing fluid chemical additives: analysis of regulations.

    PubMed

    Maule, Alexis L; Makey, Colleen M; Benson, Eugene B; Burrows, Isaac J; Scammell, Madeleine K

    2013-01-01

    Hydraulic fracturing is used to extract natural gas from shale formations. The process involves injecting into the ground fracturing fluids that contain thousands of gallons of chemical additives. Companies are not mandated by federal regulations to disclose the identities or quantities of chemicals used during hydraulic fracturing operations on private or public lands. States have begun to regulate hydraulic fracturing fluids by mandating chemical disclosure. These laws have shortcomings including nondisclosure of proprietary or "trade secret" mixtures, insufficient penalties for reporting inaccurate or incomplete information, and timelines that allow for after-the-fact reporting. These limitations leave lawmakers, regulators, public safety officers, and the public uninformed and ill-prepared to anticipate and respond to possible environmental and human health hazards associated with hydraulic fracturing fluids. We explore hydraulic fracturing exemptions from federal regulations, as well as current and future efforts to mandate chemical disclosure at the federal and state level.

  9. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    PubMed

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  10. Rare occurrence of heart lesions in Pacific oysters Crassostrea gigas caused by an unknown bacterial infection.

    PubMed

    Meyer, Gary R; Lowe, Geoffrey J; Bower, Susan M

    2017-09-20

    On rare occasions, small cream-coloured cysts have been observed in the heart and pericardial cavity of Pacific oysters Crassostrea gigas from British Columbia, Canada. Histopathology revealed the presence of large colonies of bacteria (up to 800 µm in diameter) causing significant host response and hypertrophy of the heart epithelium. The causative bacteria were characterized as follows: Gram-negative, coccoid to small rod-shaped, typically <1.5 µm in size, cell walls highly endowed with surface fimbriae and division via binary fission. Although these bacteria shared some morphological characteristics with the order Rickettsiales, they did not require an intracellular existence for multiplication. Unfortunately, a cultured isolate was not available, and a retrospective attempt to further characterize the bacteria using DNA sequence analysis of a fragment from the 16S rDNA region proved to be uninformative.

  11. Assessing the Public’s Views in Research Ethics Controversies: Deliberative Democracy and Bioethics as Natural Allies

    PubMed Central

    Kim, Scott Y. H.; Wall, Ian F.; Stanczyk, Aimee; Vries, Raymond De

    2010-01-01

    In a Liberal Democracy, Policy Decisions regarding ethical controversies, including those in research ethics, should incorporate the opinions of its citizens. Eliciting informed and well-considered ethical opinions can be challenging. The issues may not be widely familiar and they may involve complex scientific, legal, historical, and ethical dimensions. Traditional surveys risk eliciting superficial and uninformed opinions that may be of dubious quality for policy formation. We argue that the theory and practice of deliberative democracy (DD) is especially useful in overcoming such inadequacies. We explain DD theory and practice, discuss the rationale for using DD methods in research ethics, and illustrate in depth the use of a DD method for a long-standing research ethics controversy involving research based on surrogate consent. The potential pitfalls of DD and the means of minimizing them as well as future research directions are also discussed. PMID:19919315

  12. Hi-C 2.0: An optimized Hi-C procedure for high-resolution genome-wide mapping of chromosome conformation.

    PubMed

    Belaghzal, Houda; Dekker, Job; Gibcus, Johan H

    2017-07-01

    Chromosome conformation capture-based methods such as Hi-C have become mainstream techniques for the study of the 3D organization of genomes. These methods convert chromatin interactions reflecting topological chromatin structures into digital information (counts of pair-wise interactions). Here, we describe an updated protocol for Hi-C (Hi-C 2.0) that integrates recent improvements into a single protocol for efficient and high-resolution capture of chromatin interactions. This protocol combines chromatin digestion and frequently cutting enzymes to obtain kilobase (kb) resolution. It also includes steps to reduce random ligation and the generation of uninformative molecules, such as unligated ends, to improve the amount of valid intra-chromosomal read pairs. This protocol allows for obtaining information on conformational structures such as compartment and topologically associating domains, as well as high-resolution conformational features such as DNA loops. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Natural and Strategic Generosity as Signals of Trustworthiness

    PubMed Central

    Gambetta, Diego; Przepiorka, Wojtek

    2014-01-01

    We exploit the fact that generosity and trustworthiness are highly correlated and the former can thus be a sign of the latter. Subjects decide between a generous and a mean split in a dictator game. Some of them are informed from the start that afterwards they will participate in a trust game and that their choice in the dictator game may matter; others are not informed in advance. In the trust game, before trusters decide whether or not to trust, some trustees can reveal (or conceal) only their true choice in the dictator game, while others can say to trusters, truthfully or otherwise, what they chose. We find that a generous choice made naturally by uninformed trustees and reliably revealed is more effective in persuading trusters to trust than a generous choice that could be strategic or a lie. Moreover, we find that, when they can, mean subjects lie and go on to be untrustworthy. PMID:24831097

  14. Concussions and Risk Within Cultural Contexts of Play.

    PubMed

    Torres Colón, Gabriel Alejandro; Smith, Sharia; Fucillo, Jenny

    2017-06-01

    Concussions are a type of traumatic injury caused by a jolting of the brain that disrupts normal brain function, and multiple concussions can lead to serious long-term health consequences. In this article, we examine the relationship between college students' understanding of concussions and their willingness to continue playing despite the possibility of sustaining multiple head injuries. We use a mixed-methods approach that includes participant observation, cultural domain analysis, and structured interviews. Our research finds that students hold a robust cognitive understanding of concussion yet discursively frame concussions as skeletomuscular injuries. More importantly, students affirm the importance of playing sports for themselves and others, so their decisions to risk multiple concussions must be understood within cultural and biocultural contexts of meaningful social play. We suggest that peoples' decision to risk multiple head injuries should be understood as a desire for meaningful social play rather than an uninformed health risk.

  15. Impact of guided exploration and enactive exploration on self-regulatory mechanisms and information acquisition through electronic search.

    PubMed

    Debowski, S; Wood, R E; Bandura, A

    2001-12-01

    Following instruction in basic skills for electronic search, participants who practiced in a guided exploration mode developed stronger self-efficacy and greater satisfaction than those who practiced in a self-guided exploratory mode. Intrinsic motivation was not affected by exploration mode. On 2 post-training tasks, guided exploration participants produced more effective search strategies. expended less effort, made fewer errors, rejected fewer lines of search, and achieved higher performance. Relative lack of support for self-regulatory factors as mediators of exploration mode impacts was attributed to the uninformative feedback from electronic search, which causes most people to remain at a novice level and to require external guidance for development of self-efficacy and skills. Self-guided learning will be more effective on structured tasks with more informative feedback and for individuals with greater expertise on dynamic tasks.

  16. Assessing the public's views in research ethics controversies: deliberative democracy and bioethics as natural allies.

    PubMed

    Kim, Scott Y H; Wall, Ian F; Stanczyk, Aimee; De Vries, Raymond

    2009-12-01

    In a liberal democracy, policy decisions regarding ethical controversies, including those in research ethics, should incorporate the opinions of its citizens. Eliciting informed and well-considered ethical opinions can be challenging. The issues may not be widely familiar and they may involve complex scientific, legal, historical, and ethical dimensions. Traditional surveys risk eliciting superficial and uninformed opinions that may be of dubious quality for policy formation. We argue that the theory and practice of deliberative democracy (DD) is especially useful in overcoming such inadequacies. We explain DD theory and practice, discuss the rationale for using DD methods in research ethics, and illustrate in depth the use of a DD method for a longstanding research ethics controversy involving research based on surrogate consent. The potential pitfalls of DD and the means of minimizing them as well as future research directions are also discussed.

  17. Investor structure and the price-volume relationship in a continuous double auction market: An agent-based modeling perspective

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Bi, Zhengzheng; Shen, Dehua

    2017-02-01

    This paper investigates the impact of investor structure on the price-volume relationship by simulating a continuous double auction market. Connected with the underlying mechanisms of the price-volume relationship, i.e., the Mixture of Distribution Hypothesis (MDH) and the Sequential Information Arrival Hypothesis (SIAH), the simulation results show that: (1) there exists a strong lead-lag relationship between the return volatility and trading volume when the number of informed investors is close to the number of uninformed investors in the market; (2) as more and more informed investors entering the market, the lead-lag relationship becomes weaker and weaker, while the contemporaneous relationship between the return volatility and trading volume becomes more prominent; (3) when the informed investors are in absolute majority, the market can achieve the new equilibrium immediately. Therefore, we can conclude that the investor structure is a key factor in affecting the price-volume relationship.

  18. The Effect of Eye Contact Is Contingent on Visual Awareness

    PubMed Central

    Xu, Shan; Zhang, Shen; Geng, Haiyan

    2018-01-01

    The present study explored how eye contact at different levels of visual awareness influences gaze-induced joint attention. We adopted a spatial-cueing paradigm, in which an averted gaze was used as an uninformative central cue for a joint-attention task. Prior to the onset of the averted-gaze cue, either supraliminal (Experiment 1) or subliminal (Experiment 2) eye contact was presented. The results revealed a larger subsequent gaze-cueing effect following supraliminal eye contact compared to a no-contact condition. In contrast, the gaze-cueing effect was smaller in the subliminal eye-contact condition than in the no-contact condition. These findings suggest that the facilitation effect of eye contact on coordinating social attention depends on visual awareness. Furthermore, subliminal eye contact might have an impact on subsequent social attention processes that differ from supraliminal eye contact. This study highlights the need to further investigate the role of eye contact in implicit social cognition. PMID:29467703

  19. Enhancing citizen engagement in cancer screening through deliberative democracy.

    PubMed

    Rychetnik, Lucie; Carter, Stacy M; Abelson, Julia; Thornton, Hazel; Barratt, Alexandra; Entwistle, Vikki A; Mackenzie, Geraldine; Salkeld, Glenn; Glasziou, Paul

    2013-03-20

    Cancer screening is widely practiced and participation is promoted by various social, technical, and commercial drivers, but there are growing concerns about the emerging harms, risks, and costs of cancer screening. Deliberative democracy methods engage citizens in dialogue on substantial and complex problems: especially when evidence and values are important and people need time to understand and consider the relevant issues. Information derived from such deliberations can provide important guidance to cancer screening policies: citizens' values are made explicit, revealing what really matters to people and why. Policy makers can see what informed, rather than uninformed, citizens would decide on the provision of services and information on cancer screening. Caveats can be elicited to guide changes to existing policies and practices. Policies that take account of citizens' opinions through a deliberative democracy process can be considered more legitimate, justifiable, and feasible than those that don't.

  20. Educators' evaluations of children's ideas on the social exclusion of classmates with intellectual and learning disabilities.

    PubMed

    Nowicki, Elizabeth A; Brown, Jason D; Dare, Lynn

    2018-01-01

    Reasons underlying the social exclusion of children with intellectual or learning disabilities are not entirely understood. Although it is important to heed the voices of children on this issue, it is also important to consider the degree to which these ideas are informed. The present authors invited educators to evaluate the content of children's ideas on the causes of social exclusion. Educators thematically sorted and rated children's ideas on why classmates with intellectual or learning disabilities are socially excluded. Sorted data were analysed with multidimensional scaling and hierarchical cluster analysis. Six thematic clusters were identified differing in content to those provided by children in an earlier study. Educators generally rated children's ideas as showing somewhat uninformed ideas about why social exclusion occurs. Educators indicated that children need to be better informed about intellectual and learning disabilities. Limitations and implications are discussed. © 2017 John Wiley & Sons Ltd.

  1. The value of information in a multi-agent market model. The luck of the uninformed

    NASA Astrophysics Data System (ADS)

    Tóth, B.; Scalas, E.; Huber, J.; Kirchler, M.

    2007-01-01

    We present an experimental and simulated model of a multi-agent stock market driven by a double auction order matching mechanism. Studying the effect of cumulative information on the performance of traders, we find a non monotonic relationship of net returns of traders as a function of information levels, both in the experiments and in the simulations. Particularly, averagely informed traders perform worse than the non informed and only traders with high levels of information (insiders) are able to beat the market. The simulations and the experiments reproduce many stylized facts of tick-by-tick stock-exchange data, such as fast decay of autocorrelation of returns, volatility clustering and fat-tailed distribution of returns. These results have an important message for everyday life. They can give a possible explanation why, on average, professional fund managers perform worse than the market index.

  2. Information from multiple modalities helps 5-month-olds learn abstract rules.

    PubMed

    Frank, Michael C; Slemmer, Jonathan A; Marcus, Gary F; Johnson, Scott P

    2009-07-01

    By 7 months of age, infants are able to learn rules based on the abstract relationships between stimuli (Marcus et al., 1999), but they are better able to do so when exposed to speech than to some other classes of stimuli. In the current experiments we ask whether multimodal stimulus information will aid younger infants in identifying abstract rules. We habituated 5-month-olds to simple abstract patterns (ABA or ABB) instantiated in coordinated looming visual shapes and speech sounds (Experiment 1), shapes alone (Experiment 2), and speech sounds accompanied by uninformative but coordinated shapes (Experiment 3). Infants showed evidence of rule learning only in the presence of the informative multimodal cues. We hypothesize that the additional evidence present in these multimodal displays was responsible for the success of younger infants in learning rules, congruent with both a Bayesian account and with the Intersensory Redundancy Hypothesis.

  3. Chimpanzee choice rates in competitive games match equilibrium game theory predictions.

    PubMed

    Martin, Christopher Flynn; Bhui, Rahul; Bossaerts, Peter; Matsuzawa, Tetsuro; Camerer, Colin

    2014-06-05

    The capacity for strategic thinking about the payoff-relevant actions of conspecifics is not well understood across species. We use game theory to make predictions about choices and temporal dynamics in three abstract competitive situations with chimpanzee participants. Frequencies of chimpanzee choices are extremely close to equilibrium (accurate-guessing) predictions, and shift as payoffs change, just as equilibrium theory predicts. The chimpanzee choices are also closer to the equilibrium prediction, and more responsive to past history and payoff changes, than two samples of human choices from experiments in which humans were also initially uninformed about opponent payoffs and could not communicate verbally. The results are consistent with a tentative interpretation of game theory as explaining evolved behavior, with the additional hypothesis that chimpanzees may retain or practice a specialized capacity to adjust strategy choice during competition to perform at least as well as, or better than, humans have.

  4. Federal funding at a time of budget austerity

    NASA Astrophysics Data System (ADS)

    Frank Press

    I served as chair of a committee of the National Academies of Sciences and Engineering that released a report last December entitled “Allocating Federal Funds for Science and Technology” [National Academy Press, 1995; Carlowicz, 1995]. The report had at least one salutary effect: it triggered a lively national discussion of science policy, much of it useful but some occasionally uninformed. The report was issued in an environment that can be characterized by the end of the Cold War, changing national needs, and the restructuring of research and educational institutions. It is also a stressful time when extraordinary opportunities for scientific progress are constrained by a future of austere R&D funding as both political parties struggle to honor their commitments to balance the federal budget in 7 years. In the interest of furthering debate, I will summarize the premises of the report, the principal recommendations, the main objections that have arisen, and our responses to them.

  5. Body integrity identity disorder beyond amputation: consent and liberty.

    PubMed

    White, Amy

    2014-09-01

    In this article, I argue that persons suffering from Body Integrity Identity Disorder (BIID) can give informed consent to surgical measures designed to treat this disorder. This is true even if the surgery seems radical or irrational to most people. The decision to have surgery made by a BIID patient is not necessarily coerced, incompetent or uninformed. If surgery for BIID is offered, there should certainly be a screening process in place to insure informed consent. It is beyond the scope of this work, however, to define all the conditions that should be placed on the availability of surgery. However, I argue, given the similarities between BIID and gender dysphoria and the success of such gatekeeping measures for the surgical treatment of gender dysphoria, it is reasonable that similar conditions be in place for BIID. Once other treatment options are tried and gatekeeping measures satisfied, A BIID patient can give informed consent to radical surgery.

  6. Continental-scale, data-driven predictive assessment of eliminating the vector-borne disease, lymphatic filariasis, in sub-Saharan Africa by 2020.

    PubMed

    Michael, Edwin; Singh, Brajendra K; Mayala, Benjamin K; Smith, Morgan E; Hampton, Scott; Nabrzyski, Jaroslaw

    2017-09-27

    There are growing demands for predicting the prospects of achieving the global elimination of neglected tropical diseases as a result of the institution of large-scale nation-wide intervention programs by the WHO-set target year of 2020. Such predictions will be uncertain due to the impacts that spatial heterogeneity and scaling effects will have on parasite transmission processes, which will introduce significant aggregation errors into any attempt aiming to predict the outcomes of interventions at the broader spatial levels relevant to policy making. We describe a modeling platform that addresses this problem of upscaling from local settings to facilitate predictions at regional levels by the discovery and use of locality-specific transmission models, and we illustrate the utility of using this approach to evaluate the prospects for eliminating the vector-borne disease, lymphatic filariasis (LF), in sub-Saharan Africa by the WHO target year of 2020 using currently applied or newly proposed intervention strategies. METHODS AND RESULTS: We show how a computational platform that couples site-specific data discovery with model fitting and calibration can allow both learning of local LF transmission models and simulations of the impact of interventions that take a fuller account of the fine-scale heterogeneous transmission of this parasitic disease within endemic countries. We highlight how such a spatially hierarchical modeling tool that incorporates actual data regarding the roll-out of national drug treatment programs and spatial variability in infection patterns into the modeling process can produce more realistic predictions of timelines to LF elimination at coarse spatial scales, ranging from district to country to continental levels. Our results show that when locally applicable extinction thresholds are used, only three countries are likely to meet the goal of LF elimination by 2020 using currently applied mass drug treatments, and that switching to more intensive drug regimens, increasing the frequency of treatments, or switching to new triple drug regimens will be required if LF elimination is to be accelerated in Africa. The proportion of countries that would meet the goal of eliminating LF by 2020 may, however, reach up to 24/36 if the WHO 1% microfilaremia prevalence threshold is used and sequential mass drug deliveries are applied in countries. We have developed and applied a data-driven spatially hierarchical computational platform that uses the discovery of locally applicable transmission models in order to predict the prospects for eliminating the macroparasitic disease, LF, at the coarser country level in sub-Saharan Africa. We show that fine-scale spatial heterogeneity in local parasite transmission and extinction dynamics, as well as the exact nature of intervention roll-outs in countries, will impact the timelines to achieving national LF elimination on this continent.

  7. Micro-spatial distribution of malaria cases and control strategies at ward level in Gwanda district, Matabeleland South, Zimbabwe.

    PubMed

    Manyangadze, Tawanda; Chimbari, Moses J; Macherera, Margaret; Mukaratirwa, Samson

    2017-11-21

    Although there has been a decline in the number of malaria cases in Zimbabwe since 2010, the disease remains the biggest public health threat in the country. Gwanda district, located in Matabeleland South Province of Zimbabwe has progressed to the malaria pre-elimination phase. The aim of this study was to determine the spatial distribution of malaria incidence at ward level for improving the planning and implementation of malaria elimination in the district. The Poisson purely spatial model was used to detect malaria clusters and their properties, including relative risk and significance levels at ward level. The geographically weighted Poisson regression (GWPR) model was used to explore the potential role and significance of environmental variables [rainfall, minimum and maximum temperature, altitude, Enhanced Vegetation Index (EVI), Normalized Difference Vegetation Index (NDVI), Normalized Difference Water Index (NDWI), rural/urban] and malaria control strategies [indoor residual spraying (IRS) and long-lasting insecticide-treated nets (LLINs)] on the spatial patterns of malaria incidence at ward level. Two significant clusters (p < 0.05) of malaria cases were identified: (1) ward 24 south of Gwanda district and (2) ward 9 in the urban municipality, with relative risks of 5.583 and 4.316, respectively. The semiparametric-GWPR model with both local and global variables had higher performance based on AICc (70.882) compared to global regression (74.390) and GWPR which assumed that all variables varied locally (73.364). The semiparametric-GWPR captured the spatially non-stationary relationship between malaria cases and minimum temperature, NDVI, NDWI, and altitude at the ward level. The influence of LLINs, IRS and rural or urban did not vary and remained in the model as global terms. NDWI (positive coefficients) and NDVI (range from negative to positive coefficients) showed significant association with malaria cases in some of the wards. The IRS had a protection effect on malaria incidence as expected. Malaria incidence is heterogeneous even in low-transmission zones including those in pre-elimination phase. The relationship between malaria cases and NDWI, NDVI, altitude, and minimum temperature may vary at local level. The results of this study can be used in planning and implementation of malaria control strategies at district and ward levels.

  8. Effects of Dose and Route on the Disposition and Kinetics of 1-Butyl-1-methylpyrrolidinium Chloride in Male F-344 Rats

    PubMed Central

    Knudsen, G. A.; Cheng, Y.; Kuester, R. K.; Hooth, M. J.

    2009-01-01

    Studies were conducted to characterize the effects of dose and route of administration on the disposition of 1-butyl-1-methylpyrrolidinium (BmPy-Cl) in male Fischer-344 rats. After a single oral administration of [14C]BmPy-Cl (50 mg/kg), BmPy-Cl in the blood decreased rapidly after Cmax of 89.1 min with a distribution half-life (t1/2α) of 21 min, an elimination half-life (t1/2β) of 5.6 h, and a total body clearance of 7.6 ml/min. After oral administration (50, 5, and 0.5 mg/kg), 50 to 70% of the administered radioactivity was recovered in the feces, with the remainder recovered in the urine. Serial daily oral administrations of [14C]BmPy-Cl (50 mg/kg/day for 5 days) did not result in a notable alteration in disposition or elimination. After each administration, 88 to 94% of the dose was eliminated in a 24-h period, with 63 to 76% of dose recovered in the feces. Intravenous administration of [14C]BmPy-Cl (5 mg/kg) resulted in biphasic elimination. Oral systemic bioavailability was 43.4%, approximately equal to the dose recovered in urine after oral administration (29–38%). Total dermal absorption of [14C]BmPy-Cl (5 mg/kg) was moderate when it was applied in dimethylformamide-water (34 ± 13%), variable in water (22 ± 8%), or minimal in ethanol-water (13 ± 1%) vehicles. Urine was the predominant route of elimination regardless of vehicle. Only parent [14C]BmPy-Cl was detected in the urine after all doses and routes of administration. BmPy-Cl was found to be a substrate for (Kt = 37 μM) and inhibitor of (IC50/tetraethylammonium = 0.5 μM) human organic cation transporter 2. In summary, BmPy-Cl is moderately absorbed, extracted by the kidney, and eliminated in the urine as parent compound, independent of dose, number, or route of administration. PMID:19704025

  9. Dose requirements of alfentanil to eliminate autonomic responses during rapid-sequence induction with thiopental 4 mg/kg and rocuronium 0.6 mg/kg.

    PubMed

    Abou-Arab, Mohammad H; Rostrup, Morten; Heier, Tom

    2016-12-01

    Opioids are integral part of anesthesia induction, but information on optimal dosing is limited. We aimed to determine doses of alfentanil needed to eliminate increases in 5 autonomic response variables (plasma concentrations of epinephrine, norepinephrine and vasopressin, arterial blood pressure [ABP], and heart rate) during rapid-sequence induction of anesthesia with thiopental 4 mg/kg and rocuronium 0.6 mg/kg. Prospective, randomized, observer-blinded, interventional clinical study. Large academic institution. Eighty-four healthy patients, aged 18 to 55 years, received 1 of 7 assessor-blinded doses of alfentanil (0, 10, 20, 30, 40, 50, and 60 μg/kg) together with thiopental 4 mg/kg and rocuronium 0.6 mg/kg, administered in rapid succession (15 seconds). Laryngoscopy was initiated 40 seconds after rocuronium, and tracheal intubation was concluded within 15 seconds thereafter. An indwelling radial artery catheter was used for hemodynamic monitoring and blood sampling. Relationships between alfentanil dose and response variables were tested with linear regression, and the influence of covariates (sex, body weight, and age) was determined. Alfentanil dose needed to prevent increases in ABP >10% above baseline with 95% probability was estimated with logistic regression. Significant relationships were determined between alfentanil dose and response variables. Clinically interesting influence of covariates was not found. Alfentanil 55 μg/kg was needed to prevent increases in ABP postintubation >10% above baseline with 95% probability. One individual needed a bolus of vasopressor postintubation. Optimal control of autonomic responses during rapid-sequence induction was achieved with clinically relevant doses of alfentanil in healthy patients anesthetized with thiopental 4 mg/kg and rocuronium 0.6 mg/kg. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Prediction of surface roughness in turning of Ti-6Al-4V using cutting parameters, forces and tool vibration

    NASA Astrophysics Data System (ADS)

    Sahu, Neelesh Kumar; Andhare, Atul B.; Andhale, Sandip; Raju Abraham, Roja

    2018-04-01

    Present work deals with prediction of surface roughness using cutting parameters along with in-process measured cutting force and tool vibration (acceleration) during turning of Ti-6Al-4V with cubic boron nitride (CBN) inserts. Full factorial design is used for design of experiments using cutting speed, feed rate and depth of cut as design variables. Prediction model for surface roughness is developed using response surface methodology with cutting speed, feed rate, depth of cut, resultant cutting force and acceleration as control variables. Analysis of variance (ANOVA) is performed to find out significant terms in the model. Insignificant terms are removed after performing statistical test using backward elimination approach. Effect of each control variables on surface roughness is also studied. Correlation coefficient (R2 pred) of 99.4% shows that model correctly explains the experiment results and it behaves well even when adjustment is made in factors or new factors are added or eliminated. Validation of model is done with five fresh experiments and measured forces and acceleration values. Average absolute error between RSM model and experimental measured surface roughness is found to be 10.2%. Additionally, an artificial neural network model is also developed for prediction of surface roughness. The prediction results of modified regression model are compared with ANN. It is found that RSM model and ANN (average absolute error 7.5%) are predicting roughness with more than 90% accuracy. From the results obtained it is found that including cutting force and vibration for prediction of surface roughness gives better prediction than considering only cutting parameters. Also, ANN gives better prediction over RSM models.

  11. Optimization of tocopherol concentration process from soybean oil deodorized distillate using response surface methodology.

    PubMed

    Ito, Vanessa Mayumi; Batistella, César Benedito; Maciel, Maria Regina Wolf; Maciel Filho, Rubens

    2007-04-01

    Soybean oil deodorized distillate is a product derived from the refining process and it is rich in high value-added products. The recovery of these unsaponifiable fractions is of great commercial interest, because of the fact that in many cases, the "valuable products" have vitamin activities such as tocopherols (vitamin E), as well as anticarcinogenic properties such as sterols. Molecular distillation has large potential to be used in order to concentrate tocopherols, as it uses very low temperatures owing to the high vacuum and short operating time for separation, and also, it does not use solvents. Then, it can be used to separate and to purify thermosensitive material such as vitamins. In this work, the molecular distillation process was applied for tocopherol concentration, and the response surface methodology was used to optimize free fatty acids (FFA) elimination and tocopherol concentration in the residue and in the distillate streams, both of which are the products of the molecular distiller. The independent variables studied were feed flow rate (F) and evaporator temperature (T) because they are the very important process variables according to previous experience. The experimental range was 4-12 mL/min for F and 130-200 degrees C for T. It can be noted that feed flow rate and evaporator temperature are important operating variables in the FFA elimination. For decreasing the loss of FFA, in the residue stream, the operating range should be changed, increasing the evaporator temperature and decreasing the feed flow rate; D/F ratio increases, increasing evaporator temperature and decreasing feed flow rate. High concentration of tocopherols was obtained in the residue stream at low values of feed flow rate and high evaporator temperature. These results were obtained through experimental results based on experimental design.

  12. The role of SST variability in the simulation of the MJO

    NASA Astrophysics Data System (ADS)

    Stan, Cristiana

    2017-12-01

    The sensitivity of the Madden-Julian Oscillation to high-frequency variability (period 1-5 days) of sea surface temperature (SST) is investigated using numerical experiments with the super-parameterized Community Climate System Model. The findings of this study emphasize the importance of air-sea interactions in the simulation of the MJO, and stress the necessity of an accurate representation of ocean variability on short time scales. Eliminating 1-5-day variability of surface boundary forcing reduces the intraseasonal variability (ISV) of the tropics during the boreal winter. The ISV spectrum becomes close to the red noise background spectrum. The variability of atmospheric circulation shifts to longer time scales. In the absence of high-frequency variability of SST the MJO power gets confined to wavenumbers 1-2 and the magnitude of westward power associated with Rossby waves increases. The MJO convective activity propagating eastward from the Indian Ocean does not cross the Maritime Continent, and convection in the western Pacific Ocean is locally generated. In the Indian Ocean convection tends to follow the meridional propagation of SST anomalies. The response of the MJO to 1-5-day variability in the SST is through the charging and discharging mechanisms contributing to the atmospheric column moist static energy before and after peak MJO convection. Horizontal advection and surface fluxes show the largest sensitivity to SST perturbations.

  13. Flight data analysis and further development of variable-conductance heat pipes. [for aircraft control

    NASA Technical Reports Server (NTRS)

    Enginer, J. E.; Luedke, E. E.; Wanous, D. J.

    1976-01-01

    Continuing efforts in large gains in heat-pipe performance are reported. It was found that gas-controlled variable-conductance heat pipes can perform reliably for long periods in space and effectively provide temperature stabilization for spacecraft electronics. A solution was formulated that allows the control gas to vent through arterial heat-pipe walls, thus eliminating the problem of arterial failure under load, due to trace impurities of noncondensable gas trapped in an arterial bubble during priming. This solution functions well in zero gravity. Another solution was found that allows priming at a much lower fluid charge. A heat pipe with high capacity, with close temperature control of the heat source and independent of large variations in sink temperature was fabricated.

  14. Pharmacometric Approaches to Personalize Use of Primarily Renally Eliminated Antibiotics in Preterm and Term Neonates.

    PubMed

    Wilbaux, Mélanie; Fuchs, Aline; Samardzic, Janko; Rodieux, Frédérique; Csajka, Chantal; Allegaert, Karel; van den Anker, Johannes N; Pfister, Marc

    2016-08-01

    Sepsis remains a major cause of mortality and morbidity in neonates, and, as a consequence, antibiotics are the most frequently prescribed drugs in this vulnerable patient population. Growth and dynamic maturation processes during the first weeks of life result in large inter- and intrasubject variability in the pharmacokinetics (PK) and pharmacodynamics (PD) of antibiotics. In this review we (1) summarize the available population PK data and models for primarily renally eliminated antibiotics, (2) discuss quantitative approaches to account for effects of growth and maturation processes on drug exposure and response, (3) evaluate current dose recommendations, and (4) identify opportunities to further optimize and personalize dosing strategies of these antibiotics in preterm and term neonates. Although population PK models have been developed for several of these drugs, exposure-response relationships of primarily renally eliminated antibiotics in these fragile infants are not well understood, monitoring strategies remain inconsistent, and consensus on optimal, personalized dosing of these drugs in these patients is absent. Tailored PK/PD studies and models are useful to better understand relationships between drug exposures and microbiological or clinical outcomes. Pharmacometric modeling and simulation approaches facilitate quantitative evaluation and optimization of treatment strategies. National and international collaborations and platforms are essential to standardize and harmonize not only studies and models but also monitoring and dosing strategies. Simple bedside decision tools assist clinical pharmacologists and neonatologists in their efforts to fine-tune and personalize the use of primarily renally eliminated antibiotics in term and preterm neonates. © 2016, The American College of Clinical Pharmacology.

  15. Elimination of ascorbic acid after high-dose infusion in prostate cancer patients: a pharmacokinetic evaluation.

    PubMed

    Nielsen, Torben K; Højgaard, Martin; Andersen, Jon T; Poulsen, Henrik E; Lykkesfeldt, Jens; Mikines, Kári J

    2015-04-01

    Treatment with high-dose intravenous (IV) ascorbic acid (AA) is used in complementary and alternative medicine for various conditions including cancer. Cytotoxicity to cancer cell lines has been observed with millimolar concentrations of AA. Little is known about the pharmacokinetics of high-dose IV AA. The purpose of this study was to assess the basic kinetic variables in human beings over a relevant AA dosing interval for proper design of future clinical trials. Ten patients with metastatic prostate cancer were treated for 4 weeks with fixed AA doses of 5, 30 and 60 g. AA was measured consecutively in plasma and indicated first-order elimination kinetics throughout the dosing range with supra-physiological concentrations. The target dose of 60 g AA IV produced a peak plasma AA concentration of 20.3 mM. Elimination half-life was 1.87 hr (mean, S.D. ± 0.40), volume of distribution 0.19 L/kg (S.D. ±0.05) and clearance rate 6.02 L/hr (100 mL/min). No differences in pharmacokinetic parameters were observed between weeks/doses. A relatively fast first-order elimination with half-life of about 2 hr makes it impossible to maintain AA concentrations in the potential cytotoxic range after infusion stop in prostate cancer patients with normal kidney function. We propose a regimen with a bolus loading followed by a maintenance infusion based on the calculated clearance. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  16. Advanced diesel engine component development program, tasks 4-14

    NASA Astrophysics Data System (ADS)

    Kaushal, Tony S.; Weber, Karen E.

    1994-11-01

    This report summarizes the Advanced Diesel Engine Component Development (ADECD) Program to develop and demonstrate critical technology needed to advance the heavy-duty low heat rejection engine concept. Major development activities reported are the design, analysis, and fabrication of monolithic ceramic components; vapor phase and solid film lubrication; electrohydraulic valve actuation; and high pressure common rail injection. An advanced single cylinder test bed was fabricated as a laboratory tool in studying these advanced technologies. This test bed simulates the reciprocator for a system having no cooling system, turbo compounding, Rankine bottoming cycle, common rail injection, and variable valve actuation to achieve fuel consumption of 160 g/kW-hr (.26 lb/hp-hr). The advanced concepts were successfully integrated into the test engine. All ceramic components met their functional and reliability requirements. The firedeck, cast-in-place ports, valves, valve guides, piston cap, and piston ring were made from silicon nitride. Breakthroughs required to implement a 'ceramic' engine included the fabrication of air-gap cylinder heads, elimination of compression gaskets, machining of ceramic valve seats within the ceramic firedeck, fabrication of cast-in-place ceramic port liners, implementation of vapor phase lubrication, and elimination of the engine coolant system. Silicon nitride valves were successfully developed to meet several production abuse test requirements and incorporated into the test bed with a ceramic valve guide and solid film lubrication. The ADECD cylinder head features ceramic port shields to increase insulation and exhaust energy recovery. The combustion chamber includes a ceramic firedeck and piston cap. The tribological challenge posed by top ring reversal temperatures of 550 C was met through the development of vapor phase lubrication using tricresyl phosphate at the ring-liner interface. A solenoid-controlled, variable valve actuation system that eliminated the conventional camshaft was demonstrated on the test bed. High pressure fuel injection via a common rail system was also developed to reduce particulate emissions.

  17. Advanced diesel engine component development program, tasks 4-14

    NASA Technical Reports Server (NTRS)

    Kaushal, Tony S.; Weber, Karen E.

    1994-01-01

    This report summarizes the Advanced Diesel Engine Component Development (ADECD) Program to develop and demonstrate critical technology needed to advance the heavy-duty low heat rejection engine concept. Major development activities reported are the design, analysis, and fabrication of monolithic ceramic components; vapor phase and solid film lubrication; electrohydraulic valve actuation; and high pressure common rail injection. An advanced single cylinder test bed was fabricated as a laboratory tool in studying these advanced technologies. This test bed simulates the reciprocator for a system having no cooling system, turbo compounding, Rankine bottoming cycle, common rail injection, and variable valve actuation to achieve fuel consumption of 160 g/kW-hr (.26 lb/hp-hr). The advanced concepts were successfully integrated into the test engine. All ceramic components met their functional and reliability requirements. The firedeck, cast-in-place ports, valves, valve guides, piston cap, and piston ring were made from silicon nitride. Breakthroughs required to implement a 'ceramic' engine included the fabrication of air-gap cylinder heads, elimination of compression gaskets, machining of ceramic valve seats within the ceramic firedeck, fabrication of cast-in-place ceramic port liners, implementation of vapor phase lubrication, and elimination of the engine coolant system. Silicon nitride valves were successfully developed to meet several production abuse test requirements and incorporated into the test bed with a ceramic valve guide and solid film lubrication. The ADECD cylinder head features ceramic port shields to increase insulation and exhaust energy recovery. The combustion chamber includes a ceramic firedeck and piston cap. The tribological challenge posed by top ring reversal temperatures of 550 C was met through the development of vapor phase lubrication using tricresyl phosphate at the ring-liner interface. A solenoid-controlled, variable valve actuation system that eliminated the conventional camshaft was demonstrated on the test bed. High pressure fuel injection via a common rail system was also developed to reduce particulate emissions.

  18. Continuous Process Improvement at Tinker Air Logistics Complex

    DTIC Science & Technology

    2013-03-01

    principles of Lean thinking published by Womack and Jones (US Air Force, 2008). The goal of AFSO21 was to eliminate waste from organizational...survey because we did not think a survey could accurately capture the depth and of the independent variables. Intangible elements of leadership and...workers within the firm itself. People are at the heart of the TPS. One of Ohno’s 14 principles of the Toyota Way is “Develop exceptional people and

  19. Control of Gas Tungsten Arc welding pool shape by trace element addition to the weld pool

    DOEpatents

    Heiple, C.R.; Burgardt, P.

    1984-03-13

    An improved process for Gas Tungsten Arc welding maximizes the depth/width ratio of the weld pool by adding a sufficient amount of a surface active element to insure inward fluid flow, resulting in deep, narrow welds. The process is especially useful to eliminate variable weld penetration and shape in GTA welding of steels and stainless steels, particularly by using a sulfur-doped weld wire in a cold wire feed technique.

  20. Photo-z-SQL: Photometric redshift estimation framework

    NASA Astrophysics Data System (ADS)

    Beck, Róbert; Dobos, László; Budavári, Tamás; Szalay, Alexander S.; Csabai, István

    2017-04-01

    Photo-z-SQL is a flexible template-based photometric redshift estimation framework that can be seamlessly integrated into a SQL database (or DB) server and executed on demand in SQL. The DB integration eliminates the need to move large photometric datasets outside a database for redshift estimation, and uses the computational capabilities of DB hardware. Photo-z-SQL performs both maximum likelihood and Bayesian estimation and handles inputs of variable photometric filter sets and corresponding broad-band magnitudes.

  1. Boolean Reasoning and Informed Search in the Minimization of Logic Circuits

    DTIC Science & Technology

    1992-03-01

    motivation of this project as well as a definition of the problem. The scope of the effort was presented, as well as the assumptions found to be...in the resulting formula than the expansion-based product operation. The primary motive for using the expansion-based product versus a cross-product...eliminant is formed is the least-binate-variable heuristic described in Chapter 2. The motivation for this heuristic was illustrated in Example 3.3. The

  2. Variable photosynthetic units, energy transfer and light-induced evolution of hydrogen in algae and bacteria.

    NASA Technical Reports Server (NTRS)

    Gaffron, H.

    1971-01-01

    The present state of knowledge regarding the truly photochemical reactions in photosynthesis is considered. Nine-tenths of the available knowledge is of a biochemical nature. Questions regarding the activities of the chlorophyll system are examined. The simplest photochemical response observed in living hydrogen-adapted algal cells is the release of molecular hydrogen, which continues even after all other known natural reactions have been eliminated either by heating or the action of poisons.

  3. Quantum mechanical generalized phase-shift approach to atom-surface scattering: a Feshbach projection approach to dealing with closed channel effects.

    PubMed

    Maji, Kaushik; Kouri, Donald J

    2011-03-28

    We have developed a new method for solving quantum dynamical scattering problems, using the time-independent Schrödinger equation (TISE), based on a novel method to generalize a "one-way" quantum mechanical wave equation, impose correct boundary conditions, and eliminate exponentially growing closed channel solutions. The approach is readily parallelized to achieve approximate N(2) scaling, where N is the number of coupled equations. The full two-way nature of the TISE is included while propagating the wave function in the scattering variable and the full S-matrix is obtained. The new algorithm is based on a "Modified Cayley" operator splitting approach, generalizing earlier work where the method was applied to the time-dependent Schrödinger equation. All scattering variable propagation approaches to solving the TISE involve solving a Helmholtz-type equation, and for more than one degree of freedom, these are notoriously ill-behaved, due to the unavoidable presence of exponentially growing contributions to the numerical solution. Traditionally, the method used to eliminate exponential growth has posed a major obstacle to the full parallelization of such propagation algorithms. We stabilize by using the Feshbach projection operator technique to remove all the nonphysical exponentially growing closed channels, while retaining all of the propagating open channel components, as well as exponentially decaying closed channel components.

  4. WRC bulletin. A review of underclad cracking in pressure-vessel components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinckier, A.G.; Pense, A.W.

    1974-01-01

    This review of cracking underneath the weld cladding is to determine what factors contribute to this condition, and to outline means for alleviating or eliminating this condition. Considerable data on manufacture, heat treatment, and cladding of heavy-section pressure-vessel steels for nuclear service are also included. Three factors in combination that promote underclad cracking are susceptible microstructure, favorable residual-stress pattern, and a thermal treatment bringing the steel into a critical temperature region (600-650/sup 0/C) where creep ductility is low. High-heat-input weld-overlay cladding produces the susceptible microstructure and residual-stress pattern and postweld heat treatment produces the critical temperature. Most underclad cracking wasmore » found in SA508 Class 2 steel forgings clad with one-layer submerged-arc strip electrodes or multi-electrode processes. It was not produced in SA533 Grade B plate or when multilayer overlay processes were used. Underclad cracking can be reduced or eliminated by a two-layer cladding technique, by controlling welding process variables (low heat input), renormalizing the sensitive HAZ region prior to heat treatment, by use of nonsusceptible materials, or by eliminating the postweld heat treatment. Results of a questionnaire survey are also included. 50 references. (DLC)« less

  5. Elimination of endogenous aberrant kappa chain transcripts from sp2/0-derived hybridoma cells by specific ribozyme cleavage: utility in genetic therapy of HIV-1 infections.

    PubMed Central

    Duan, L; Pomerantz, R J

    1994-01-01

    The pooled degenerate-primer polymerase chain reaction (PCR) technology is now widely used in the amplification and cloning of murine hybridoma-specific immunoglobulin gene cDNAs. The design of primers is mainly based on the highly conserved 5' terminus of immunoglobulin gene variable regions and the constant region in the 3' terminus. Of note, most murine hybridoma cell lines are derived from the Sp2/0 cell line, which is demonstrated to express endogenous aberrant kappa chains (abV kappa). This high-level endogenous abV kappa mixes with specific kappa chains in the hybridomas and interferes with the efficiency of the reverse transcriptase (RT)-PCR cloning strategy. In this report, during the cloning of murine anti-human immunodeficiency virus type I (HIV-1) hybridoma immunoglobulin cDNAs, a specific primer-PCR screening system was developed, based on the abV kappa complementarity-defining region (CDR), to eliminate abV kappa-carrying plasmids. Furthermore, an abV kappa sequence-specific derived ribozyme was developed and packaged in a retroviral expression vector system. This abV kappa ribozyme can be transduced into different murine hybridomas, and expressed intracellularly to potently eliminate endogenous abV kappa RNA. Images PMID:7816635

  6. Evaluation of the whole body physiologically based pharmacokinetic (WB-PBPK) modeling of drugs.

    PubMed

    Munir, Anum; Azam, Shumaila; Fazal, Sahar; Bhatti, A I

    2018-08-14

    The Physiologically based pharmacokinetic (PBPK) modeling is a supporting tool in drug discovery and improvement. Simulations produced by these models help to save time and aids in examining the effects of different variables on the pharmacokinetics of drugs. For this purpose, Sheila and Peters suggested a PBPK model capable of performing simulations to study a given drug absorption. There is a need to extend this model to the whole body entailing all another process like distribution, metabolism, and elimination, besides absorption. The aim of this scientific study is to hypothesize a WB-PBPK model through integrating absorption, distribution, metabolism, and elimination processes with the existing PBPK model.Absorption, distribution, metabolism, and elimination models are designed, integrated with PBPK model and validated. For validation purposes, clinical records of few drugs are collected from the literature. The developed WB-PBPK model is affirmed by comparing the simulations produced by the model against the searched clinical data. . It is proposed that the WB-PBPK model may be used in pharmaceutical industries to create of the pharmacokinetic profiles of drug candidates for better outcomes, as it is advance PBPK model and creates comprehensive PK profiles for drug ADME in concentration-time plots. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. On rules of induction and the raven paradox in Bayesian confirmation theory

    NASA Astrophysics Data System (ADS)

    Afshar, H. M.; Sunehag, P.

    2014-12-01

    Confirmation theory is studying how one can confirm a universal statement like "All ravens are black". Early authors discussed how one's degree of belief in such a statement should change with new evidence and suggested various rules of induction. Nicod's Condition (NC) says that the claim that all F are G is supported by observing a previously unseen object that is both F and G. Hempel pointed out that NC implies the paradoxical conclusion that observing a white sock supports that all ravens are black. In our time, confirmation is studied by using subjective conditional probability as degrees of belief with Kolmogorov's axioms as the main rules of induction. The old rules and problems of induction are, however, still studied within the probabilistic framework. We consider a setting where the number of individuals having a particular property is given and find that NC can contradict a simpler principle, namely projectability (PJ) which says that if we observe an object with property ψ then other objects are also more likely to have property ψ. We find that intuition can side with either one depending on the situation. We suggest that a more appropriate formalization of the intuition behind NC is the weaker principle of reasoning by analogy (RA). RA says that if we see an object that is F and G and we know that another object is F, then it is more likely to also be G. Projectability might still be considered valid for relatively uninformed a priori beliefs. If one decides that a principle like projectability is valid for confirmation in an uninformed situation, it provides a test that an a priori distribution must satisfy. Hence, decreasing the arbitrariness of the choice of measure. Further, by considering background knowledge saying only how many ravens there are in the world we conclude that if someone accepts the projectability principle, an agent will not increment the belief that all ravens are black when having observed a white sock. Most Bayesian approaches have so far derived a small increment in confirmation by relying on some particular a priori measure. A conclusion rejected by common sense. We here resolve the contradiction by formally identifying the natural background knowledge considered and the inductive rule that the a priori belief should comply with in the situation at hand. The original paradox is dispelled by rejecting NC.

  8. The treatment of parental height as a biological factor in studies of birth weight and childhood growth

    PubMed Central

    Spencer, N; Logan, S

    2002-01-01

    Parental height is frequently treated as a biological variable in studies of birth weight and childhood growth. Elimination of social variables from multivariate models including parental height as a biological variable leads researchers to conclude that social factors have no independent effect on the outcome. This paper challenges the treatment of parental height as a biological variable, drawing on extensive evidence for the determination of adult height through a complex interaction of genetic and social factors. The paper firstly seeks to establish the importance of social factors in the determination of height. The methodological problems associated with treatment of parental height as a purely biological variable are then discussed, illustrated by data from published studies and by analysis of data from the 1958 National Childhood Development Study (NCDS). The paper concludes that a framework for studying pathways to pregnancy and childhood outcomes needs to take account of the complexity of the relation between genetic and social factors and be able to account for the effects of multiple risk factors acting cumulatively across time and across generations. Illustrations of these approaches are given using NCDS data. PMID:12193422

  9. Environmental strategies: A case study of systematic evaluation

    NASA Astrophysics Data System (ADS)

    Sherman, Douglas J.; Garès, Paul A.

    1982-09-01

    A major problem facing environmental managers is the necessity to effectively evaluate management alternatives. Traditional environmental assessments have emphasized the use of economic analyses. These approaches are often deficient due to difficulty in assigning dollar values to environmental systems and to social amenities. A more flexible decisionmaking model has been developed to analyze management options for coping with beach erosion problems at the Sandy Hook Unit of Gateway National Recreation Area in New Jersey. This model is comprised of decision-making variables which are formulated from a combination of environmental and management criteria, and it has an accept-reject format in which the management options are analyzed in terms of the variables. Through logical ordering of the insertion of the variables into the model, stepwise elimination of alternatives is possible. A hierarchy of variables is determined through estimating work required to complete an assessment of the alternatives for each variable. The assessment requiring the least work is performed first so that the more difficult evaluation will be limited to fewer alternatives. The application of this approach is illustrated with a case study in which beach protection alternatives were evaluated for the United States National Park Service.

  10. Influence of Cushioning Variables in the Workplace and in the Family on the Probability of Suffering Stress.

    PubMed

    Gonzalo, David Cárdenas

    2016-09-01

    Stress at work and in the family is a very common issue in our society that generates many health-related problems. During recent years, numerous studies have sought to define the term stress, raising many contradictions that various authors have studied. Other authors have attempted to establish some criteria, in subjective and not very quantitative ways, in an attempt to reduce and even to eliminate stressors and their effects at work and in the family context. The purpose of this study was to quantify so-called cushioning variables, such as control, social support, home/work life conciliation, and even sports and leisure activities, with the purpose of, as much as possible, reducing the negative effects of stress, which seriously affects the health of workers. The study employs data from the Fifth European Working Conditions Survey, in which nearly 44,000 interviewees from 34 countries in the European Union participated. We constructed a probabilistic model based on a Bayesian network, using variables from both the workplace and the family, the aforementioned cushioning variables, as well as the variable stress. If action is taken on the above variables, then the probabilities of suffering high levels of stress may be reduced. Such action may improve the quality of life of people at work and in the family.

  11. Factors that affect the fatigue strength of power transmission shafting

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1984-01-01

    A long standing objective in the design of power transmission shafting is to eliminate excess shaft material without compromising operational reliability. A shaft design method is presented which accounts for variable amplitude loading histories and their influence on limited life designs. The effects of combined bending and torsional loading are considered along with a number of application factors known to influence the fatigue strength of shafting materials. Among the factors examined are surface condition, size, stress concentration, residual stress and corrosion fatigue.

  12. Hypersurface Insertion Window for Long Term Orbital Stability of Artificial Satellites About the Planet Venus

    DTIC Science & Technology

    1988-12-01

    Conversion of the Geopotential into the Modified Orbital Elements 83 Appendix C: Useful Derivatives for the Geopotential Calculations 87 Appendix D...replaced by two equinoctial elements , h and k (from a coordinate system with singularities at i = x and for rectilinear orbits ). Also, for long term 3...0. 10 and 0.55 i 15.5) a more well behaved set of variables will be used: two of the equinoctial elements , h and k. These elements eliminate the

  13. Prayer: folk home remedy vs. spiritual practice.

    PubMed

    Easom, Leisa R; Easom, Linda R

    2006-01-01

    A multidisciplinary review of the literature reveals that prayer, in a multicultural context, may be viewed as both a folk home remedy and a practice of spirituality. Understanding cultural differences and similarities of the use of prayer as a variable for health promotion may have implications for tailoring treatment approaches to eliminate disparities in providing care to clients of diverse cultural backgrounds. This paper presents these similarities and differences within the cultural beliefs of the White, African-American, and Hispanic populations.

  14. The Effects of Climatological and Transient Wind Forcing on Eddy Generation in the California Current System

    DTIC Science & Technology

    1989-09-01

    most dynamically important area in the coastal upwelling region (Philander and Yoon, 1982; Allen, 1980) and as such, the treatment of this region becomes...the area of the model domain (Nelson, 1977). This treatment of the wind stress eliminates all spatial variability in the nearshore region, reducing...that of Experiment 3, again using data from Nelson (1977). The difference in the two experiments lies in the treatment of the wind field next to the

  15. Highly-reliable fly-by-light/power-by-wire technology

    NASA Technical Reports Server (NTRS)

    Pitts, Felix L.

    1993-01-01

    This paper presents in viewgraph format an overview of the program at NASA Langley Research Center to develop fly-by-light/power-by-wire (FBL/PBW) technology. Benefits of FBL/PBW include intrinsic electromagnetic interference (EMI) immunity and lifetime immunity to signal EMI of optics; simplified certification; the elimination of hydraulics, engine bleed air, and variable speed, constant frequency drive; and weight and volume reduction. The paper summarizes a study on the electromagnetic environmental effects on FBL/PBW systems. The paper concludes with FY 1993 plans.

  16. An Analysis of the United States Special Operations Command’s Acquisition Process to Determine Its Compliance with Acquisition Reform Initiatives of the Past Decade

    DTIC Science & Technology

    1996-12-01

    This includes an exemption from publishing the opportunity in the Commerce Business Daily ( CBD ) and elimination of the requirement to hold the...of assigned programs. In discharging this responsibility, thc 990 coor.int-tes his efforts with ether ASN(RDMA) offices, b. TEFlO..M-VAL OPERATIONS...Communications, Computers and Information Systems CA Civil Affairs CAIV Cost as an Independent Variable CBD Commerce Business Daily CBPL Capabilities

  17. The Effects of Selected Instructional Variables on the Acquisition of Cognitive Learning Strategies

    DTIC Science & Technology

    1980-08-01

    Free Recall Scores in Exneriment I.......................11 Table 4. Source Table for Analysis of Variance on the Paired- A-sociate Task in Experiment I...analysis of the free recall data, the scores of two students in the traditional instructions group had to be eliminated due to a faulty slide projector...For the analysis of the 8 paired-associate data, the score of one student in the combined instructions qroup and the score of one student in the

  18. A Multiple Ranking Procedure Adapted to Discrete-Event Simulation.

    DTIC Science & Technology

    1983-12-01

    stopped if 50% of the original data set has been truncated and the bias effects have still not been eliminated. Use of this limit is reinforced by the... immediately ; thus when the store opens on Monday, the order placed on the preceding Friday has already arrived. Rackorders 103 are permitted. The...INITIALIZE VARIABLES USED DURING C *****EACH PASS THROUGH THE DATA C AMAX = 0.0 CUSUM = 0.0 PMEAN = 0.0 AMIN = 0.0 PSUM = 0.0 M = 0 NEGTIV = 0 POSTIV = 0

  19. Evaluation of variable selection methods for random forests and omics data sets.

    PubMed

    Degenhardt, Frauke; Seifert, Stephan; Szymczak, Silke

    2017-10-16

    Machine learning methods and in particular random forests are promising approaches for prediction based on high dimensional omics data sets. They provide variable importance measures to rank predictors according to their predictive power. If building a prediction model is the main goal of a study, often a minimal set of variables with good prediction performance is selected. However, if the objective is the identification of involved variables to find active networks and pathways, approaches that aim to select all relevant variables should be preferred. We evaluated several variable selection procedures based on simulated data as well as publicly available experimental methylation and gene expression data. Our comparison included the Boruta algorithm, the Vita method, recurrent relative variable importance, a permutation approach and its parametric variant (Altmann) as well as recursive feature elimination (RFE). In our simulation studies, Boruta was the most powerful approach, followed closely by the Vita method. Both approaches demonstrated similar stability in variable selection, while Vita was the most robust approach under a pure null model without any predictor variables related to the outcome. In the analysis of the different experimental data sets, Vita demonstrated slightly better stability in variable selection and was less computationally intensive than Boruta.In conclusion, we recommend the Boruta and Vita approaches for the analysis of high-dimensional data sets. Vita is considerably faster than Boruta and thus more suitable for large data sets, but only Boruta can also be applied in low-dimensional settings. © The Author 2017. Published by Oxford University Press.

  20. Stabilizing detached Bridgman melt crystal growth: Model-based nonlinear feedback control

    NASA Astrophysics Data System (ADS)

    Yeckel, Andrew; Daoutidis, Prodromos; Derby, Jeffrey J.

    2012-12-01

    The dynamics and operability limits of a nonlinear-proportional-integral controller designed to stabilize detached vertical Bridgman crystal growth are studied. The manipulated variable is the pressure difference between upper and lower vapor spaces, and the controlled variable is the gap width at the triple-phase line. The controller consists of a model-based nonlinear component coupled with a standard proportional-integral controller. The nonlinear component is based on a capillary model of shape stability. Perturbations to gap width, pressure difference, wetting angle, and growth angle are studied under both shape stable and shape unstable conditions. The nonlinear-PI controller allows a wider operating range of gain than a standard PI controller used alone, is easier to tune, and eliminates solution multiplicity from closed-loop operation.

  1. Improving University Ranking to Achieve University Competitiveness by Management Information System

    NASA Astrophysics Data System (ADS)

    Dachyar, M.; Dewi, F.

    2015-05-01

    One way to increase university competitiveness is through information system management. A literature review was done to find information system factors that affect university performance in Quacquarelli Symonds (QS) University Ranking: Asia evaluation. Information system factors were then eliminated using Delphi method through consensus of 7 experts. Result from Delphi method was used as measured variables in PLS-SEM. Estimation with PLS-SEM method through 72 respondents shows that the latent variable academic reputation and citation per paper have significant correlation to university competitiveness. In University of Indonesia (UI) the priority to increase university competitiveness as follow: (i) network building in international conference, (ii) availability of research data to public, (iii) international conference information, (iv) information on achievements and accreditations of each major, (v) ease of employment for alumni.

  2. The Strength and Characteristics of VPPA Welded 2219-T87 Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Jemian, W. A.

    1985-01-01

    A study of the variable polarity plasma arc (VPPA) welding process and those factors that control the structure and properties of VPPA welded aluminum alloy 2219-T87 was conducted. The importance of joint preparation, alignment of parts and welding process variables are already established. Internal weld defects have been eliminated. However, a variation of properties was found to be due to the size variation of interdendritic particles in the fusion zone. These particles contribute to the void formation process, which controls the ultimate tensile strength of the welded alloy. A variation of 150 microns in particle size correlated with a 10 ksi variation of ultimate tensile strength. It was found that all fracture surfaces were of the dimple rupture type, with fracture initiating within the fusion zone.

  3. Three Cs in measurement models: causal indicators, composite indicators, and covariates.

    PubMed

    Bollen, Kenneth A; Bauldry, Shawn

    2011-09-01

    In the last 2 decades attention to causal (and formative) indicators has grown. Accompanying this growth has been the belief that one can classify indicators into 2 categories: effect (reflective) indicators and causal (formative) indicators. We argue that the dichotomous view is too simple. Instead, there are effect indicators and 3 types of variables on which a latent variable depends: causal indicators, composite (formative) indicators, and covariates (the "Three Cs"). Causal indicators have conceptual unity, and their effects on latent variables are structural. Covariates are not concept measures, but are variables to control to avoid bias in estimating the relations between measures and latent variables. Composite (formative) indicators form exact linear combinations of variables that need not share a concept. Their coefficients are weights rather than structural effects, and composites are a matter of convenience. The failure to distinguish the Three Cs has led to confusion and questions, such as, Are causal and formative indicators different names for the same indicator type? Should an equation with causal or formative indicators have an error term? Are the coefficients of causal indicators less stable than effect indicators? Distinguishing between causal and composite indicators and covariates goes a long way toward eliminating this confusion. We emphasize the key role that subject matter expertise plays in making these distinctions. We provide new guidelines for working with these variable types, including identification of models, scaling latent variables, parameter estimation, and validity assessment. A running empirical example on self-perceived health illustrates our major points.

  4. Bi-Level Integrated System Synthesis (BLISS) for Concurrent and Distributed Processing

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Altus, Troy D.; Phillips, Matthew; Sandusky, Robert

    2002-01-01

    The paper introduces a new version of the Bi-Level Integrated System Synthesis (BLISS) methods intended for optimization of engineering systems conducted by distributed specialty groups working concurrently and using a multiprocessor computing environment. The method decomposes the overall optimization task into subtasks associated with disciplines or subsystems where the local design variables are numerous and a single, system-level optimization whose design variables are relatively few. The subtasks are fully autonomous as to their inner operations and decision making. Their purpose is to eliminate the local design variables and generate a wide spectrum of feasible designs whose behavior is represented by Response Surfaces to be accessed by a system-level optimization. It is shown that, if the problem is convex, the solution of the decomposed problem is the same as that obtained without decomposition. A simplified example of an aircraft design shows the method working as intended. The paper includes a discussion of the method merits and demerits and recommendations for further research.

  5. Enhancing Multimedia Imbalanced Concept Detection Using VIMP in Random Forests.

    PubMed

    Sadiq, Saad; Yan, Yilin; Shyu, Mei-Ling; Chen, Shu-Ching; Ishwaran, Hemant

    2016-07-01

    Recent developments in social media and cloud storage lead to an exponential growth in the amount of multimedia data, which increases the complexity of managing, storing, indexing, and retrieving information from such big data. Many current content-based concept detection approaches lag from successfully bridging the semantic gap. To solve this problem, a multi-stage random forest framework is proposed to generate predictor variables based on multivariate regressions using variable importance (VIMP). By fine tuning the forests and significantly reducing the predictor variables, the concept detection scores are evaluated when the concept of interest is rare and imbalanced, i.e., having little collaboration with other high level concepts. Using classical multivariate statistics, estimating the value of one coordinate using other coordinates standardizes the covariates and it depends upon the variance of the correlations instead of the mean. Thus, conditional dependence on the data being normally distributed is eliminated. Experimental results demonstrate that the proposed framework outperforms those approaches in the comparison in terms of the Mean Average Precision (MAP) values.

  6. Variability of PCB burden in 5 fish and sharks species of the French Mediterranean continental slope.

    PubMed

    Cresson, Pierre; Fabri, Marie Claire; Miralles, Françoise Marco; Dufour, Jean-Louis; Elleboode, Romain; Sevin, Karine; Mahé, Kelig; Bouchoucha, Marc

    2016-05-01

    Despite being generally located far from contamination sources, deep marine ecosystems are impacted by chemicals like PCB. The PCB contamination in five fish and shark species collected in the continental slope of the Gulf of Lions (NW Mediterranean Sea) was measured, with a special focus on intra- and interspecific variability and on the driving factors. Significant differences occurred between species. Higher values were measured in Scyliorhinus canicula, Galeus melastomus and Helicolenus dactylopterus and lower values in Phycis blennoides and Lepidorhombus boscii. These differences might be explained by specific abilities to accumulate and eliminate contaminant, mostly through cytochrome P450 pathway. Interindividual variation was also high and no correlation was observed between contamination and length, age or trophic level. Despite its major importance, actual bioaccumulation of PCB in deep fish is not as documented as in other marine ecosystems, calling for a better assessment of the factors driving individual bioaccumulation mechanisms and originating high variability in PCB contamination. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Variability Extraction and Synthesis via Multi-Resolution Analysis using Distribution Transformer High-Speed Power Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Mather, Barry A

    A library of load variability classes is created to produce scalable synthetic data sets using historical high-speed raw data. These data are collected from distribution monitoring units connected at the secondary side of a distribution transformer. Because of the irregular patterns and large volume of historical high-speed data sets, the utilization of current load characterization and modeling techniques are challenging. Multi-resolution analysis techniques are applied to extract the necessary components and eliminate the unnecessary components from the historical high-speed raw data to create the library of classes, which are then utilized to create new synthetic load data sets. A validationmore » is performed to ensure that the synthesized data sets contain the same variability characteristics as the training data sets. The synthesized data sets are intended to be utilized in quasi-static time-series studies for distribution system planning studies on a granular scale, such as detailed PV interconnection studies.« less

  8. A review of employment conditions as social determinants of health part II: the workplace.

    PubMed

    Moure-Eraso, Rafael; Flum, Marian; Lahiri, Supriya; Tilly, Chris; Massawe, Ephraim

    2006-01-01

    This is the second part of an article on employment conditions as social determinants of health and health inequalities. In part I of this article, we explored structural (external) employment conditions that affect health inequalities and health gradients. In this article, we try to examine the internal aspects of employment conditions that affect the same variables. It is not our intention to "box" employment conditions in a rigid framework within an internal domain of person-hazard interaction. The objective of examining this variable is to scrutinize internal aspects of employment conditions at a comprehensive policy level in conjunction with external contextual variables. Major occupational health concerns are examined in relationship to globalization, child labor, and work in the formal and informal sectors. Interventions that can eliminate or greatly reduce these exposures as well as those that have been unsuccessful are reviewed. Innovative interventions including work organization change, cleaner production, control banding, national and international coalitions, participatory training, and participatory approaches to improving the work environment are reviewed.

  9. Selection of a Geostatistical Method to Interpolate Soil Properties of the State Crop Testing Fields using Attributes of a Digital Terrain Model

    NASA Astrophysics Data System (ADS)

    Sahabiev, I. A.; Ryazanov, S. S.; Kolcova, T. G.; Grigoryan, B. R.

    2018-03-01

    The three most common techniques to interpolate soil properties at a field scale—ordinary kriging (OK), regression kriging with multiple linear regression drift model (RK + MLR), and regression kriging with principal component regression drift model (RK + PCR)—were examined. The results of the performed study were compiled into an algorithm of choosing the most appropriate soil mapping technique. Relief attributes were used as the auxiliary variables. When spatial dependence of a target variable was strong, the OK method showed more accurate interpolation results, and the inclusion of the auxiliary data resulted in an insignificant improvement in prediction accuracy. According to the algorithm, the RK + PCR method effectively eliminates multicollinearity of explanatory variables. However, if the number of predictors is less than ten, the probability of multicollinearity is reduced, and application of the PCR becomes irrational. In that case, the multiple linear regression should be used instead.

  10. Experimental investigation of wall shock cancellation and reduction of wall interference in transonic testing

    NASA Technical Reports Server (NTRS)

    Ferri, A.; Roffe, G.

    1975-01-01

    A series of experiments were performed to evaluate the effectiveness of a three-dimensional land and groove wall geometry and a variable permeability distribution to reduce the interference produced by the porous walls of a supercritical transonic test section. The three-dimensional wall geometry was found to diffuse the pressure perturbations caused by small local mismatches in wall porosity permitting the use of a relatively coarse wall porosity control to reduce or eliminate wall interference effects. The wall porosity distribution required was found to be a sensitive function of Mach number requiring that the Mach number repeatability characteristics of the test apparatus be quite good. The effectiveness of a variable porosity wall is greatest in the upstream region of the test section where the pressure differences across the wall are largest. An effective variable porosity wall in the down stream region of the test section requires the use of a slightly convergent test section geometry.

  11. Design and testing of a novel multi-stroke micropositioning system with variable resolutions.

    PubMed

    Xu, Qingsong

    2014-02-01

    Multi-stroke stages are demanded in micro-/nanopositioning applications which require smaller and larger motion strokes with fine and coarse resolutions, respectively. This paper presents the conceptual design of a novel multi-stroke, multi-resolution micropositioning stage driven by a single actuator for each working axis. It eliminates the issue of the interference among different drives, which resides in conventional multi-actuation stages. The stage is devised based on a fully compliant variable stiffness mechanism, which exhibits unequal stiffnesses in different strokes. Resistive strain sensors are employed to offer variable position resolutions in the different strokes. To quantify the design of the motion strokes and coarse/fine resolution ratio, analytical models are established. These models are verified through finite-element analysis simulations. A proof-of-concept prototype XY stage is designed, fabricated, and tested to demonstrate the feasibility of the presented ideas. Experimental results of static and dynamic testing validate the effectiveness of the proposed design.

  12. Elimination of Neglected Diseases in Latin America and the Caribbean: A Mapping of Selected Diseases

    PubMed Central

    Schneider, Maria Cristina; Aguilera, Ximena Paz; Barbosa da Silva Junior, Jarbas; Ault, Steven Kenyon; Najera, Patricia; Martinez, Julio; Requejo, Raquel; Nicholls, Ruben Santiago; Yadon, Zaida; Silva, Juan Carlos; Leanes, Luis Fernando; Periago, Mirta Roses

    2011-01-01

    In Latin America and the Caribbean, around 195 million people live in poverty, a situation that increases the burden of some infectious diseases. Neglected diseases, in particular, are often restricted to poor, marginalized sections of the population. Tools exist to combat these diseases, making it imperative to work towards their elimination. In 2009, the Pan American Health Organization (PAHO) received a mandate to support the countries in the Region in eliminating neglected diseases and other poverty-related infections. The objective of this study is to analyze the presence of selected diseases using geo-processing techniques. Five diseases with information available at the first sub-national level (states) were mapped, showing the presence of the disease (“hotspots”) and overlap of diseases (“major hotspots”). In the 45 countries/territories (approximately 570 states) of the Region, there is: lymphatic filariasis in four countries (29 states), onchocerciasis in six countries (25 states), schistosomiasis in four countries (39 states), trachoma in three countries (29 states), and human rabies transmitted by dogs in ten countries (20 states). Of the 108 states with one or more of the selected diseases, 36 states present the diseases in overlapping areas (“major hotspots”). Additional information about soil-transmitted helminths was included. The analysis suggests a majority of the selected diseases are not widespread and can be considered part of an unfinished agenda with elimination as a goal. Integrated plans and a comprehensive approach, ensuring access to existing diagnostic and treatment methods, and establishing a multi-sectoral agenda that addresses social determinants, including access to adequate water and sanitation, are required. Future studies can include additional diseases, socio-economic and environmental variables. PMID:21358810

  13. Pharmacokinetics of phenylbutazone in camels.

    PubMed

    Wasfi, I A; Abdel Hadi, A H; Zorob, O; Osman M al-G; Boni, N S

    1997-06-01

    To document disposition variables of phenylbutazone and its metabolite, oxyphenbutazone, in camels (Camelus dromedarius) after single i.v. bolus administration of phenylbutazone, with a view to making recommendation on avoiding violative residues in racing camels. 6 healthy camels (4 males, 2 females), 5 to 7 years old, and weighing from 350 to 450 kg. Blood samples were collected to 0, 5, 10, 15, 45, and 60 minutes and at 1.5, 2, 2.5, 3, 3.5, 4, 5, 6, 8, 12, 24, 26, 28, 30, 40, 48, 50, 53, and 60 hours after i.v. administration of 4.5 mg of phenylbutazone per kg of body weight. Urine was obtained in fractions during the entire blood sample collection period. Serum and urine phenylbutazone concentrations were measured by high-performance liquid chromatography; assay sensitivity was 100 ng/ml. Serum oxyphenbutazone concentration was measured by gas chromatography/mass spectrometry; assay sensitivity was 10 ng/ml. Disposition of phenylbutazone was best described by a two-compartment open model. Mean +/- SEM elimination half-life was 13.44 +/- 0.44 hours. Total body clearance was 12.63 +/- 1.64 mg/kg/h. Renal clearance was between 0.3 and 0.4% of total body clearance. The elimination half-life of oxyphenbutazone was 23.9 +/- 2.09 hours. The elimination half-life and total body clearance of phenylbutazone in camels are intermediate between reported values in horses and cattle. Extrapolation of a dosage regimen from either species to camels is, therefore, not appropriate. Elimination of phenylbutazone in camels is mainly via metabolism. Owing to the long half-life of phenylbutazone and of oxyphenbutazone, and to the zero drug concentration regulation adopted by the racing commissioner in the United Arab Emirates, practicing veterinarians would be advised not to use phenylbutazone in camels for at least 7 days prior to racing.

  14. malERA: An updated research agenda for characterising the reservoir and measuring transmission in malaria elimination and eradication

    PubMed Central

    2017-01-01

    This paper summarises key advances in defining the infectious reservoir for malaria and the measurement of transmission for research and programmatic use since the Malaria Eradication Research Agenda (malERA) publication in 2011. Rapid and effective progress towards elimination requires an improved understanding of the sources of transmission as well as those at risk of infection. Characterising the transmission reservoir in different settings will enable the most appropriate choice, delivery, and evaluation of interventions. Since 2011, progress has been made in a number of areas. The extent of submicroscopic and asymptomatic infections is better understood, as are the biological parameters governing transmission of sexual stage parasites. Limitations of existing transmission measures have been documented, and proof-of-concept has been established for new innovative serological and molecular methods to better characterise transmission. Finally, there now exists a concerted effort towards the use of ensemble datasets across the spectrum of metrics, from passive and active sources, to develop more accurate risk maps of transmission. These can be used to better target interventions and effectively monitor progress toward elimination. The success of interventions depends not only on the level of endemicity but also on how rapidly or recently an area has undergone changes in transmission. Improved understanding of the biology of mosquito–human and human–mosquito transmission is needed particularly in low-endemic settings, where heterogeneity of infection is pronounced and local vector ecology is variable. New and improved measures of transmission need to be operationally feasible for the malaria programmes. Outputs from these research priorities should allow the development of a set of approaches (applicable to both research and control programmes) that address the unique challenges of measuring and monitoring transmission in near-elimination settings and defining the absence of transmission. PMID:29190279

  15. Operator control systems and methods for swing-free gantry-style cranes

    DOEpatents

    Feddema, J.T.; Petterson, B.J.; Robinett, R.D. III

    1998-07-28

    A system and method are disclosed for eliminating swing motions in gantry-style cranes while subject to operator control. The present invention comprises an infinite impulse response (IIR) filter and a proportional-integral (PI) feedback controller. The IIR filter receives input signals (commanded velocity or acceleration) from an operator input device and transforms them into output signals in such a fashion that the resulting motion is swing free (i.e., end-point swinging prevented). The parameters of the IIR filter are updated in real time using measurements from a hoist cable length encoder. The PI feedback controller compensates for modeling errors and external disturbances, such as wind or perturbations caused by collision with objects. The PI feedback controller operates on cable swing angle measurements provided by a cable angle sensor. The present invention adjusts acceleration and deceleration to eliminate oscillations. An especially important feature of the present invention is that it compensates for variable-length cable motions from multiple cables attached to a suspended payload. 10 figs.

  16. Operator control systems and methods for swing-free gantry-style cranes

    DOEpatents

    Feddema, John T.; Petterson, Ben J.; Robinett, III, Rush D.

    1998-01-01

    A system and method for eliminating swing motions in gantry-style cranes while subject to operator control is presented. The present invention comprises an infinite impulse response ("IIR") filter and a proportional-integral ("PI") feedback controller (50). The IIR filter receives input signals (46) (commanded velocity or acceleration) from an operator input device (45) and transforms them into output signals (47) in such a fashion that the resulting motion is swing free (i.e., end-point swinging prevented). The parameters of the IIR filter are updated in real time using measurements from a hoist cable length encoder (25). The PI feedback controller compensates for modeling errors and external disturbances, such as wind or perturbations caused by collision with objects. The PI feedback controller operates on cable swing angle measurements provided by a cable angle sensor (27). The present invention adjusts acceleration and deceleration to eliminate oscillations. An especially important feature of the present invention is that it compensates for variable-length cable motions from multiple cables attached to a suspended payload.

  17. Effects of a new glucocorticoid, oxazacort, on some variables connected with bone metabolism in man: a comparison with prednisone.

    PubMed

    Caniggia, A; Marchetti, M; Gennari, C; Vattimo, A; Nicolis, F B

    1977-03-01

    The urinary elimination of calcium, other electrolytes, and hydroxyproline and the oral absorption of 47Ca have been evaluated in three groups of 8 patients before and during a 15-day treatment with prednisone at daily doses of 25 and 50 mg and with oxazacort, a new glucocorticoid, at a daily dose of 50 mg. The results obtained demonstrate that oxazacort in short-term teatment with a high dose has no significant effect on the urinary elimination of calcium and hydroxypyroline in experimental conditions in which prednisone produces statistically significant and clinically relevant increase, both when given at the same dose and when given at half that dose. On the other hand, the oral absorption of 47Ca is decreased by oxazacort, but less than by prednisone at the same dose. As the antirheumatic activity of oxazacort appears to be only slightly lower than that of prednisone (activity ratio of about 0.84: 1), these findings may have interesting therapeutic implications.

  18. The Effect of Compliance on the Impact of Mass Drug Administration for Elimination of Lymphatic Filariasis in Egypt

    PubMed Central

    El-Setouhy, Maged; Abd Elaziz, Khaled M.; Helmy, Hanan; Farid, Hoda A.; Kamal, Hussein A.; Ramzy, Reda M. R.; Shannon, William D.; Weil, Gary J.

    2008-01-01

    We studied effects of compliance on the impact of mass drug administration (MDA) with diethylcarbamazine and albendazole for lymphatic filariasis (LF) in an Egyptian village. Baseline microfilaremia (mf) and filarial antigenemia rates were 11.5% and 19.0%, respectively. The MDA compliance rates were excellent (> 85%). However, individual compliance was highly variable; 7.4% of those surveyed after five rounds of MDA denied having ever taken the medications and 52.4% reported that they had taken all five doses. The mf and antigenemia rates were 0.2% and 2.7% in those who reported five doses of MDA and 8.3% and 13.8% in those who reported zero doses. There was no significant difference in residual infection rates among those who had taken two or more doses. These results underscore the importance of compliance for LF elimination programs based on MDA and suggest that two ingested doses of MDA are as effective as five doses for reducing filariasis infection rates. PMID:18165524

  19. Matrix-elimination with steam distillation for determination of short-chain fatty acids in hypersaline waters from pre-salt layer by ion-exclusion chromatography.

    PubMed

    Ferreira, Fernanda N; Carneiro, Manuel C; Vaitsman, Delmo S; Pontes, Fernanda V M; Monteiro, Maria Inês C; Silva, Lílian Irene D da; Neto, Arnaldo Alcover

    2012-02-03

    A method for determination of formic, acetic, propionic and butyric acids in hypersaline waters by ion-exclusion chromatography (IEC), using steam distillation to eliminate matrix-interference, was developed. The steam distillation variables such as type of solution to collect the distillate, distillation time and volume of the 50% v/v H₂SO₄ solution were optimized. The effect of the addition of NaCl different concentrations to the calibration standards on the carboxylic acid recovery was also investigated. Detection limits of 0.2, 0.5, 0.3 and 1.5 mg L⁻¹ were obtained for formic, acetic, propionic and butyric acids, respectively. Produced waters from petroleum reservoirs in the Brazilian pre-salt layer containing about 19% m/v of NaCl were analyzed. Good recoveries (99-108%) were obtained for all acids in spiked produced water samples. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Input filter compensation for switching regulators

    NASA Technical Reports Server (NTRS)

    Kelkar, S. S.; Lee, F. C.

    1983-01-01

    A novel input filter compensation scheme for a buck regulator that eliminates the interaction between the input filter output impedance and the regulator control loop is presented. The scheme is implemented using a feedforward loop that senses the input filter state variables and uses this information to modulate the duty cycle signal. The feedforward design process presented is seen to be straightforward and the feedforward easy to implement. Extensive experimental data supported by analytical results show that significant performance improvement is achieved with the use of feedforward in the following performance categories: loop stability, audiosusceptibility, output impedance and transient response. The use of feedforward results in isolating the switching regulator from its power source thus eliminating all interaction between the regulator and equipment upstream. In addition the use of feedforward removes some of the input filter design constraints and makes the input filter design process simpler thus making it possible to optimize the input filter. The concept of feedforward compensation can also be extended to other types of switching regulators.

  1. Unified treatment of microscopic boundary conditions and efficient algorithms for estimating tangent operators of the homogenized behavior in the computational homogenization method

    NASA Astrophysics Data System (ADS)

    Nguyen, Van-Dung; Wu, Ling; Noels, Ludovic

    2017-03-01

    This work provides a unified treatment of arbitrary kinds of microscopic boundary conditions usually considered in the multi-scale computational homogenization method for nonlinear multi-physics problems. An efficient procedure is developed to enforce the multi-point linear constraints arising from the microscopic boundary condition either by the direct constraint elimination or by the Lagrange multiplier elimination methods. The macroscopic tangent operators are computed in an efficient way from a multiple right hand sides linear system whose left hand side matrix is the stiffness matrix of the microscopic linearized system at the converged solution. The number of vectors at the right hand side is equal to the number of the macroscopic kinematic variables used to formulate the microscopic boundary condition. As the resolution of the microscopic linearized system often follows a direct factorization procedure, the computation of the macroscopic tangent operators is then performed using this factorized matrix at a reduced computational time.

  2. Swine Dysentery.

    PubMed

    Burrough, E R

    2017-01-01

    Swine dysentery is a severe enteric disease in pigs, which is characterized by bloody to mucoid diarrhea and associated with reduced growth performance and variable mortality. This disease is most often observed in grower-finisher pigs, wherein susceptible pigs develop a significant mucohemorrhagic typhlocolitis following infection with strongly hemolytic spirochetes of the genus Brachyspira. While swine dysentery is endemic in many parts of the world, the disease had essentially disappeared in much of the United States by the mid-1990s as a result of industry consolidation and effective treatment, control, and elimination methods. However, since 2007, there has been a reported increase in laboratory diagnosis of swine dysentery in parts of North America along with the detection of novel pathogenic Brachyspira spp worldwide. Accordingly, there has been a renewed interest in swine dysentery and Brachyspira spp infections in pigs, particularly in areas where the disease was previously eliminated. This review provides an overview of knowledge on the etiology, pathogenesis, and diagnosis of swine dysentery, with insights into risk factors and control.

  3. On the efficacy of procedures to normalize Ex-Gaussian distributions

    PubMed Central

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2015-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results. PMID:25709588

  4. Spatial distribution estimation of malaria in northern China and its scenarios in 2020, 2030, 2040 and 2050.

    PubMed

    Song, Yongze; Ge, Yong; Wang, Jinfeng; Ren, Zhoupeng; Liao, Yilan; Peng, Junhuan

    2016-07-07

    Malaria is one of the most severe parasitic diseases in the world. Spatial distribution estimation of malaria and its future scenarios are important issues for malaria control and elimination. Furthermore, sophisticated nonlinear relationships for prediction between malaria incidence and potential variables have not been well constructed in previous research. This study aims to estimate these nonlinear relationships and predict future malaria scenarios in northern China. Nonlinear relationships between malaria incidence and predictor variables were constructed using a genetic programming (GP) method, to predict the spatial distributions of malaria under climate change scenarios. For this, the examples of monthly average malaria incidence were used in each county of northern China from 2004 to 2010. Among the five variables at county level, precipitation rate and temperature are used for projections, while elevation, water density index, and gross domestic product are held at their present-day values. Average malaria incidence was 0.107 ‰ per annum in northern China, with incidence characteristics in significant spatial clustering. A GP-based model fit the relationships with average relative error (ARE) = 8.127 % for training data (R(2) = 0.825) and 17.102 % for test data (R(2) = 0.532). The fitness of GP results are significantly improved compared with those by generalized additive models (GAM) and linear regressions. With the future precipitation rate and temperature conditions in Special Report on Emission Scenarios (SRES) family B1, A1B and A2 scenarios, spatial distributions and changes in malaria incidences in 2020, 2030, 2040 and 2050 were predicted and mapped. The GP method increases the precision of predicting the spatial distribution of malaria incidence. With the assumption of varied precipitation rate and temperature, and other variables controlled, the relationships between incidence and the varied variables appear sophisticated nonlinearity and spatially differentiation. Using the future fluctuated precipitation and the increased temperature, median malaria incidence in 2020, 2030, 2040 and 2050 would significantly increase that it might increase 19 to 29 % in 2020, but currently China is in the malaria elimination phase, indicating that the effective strategies and actions had been taken. While the mean incidences will not increase even reduce due to the incidence reduction in high-risk regions but the simultaneous expansion of the high-risk areas.

  5. Clustering and variable selection in the presence of mixed variable types and missing data.

    PubMed

    Storlie, C B; Myers, S M; Katusic, S K; Weaver, A L; Voigt, R G; Croarkin, P E; Stoeckel, R E; Port, J D

    2018-05-17

    We consider the problem of model-based clustering in the presence of many correlated, mixed continuous, and discrete variables, some of which may have missing values. Discrete variables are treated with a latent continuous variable approach, and the Dirichlet process is used to construct a mixture model with an unknown number of components. Variable selection is also performed to identify the variables that are most influential for determining cluster membership. The work is motivated by the need to cluster patients thought to potentially have autism spectrum disorder on the basis of many cognitive and/or behavioral test scores. There are a modest number of patients (486) in the data set along with many (55) test score variables (many of which are discrete valued and/or missing). The goal of the work is to (1) cluster these patients into similar groups to help identify those with similar clinical presentation and (2) identify a sparse subset of tests that inform the clusters in order to eliminate unnecessary testing. The proposed approach compares very favorably with other methods via simulation of problems of this type. The results of the autism spectrum disorder analysis suggested 3 clusters to be most likely, while only 4 test scores had high (>0.5) posterior probability of being informative. This will result in much more efficient and informative testing. The need to cluster observations on the basis of many correlated, continuous/discrete variables with missing values is a common problem in the health sciences as well as in many other disciplines. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Three Cs in Measurement Models: Causal Indicators, Composite Indicators, and Covariates

    PubMed Central

    Bollen, Kenneth A.; Bauldry, Shawn

    2013-01-01

    In the last two decades attention to causal (and formative) indicators has grown. Accompanying this growth has been the belief that we can classify indicators into two categories, effect (reflective) indicators and causal (formative) indicators. This paper argues that the dichotomous view is too simple. Instead, there are effect indicators and three types of variables on which a latent variable depends: causal indicators, composite (formative) indicators, and covariates (the “three Cs”). Causal indicators have conceptual unity and their effects on latent variables are structural. Covariates are not concept measures, but are variables to control to avoid bias in estimating the relations between measures and latent variable(s). Composite (formative) indicators form exact linear combinations of variables that need not share a concept. Their coefficients are weights rather than structural effects and composites are a matter of convenience. The failure to distinguish the “three Cs” has led to confusion and questions such as: are causal and formative indicators different names for the same indicator type? Should an equation with causal or formative indicators have an error term? Are the coefficients of causal indicators less stable than effect indicators? Distinguishing between causal and composite indicators and covariates goes a long way toward eliminating this confusion. We emphasize the key role that subject matter expertise plays in making these distinctions. We provide new guidelines for working with these variable types, including identification of models, scaling latent variables, parameter estimation, and validity assessment. A running empirical example on self-perceived health illustrates our major points. PMID:21767021

  7. How predictable are equatorial Atlantic surface winds?

    NASA Astrophysics Data System (ADS)

    Richter, Ingo; Doi, Takeshi; Behera, Swadhin

    2017-04-01

    Sensitivity tests with the SINTEX-F general circulation model (GCM) as well as experiments from the Coupled Model Intercomparison Project phase 5 (CMIP5) are used to examine the extent to which sea-surface temperature (SST) anomalies contribute to the variability and predictability of monthly mean surface winds in the equatorial Atlantic. In the SINTEX-F experiments, a control experiment with prescribed observed SST for the period 1982-2014 is modified by inserting climatological values in certain regions, thereby eliminating SST anomalies. When SSTs are set to climatology in the tropical Atlantic only (30S to 30N), surface wind variability over the equatorial Atlantic (5S-5N) decreases by about 40% in April-May-June (AMJ). This suggests that about 60% of surface wind variability is due to either internal atmospheric variability or SSTs anomalies outside the tropical Atlantic. A further experiment with climatological SSTs in the equatorial Pacific indicates that another 10% of variability in AMJ may be due to remote influences from that basin. Experiments from the CMIP5 archive, in which climatological SSTs are prescribed globally, tend to confirm the results from SINTEX-F but show a wide spread. In some models, the equatorial Atlantic surface wind variability decreases by more than 90%, while in others it even increases. Overall, the results suggest that about 50-60% of surface wind variance in AMJ is predictable, while the rest is due to internal atmospheric variability. Other months show significantly lower predictability. The relatively strong internal variability as well as the influence of remote SSTs suggest a limited role for coupled ocean-atmosphere feedbacks in equatorial Atlantic variability.

  8. Dysfunctional elimination symptoms in childhood and adulthood.

    PubMed

    Bower, W F; Yip, S K; Yeung, C K

    2005-10-01

    The dysfunctional elimination syndrome (DES) is rare in adulthood. We evaluate the natural history of DES to identify aspects of the disorder that may be carried into adulthood. A 2-part questionnaire was devised and self-administered to 191 consecutive women attending a urogynecological clinic (UG) and to 251 normal women. The first section asked for recall of childhood symptoms known to be associated with DES, while the lat-ter section explored current bladder and bowel problems. Data sets from the normal cohort (55) reporting current bladder problems were excluded. Descriptive statistics, chi-square and Mann-Whitney-U tests were used to compare variables. UG patients had significantly higher childhood DES scores than normal women. Overall 41.7% of UG patients could be labeled as having dysfunctional elimination as an adult. Symptoms reported significantly more often in childhood by UG patients than by control women were frequent urinary tract infection, vesicoureteral reflux, frequency, urge incontinence, slow and intermittent urine flow, small volume high urge voids, hospitalization for constipation, frequent fecal soiling and nocturnal enuresis. Higher DES scores correlated significantly with current adult urgency, urge leak, stress incontinence, incomplete emptying, post-void leak, hesitancy, nocturia and nocturnal enuresis. Constipation and fecal incontinence in adulthood also showed a significant association with high DES scores. Logistic regression revealed childhood urgency to be associated with adult DES. Childhood lower urinary tract dysfunction may have a negative impact on bladder and bowel function later life.

  9. A single network adaptive critic (SNAC) architecture for optimal control synthesis for a class of nonlinear systems.

    PubMed

    Padhi, Radhakant; Unnikrishnan, Nishant; Wang, Xiaohua; Balakrishnan, S N

    2006-12-01

    Even though dynamic programming offers an optimal control solution in a state feedback form, the method is overwhelmed by computational and storage requirements. Approximate dynamic programming implemented with an Adaptive Critic (AC) neural network structure has evolved as a powerful alternative technique that obviates the need for excessive computations and storage requirements in solving optimal control problems. In this paper, an improvement to the AC architecture, called the "Single Network Adaptive Critic (SNAC)" is presented. This approach is applicable to a wide class of nonlinear systems where the optimal control (stationary) equation can be explicitly expressed in terms of the state and costate variables. The selection of this terminology is guided by the fact that it eliminates the use of one neural network (namely the action network) that is part of a typical dual network AC setup. As a consequence, the SNAC architecture offers three potential advantages: a simpler architecture, lesser computational load and elimination of the approximation error associated with the eliminated network. In order to demonstrate these benefits and the control synthesis technique using SNAC, two problems have been solved with the AC and SNAC approaches and their computational performances are compared. One of these problems is a real-life Micro-Electro-Mechanical-system (MEMS) problem, which demonstrates that the SNAC technique is applicable to complex engineering systems.

  10. Bayesian Analysis for Inference of an Emerging Epidemic: Citrus Canker in Urban Landscapes

    PubMed Central

    Neri, Franco M.; Cook, Alex R.; Gibson, Gavin J.; Gottwald, Tim R.; Gilligan, Christopher A.

    2014-01-01

    Outbreaks of infectious diseases require a rapid response from policy makers. The choice of an adequate level of response relies upon available knowledge of the spatial and temporal parameters governing pathogen spread, affecting, amongst others, the predicted severity of the epidemic. Yet, when a new pathogen is introduced into an alien environment, such information is often lacking or of no use, and epidemiological parameters must be estimated from the first observations of the epidemic. This poses a challenge to epidemiologists: how quickly can the parameters of an emerging disease be estimated? How soon can the future progress of the epidemic be reliably predicted? We investigate these issues using a unique, spatially and temporally resolved dataset for the invasion of a plant disease, Asiatic citrus canker in urban Miami. We use epidemiological models, Bayesian Markov-chain Monte Carlo, and advanced spatial statistical methods to analyse rates and extent of spread of the disease. A rich and complex epidemic behaviour is revealed. The spatial scale of spread is approximately constant over time and can be estimated rapidly with great precision (although the evidence for long-range transmission is inconclusive). In contrast, the rate of infection is characterised by strong monthly fluctuations that we associate with extreme weather events. Uninformed predictions from the early stages of the epidemic, assuming complete ignorance of the future environmental drivers, fail because of the unpredictable variability of the infection rate. Conversely, predictions improve dramatically if we assume prior knowledge of either the main environmental trend, or the main environmental events. A contrast emerges between the high detail attained by modelling in the spatiotemporal description of the epidemic and the bottleneck imposed on epidemic prediction by the limits of meteorological predictability. We argue that identifying such bottlenecks will be a fundamental step in future modelling of weather-driven epidemics. PMID:24762851

  11. The 2011 Report on Dietary Reference Intakes for Calcium and Vitamin D from the Institute of Medicine: What Clinicians Need to Know

    PubMed Central

    Ross, A. Catharine; Manson, JoAnn E.; Abrams, Steven A.; Aloia, John F.; Brannon, Patsy M.; Clinton, Steven K.; Durazo-Arvizu, Ramon A.; Gallagher, J. Christopher; Gallo, Richard L.; Jones, Glenville; Kovacs, Christopher S.; Mayne, Susan T.; Rosen, Clifford J.; Shapses, Sue A.

    2011-01-01

    This article summarizes the new 2011 report on dietary requirements for calcium and vitamin D from the Institute of Medicine (IOM). An IOM Committee charged with determining the population needs for these nutrients in North America conducted a comprehensive review of the evidence for both skeletal and extraskeletal outcomes. The Committee concluded that available scientific evidence supports a key role of calcium and vitamin D in skeletal health, consistent with a cause-and-effect relationship and providing a sound basis for determination of intake requirements. For extraskeletal outcomes, including cancer, cardiovascular disease, diabetes, and autoimmune disorders, the evidence was inconsistent, inconclusive as to causality, and insufficient to inform nutritional requirements. Randomized clinical trial evidence for extraskeletal outcomes was limited and generally uninformative. Based on bone health, Recommended Dietary Allowances (RDAs; covering requirements of ≥97.5% of the population) for calcium range from 700 to 1300 mg/d for life-stage groups at least 1 yr of age. For vitamin D, RDAs of 600 IU/d for ages 1–70 yr and 800 IU/d for ages 71 yr and older, corresponding to a serum 25-hydroxyvitamin D level of at least 20 ng/ml (50 nmol/liter), meet the requirements of at least 97.5% of the population. RDAs for vitamin D were derived based on conditions of minimal sun exposure due to wide variability in vitamin D synthesis from ultraviolet light and the risks of skin cancer. Higher values were not consistently associated with greater benefit, and for some outcomes U-shaped associations were observed, with risks at both low and high levels. The Committee concluded that the prevalence of vitamin D inadequacy in North America has been overestimated. Urgent research and clinical priorities were identified, including reassessment of laboratory ranges for 25-hydroxyvitamin D, to avoid problems of both undertreatment and overtreatment. PMID:21118827

  12. Age related changes in fractional elimination pathways for drugs: assessing the impact of variable ontogeny on metabolic drug-drug interactions.

    PubMed

    Salem, Farzaneh; Johnson, Trevor N; Barter, Zoe E; Leeder, J Steven; Rostami-Hodjegan, Amin

    2013-08-01

    The magnitude of any metabolic drug-drug interactions (DDIs) depends on fractional importance of inhibited pathway which may not necessarily be the same in young children when compared to adults. The ontogeny pattern of cytochrome P450 (CYP) enzymes (CYPs 1A2, 2B6, 2C8, 2C9, 2C18/19, 2D6, 2E1, 3A4) and renal function were analyzed systematically. Bootstrap methodology was used to account for variability, and to define the age range over which statistical differences existed between each pair of specific pathways. A number of DDIs were simulated (Simcyp Pediatric v12) for virtual compounds to highlight effects of age on fractional elimination and consequent magnitude of DDI. For a theoretical drug metabolized 50% by each of CYP2D6 and CYP3A4 pathways at birth, co-administration of ketoconazole (3 mg/kg) resulted in a 1.65-fold difference between inhibited versus uninhibited AUC compared to 2.4-fold in 1 year olds and 3.2-fold in adults. Conversely, neonates could be more sensitive to DDI than adults in certain scenarios. Thus, extrapolation from adult data may not be applicable across all pediatric age groups. The use of pediatric physiologically based pharmacokinetic (p-PBPK) models may offer an interim solution to uncovering potential periods of vulnerability to DDI where there are no existing clinical data derived from children. © The Author(s) 2013.

  13. Beyond the floor effect on the Wechsler Intelligence Scale for Children--4th Ed. (WISC-IV): calculating IQ and Indexes of subjects presenting a floored pattern of results.

    PubMed

    Orsini, A; Pezzuti, L; Hulbert, S

    2015-05-01

    It is now widely known that children with severe intellectual disability show a 'floor effect' on the Wechsler scales. This effect emerges because the practice of transforming raw scores into scaled scores eliminates any variability present in participants with low intellectual ability and because intelligence quotient (IQ) scores are limited insofar as they do not measure scores lower than 40. Following Hessl et al.'s results, the present authors propose a method for the computation of the Wechsler Intelligence Scale for Children--4th Ed. (WISC-IV)'s IQ and Indexes in intellectually disabled participants affected by a floored pattern of results. The Italian standardization sample (n = 2200) for the WISC-IV was used. The method presented in this study highlights the limits of the 'floor effect' of the WISC-IV in children with serious intellectual disability who present a profile with weighted scores of 1 in all the subtests despite some variability in the raw scores. Such method eliminates the floor effect of the scale and therefore makes it possible to analyse the strengths and weaknesses of the WISC-IV's Indexes in these participants. The Authors reflect on clinical utility of this method and on the meaning of raw score of 0 on subtest. © 2014 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.

  14. In-situ implant containing PCL-curcumin nanoparticles developed using design of experiments.

    PubMed

    Kasinathan, Narayanan; Amirthalingam, Muthukumar; Reddy, Neetinkumar D; Jagani, Hitesh V; Volety, Subrahmanyam M; Rao, Josyula Venkata

    2016-01-01

    Polymeric delivery system is useful in reducing pharmacokinetic limitations viz., poor absorption and rapid elimination associated with clinical use of curcumin. Design of experiment is a precise and cost effective tool useful in analyzing the effect of independent variables and their interaction on the product attributes. To evaluate the effect of process variables involved in preparation of curcumin-loaded polycaprolactone (PCL) nanoparticles (CPN). In the present experiment, CPNs were prepared by emulsification solvent evaporation technique. The effect of independent variables on the dependent variable was analyzed using design of experiments. Anticancer activity of CPN was studied using Ehrlich ascites carcinoma (EAC) model. In-situ implant was developed using PLGA as polymer. The effect of independent variables was studied in two stages. First, the effect of drug-polymer ratio, homogenization speed and surfactant concentration on size was studied using factorial design. The interaction of homogenization speed with homogenization time on mean particle size of CPN was then evaluated using central composite design. In the second stage, the effect of these variables (under the conditions optimized for producing particles <500 nm) on percentage drug encapsulation was evaluated using factorial design. CPN prepared under optimized conditions were able to control the development of EAC in Swiss albino mice and enhanced their survival time. PLGA based in-situ implant containing CPN prepared under optimized conditions showed sustained drug release. This implant could be further evaluated for pharmacological activities.

  15. The mechanical and chemical equations of motion of muscle contraction

    NASA Astrophysics Data System (ADS)

    Shiner, J. S.; Sieniutycz, Stanislaw

    1997-11-01

    Up to now no formulation of muscle contraction has provided both the chemical kinetic equations for the reactions responsible for the contraction and the mechanical equation of motion for the muscle. This has most likely been due to the lack of general formalisms for nonlinear systems with chemical-nonchemical coupling valid under the far from equilibrium conditions under which muscle operates physiologically. We have recently developed such formalisms and apply them here to the formulation of muscle contraction to obtain both the chemical and the mechanical equations. The standard formulation up to now has yielded only the dynamic equations for the chemical variables and has considered these to be functions of both time and an appropriate mechanical variable. The macroscopically observable quantities were then obtained by averaging over the mechanical variable. When attempting to derive the dynamics equations for both the chemistry and mechanics this choice of variables leads to conflicting results for the mechanical equation of motion when two different general formalisms are applied. The conflict can be resolved by choosing the variables such that both the chemical variables and the mechanical variables are considered to be functions of time alone. This adds one equation to the set of differential equations to be solved but is actually a simplification of the problem, since these equations are ordinary differential equations, not the partial differential equations of the now standard formulation, and since in this choice of variables the variables themselves are the macroscopic observables the procedure of averaging over the mechanical variable is eliminated. Furthermore, the parameters occurring in the equations at this level of description should be accessible to direct experimental determination.

  16. Loop Heat Pipe with Thermal Control Valve as a Variable Thermal Link

    NASA Technical Reports Server (NTRS)

    Hartenstine, John; Anderson, William G.; Walker, Kara; Dussinger, Pete

    2012-01-01

    Future lunar landers and rovers will require variable thermal links that allow for heat rejection during the lunar daytime and passively prevent heat rejection during the lunar night. During the lunar day, the thermal management system must reject the waste heat from the electronics and batteries to maintain them below the maximum acceptable temperature. During the lunar night, the heat rejection system must either be shut down or significant amounts of guard heat must be added to keep the electronics and batteries above the minimum acceptable temperature. Since guard heater power is unfavorable because it adds to system size and complexity, a variable thermal link is preferred to limit heat removal from the electronics and batteries during the long lunar night. Conventional loop heat pipes (LHPs) can provide the required variable thermal conductance, but they still consume electrical power to shut down the heat transfer. This innovation adds a thermal control valve (TCV) and a bypass line to a conventional LHP that proportionally allows vapor to flow back into the compensation chamber of the LHP. The addition of this valve can achieve completely passive thermal control of the LHP, eliminating the need for guard heaters and complex controls.

  17. Balancing Europe's wind power output through spatial deployment informed by weather regimes.

    PubMed

    Grams, Christian M; Beerli, Remo; Pfenninger, Stefan; Staffell, Iain; Wernli, Heini

    2017-08-01

    As wind and solar power provide a growing share of Europe's electricity1, understanding and accommodating their variability on multiple timescales remains a critical problem. On weekly timescales, variability is related to long-lasting weather conditions, called weather regimes2-5, which can cause lulls with a loss of wind power across neighbouring countries6. Here we show that weather regimes provide a meteorological explanation for multi-day fluctuations in Europe's wind power and can help guide new deployment pathways which minimise this variability. Mean generation during different regimes currently ranges from 22 GW to 44 GW and is expected to triple by 2030 with current planning strategies. However, balancing future wind capacity across regions with contrasting inter-regime behaviour - specifically deploying in the Balkans instead of the North Sea - would almost eliminate these output variations, maintain mean generation, and increase fleet-wide minimum output. Solar photovoltaics could balance low-wind regimes locally, but only by expanding current capacity tenfold. New deployment strategies based on an understanding of continent-scale wind patterns and pan-European collaboration could enable a high share of wind energy whilst minimising the negative impacts of output variability.

  18. Maintenance of Genetic Variability under Strong Stabilizing Selection: A Two-Locus Model

    PubMed Central

    Gavrilets, S.; Hastings, A.

    1993-01-01

    We study a two locus model with additive contributions to the phenotype to explore the relationship between stabilizing selection and recombination. We show that if the double heterozygote has the optimum phenotype and the contributions of the loci to the trait are different, then any symmetric stabilizing selection fitness function can maintain genetic variability provided selection is sufficiently strong relative to linkage. We present results of a detailed analysis of the quadratic fitness function which show that selection need not be extremely strong relative to recombination for the polymorphic equilibria to be stable. At these polymorphic equilibria the mean value of the trait, in general, is not equal to the optimum phenotype, there exists a large level of negative linkage disequilibrium which ``hides'' additive genetic variance, and different equilibria can be stable simultaneously. We analyze dependence of different characteristics of these equilibria on the location of optimum phenotype, on the difference in allelic effect, and on the strength of selection relative to recombination. Our overall result that stabilizing selection does not necessarily eliminate genetic variability is compatible with some experimental results where the lines subject to strong stabilizing selection did not have significant reductions in genetic variability. PMID:8514145

  19. Rapid-cadence optical monitoring for short-period variability of ɛ Aurigae

    NASA Astrophysics Data System (ADS)

    Billings, Gary

    2013-07-01

    ɛ Aurigae was observed with CCD cameras and 35 mm SLR camera lenses, at rapid cadence (>1/minute), for long runs (up to 11 hours), on multiple occasions during 2009 - 2011, to monitor for variability of the system at scales of minutes to hours. The lens and camera were changed during the period to improve results, finalizing on a 135 mm focal length Canon f/2 lens (at f/2.8), an ND8 neutral density filter, a Johnson V filter, and an SBIG ST-8XME camera (Kodak KAF-1603ME microlensed chip). Differential photometry was attempted, but because of the large separation between the variable and comparison star (η Aur), noise caused by transient extinction variations was not consistently eliminated. The lowest-noise time series for searching for short-period variability proved to be the extinction-corrected instrumental magnitude of ɛ Aur obtained on "photometric nights", with η Aur used to determine and monitor the extinction coefficient for the night. No flares or short-period variations of ɛ Aur were detected by visual inspection of the light curves from observing runs with noise levels as low as 0.008 magnitudes rms.

  20. Stock discrimination of spottedtail goby ( Synechogobius ommaturus) in the Yellow Sea by analysis of otolith shape

    NASA Astrophysics Data System (ADS)

    Wang, Yingjun; Ye, Zhenjiang; Liu, Qun; Cao, Liang

    2011-01-01

    Otolith shape is species specific and is an ideal marker of fish population affiliation. In this study, otolith shape of spottedtail goby Synechogobius ommaturus is used to identify stocks in different spawning locations in the Yellow Sea. The main objectives of this study are to explore the potential existence of local stocks of spottedtail goby in the Yellow Sea by analysis of otolith shape, and to investigate ambient impacts on otolith shape. Spottedtail goby was sampled in five locations in the Yellow Sea in 2007 and 2008. Otoliths are described using variables correlated to size (otolith area, perimeter, length, width, and weight) and shape (rectangularity, circularity, and 20 Fourier harmonics). Only standardized otolith variables are used so that the effect of otolith size on the shape variables could be eliminated. There is no significant difference among variables of sex, year, and side (left and right). However, the otolith shapes of the spring stocks and the autumn stocks differ significantly. Otolith shape differences are greater among locations than between years. Correct classification rate of spottedtail goby with the otolith shape at different sampling locations range from 29.7%-77.4%.

Top