Sample records for method based analyses

  1. A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades

    DTIC Science & Technology

    2016-09-01

    UNCLASSIFIED A Review of Sparsity-Based Methods for Analysing Radar Returns from Helicopter Rotor Blades Ngoc Hung Nguyen 1, Hai-Tan Tran 2, Kutluyıl...TR–3292 ABSTRACT Radar imaging of rotating blade -like objects, such as helicopter rotors, using narrowband radar has lately been of significant...Methods for Analysing Radar Returns from Helicopter Rotor Blades Executive Summary Signal analysis and radar imaging of fast-rotating objects such as

  2. Shrinkage regression-based methods for microarray missing value imputation.

    PubMed

    Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng

    2013-01-01

    Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.

  3. Review of Statistical Methods for Analysing Healthcare Resources and Costs

    PubMed Central

    Mihaylova, Borislava; Briggs, Andrew; O'Hagan, Anthony; Thompson, Simon G

    2011-01-01

    We review statistical methods for analysing healthcare resource use and costs, their ability to address skewness, excess zeros, multimodality and heavy right tails, and their ease for general use. We aim to provide guidance on analysing resource use and costs focusing on randomised trials, although methods often have wider applicability. Twelve broad categories of methods were identified: (I) methods based on the normal distribution, (II) methods following transformation of data, (III) single-distribution generalized linear models (GLMs), (IV) parametric models based on skewed distributions outside the GLM family, (V) models based on mixtures of parametric distributions, (VI) two (or multi)-part and Tobit models, (VII) survival methods, (VIII) non-parametric methods, (IX) methods based on truncation or trimming of data, (X) data components models, (XI) methods based on averaging across models, and (XII) Markov chain methods. Based on this review, our recommendations are that, first, simple methods are preferred in large samples where the near-normality of sample means is assured. Second, in somewhat smaller samples, relatively simple methods, able to deal with one or two of above data characteristics, may be preferable but checking sensitivity to assumptions is necessary. Finally, some more complex methods hold promise, but are relatively untried; their implementation requires substantial expertise and they are not currently recommended for wider applied work. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20799344

  4. Environmental life cycle assessment of methanol and electricity co-production system based on coal gasification technology.

    PubMed

    Śliwińska, Anna; Burchart-Korol, Dorota; Smoliński, Adam

    2017-01-01

    This paper presents a life cycle assessment (LCA) of greenhouse gas emissions generated through methanol and electricity co-production system based on coal gasification technology. The analysis focuses on polygeneration technologies from which two products are produced, and thus, issues related to an allocation procedure for LCA are addressed in this paper. In the LCA, two methods were used: a 'system expansion' method based on two approaches, the 'avoided burdens approach' and 'direct system enlargement' methods and an 'allocation' method involving proportional partitioning based on physical relationships in a technological process. Cause-effect relationships in the analysed production process were identified, allowing for the identification of allocation factors. The 'system expansion' method involved expanding the analysis to include five additional variants of electricity production technologies in Poland (alternative technologies). This method revealed environmental consequences of implementation for the analysed technologies. It was found that the LCA of polygeneration technologies based on the 'system expansion' method generated a more complete source of information on environmental consequences than the 'allocation' method. The analysis shows that alternative technologies chosen for generating LCA results are crucial. Life cycle assessment was performed for the analysed, reference and variant alternative technologies. Comparative analysis was performed between the analysed technologies of methanol and electricity co-production from coal gasification as well as a reference technology of methanol production from the natural gas reforming process. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Comparing Distance vs. Campus-Based Delivery of Research Methods Courses

    ERIC Educational Resources Information Center

    Girod, Mark; Wojcikiewicz, Steve

    2009-01-01

    A causal-comparative pre-test, post-test design was used to investigate differences in learning in a research methods course for face-to-face and web-based delivery models. Analyses of participant achievement (N = 205) revealed almost no differences but post-hoc analyses revealed important differences in pedagogy between delivery models despite…

  6. Sarma-based key-group method for rock slope reliability analyses

    NASA Astrophysics Data System (ADS)

    Yarahmadi Bafghi, A. R.; Verdel, T.

    2005-08-01

    The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright

  7. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  8. Methodology for processing pressure traces used as inputs for combustion analyses in diesel engines

    NASA Astrophysics Data System (ADS)

    Rašić, Davor; Vihar, Rok; Žvar Baškovič, Urban; Katrašnik, Tomaž

    2017-05-01

    This study proposes a novel methodology for designing an optimum equiripple finite impulse response (FIR) filter for processing in-cylinder pressure traces of a diesel internal combustion engine, which serve as inputs for high-precision combustion analyses. The proposed automated workflow is based on an innovative approach of determining the transition band frequencies and optimum filter order. The methodology is based on discrete Fourier transform analysis, which is the first step to estimate the location of the pass-band and stop-band frequencies. The second step uses short-time Fourier transform analysis to refine the estimated aforementioned frequencies. These pass-band and stop-band frequencies are further used to determine the most appropriate FIR filter order. The most widely used existing methods for estimating the FIR filter order are not effective in suppressing the oscillations in the rate- of-heat-release (ROHR) trace, thus hindering the accuracy of combustion analyses. To address this problem, an innovative method for determining the order of an FIR filter is proposed in this study. This method is based on the minimization of the integral of normalized signal-to-noise differences between the stop-band frequency and the Nyquist frequency. Developed filters were validated using spectral analysis and calculation of the ROHR. The validation results showed that the filters designed using the proposed innovative method were superior compared with those using the existing methods for all analyzed cases. Highlights • Pressure traces of a diesel engine were processed by finite impulse response (FIR) filters with different orders • Transition band frequencies were determined with an innovative method based on discrete Fourier transform and short-time Fourier transform • Spectral analyses showed deficiencies of existing methods in determining the FIR filter order • A new method of determining the FIR filter order for processing pressure traces was proposed • The efficiency of the new method was demonstrated by spectral analyses and calculations of rate-of-heat-release traces

  9. [Development of selective determination methods for quinones with fluorescence and chemiluminescence detection and their application to environmental and biological samples].

    PubMed

    Kishikawa, Naoya

    2010-10-01

    Quinones are compounds that have various characteristics such as a biological electron transporter, an industrial product and a harmful environmental pollutant. Therefore, an effective determination method for quinones is required in many fields. This review describes the development of sensitive and selective determination methods for quinones based on some detection principles and their application to analyses in environmental, pharmaceutical and biological samples. Firstly, a fluorescence method was developed based on fluorogenic derivatization of quinones and applied to environmental analysis. Secondly, a luminol chemiluminescence method was developed based on generation of reactive oxygen species through the redox cycle of quinone and applied to pharmaceutical analysis. Thirdly, a photo-induced chemiluminescence method was developed based on formation of reactive oxygen species and fluorophore or chemiluminescence enhancer by the photoreaction of quinones and applied to biological and environmental analyses.

  10. Effects of different preservation methods on inter simple sequence repeat (ISSR) and random amplified polymorphic DNA (RAPD) molecular markers in botanic samples.

    PubMed

    Wang, Xiaolong; Li, Lin; Zhao, Jiaxin; Li, Fangliang; Guo, Wei; Chen, Xia

    2017-04-01

    To evaluate the effects of different preservation methods (stored in a -20°C ice chest, preserved in liquid nitrogen and dried in silica gel) on inter simple sequence repeat (ISSR) or random amplified polymorphic DNA (RAPD) analyses in various botanical specimens (including broad-leaved plants, needle-leaved plants and succulent plants) for different times (three weeks and three years), we used a statistical analysis based on the number of bands, genetic index and cluster analysis. The results demonstrate that methods used to preserve samples can provide sufficient amounts of genomic DNA for ISSR and RAPD analyses; however, the effect of different preservation methods on these analyses vary significantly, and the preservation time has little effect on these analyses. Our results provide a reference for researchers to select the most suitable preservation method depending on their study subject for the analysis of molecular markers based on genomic DNA. Copyright © 2017 Académie des sciences. Published by Elsevier Masson SAS. All rights reserved.

  11. Analyses of Crime Patterns in NIBRS Data Based on a Novel Graph Theory Clustering Method: Virginia as a Case Study

    PubMed Central

    Nolan, Jim

    2014-01-01

    This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS) data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area. PMID:24778585

  12. Application of the Gini correlation coefficient to infer regulatory relationships in transcriptome analysis.

    PubMed

    Ma, Chuang; Wang, Xiangfeng

    2012-09-01

    One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey's biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses.

  13. Application of the Gini Correlation Coefficient to Infer Regulatory Relationships in Transcriptome Analysis[W][OA

    PubMed Central

    Ma, Chuang; Wang, Xiangfeng

    2012-01-01

    One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey’s biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses. PMID:22797655

  14. mRNA-Based Parallel Detection of Active Methanotroph Populations by Use of a Diagnostic Microarray

    PubMed Central

    Bodrossy, Levente; Stralis-Pavese, Nancy; Konrad-Köszler, Marianne; Weilharter, Alexandra; Reichenauer, Thomas G.; Schöfer, David; Sessitsch, Angela

    2006-01-01

    A method was developed for the mRNA-based application of microbial diagnostic microarrays to detect active microbial populations. DNA- and mRNA-based analyses of environmental samples were compared and confirmed via quantitative PCR. Results indicated that mRNA-based microarray analyses may provide additional information on the composition and functioning of microbial communities. PMID:16461725

  15. A survey of functional behavior assessment methods used by behavior analysts in practice.

    PubMed

    Oliver, Anthony C; Pratt, Leigh A; Normand, Matthew P

    2015-12-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially descriptive assessments. Moreover, the data suggest that the majority of students are being formally taught about the various FBA methods and that educators are emphasizing the range of FBA methods in their teaching. However, less than half of the respondents reported using functional analyses in practice, although many considered descriptive assessments and functional analyses to be the most useful FBA methods. Most respondents reported using informant and descriptive assessments more frequently than functional analyses, and a majority of respondents indicated that they "never" or "almost never" used functional analyses to identify the function of behavior. © Society for the Experimental Analysis of Behavior.

  16. A need for determination of arsenic species at low levels in cereal-based food and infant cereals. Validation of a method by IC-ICPMS.

    PubMed

    Llorente-Mirandes, Toni; Calderón, Josep; Centrich, Francesc; Rubio, Roser; López-Sánchez, José Fermín

    2014-03-15

    The present study arose from the need to determine inorganic arsenic (iAs) at low levels in cereal-based food. Validated methods with a low limit of detection (LOD) are required to analyse these kinds of food. An analytical method for the determination of iAs, methylarsonic acid (MA) and dimethylarsinic acid (DMA) in cereal-based food and infant cereals is reported. The method was optimised and validated to achieve low LODs. Ion chromatography-inductively coupled plasma mass spectrometry (IC-ICPMS) was used for arsenic speciation. The main quality parameters were established. To expand the applicability of the method, different cereal products were analysed: bread, biscuits, breakfast cereals, wheat flour, corn snacks, pasta and infant cereals. The total and inorganic arsenic content of 29 cereal-based food samples ranged between 3.7-35.6 and 3.1-26.0 μg As kg(-1), respectively. The present method could be considered a valuable tool for assessing inorganic arsenic contents in cereal-based foods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Genes with minimal phylogenetic information are problematic for coalescent analyses when gene tree estimation is biased.

    PubMed

    Xi, Zhenxiang; Liu, Liang; Davis, Charles C

    2015-11-01

    The development and application of coalescent methods are undergoing rapid changes. One little explored area that bears on the application of gene-tree-based coalescent methods to species tree estimation is gene informativeness. Here, we investigate the accuracy of these coalescent methods when genes have minimal phylogenetic information, including the implementation of the multilocus bootstrap approach. Using simulated DNA sequences, we demonstrate that genes with minimal phylogenetic information can produce unreliable gene trees (i.e., high error in gene tree estimation), which may in turn reduce the accuracy of species tree estimation using gene-tree-based coalescent methods. We demonstrate that this problem can be alleviated by sampling more genes, as is commonly done in large-scale phylogenomic analyses. This applies even when these genes are minimally informative. If gene tree estimation is biased, however, gene-tree-based coalescent analyses will produce inconsistent results, which cannot be remedied by increasing the number of genes. In this case, it is not the gene-tree-based coalescent methods that are flawed, but rather the input data (i.e., estimated gene trees). Along these lines, the commonly used program PhyML has a tendency to infer one particular bifurcating topology even though it is best represented as a polytomy. We additionally corroborate these findings by analyzing the 183-locus mammal data set assembled by McCormack et al. (2012) using ultra-conserved elements (UCEs) and flanking DNA. Lastly, we demonstrate that when employing the multilocus bootstrap approach on this 183-locus data set, there is no strong conflict between species trees estimated from concatenation and gene-tree-based coalescent analyses, as has been previously suggested by Gatesy and Springer (2014). Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Partial spline models for the inclusion of tropopause and frontal boundary information in otherwise smooth two- and three-dimensional objective analysis

    NASA Technical Reports Server (NTRS)

    Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.

    1986-01-01

    A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.

  19. Determination of mycotoxins in plant-based beverages using QuEChERS and liquid chromatography-tandem mass spectrometry.

    PubMed

    Miró-Abella, Eugènia; Herrero, Pol; Canela, Núria; Arola, Lluís; Borrull, Francesc; Ras, Rosa; Fontanals, Núria

    2017-08-15

    A method was developed for the simultaneous determination of 11 mycotoxins in plant-based beverage matrices, using a QuEChERS extraction followed by ultra-high performance liquid chromatography coupled to tandem mass spectrometry detection (UHPLC-(ESI)MS/MS). This multi-mycotoxin method was applied to analyse plant-based beverages such as soy, oat and rice. QuEChERS extraction was applied obtaining suitable extraction recoveries between 80 and 91%, and good repeatability and reproducibility values. Method Quantification Limits were between 0.05μgL -1 (for aflatoxin G 1 and aflatoxin B 1 ) and 15μgL -1 (for deoxynivalenol and fumonisin B 2 ). This is the first time that plant-based beverages have been analysed, and certain mycotoxins, such as deoxynivalenol, aflatoxin B 1 , aflatoxin B 2 , aflatoxin G 1 , aflatoxin G 2 , ochratoxin A, T-2 toxin and zearalenone, were found in the analysed samples, and some of them quantified between 0.1μgL -1 and 19μgL -1 . Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. [A method for the analysis of overlapped peaks in the high performance liquid chromatogram based on spectrum analysis].

    PubMed

    Liu, Bao; Fan, Xiaoming; Huo, Shengnan; Zhou, Lili; Wang, Jun; Zhang, Hui; Hu, Mei; Zhu, Jianhua

    2011-12-01

    A method was established to analyse the overlapped chromatographic peaks based on the chromatographic-spectra data detected by the diode-array ultraviolet detector. In the method, the three-dimensional data were de-noised and normalized firstly; secondly the differences and clustering analysis of the spectra at different time points were calculated; then the purity of the whole chromatographic peak were analysed and the region were sought out in which the spectra of different time points were stable. The feature spectra were extracted from the spectrum-stable region as the basic foundation. The nonnegative least-square method was chosen to separate the overlapped peaks and get the flow curve which was based on the feature spectrum. The three-dimensional divided chromatographic-spectrum peak could be gained by the matrix operations of the feature spectra with the flow curve. The results displayed that this method could separate the overlapped peaks.

  1. Gene sequence analyses and other DNA-based methods for yeast species recognition

    USDA-ARS?s Scientific Manuscript database

    DNA sequence analyses, as well as other DNA-based methodologies, have transformed the way in which yeasts are identified. The focus of this chapter will be on the resolution of species using various types of DNA comparisons. In other chapters in this book, Rozpedowska, Piškur and Wolfe discuss mul...

  2. Identification of Active Bacterial Communities in Drinking Water Using 16S rRNA-Based Sequence Analyses

    EPA Science Inventory

    DNA-based methods have considerably increased our understanding of the bacterial diversity of water distribution systems (WDS). However, as DNA may persist after cell death, the use of DNA-based methods cannot be used to describe metabolically-active microbes. In contrast, intra...

  3. Reading between the Lines: Exploring Methods for Analysing Professional Examiner Feedback Discourse

    ERIC Educational Resources Information Center

    Johnson, Martin

    2017-01-01

    This paper uses the remote interactions of professional examiners working for a UK-based awarding body as a vehicle for discussing the benefits of the use of different methods for analysing such discourse. Communication is an area of interest for sociocultural theory because it can potentiate cognitive shifts in participants and affords learning.…

  4. A stepwedge-based method for measuring breast density: observer variability and comparison with human reading

    NASA Astrophysics Data System (ADS)

    Diffey, Jenny; Berks, Michael; Hufton, Alan; Chung, Camilla; Verow, Rosanne; Morrison, Joanna; Wilson, Mary; Boggis, Caroline; Morris, Julie; Maxwell, Anthony; Astley, Susan

    2010-04-01

    Breast density is positively linked to the risk of developing breast cancer. We have developed a semi-automated, stepwedge-based method that has been applied to the mammograms of 1,289 women in the UK breast screening programme to measure breast density by volume and area. 116 images were analysed by three independent operators to assess inter-observer variability; 24 of these were analysed on 10 separate occasions by the same operator to determine intra-observer variability. 168 separate images were analysed using the stepwedge method and by two radiologists who independently estimated percentage breast density by area. There was little intra-observer variability in the stepwedge method (average coefficients of variation 3.49% - 5.73%). There were significant differences in the volumes of glandular tissue obtained by the three operators. This was attributed to variations in the operators' definition of the breast edge. For fatty and dense breasts, there was good correlation between breast density assessed by the stepwedge method and the radiologists. This was also observed between radiologists, despite significant inter-observer variation. Based on analysis of thresholds used in the stepwedge method, radiologists' definition of a dense pixel is one in which the percentage of glandular tissue is between 10 and 20% of the total thickness of tissue.

  5. Longitudinal data analyses using linear mixed models in SPSS: concepts, procedures and illustrations.

    PubMed

    Shek, Daniel T L; Ma, Cecilia M S

    2011-01-05

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.

  6. Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations

    PubMed Central

    Shek, Daniel T. L.; Ma, Cecilia M. S.

    2011-01-01

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented. PMID:21218263

  7. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  8. Global approach for the validation of an in-line Raman spectroscopic method to determine the API content in real-time during a hot-melt extrusion process.

    PubMed

    Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E

    2017-08-15

    Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. A Web service substitution method based on service cluster nets

    NASA Astrophysics Data System (ADS)

    Du, YuYue; Gai, JunJing; Zhou, MengChu

    2017-11-01

    Service substitution is an important research topic in the fields of Web services and service-oriented computing. This work presents a novel method to analyse and substitute Web services. A new concept, called a Service Cluster Net Unit, is proposed based on Web service clusters. A service cluster is converted into a Service Cluster Net Unit. Then it is used to analyse whether the services in the cluster can satisfy some service requests. Meanwhile, the substitution methods of an atomic service and a composite service are proposed. The correctness of the proposed method is proved, and the effectiveness is shown and compared with the state-of-the-art method via an experiment. It can be readily applied to e-commerce service substitution to meet the business automation needs.

  10. Experimental Quasi-Microwave Whole-Body Averaged SAR Estimation Method Using Cylindrical-External Field Scanning

    NASA Astrophysics Data System (ADS)

    Kawamura, Yoshifumi; Hikage, Takashi; Nojima, Toshio

    The aim of this study is to develop a new whole-body averaged specific absorption rate (SAR) estimation method based on the external-cylindrical field scanning technique. This technique is adopted with the goal of simplifying the dosimetry estimation of human phantoms that have different postures or sizes. An experimental scaled model system is constructed. In order to examine the validity of the proposed method for realistic human models, we discuss the pros and cons of measurements and numerical analyses based on the finite-difference time-domain (FDTD) method. We consider the anatomical European human phantoms and plane-wave in the 2GHz mobile phone frequency band. The measured whole-body averaged SAR results obtained by the proposed method are compared with the results of the FDTD analyses.

  11. A voxel-based approach to gray matter asymmetries.

    PubMed

    Luders, E; Gaser, C; Jancke, L; Schlaug, G

    2004-06-01

    Voxel-based morphometry (VBM) was used to analyze gray matter (GM) asymmetries in a large sample (n = 60) of male and female professional musicians with and without absolute pitch (AP). We chose to examine these particular groups because previous studies using traditional region-of-interest (ROI) analyses have shown differences in hemispheric asymmetry related to AP and gender. Voxel-based methods may have advantages over traditional ROI-based methods since the analysis can be performed across the whole brain with minimal user bias. After determining that the VBM method was sufficiently sensitive for the detection of differences in GM asymmetries between groups, we found that male AP musicians were more leftward lateralized in the anterior region of the planum temporale (PT) than male non-AP musicians. This confirmed the results of previous studies using ROI-based methods that showed an association between PT asymmetry and the AP phenotype. We further observed that male non-AP musicians revealed an increased leftward GM asymmetry in the postcentral gyrus compared to female non-AP musicians, again corroborating results of a previously published study using ROI-based methods. By analyzing hemispheric GM differences across our entire sample, we were able to partially confirm findings of previous studies using traditional morphometric techniques, as well as more recent, voxel-based analyses. In addition, we found some unusually pronounced GM asymmetries in our musician sample not previously detected in subjects unselected for musical training. Since we were able to validate gender- and AP-related brain asymmetries previously described using traditional ROI-based morphometric techniques, the results of our analyses support the use of VBM for examinations of GM asymmetries.

  12. Use of multiple cluster analysis methods to explore the validity of a community outcomes concept map.

    PubMed

    Orsi, Rebecca

    2017-02-01

    Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Computational neuroanatomy using brain deformations: From brain parcellation to multivariate pattern analysis and machine learning.

    PubMed

    Davatzikos, Christos

    2016-10-01

    The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. Copyright © 2016. Published by Elsevier B.V.

  14. Computational neuroanatomy using brain deformations: From brain parcellation to multivariate pattern analysis and machine learning

    PubMed Central

    Davatzikos, Christos

    2017-01-01

    The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. PMID:27514582

  15. Wavelet bases on the L-shaped domain

    NASA Astrophysics Data System (ADS)

    Jouini, Abdellatif; Lemarié-Rieusset, Pierre Gilles

    2013-07-01

    We present in this paper two elementary constructions of multiresolution analyses on the L-shaped domain D. In the first one, we shall describe a direct method to define an orthonormal multiresolution analysis. In the second one, we use the decomposition method for constructing a biorthogonal multiresolution analysis. These analyses are adapted for the study of the Sobolev spaces Hs(D)(s∈N).

  16. Estimates of recharge to unconfined aquifers and leakage to confined aquifers in the seven-county metropolitan area of Minneapolis-St. Paul, Minnesota

    USGS Publications Warehouse

    Ruhl, James F.; Kanivetsky, Roman; Shmagin, Boris

    2002-01-01

    Recharge estimates, which generally varied within 10 in./yr for each of the methods, generally were largest based on the precipitation, ground-water level fluctuation, and age dating of shallow ground water methods, slightly smaller based on the streamflow-recession displacement method, and smallest based on the watershed characteristics method. Leakage, which was less than 1 in./yr, varied within 1 order of magnitude based on the ground-water level fluctuation method and as much as 4 orders of magnitude based on analyses of vertical-hydraulic gradients.

  17. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  18. Light Scattering based detection of food pathogens

    USDA-ARS?s Scientific Manuscript database

    The current methods for detecting foodborne pathogens are mostly destructive (i.e., samples need to be pretreated), and require time, personnel, and laboratories for analyses. Optical methods including light scattering based techniques have gained a lot of attention recently due to its their rapid a...

  19. Functional Analyses and Treatment of Precursor Behavior

    PubMed Central

    Najdowski, Adel C; Wallace, Michele D; Ellsworth, Carrie L; MacAleese, Alicia N; Cleveland, Jackie M

    2008-01-01

    Functional analysis has been demonstrated to be an effective method to identify environmental variables that maintain problem behavior. However, there are cases when conducting functional analyses of severe problem behavior may be contraindicated. The current study applied functional analysis procedures to a class of behavior that preceded severe problem behavior (precursor behavior) and evaluated treatments based on the outcomes of the functional analyses of precursor behavior. Responding for all participants was differentiated during the functional analyses, and individualized treatments eliminated precursor behavior. These results suggest that functional analysis of precursor behavior may offer an alternative, indirect method to assess the operant function of severe problem behavior. PMID:18468282

  20. Reduced size first-order subsonic and supersonic aeroelastic modeling

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Various aeroelastic, aeroservoelastic, dynamic-response, and sensitivity analyses are based on a time-domain first-order (state-space) formulation of the equations of motion. The formulation of this paper is based on the minimum-state (MS) aerodynamic approximation method, which yields a low number of aerodynamic augmenting states. Modifications of the MS and the physical weighting procedures make the modeling method even more attractive. The flexibility of constraint selection is increased without increasing the approximation problem size; the accuracy of dynamic residualization of high-frequency modes is improved; and the resulting model is less sensitive to parametric changes in subsequent analyses. Applications to subsonic and supersonic cases demonstrate the generality, flexibility, accuracy, and efficiency of the method.

  1. Data-Based Decision-Making: Developing a Method for Capturing Teachers' Understanding of CBM Graphs

    ERIC Educational Resources Information Center

    Espin, Christine A.; Wayman, Miya Miura; Deno, Stanley L.; McMaster, Kristen L.; de Rooij, Mark

    2017-01-01

    In this special issue, we explore the decision-making aspect of "data-based decision-making". The articles in the issue address a wide range of research questions, designs, methods, and analyses, but all focus on data-based decision-making for students with learning difficulties. In this first article, we introduce the topic of…

  2. Alignment methods: strategies, challenges, benchmarking, and comparative overview.

    PubMed

    Löytynoja, Ari

    2012-01-01

    Comparative evolutionary analyses of molecular sequences are solely based on the identities and differences detected between homologous characters. Errors in this homology statement, that is errors in the alignment of the sequences, are likely to lead to errors in the downstream analyses. Sequence alignment and phylogenetic inference are tightly connected and many popular alignment programs use the phylogeny to divide the alignment problem into smaller tasks. They then neglect the phylogenetic tree, however, and produce alignments that are not evolutionarily meaningful. The use of phylogeny-aware methods reduces the error but the resulting alignments, with evolutionarily correct representation of homology, can challenge the existing practices and methods for viewing and visualising the sequences. The inter-dependency of alignment and phylogeny can be resolved by joint estimation of the two; methods based on statistical models allow for inferring the alignment parameters from the data and correctly take into account the uncertainty of the solution but remain computationally challenging. Widely used alignment methods are based on heuristic algorithms and unlikely to find globally optimal solutions. The whole concept of one correct alignment for the sequences is questionable, however, as there typically exist vast numbers of alternative, roughly equally good alignments that should also be considered. This uncertainty is hidden by many popular alignment programs and is rarely correctly taken into account in the downstream analyses. The quest for finding and improving the alignment solution is complicated by the lack of suitable measures of alignment goodness. The difficulty of comparing alternative solutions also affects benchmarks of alignment methods and the results strongly depend on the measure used. As the effects of alignment error cannot be predicted, comparing the alignments' performance in downstream analyses is recommended.

  3. Species-richness of the Anopheles annulipes Complex (Diptera: Culicidae) Revealed by Tree and Model-Based Allozyme Clustering Analyses

    DTIC Science & Technology

    2007-01-01

    including tree- based methods such as the unweighted pair group method of analysis ( UPGMA ) and Neighbour-joining (NJ) (Saitou & Nei, 1987). By...based Bayesian approach and the tree-based UPGMA and NJ cluster- ing methods. The results obtained suggest that far more species occur in the An...unlikely that groups that differ by more than these levels are conspecific. Genetic distances were clustered using the UPGMA and NJ algorithms in MEGA

  4. An efficient sensitivity analysis method for modified geometry of Macpherson suspension based on Pearson correlation coefficient

    NASA Astrophysics Data System (ADS)

    Shojaeefard, Mohammad Hasan; Khalkhali, Abolfazl; Yarmohammadisatri, Sadegh

    2017-06-01

    The main purpose of this paper is to propose a new method for designing Macpherson suspension, based on the Sobol indices in terms of Pearson correlation which determines the importance of each member on the behaviour of vehicle suspension. The formulation of dynamic analysis of Macpherson suspension system is developed using the suspension members as the modified links in order to achieve the desired kinematic behaviour. The mechanical system is replaced with an equivalent constrained links and then kinematic laws are utilised to obtain a new modified geometry of Macpherson suspension. The equivalent mechanism of Macpherson suspension increased the speed of analysis and reduced its complexity. The ADAMS/CAR software is utilised to simulate a full vehicle, Renault Logan car, in order to analyse the accuracy of modified geometry model. An experimental 4-poster test rig is considered for validating both ADAMS/CAR simulation and analytical geometry model. Pearson correlation coefficient is applied to analyse the sensitivity of each suspension member according to vehicle objective functions such as sprung mass acceleration, etc. Besides this matter, the estimation of Pearson correlation coefficient between variables is analysed in this method. It is understood that the Pearson correlation coefficient is an efficient method for analysing the vehicle suspension which leads to a better design of Macpherson suspension system.

  5. Evolution of microbiological analytical methods for dairy industry needs

    PubMed Central

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675

  6. Evolution of microbiological analytical methods for dairy industry needs.

    PubMed

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.

  7. Use of MALDI-TOF Mass Spectrometry and a Custom Database to Characterize Bacteria Indigenous to a Unique Cave Environment (Kartchner Caverns, AZ, USA)

    PubMed Central

    Zhang, Lin; Vranckx, Katleen; Janssens, Koen; Sandrin, Todd R.

    2015-01-01

    MALDI-TOF mass spectrometry has been shown to be a rapid and reliable tool for identification of bacteria at the genus and species, and in some cases, strain levels. Commercially available and open source software tools have been developed to facilitate identification; however, no universal/standardized data analysis pipeline has been described in the literature. Here, we provide a comprehensive and detailed demonstration of bacterial identification procedures using a MALDI-TOF mass spectrometer. Mass spectra were collected from 15 diverse bacteria isolated from Kartchner Caverns, AZ, USA, and identified by 16S rDNA sequencing. Databases were constructed in BioNumerics 7.1. Follow-up analyses of mass spectra were performed, including cluster analyses, peak matching, and statistical analyses. Identification was performed using blind-coded samples randomly selected from these 15 bacteria. Two identification methods are presented: similarity coefficient-based and biomarker-based methods. Results show that both identification methods can identify the bacteria to the species level. PMID:25590854

  8. Use of MALDI-TOF mass spectrometry and a custom database to characterize bacteria indigenous to a unique cave environment (Kartchner Caverns, AZ, USA).

    PubMed

    Zhang, Lin; Vranckx, Katleen; Janssens, Koen; Sandrin, Todd R

    2015-01-02

    MALDI-TOF mass spectrometry has been shown to be a rapid and reliable tool for identification of bacteria at the genus and species, and in some cases, strain levels. Commercially available and open source software tools have been developed to facilitate identification; however, no universal/standardized data analysis pipeline has been described in the literature. Here, we provide a comprehensive and detailed demonstration of bacterial identification procedures using a MALDI-TOF mass spectrometer. Mass spectra were collected from 15 diverse bacteria isolated from Kartchner Caverns, AZ, USA, and identified by 16S rDNA sequencing. Databases were constructed in BioNumerics 7.1. Follow-up analyses of mass spectra were performed, including cluster analyses, peak matching, and statistical analyses. Identification was performed using blind-coded samples randomly selected from these 15 bacteria. Two identification methods are presented: similarity coefficient-based and biomarker-based methods. Results show that both identification methods can identify the bacteria to the species level.

  9. Weight-based determination of fluid overload status and mortality in pediatric intensive care unit patients requiring continuous renal replacement therapy

    PubMed Central

    Selewski, David T.; Cornell, Timothy T.; Lombel, Rebecca M.; Blatt, Neal B.; Han, Yong Y.; Mottes, Theresa; Kommareddi, Mallika; Kershaw, David B.; Shanley, Thomas P.; Heung, Michael

    2012-01-01

    Purpose In pediatric intensive care unit (PICU) patients, fluid overload (FO) at initiation of continuous renal replacement therapy (CRRT) has been reported to be an independent risk factor for mortality. Previous studies have calculated FO based on daily fluid balance during ICU admission, which is labor intensive and error prone. We hypothesized that a weight-based definition of FO at CRRT initiation would correlate with the fluid balance method and prove predictive of outcome. Methods This is a retrospective single-center review of PICU patients requiring CRRT from July 2006 through February 2010 (n = 113). We compared the degree of FO at CRRT initiation using the standard fluid balance method versus methods based on patient weight changes assessed by both univariate and multivariate analyses. Results The degree of fluid overload at CRRT initiation was significantly greater in nonsurvivors, irrespective of which method was used. The univariate odds ratio for PICU mortality per 1% increase in FO was 1.056 [95% confidence interval (CI) 1.025, 1.087] by the fluid balance method, 1.044 (95% CI 1.019, 1.069) by the weight-based method using PICU admission weight, and 1.045 (95% CI 1.022, 1.07) by the weight-based method using hospital admission weight. On multivariate analyses, all three methods approached significance in predicting PICU survival. Conclusions Our findings suggest that weight-based definitions of FO are useful in defining FO at CRRT initiation and are associated with increased mortality in a broad PICU patient population. This study provides evidence for a more practical weight-based definition of FO that can be used at the bedside. PMID:21533569

  10. Evaluation method based on the image correlation for laser jamming image

    NASA Astrophysics Data System (ADS)

    Che, Jinxi; Li, Zhongmin; Gao, Bo

    2013-09-01

    The jamming effectiveness evaluation of infrared imaging system is an important part of electro-optical countermeasure. The infrared imaging devices in the military are widely used in the searching, tracking and guidance and so many other fields. At the same time, with the continuous development of laser technology, research of laser interference and damage effect developed continuously, laser has been used to disturbing the infrared imaging device. Therefore, the effect evaluation of the infrared imaging system by laser has become a meaningful problem to be solved. The information that the infrared imaging system ultimately present to the user is an image, so the evaluation on jamming effect can be made from the point of assessment of image quality. The image contains two aspects of the information, the light amplitude and light phase, so the image correlation can accurately perform the difference between the original image and disturbed image. In the paper, the evaluation method of digital image correlation, the assessment method of image quality based on Fourier transform, the estimate method of image quality based on error statistic and the evaluation method of based on peak signal noise ratio are analysed. In addition, the advantages and disadvantages of these methods are analysed. Moreover, the infrared disturbing images of the experiment result, in which the thermal infrared imager was interfered by laser, were analysed by using these methods. The results show that the methods can better reflect the jamming effects of the infrared imaging system by laser. Furthermore, there is good consistence between evaluation results by using the methods and the results of subjective visual evaluation. And it also provides well repeatability and convenient quantitative analysis. The feasibility of the methods to evaluate the jamming effect was proved. It has some extent reference value for the studying and developing on electro-optical countermeasures equipments and effectiveness evaluation.

  11. [Value-based medicine in ophthalmology].

    PubMed

    Hirneiss, C; Neubauer, A S; Tribus, C; Kampik, A

    2006-06-01

    Value-based medicine (VBM) unifies costs and patient-perceived value (improvement in quality of life, length of life, or both) of an intervention. Value-based ophthalmology is of increasing importance for decisions in eye care. The methods of VBM are explained and definitions for a specific terminology in this field are given. The cost-utility analysis as part of health care economic analyses is explained. VBM exceeds evidence-based medicine by incorporating parameters of cost and benefits from an ophthalmological intervention. The benefit of the intervention is defined as an increase or maintenance of visual quality of life and can be determined by utility analysis. The time trade-off method is valid and reliable for utility analysis. The resources expended for the value gained in VBM are measured with cost-utility analysis in terms of cost per quality-adjusted life years gained (euros/QALY). Numerous cost-utility analyses of different ophthalmological interventions have been published. The fundamental instrument of VBM is cost-utility analysis. The results in cost per QALY allow estimation of cost effectiveness of an ophthalmological intervention. Using the time trade-off method for utility analysis allows the comparison of ophthalmological cost-utility analyses with those of other medical interventions. VBM is important for individual medical decision making and for general health care.

  12. Predicting critical micelle concentration and micelle molecular weight of polysorbate 80 using compendial methods.

    PubMed

    Braun, Alexandra C; Ilko, David; Merget, Benjamin; Gieseler, Henning; Germershaus, Oliver; Holzgrabe, Ulrike; Meinel, Lorenz

    2015-08-01

    This manuscript addresses the capability of compendial methods in controlling polysorbate 80 (PS80) functionality. Based on the analysis of sixteen batches, functionality related characteristics (FRC) including critical micelle concentration (CMC), cloud point, hydrophilic-lipophilic balance (HLB) value and micelle molecular weight were correlated to chemical composition including fatty acids before and after hydrolysis, content of non-esterified polyethylene glycols and sorbitan polyethoxylates, sorbitan- and isosorbide polyethoxylate fatty acid mono- and diesters, polyoxyethylene diesters, and peroxide values. Batches from some suppliers had a high variability in functionality related characteristic (FRC), questioning the ability of the current monograph in controlling these. Interestingly, the combined use of the input parameters oleic acid content and peroxide value - both of which being monographed methods - resulted in a model adequately predicting CMC. Confining the batches to those complying with specifications for peroxide value proved oleic acid content alone as being predictive for CMC. Similarly, a four parameter model based on chemical analyses alone was instrumental in predicting the molecular weight of PS80 micelles. Improved models based on analytical outcome from fingerprint analyses are also presented. A road map controlling PS80 batches with respect to FRC and based on chemical analyses alone is provided for the formulator. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. A DNA microarray-based methylation-sensitive (MS)-AFLP hybridization method for genetic and epigenetic analyses.

    PubMed

    Yamamoto, F; Yamamoto, M

    2004-07-01

    We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.

  14. Fluorescence-based methods for detecting caries lesions: systematic review, meta-analysis and sources of heterogeneity.

    PubMed

    Gimenez, Thais; Braga, Mariana Minatel; Raggio, Daniela Procida; Deery, Chris; Ricketts, David N; Mendes, Fausto Medeiros

    2013-01-01

    Fluorescence-based methods have been proposed to aid caries lesion detection. Summarizing and analysing findings of studies about fluorescence-based methods could clarify their real benefits. We aimed to perform a comprehensive systematic review and meta-analysis to evaluate the accuracy of fluorescence-based methods in detecting caries lesions. Two independent reviewers searched PubMed, Embase and Scopus through June 2012 to identify papers/articles published. Other sources were checked to identify non-published literature. STUDY ELIGIBILITY CRITERIA, PARTICIPANTS AND DIAGNOSTIC METHODS: The eligibility criteria were studies that: (1) have assessed the accuracy of fluorescence-based methods of detecting caries lesions on occlusal, approximal or smooth surfaces, in both primary or permanent human teeth, in the laboratory or clinical setting; (2) have used a reference standard; and (3) have reported sufficient data relating to the sample size and the accuracy of methods. A diagnostic 2×2 table was extracted from included studies to calculate the pooled sensitivity, specificity and overall accuracy parameters (Diagnostic Odds Ratio and Summary Receiver-Operating curve). The analyses were performed separately for each method and different characteristics of the studies. The quality of the studies and heterogeneity were also evaluated. Seventy five studies met the inclusion criteria from the 434 articles initially identified. The search of the grey or non-published literature did not identify any further studies. In general, the analysis demonstrated that the fluorescence-based method tend to have similar accuracy for all types of teeth, dental surfaces or settings. There was a trend of better performance of fluorescence methods in detecting more advanced caries lesions. We also observed moderate to high heterogeneity and evidenced publication bias. Fluorescence-based devices have similar overall performance; however, better accuracy in detecting more advanced caries lesions has been observed.

  15. The integrative review: updated methodology.

    PubMed

    Whittemore, Robin; Knafl, Kathleen

    2005-12-01

    The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.

  16. Development of gas chromatographic methods for the analyses of organic carbonate-based electrolytes

    NASA Astrophysics Data System (ADS)

    Terborg, Lydia; Weber, Sascha; Passerini, Stefano; Winter, Martin; Karst, Uwe; Nowak, Sascha

    2014-01-01

    In this work, novel methods based on gas chromatography (GC) for the investigation of common organic carbonate-based electrolyte systems are presented, which are used in lithium ion batteries. The methods were developed for flame ionization detection (FID), mass spectrometric detection (MS). Further, headspace (HS) sampling for the investigation of solid samples like electrodes is reported. Limits of detection are reported for FID. Finally, the developed methods were applied to the electrolyte system of commercially available lithium ion batteries as well as on in-house assembled cells.

  17. Conceptual Chemical Process Design for Sustainability. ...

    EPA Pesticide Factsheets

    This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews

  18. Forecast and analysis of the ratio of electric energy to terminal energy consumption for global energy internet

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si

    2018-02-01

    In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.

  19. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    PubMed

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  20. ANALYSES OF FISH TISSUE BY VACUUM DISTILLATION/GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    EPA Science Inventory

    The analyses of fish tissue using VD/GC/MS with surrogate-based matrix corrections is described. Techniques for equilibrating surrogate and analyte spikes with a tissue matrix are presented, and equilibrated spiked samples are used to document method performance. The removal of a...

  1. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  2. Assessing Commercial and Alternative Poultry Processing Methods using Microbiome Analyses

    USDA-ARS?s Scientific Manuscript database

    Assessing poultry processing methods/strategies has historically used culture-based methods to assess bacterial changes or reductions, both in terms of general microbial communities (e.g. total aerobic bacteria) or zoonotic pathogens of interest (e.g. Salmonella, Campylobacter). The advent of next ...

  3. Differentially co-expressed interacting protein pairs discriminate samples under distinct stages of HIV type 1 infection.

    PubMed

    Yoon, Dukyong; Kim, Hyosil; Suh-Kim, Haeyoung; Park, Rae Woong; Lee, KiYoung

    2011-01-01

    Microarray analyses based on differentially expressed genes (DEGs) have been widely used to distinguish samples across different cellular conditions. However, studies based on DEGs have not been able to clearly determine significant differences between samples of pathophysiologically similar HIV-1 stages, e.g., between acute and chronic progressive (or AIDS) or between uninfected and clinically latent stages. We here suggest a novel approach to allow such discrimination based on stage-specific genetic features of HIV-1 infection. Our approach is based on co-expression changes of genes known to interact. The method can identify a genetic signature for a single sample as contrasted with existing protein-protein-based analyses with correlational designs. Our approach distinguishes each sample using differentially co-expressed interacting protein pairs (DEPs) based on co-expression scores of individual interacting pairs within a sample. The co-expression score has positive value if two genes in a sample are simultaneously up-regulated or down-regulated. And the score has higher absolute value if expression-changing ratios are similar between the two genes. We compared characteristics of DEPs with that of DEGs by evaluating their usefulness in separation of HIV-1 stage. And we identified DEP-based network-modules and their gene-ontology enrichment to find out the HIV-1 stage-specific gene signature. Based on the DEP approach, we observed clear separation among samples from distinct HIV-1 stages using clustering and principal component analyses. Moreover, the discrimination power of DEPs on the samples (70-100% accuracy) was much higher than that of DEGs (35-45%) using several well-known classifiers. DEP-based network analysis also revealed the HIV-1 stage-specific network modules; the main biological processes were related to "translation," "RNA splicing," "mRNA, RNA, and nucleic acid transport," and "DNA metabolism." Through the HIV-1 stage-related modules, changing stage-specific patterns of protein interactions could be observed. DEP-based method discriminated the HIV-1 infection stages clearly, and revealed a HIV-1 stage-specific gene signature. The proposed DEP-based method might complement existing DEG-based approaches in various microarray expression analyses.

  4. Study on optimal design of 210kW traction IPMSM considering thermal demagnetization characteristics

    NASA Astrophysics Data System (ADS)

    Kim, Young Hyun; Lee, Seong Soo; Cheon, Byung Chul; Lee, Jung Ho

    2018-04-01

    This study analyses the permanent magnet (PM) used in the rotor of an interior permanent magnet synchronous motor (IPMSM) used for driving an electric railway vehicle (ERV) in the context of controllable shape, temperature, and external magnetic field. The positioning of the inserted magnets is a degree of freedom in the design of such machines. This paper describes a preliminary analysis using parametric finite-element method performed with the aim of achieving an effective design. Next, features of the experimental design, based on methods such as the central-composition method, Box-Behnken and Taguchi method, are explored to optimise the shape of the high power density. The results are used to produce an optimal design for IPMSMs, with design errors minimized using Maxwell 2D, a commercial program. Furthermore, the demagnetization process is analysed based on the magnetization and demagnetization theory for PM materials in computer simulation. The result of the analysis can be used to calculate the magnetization and demagnetization phenomenon according to the input B-H curve. This paper presents the conditions for demagnetization by the external magnetic field in the driving and stopped states, and proposes a simulation method that can analyse demagnetization phenomena according to each condition and design the IPMSM that maximizes efficiency and torque characteristics. Finally, operational characteristics are analysed in terms of the operation patterns of railway vehicles, and control conditions are deduced to achieve maximum efficiency in all sections. This was experimentally verified.

  5. Towards fully automated structure-based function prediction in structural genomics: a case study.

    PubMed

    Watson, James D; Sanderson, Steve; Ezersky, Alexandra; Savchenko, Alexei; Edwards, Aled; Orengo, Christine; Joachimiak, Andrzej; Laskowski, Roman A; Thornton, Janet M

    2007-04-13

    As the global Structural Genomics projects have picked up pace, the number of structures annotated in the Protein Data Bank as hypothetical protein or unknown function has grown significantly. A major challenge now involves the development of computational methods to assign functions to these proteins accurately and automatically. As part of the Midwest Center for Structural Genomics (MCSG) we have developed a fully automated functional analysis server, ProFunc, which performs a battery of analyses on a submitted structure. The analyses combine a number of sequence-based and structure-based methods to identify functional clues. After the first stage of the Protein Structure Initiative (PSI), we review the success of the pipeline and the importance of structure-based function prediction. As a dataset, we have chosen all structures solved by the MCSG during the 5 years of the first PSI. Our analysis suggests that two of the structure-based methods are particularly successful and provide examples of local similarity that is difficult to identify using current sequence-based methods. No one method is successful in all cases, so, through the use of a number of complementary sequence and structural approaches, the ProFunc server increases the chances that at least one method will find a significant hit that can help elucidate function. Manual assessment of the results is a time-consuming process and subject to individual interpretation and human error. We present a method based on the Gene Ontology (GO) schema using GO-slims that can allow the automated assessment of hits with a success rate approaching that of expert manual assessment.

  6. A Classroom-Based Assessment Method to Test Speaking Skills in English for Specific Purposes

    ERIC Educational Resources Information Center

    Alberola Colomar, María Pilar

    2014-01-01

    This article presents and analyses a classroom-based assessment method to test students' speaking skills in a variety of professional settings in tourism. The assessment system has been implemented in the Communication in English for Tourism course, as part of the Tourism Management degree programme, at Florida Universitaria (affiliated to the…

  7. A Review of Missing Data Handling Methods in Education Research

    ERIC Educational Resources Information Center

    Cheema, Jehanzeb R.

    2014-01-01

    Missing data are a common occurrence in survey-based research studies in education, and the way missing values are handled can significantly affect the results of analyses based on such data. Despite known problems with performance of some missing data handling methods, such as mean imputation, many researchers in education continue to use those…

  8. Relation between scattering and production amplitude--Case of intermediate {sigma}-particle in {pi}{pi}-system--

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishida, Muneyuki; Ishida, Shin; Ishida, Taku

    1998-05-29

    The relation between scattering and production amplitudes are investigated, using a simple field theoretical model, from the general viewpoint of unitarity and the applicability of final state interaction (FSI-) theorem. The IA-method and VMW-method, which are applied to our phenomenological analyses [2,3] suggesting the {sigma}-existence, are obtained as the physical state representations of scattering and production amplitudes, respectively. Moreover, the VMW-method is shown to be an effective method to obtain the resonance properties from general production processes, while the conventional analyses based on the 'universality' of {pi}{pi}-scattering amplitude are powerless for this purpose.

  9. Relation between scattering and production amplitude—Case of intermediate σ-particle in ππ-system—

    NASA Astrophysics Data System (ADS)

    Ishida, Muneyuki; Ishida, Shin; Ishida, Taku

    1998-05-01

    The relation between scattering and production amplitudes are investigated, using a simple field theoretical model, from the general viewpoint of unitarity and the applicability of final state interaction (FSI-) theorem. The IA-method and VMW-method, which are applied to our phenomenological analyses [2,3] suggesting the σ-existence, are obtained as the physical state representations of scattering and production amplitudes, respectively. Moreover, the VMW-method is shown to be an effective method to obtain the resonance properties from general production processes, while the conventional analyses based on the "universality" of ππ-scattering amplitude are powerless for this purpose.

  10. Annoyance survey by means of social media.

    PubMed

    Silva, Bruno; Santos, Gustavo; Eller, Rogeria; Gjestland, Truls

    2017-02-01

    Social surveys have been the conventional means of evaluating the annoyance caused by transportation noise. Sampling and interviewing by telephone, mail, or in person are often costly and time consuming, however. Data collection by web-based survey methods are less costly and may be completed more quickly, and hence, could be conducted in countries with fewer resources. Such methods, however, raise issues about the generalizability and comparability of findings. These issues were investigated in a study of the annoyance of aircraft noise exposure around Brazil's Guarulhos Airport. The findings of 547 interviews obtained with the aid of Facebook advertisements and web-based forms were analysed with respect to estimated aircraft noise exposure levels at respondents' residences. The results were analysed to assess whether and how web-based surveys might yield generalizable noise dose-response relationships.

  11. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  12. FARVATX: FAmily-based Rare Variant Association Test for X-linked genes

    PubMed Central

    Choi, Sungkyoung; Lee, Sungyoung; Qiao, Dandi; Hardin, Megan; Cho, Michael H.; Silverman, Edwin K; Park, Taesung; Won, Sungho

    2016-01-01

    Although the X chromosome has many genes that are functionally related to human diseases, the complicated biological properties of the X chromosome have prevented efficient genetic association analyses, and only a few significantly associated X-linked variants have been reported for complex traits. For instance, dosage compensation of X-linked genes is often achieved via the inactivation of one allele in each X-linked variant in females; however, some X-linked variants can escape this X chromosome inactivation. Efficient genetic analyses cannot be conducted without prior knowledge about the gene expression process of X-linked variants, and misspecified information can lead to power loss. In this report, we propose new statistical methods for rare X-linked variant genetic association analysis of dichotomous phenotypes with family-based samples. The proposed methods are computationally efficient and can complete X-linked analyses within a few hours. Simulation studies demonstrate the statistical efficiency of the proposed methods, which were then applied to rare-variant association analysis of the X chromosome in chronic obstructive pulmonary disease (COPD). Some promising significant X-linked genes were identified, illustrating the practical importance of the proposed methods. PMID:27325607

  13. FARVATX: Family-Based Rare Variant Association Test for X-Linked Genes.

    PubMed

    Choi, Sungkyoung; Lee, Sungyoung; Qiao, Dandi; Hardin, Megan; Cho, Michael H; Silverman, Edwin K; Park, Taesung; Won, Sungho

    2016-09-01

    Although the X chromosome has many genes that are functionally related to human diseases, the complicated biological properties of the X chromosome have prevented efficient genetic association analyses, and only a few significantly associated X-linked variants have been reported for complex traits. For instance, dosage compensation of X-linked genes is often achieved via the inactivation of one allele in each X-linked variant in females; however, some X-linked variants can escape this X chromosome inactivation. Efficient genetic analyses cannot be conducted without prior knowledge about the gene expression process of X-linked variants, and misspecified information can lead to power loss. In this report, we propose new statistical methods for rare X-linked variant genetic association analysis of dichotomous phenotypes with family-based samples. The proposed methods are computationally efficient and can complete X-linked analyses within a few hours. Simulation studies demonstrate the statistical efficiency of the proposed methods, which were then applied to rare-variant association analysis of the X chromosome in chronic obstructive pulmonary disease. Some promising significant X-linked genes were identified, illustrating the practical importance of the proposed methods. © 2016 WILEY PERIODICALS, INC.

  14. The influence of transient change of total body water on relative body fats based on three bioelectrical impedance analyses methods. Comparison between before and after exercise with sweat loss, and after drinking.

    PubMed

    Demura, S; Yamaji, S; Goshi, F; Nagasawa, Y

    2002-03-01

    The purpose of this study was to clarify the influence of change of total body water caused by exercise and drinking, on relative body fat (%BF) based on three bioelectrical impedance analyses (BIA) methods, between hand and foot (H-F), between hand and hand (H-H), and between foot and foot (F-F). The subjects were 30 Japanese healthy young adults aged 18 to 23 years (15 males, 15 females). Measurements were made three times for each BIA method; before and after exercise with sweat, and after drinking, and also twice according to the under water weighing (UW) method, before exercise and after drinking. A pedaling exercise, with a bicycle ergometer, was used for 60 minutes as the exercise. The relationship of %BF between the UW method and each BIA method was mid-range or more (r=0.765-0.839). However, %BF based on the H-F and F-F BIA methods were higher than that based on the UW method. After drinking, %BF of all the BIA methods were higher than the UW method. %BF of the BIA methods after exercise indicated values lower than those before exercise. %BF of the H-F and H-H BIA methods after drinking were a little higher than those before exercise, indicating that those measurements reflect a slight change of body water. It was demonstrated that %BF of any BIA method reflect the change of body water caused by exercise, sweating, and drinking.

  15. Incorporating external evidence in trial-based cost-effectiveness analyses: the use of resampling methods

    PubMed Central

    2014-01-01

    Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356

  16. Automated analysis of clonal cancer cells by intravital imaging

    PubMed Central

    Coffey, Sarah Earley; Giedt, Randy J; Weissleder, Ralph

    2013-01-01

    Longitudinal analyses of single cell lineages over prolonged periods have been challenging particularly in processes characterized by high cell turn-over such as inflammation, proliferation, or cancer. RGB marking has emerged as an elegant approach for enabling such investigations. However, methods for automated image analysis continue to be lacking. Here, to address this, we created a number of different multicolored poly- and monoclonal cancer cell lines for in vitro and in vivo use. To classify these cells in large scale data sets, we subsequently developed and tested an automated algorithm based on hue selection. Our results showed that this method allows accurate analyses at a fraction of the computational time required by more complex color classification methods. Moreover, the methodology should be broadly applicable to both in vitro and in vivo analyses. PMID:24349895

  17. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  18. Guidelines for cost control and analysis of cost-type research and development contracts

    NASA Technical Reports Server (NTRS)

    Sibbers, C. W.

    1981-01-01

    The cost information which should be obtained from a contractor(s) on a major, cost type research and development contract(s), and the analyses and effective use of these data are discussed. Specific type(s) of information which should be required, methods for analyzing such information, and methods for effectively using the results of such analyses to enhance NASA contract and project management are included. The material presented is based primarily on the principal methods which have been effectively used in the management of major cost type research and development contracts.

  19. NEW TARGET AND CONTROL ASSAYS FOR QUANTITATIVE POLYMERASE CHAIN REACTION (QPCR) ANALYSIS OF ENTEROCOCCI IN WATER

    EPA Science Inventory

    Enterococci are frequently monitored in water samples as indicators of fecal pollution. Attention is now shifting from culture based methods for enumerating these organisms to more rapid molecular methods such as QPCR. Accurate quantitative analyses by this method requires highly...

  20. Advantages of virulotyping foodborne pathogens over traditional identification and characterization methods

    USDA-ARS?s Scientific Manuscript database

    This chapter provides an overview regarding the advantages of virulotyping over historic serology-based, PCR-based on genes that identify an organism, or enzymatic and biochemical-based analyses of foodborne pathogens in clinical diagnostics and food industry microbiology testing. Traditional ident...

  1. Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.

    PubMed

    Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei

    2017-09-01

    Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Missing data in trial‐based cost‐effectiveness analysis: An incomplete journey

    PubMed Central

    Gomes, Manuel; Carpenter, James R.

    2018-01-01

    SUMMARY Cost‐effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial‐based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty‐two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost‐effectiveness data was 63% (interquartile range: 47%–81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing‐at‐random assumption. Further improvements are needed to address missing data in cost‐effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing‐at‐random assumption. PMID:29573044

  3. Bioinformatics for spermatogenesis: annotation of male reproduction based on proteomics

    PubMed Central

    Zhou, Tao; Zhou, Zuo-Min; Guo, Xue-Jiang

    2013-01-01

    Proteomics strategies have been widely used in the field of male reproduction, both in basic and clinical research. Bioinformatics methods are indispensable in proteomics-based studies and are used for data presentation, database construction and functional annotation. In the present review, we focus on the functional annotation of gene lists obtained through qualitative or quantitative methods, summarizing the common and male reproduction specialized proteomics databases. We introduce several integrated tools used to find the hidden biological significance from the data obtained. We further describe in detail the information on male reproduction derived from Gene Ontology analyses, pathway analyses and biomedical analyses. We provide an overview of bioinformatics annotations in spermatogenesis, from gene function to biological function and from biological function to clinical application. On the basis of recently published proteomics studies and associated data, we show that bioinformatics methods help us to discover drug targets for sperm motility and to scan for cancer-testis genes. In addition, we summarize the online resources relevant to male reproduction research for the exploration of the regulation of spermatogenesis. PMID:23852026

  4. Using Case Study Multi-Methods to Investigate Close(r) Collaboration: Course-Based Tutoring and the Directive/Nondirective Instructional Continuum

    ERIC Educational Resources Information Center

    Corbett, Steven J.

    2011-01-01

    This essay presents case studies of "course-based tutoring" (CBT) and one-to-one tutorials in two sections of developmental first-year composition (FYC) at a large West Coast research university. The author's study uses a combination of rhetorical and discourse analyses and ethnographic and case study multi-methods to investigate both…

  5. Case Study of a Gifted and Talented Catholic Dominican Nun

    ERIC Educational Resources Information Center

    Lavin, Angela

    2017-01-01

    The case of a gifted and talented Catholic Dominican nun is described and analysed in the context of Renzulli's Three-Ring Conception of Giftedness and Gagne's Differentiated Model of Giftedness and Talent. Using qualitative methods, semi-structured interviews of relevant individuals were conducted and analysed. Based on the conclusions of this…

  6. Electronic-projecting Moire method applying CBR-technology

    NASA Astrophysics Data System (ADS)

    Kuzyakov, O. N.; Lapteva, U. V.; Andreeva, M. A.

    2018-01-01

    Electronic-projecting method based on Moire effect for examining surface topology is suggested. Conditions of forming Moire fringes and their parameters’ dependence on reference parameters of object and virtual grids are analyzed. Control system structure and decision-making subsystem are elaborated. Subsystem execution includes CBR-technology, based on applying case base. The approach related to analysing and forming decision for each separate local area with consequent formation of common topology map is applied.

  7. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.

    2016-05-03

    ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of thismore » protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical). IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample.« less

  8. Template based rotation: A method for functional connectivity analysis with a priori templates☆

    PubMed Central

    Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.

    2014-01-01

    Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based analyses. This flexibility owes to the reduced spatial and temporal orthogonality constraints of template based rotation as compared to dual regression. These results suggest that template based rotation can provide a useful alternative to existing fcMRI analytic methods, particularly in clinical trial settings where predefined outcome measures and conserved network descriptions across groups are at a premium. PMID:25150630

  9. Analysis of the width-w non-adjacent form in conjunction with hyperelliptic curve cryptography and with lattices☆

    PubMed Central

    Krenn, Daniel

    2013-01-01

    In this work the number of occurrences of a fixed non-zero digit in the width-w non-adjacent forms of all elements of a lattice in some region (e.g. a ball) is analysed. As bases, expanding endomorphisms with eigenvalues of the same absolute value are allowed. Applications of the main result are on numeral systems with an algebraic integer as base. Those come from efficient scalar multiplication methods (Frobenius-and-add methods) in hyperelliptic curves cryptography, and the result is needed for analysing the running time of such algorithms. The counting result itself is an asymptotic formula, where its main term coincides with the full block length analysis. In its second order term a periodic fluctuation is exhibited. The proof follows Delange’s method. PMID:23805020

  10. Analysis of the width-[Formula: see text] non-adjacent form in conjunction with hyperelliptic curve cryptography and with lattices.

    PubMed

    Krenn, Daniel

    2013-06-17

    In this work the number of occurrences of a fixed non-zero digit in the width-[Formula: see text] non-adjacent forms of all elements of a lattice in some region (e.g. a ball) is analysed. As bases, expanding endomorphisms with eigenvalues of the same absolute value are allowed. Applications of the main result are on numeral systems with an algebraic integer as base. Those come from efficient scalar multiplication methods (Frobenius-and-add methods) in hyperelliptic curves cryptography, and the result is needed for analysing the running time of such algorithms. The counting result itself is an asymptotic formula, where its main term coincides with the full block length analysis. In its second order term a periodic fluctuation is exhibited. The proof follows Delange's method.

  11. Studying distributed cognition of simulation-based team training with DiCoT.

    PubMed

    Rybing, Jonas; Nilsson, Heléne; Jonson, Carl-Oscar; Bang, Magnus

    2016-03-01

    Health care organizations employ simulation-based team training (SBTT) to improve skill, communication and coordination in a broad range of critical care contexts. Quantitative approaches, such as team performance measurements, are predominantly used to measure SBTTs effectiveness. However, a practical evaluation method that examines how this approach supports cognition and teamwork is missing. We have applied Distributed Cognition for Teamwork (DiCoT), a method for analysing cognition and collaboration aspects of work settings, with the purpose of assessing the methodology's usefulness for evaluating SBTTs. In a case study, we observed and analysed four Emergo Train System® simulation exercises where medical professionals trained emergency response routines. The study suggests that DiCoT is an applicable and learnable tool for determining key distributed cognition attributes of SBTTs that are of importance for the simulation validity of training environments. Moreover, we discuss and exemplify how DiCoT supports design of SBTTs with a focus on transfer and validity characteristics. Practitioner Summary: In this study, we have evaluated a method to assess simulation-based team training environments from a cognitive ergonomics perspective. Using a case study, we analysed Distributed Cognition for Teamwork (DiCoT) by applying it to the Emergo Train System®. We conclude that DiCoT is useful for SBTT evaluation and simulator (re)design.

  12. Meta-Analysis of Inquiry-Based Instruction Research

    NASA Astrophysics Data System (ADS)

    Hasanah, N.; Prasetyo, A. P. B.; Rudyatmi, E.

    2017-04-01

    Inquiry-based instruction in biology has been the focus of educational research conducted by Unnes biology department students in collaboration with their university supervisors. This study aimed to describe the methodological aspects, inquiry teaching methods critically, and to analyse the results claims, of the selected four student research reports, grounded in inquiry, based on the database of Unnes biology department 2014. Four experimental quantitative research of 16 were selected as research objects by purposive sampling technique. Data collected through documentation study was qualitatively analysed regarding methods used, quality of inquiry syntax, and finding claims. Findings showed that the student research was still the lack of relevant aspects of research methodology, namely in appropriate sampling procedures, limited validity tests of all research instruments, and the limited parametric statistic (t-test) not supported previously by data normality tests. Their consistent inquiry syntax supported the four mini-thesis claims that inquiry-based teaching influenced their dependent variables significantly. In other words, the findings indicated that positive claims of the research results were not fully supported by good research methods, and well-defined inquiry procedures implementation.

  13. A reanalysis of cluster randomized trials showed interrupted time-series studies were valuable in health system evaluation.

    PubMed

    Fretheim, Atle; Zhang, Fang; Ross-Degnan, Dennis; Oxman, Andrew D; Cheyne, Helen; Foy, Robbie; Goodacre, Steve; Herrin, Jeph; Kerse, Ngaire; McKinlay, R James; Wright, Adam; Soumerai, Stephen B

    2015-03-01

    There is often substantial uncertainty about the impacts of health system and policy interventions. Despite that, randomized controlled trials (RCTs) are uncommon in this field, partly because experiments can be difficult to carry out. An alternative method for impact evaluation is the interrupted time-series (ITS) design. Little is known, however, about how results from the two methods compare. Our aim was to explore whether ITS studies yield results that differ from those of randomized trials. We conducted single-arm ITS analyses (segmented regression) based on data from the intervention arm of cluster randomized trials (C-RCTs), that is, discarding control arm data. Secondarily, we included the control group data in the analyses, by subtracting control group data points from intervention group data points, thereby constructing a time series representing the difference between the intervention and control groups. We compared the results from the single-arm and controlled ITS analyses with results based on conventional aggregated analyses of trial data. The findings were largely concordant, yielding effect estimates with overlapping 95% confidence intervals (CI) across different analytical methods. However, our analyses revealed the importance of a concurrent control group and of taking baseline and follow-up trends into account in the analysis of C-RCTs. The ITS design is valuable for evaluation of health systems interventions, both when RCTs are not feasible and in the analysis and interpretation of data from C-RCTs. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Incorporating external evidence in trial-based cost-effectiveness analyses: the use of resampling methods.

    PubMed

    Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling

    2014-06-03

    Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.

  15. Stability of operational taxonomic units: an important but neglected property for analyzing microbial diversity.

    PubMed

    He, Yan; Caporaso, J Gregory; Jiang, Xiao-Tao; Sheng, Hua-Fang; Huse, Susan M; Rideout, Jai Ram; Edgar, Robert C; Kopylova, Evguenia; Walters, William A; Knight, Rob; Zhou, Hong-Wei

    2015-01-01

    The operational taxonomic unit (OTU) is widely used in microbial ecology. Reproducibility in microbial ecology research depends on the reliability of OTU-based 16S ribosomal subunit RNA (rRNA) analyses. Here, we report that many hierarchical and greedy clustering methods produce unstable OTUs, with membership that depends on the number of sequences clustered. If OTUs are regenerated with additional sequences or samples, sequences originally assigned to a given OTU can be split into different OTUs. Alternatively, sequences assigned to different OTUs can be merged into a single OTU. This OTU instability affects alpha-diversity analyses such as rarefaction curves, beta-diversity analyses such as distance-based ordination (for example, Principal Coordinate Analysis (PCoA)), and the identification of differentially represented OTUs. Our results show that the proportion of unstable OTUs varies for different clustering methods. We found that the closed-reference method is the only one that produces completely stable OTUs, with the caveat that sequences that do not match a pre-existing reference sequence collection are discarded. As a compromise to the factors listed above, we propose using an open-reference method to enhance OTU stability. This type of method clusters sequences against a database and includes unmatched sequences by clustering them via a relatively stable de novo clustering method. OTU stability is an important consideration when analyzing microbial diversity and is a feature that should be taken into account during the development of novel OTU clustering methods.

  16. Assay Development for the Determination of Phosphorylation Stoichiometry using MRM methods with and without Phosphatase Treatment: Application to Breast Cancer Signaling Pathways

    PubMed Central

    Domanski, Dominik; Murphy, Leigh C.; Borchers, Christoph H.

    2010-01-01

    We have developed a phosphatase-based phosphopeptide quantitation (PPQ) method for determining phosphorylation stoichiometry in complex biological samples. This PPQ method is based on enzymatic dephosphorylation, combined with specific and accurate peptide identification and quantification by multiple reaction monitoring (MRM) detection with stable-isotope-labeled standard peptides. In contrast with the classical MRM methods for the quantitation of phosphorylation stoichiometry, the PPQ-MRM method needs only one non-phosphorylated SIS (stable isotope-coded standard) and two analyses (one for the untreated and one for the phosphatase-treated sample), from which the expression and modification levels can accurately be determined. From these analyses, the % phosphorylation can be determined. In this manuscript, we compare the PPQ-MRM method with an MRM method without phosphatase, and demonstrate the application of these methods to the detection and quantitation of phosphorylation of the classic phosphorylated breast cancer biomarkers (ERα and HER2), and for phosphorylated RAF and ERK1, which also contain phosphorylation sites with important biological implications. Using synthetic peptides spiked into a complex protein digest, we were able to use our PPQ-MRM method to accurately determine the total phosphorylation stoichiometry on specific peptides, as well as the absolute amount of the peptide and phosphopeptide present. Analyses of samples containing ERα protein revealed that the PPQ-MRM is capable of determining phosphorylation stoichiometry in proteins from cell lines, and is in good agreement with determinations obtained using the direct MRM approach in terms of phosphorylation and total protein amount. PMID:20524616

  17. Identifying city PV roof resource based on Gabor filter

    NASA Astrophysics Data System (ADS)

    Ruhang, Xu; Zhilin, Liu; Yong, Huang; Xiaoyu, Zhang

    2017-06-01

    To identify a city’s PV roof resources, the area and ownership distribution of residential buildings in an urban district should be assessed. To achieve this assessment, remote sensing data analysing is a promising approach. Urban building roof area estimation is a major topic for remote sensing image information extraction. There are normally three ways to solve this problem. The first way is pixel-based analysis, which is based on mathematical morphology or statistical methods; the second way is object-based analysis, which is able to combine semantic information and expert knowledge; the third way is signal-processing view method. This paper presented a Gabor filter based method. This result shows that the method is fast and with proper accuracy.

  18. Finite Element Creep Damage Analyses and Life Prediction of P91 Pipe Containing Local Wall Thinning Defect

    NASA Astrophysics Data System (ADS)

    Xue, Jilin; Zhou, Changyu

    2016-03-01

    Creep continuum damage finite element (FE) analyses were performed for P91 steel pipe containing local wall thinning (LWT) defect subjected to monotonic internal pressure, monotonic bending moment and combined internal pressure and bending moment by orthogonal experimental design method. The creep damage lives of pipe containing LWT defect under different load conditions were obtained. Then, the creep damage life formulas were regressed based on the creep damage life results from FE method. At the same time a skeletal point rupture stress was found and used for life prediction which was compared with creep damage lives obtained by continuum damage analyses. From the results, the failure lives of pipe containing LWT defect can be obtained accurately by using skeletal point rupture stress method. Finally, the influence of LWT defect geometry was analysed, which indicated that relative defect depth was the most significant factor for creep damage lives of pipe containing LWT defect.

  19. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  20. A Cash Management Model.

    ERIC Educational Resources Information Center

    Boyles, William W.

    1975-01-01

    In 1973, Ronald G. Lykins presented a model for cash management and analysed its benefits for Ohio University. This paper attempts to expand on the previous method by providing answers to questions raised by the Lykins methods by a series of simple algebraic formulas. Both methods are based on two premises: (1) all cash over which the business…

  1. Analysing Culture and Interculture in Saudi EFL Textbooks: A Corpus Linguistic Approach

    ERIC Educational Resources Information Center

    Almujaiwel, Sultan

    2018-01-01

    This paper combines corpus processing tools to investigate the cultural elements of Saudi education of English as a foreign language (EFL). The latest Saudi EFL textbooks (2016 onwards) are available in researchable PDF formats. This helps process them through corpus search software tools. The method adopted is based on analysing 20 cultural…

  2. Determining the 95% limit of detection for waterborne pathogen analyses from primary concentration to qPCR

    USDA-ARS?s Scientific Manuscript database

    The limit of detection (LOD) for qPCR-based analyses is not consistently defined or determined in studies on waterborne pathogens. Moreover, the LODs reported often reflect the qPCR assay rather than the entire sample process. Our objective was to develop a method to determine the 95% LOD (lowest co...

  3. Using Toulmin Analysis to Analyse an Instructor's Proof Presentation in Abstract Algebra

    ERIC Educational Resources Information Center

    Fukawa-Connelly, Timothy

    2014-01-01

    This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of…

  4. An IC-MS/MS Method for the Determination of 1-Hydroxyethylidene-1,1-diphosphonic Acid on Uncooked Foods Treated with Peracetic Acid-Based Sanitizers.

    PubMed

    Suzuki, Ippei; Kubota, Hiroki; Ohtsuki, Takashi; Tatebe, Chiye; Tada, Atsuko; Yano, Takeo; Akiyama, Hiroshi; Sato, Kyoko

    2016-01-01

    A rapid, sensitive, and specific analytical method for the determination of 1-hydroxyethylidene-1,1-diphosphonic acid (HEDP) on uncooked foods after treatment with a peracetic acid-based sanitizer (PAS) was developed. The method involves simple sample preparation steps and analysis using ion chromatography (IC) coupled with tandem mass spectrometry (MS/MS). The quantification limits of HEDP on uncooked foods are 0.007 mg/kg for vegetables and fruits and 0.2 mg/kg for meats. The recovery and relative standard deviation (RSD) of HEDP analyses of uncooked foods ranged from 73.9 to 103.8% and 1.9 to 12.6%, respectively. The method's accuracy and precision were evaluated by inter-day recovery tests. The recovery for all samples ranged from 93.6 to 101.2%, and the within-laboratory repeatability and reproducibility were evaluated based on RSD values, which were less than 6.9 and 11.5%, respectively. Analyses of PAS-treated fruits and vegetables using the developed method indicated levels of HEDP ranging from 0.008 to 0.351 mg/kg. Therefore, the results of the present study suggest that the proposed method is an accurate, precise, and reliable way to determine residual HEDP levels on PAS-treated uncooked foods.

  5. The Co-simulation of Humanoid Robot Based on Solidworks, ADAMS and Simulink

    NASA Astrophysics Data System (ADS)

    Song, Dalei; Zheng, Lidan; Wang, Li; Qi, Weiwei; Li, Yanli

    A simulation method of adaptive controller is proposed for the humanoid robot system based on co-simulation of Solidworks, ADAMS and Simulink. A complex mathematical modeling process is avoided by this method, and the real time dynamic simulating function of Simulink would be exerted adequately. This method could be generalized to other complicated control system. This method is adopted to build and analyse the model of humanoid robot. The trajectory tracking and adaptive controller design also proceed based on it. The effect of trajectory tracking is evaluated by fitting-curve theory of least squares method. The anti-interference capability of the robot is improved a lot through comparative analysis.

  6. Microscale Concentration Measurements Using Laser Light Scattering Methods

    NASA Technical Reports Server (NTRS)

    Niederhaus, Charles; Miller, Fletcher

    2004-01-01

    The development of lab-on-a-chip devices for microscale biochemical assays has led to the need for microscale concentration measurements of specific analyses. While fluorescence methods are the current choice, this method requires developing fluorophore-tagged conjugates for each analyte of interest. In addition, fluorescent imaging is also a volume-based method, and can be limiting as smaller detection regions are required.

  7. Modelling Coastal Cliff Recession Based on the GIM-DDD Method

    NASA Astrophysics Data System (ADS)

    Gong, Bin; Wang, Shanyong; Sloan, Scott William; Sheng, Daichao; Tang, Chun'an

    2018-04-01

    The unpredictable and instantaneous collapse behaviour of coastal rocky cliffs may cause damage that extends significantly beyond the area of failure. Gravitational movements that occur during coastal cliff recession involve two major stages: the small deformation stage and the large displacement stage. In this paper, a method of simulating the entire progressive failure process of coastal rocky cliffs is developed based on the gravity increase method (GIM), the rock failure process analysis method and the discontinuous deformation analysis method, and it is referred to as the GIM-DDD method. The small deformation stage, which includes crack initiation, propagation and coalescence processes, and the large displacement stage, which includes block translation and rotation processes during the rocky cliff collapse, are modelled using the GIM-DDD method. In addition, acoustic emissions, stress field variations, crack propagation and failure mode characteristics are further analysed to provide insights that can be used to predict, prevent and minimize potential economic losses and casualties. The calculation and analytical results are consistent with previous studies, which indicate that the developed method provides an effective and reliable approach for performing rocky cliff stability evaluations and coastal cliff recession analyses and has considerable potential for improving the safety and protection of seaside cliff areas.

  8. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    PubMed Central

    Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David

    2016-01-01

    Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175

  9. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  10. Sources of Variability in Chlorophyll Analysis by Fluorometry and by High Performance Liquid Chromatography. Chapter 22

    NASA Technical Reports Server (NTRS)

    VanHeukelem, Laurie; Thomas, Crystal S.; Glibert, Patricia M.

    2001-01-01

    The need for accurate determination of chlorophyll a (chl a) is of interest for numerous reasons. From the need for ground-truth data for remote sensing to pigment detection for laboratory experimentation, it is essential to know the accuracy of the analyses and the factors potentially contributing to variability and error. Numerous methods and instrument techniques are currently employed in the analyses of chl a. These methods range from spectrophotometric quantification, to fluorometric analysis and determination by high performance liquid chromatography. Even within the application of HPLC techniques, methods vary. Here we provide the results of a comparison among methods and provide some guidance for improving the accuracy of these analyses. These results are based on a round-robin conducted among numerous investigators, including several in the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) and HyCODE Programs. Our purpose here is not to present the full results of the laboratory intercalibration; those results will be presented elsewhere. Rather, here we highlight some of the major factors that may contribute to the variability observed. Specifically, we aim to assess the comparability of chl a analyses performed by fluorometry and HPLC, and we identify several factors in the analyses which may contribute disproportionately to this variability.

  11. Measurement correction method for force sensor used in dynamic pressure calibration based on artificial neural network optimized by genetic algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Tingwei; Kong, Deren; Shang, Fei; Chen, Jing

    2017-12-01

    We present an optimization algorithm to obtain low-uncertainty dynamic pressure measurements from a force-transducer-based device. In this paper, the advantages and disadvantages of the methods that are commonly used to measure the propellant powder gas pressure, the applicable scope of dynamic pressure calibration devices, and the shortcomings of the traditional comparison calibration method based on the drop-weight device are firstly analysed in detail. Then, a dynamic calibration method for measuring pressure using a force sensor based on a drop-weight device is introduced. This method can effectively save time when many pressure sensors are calibrated simultaneously and extend the life of expensive reference sensors. However, the force sensor is installed between the drop-weight and the hammerhead by transition pieces through the connection mode of bolt fastening, which causes adverse effects such as additional pretightening and inertia forces. To solve these effects, the influence mechanisms of the pretightening force, the inertia force and other influence factors on the force measurement are theoretically analysed. Then a measurement correction method for the force measurement is proposed based on an artificial neural network optimized by a genetic algorithm. The training and testing data sets are obtained from calibration tests, and the selection criteria for the key parameters of the correction model is discussed. The evaluation results for the test data show that the correction model can effectively improve the force measurement accuracy of the force sensor. Compared with the traditional high-accuracy comparison calibration method, the percentage difference of the impact-force-based measurement is less than 0.6% and the relative uncertainty of the corrected force value is 1.95%, which can meet the requirements of engineering applications.

  12. A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases

    PubMed Central

    Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357

  13. A novel method to handle the effect of uneven sampling effort in biodiversity databases.

    PubMed

    Pardo, Iker; Pata, María P; Gómez, Daniel; García, María B

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses.

  14. Nonlinear earthquake analysis of reinforced concrete frames with fiber and Bernoulli-Euler beam-column element.

    PubMed

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched.

  15. Pathway-based analyses.

    PubMed

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  16. GSimp: A Gibbs sampler based left-censored missing value imputation approach for metabolomics studies

    PubMed Central

    Jia, Erik; Chen, Tianlu

    2018-01-01

    Left-censored missing values commonly exist in targeted metabolomics datasets and can be considered as missing not at random (MNAR). Improper data processing procedures for missing values will cause adverse impacts on subsequent statistical analyses. However, few imputation methods have been developed and applied to the situation of MNAR in the field of metabolomics. Thus, a practical left-censored missing value imputation method is urgently needed. We developed an iterative Gibbs sampler based left-censored missing value imputation approach (GSimp). We compared GSimp with other three imputation methods on two real-world targeted metabolomics datasets and one simulation dataset using our imputation evaluation pipeline. The results show that GSimp outperforms other imputation methods in terms of imputation accuracy, observation distribution, univariate and multivariate analyses, and statistical sensitivity. Additionally, a parallel version of GSimp was developed for dealing with large scale metabolomics datasets. The R code for GSimp, evaluation pipeline, tutorial, real-world and simulated targeted metabolomics datasets are available at: https://github.com/WandeRum/GSimp. PMID:29385130

  17. Live births after simultaneous avoidance of monogenic diseases and chromosome abnormality by next-generation sequencing with linkage analyses.

    PubMed

    Yan, Liying; Huang, Lei; Xu, Liya; Huang, Jin; Ma, Fei; Zhu, Xiaohui; Tang, Yaqiong; Liu, Mingshan; Lian, Ying; Liu, Ping; Li, Rong; Lu, Sijia; Tang, Fuchou; Qiao, Jie; Xie, X Sunney

    2015-12-29

    In vitro fertilization (IVF), preimplantation genetic diagnosis (PGD), and preimplantation genetic screening (PGS) help patients to select embryos free of monogenic diseases and aneuploidy (chromosome abnormality). Next-generation sequencing (NGS) methods, while experiencing a rapid cost reduction, have improved the precision of PGD/PGS. However, the precision of PGD has been limited by the false-positive and false-negative single-nucleotide variations (SNVs), which are not acceptable in IVF and can be circumvented by linkage analyses, such as short tandem repeats or karyomapping. It is noteworthy that existing methods of detecting SNV/copy number variation (CNV) and linkage analysis often require separate procedures for the same embryo. Here we report an NGS-based PGD/PGS procedure that can simultaneously detect a single-gene disorder and aneuploidy and is capable of linkage analysis in a cost-effective way. This method, called "mutated allele revealed by sequencing with aneuploidy and linkage analyses" (MARSALA), involves multiple annealing and looping-based amplification cycles (MALBAC) for single-cell whole-genome amplification. Aneuploidy is determined by CNVs, whereas SNVs associated with the monogenic diseases are detected by PCR amplification of the MALBAC product. The false-positive and -negative SNVs are avoided by an NGS-based linkage analysis. Two healthy babies, free of the monogenic diseases of their parents, were born after such embryo selection. The monogenic diseases originated from a single base mutation on the autosome and the X-chromosome of the disease-carrying father and mother, respectively.

  18. Zirconium determination by cooling curve analysis during the pyroprocessing of used nuclear fuel

    NASA Astrophysics Data System (ADS)

    Westphal, B. R.; Price, J. C.; Bateman, K. J.; Marsden, K. C.

    2015-02-01

    An alternative method to sampling and chemical analyses has been developed to monitor the concentration of zirconium in real-time during the casting of uranium products from the pyroprocessing of used nuclear fuel. The method utilizes the solidification characteristics of the uranium products to determine zirconium levels based on standard cooling curve analyses and established binary phase diagram data. Numerous uranium products have been analyzed for their zirconium content and compared against measured zirconium data. From this data, the following equation was derived for the zirconium content of uranium products:

  19. [Point of Care 2.0: Coagulation Monitoring Using Rotem® Sigma and Teg® 6S].

    PubMed

    Weber, Christian Friedrich; Zacharowski, Kai

    2018-06-01

    New-generation methods for point of care based coagulation monitoring enable fully automated viscoelastic analyses for the assessment of particular parts of hemostasis. Contrary to the measuring techniques of former models, the viscoelastic ROTEM ® sigma and TEG ® 6s analyses are performed in single-use test cartridges without time- and personnel-intensive pre-analytical procedures. This review highlights methodical strengths and limitations of the devices and meets concerns associated with their integration in routine clinical practice. Georg Thieme Verlag KG Stuttgart · New York.

  20. Canonical decomposition of magnetotelluric responses: Experiment on 1D anisotropic structures

    NASA Astrophysics Data System (ADS)

    Guo, Ze-qiu; Wei, Wen-bo; Ye, Gao-feng; Jin, Sheng; Jing, Jian-en

    2015-08-01

    Horizontal electrical heterogeneity of subsurface earth is mostly originated from structural complexity and electrical anisotropy, and local near-surface electrical heterogeneity will severely distort regional electromagnetic responses. Conventional distortion analyses for magnetotelluric soundings are primarily physical decomposition methods with respect to isotropic models, which mostly presume that the geoelectric distribution of geological structures is of local and regional patterns represented by 3D/2D models. Due to the widespread anisotropy of earth media, the confusion between 1D anisotropic responses and 2D isotropic responses, and the defects of physical decomposition methods, we propose to conduct modeling experiments with canonical decomposition in terms of 1D layered anisotropic models, and the method is one of the mathematical decomposition methods based on eigenstate analyses differentiated from distortion analyses, which can be used to recover electrical information such as strike directions, and maximum and minimum conductivity. We tested this method with numerical simulation experiments on several 1D synthetic models, which turned out that canonical decomposition is quite effective to reveal geological anisotropic information. Finally, for the background of anisotropy from previous study by geological and seismological methods, canonical decomposition is applied to real data acquired in North China Craton for 1D anisotropy analyses, and the result shows that, with effective modeling and cautious interpretation, canonical decomposition could be another good method to detect anisotropy of geological media.

  1. Ulex Europaeus Agglutinin-1 Is a Reliable Taste Bud Marker for In Situ Hybridization Analyses.

    PubMed

    Yoshimoto, Joto; Okada, Shinji; Kishi, Mikiya; Misaka, Takumi

    2016-03-01

    Taste signals are received by taste buds. To better understand the taste reception system, expression patterns of taste-related molecules are determined by in situ hybridization (ISH) analyses at the histological level. Nevertheless, even though ISH is essential for determining mRNA expression, few taste bud markers can be applied together with ISH. Ulex europaeus agglutinin-1 (UEA-1) appears to be a reliable murine taste bud marker based on immunohistochemistry (IHC) analyses. However, there is no evidence as to whether UEA-1 can be used for ISH. Thus, the present study evaluated UEA-1 using various histochemical methods, especially ISH. When lectin staining was performed after ISH procedures, UEA-1 clearly labeled taste cellular membranes and distinctly indicated boundaries between taste buds and the surrounding epithelial cells. Additionally, UEA-1 was determined as a taste bud marker not only when used in single-colored ISH but also when employed with double-labeled ISH or during simultaneous detection using IHC and ISH methods. These results suggest that UEA-1 is a useful marker when conducting analyses based on ISH methods. To clarify UEA-1 staining details, multi-fluorescent IHC (together with UEA-1 staining) was examined, resulting in more than 99% of cells being labeled by UEA-1 and overlapping with KCNQ1-expressing cells. © 2016 The Histochemical Society.

  2. Missing data in trial-based cost-effectiveness analysis: An incomplete journey.

    PubMed

    Leurent, Baptiste; Gomes, Manuel; Carpenter, James R

    2018-06-01

    Cost-effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial-based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty-two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost-effectiveness data was 63% (interquartile range: 47%-81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing-at-random assumption. Further improvements are needed to address missing data in cost-effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing-at-random assumption. © 2018 The Authors Health Economics published by John Wiley & Sons Ltd.

  3. Ulex Europaeus Agglutinin-1 Is a Reliable Taste Bud Marker for In Situ Hybridization Analyses

    PubMed Central

    Yoshimoto, Joto; Okada, Shinji; Kishi, Mikiya; Misaka, Takumi

    2015-01-01

    Taste signals are received by taste buds. To better understand the taste reception system, expression patterns of taste-related molecules are determined by in situ hybridization (ISH) analyses at the histological level. Nevertheless, even though ISH is essential for determining mRNA expression, few taste bud markers can be applied together with ISH. Ulex europaeus agglutinin-1 (UEA-1) appears to be a reliable murine taste bud marker based on immunohistochemistry (IHC) analyses. However, there is no evidence as to whether UEA-1 can be used for ISH. Thus, the present study evaluated UEA-1 using various histochemical methods, especially ISH. When lectin staining was performed after ISH procedures, UEA-1 clearly labeled taste cellular membranes and distinctly indicated boundaries between taste buds and the surrounding epithelial cells. Additionally, UEA-1 was determined as a taste bud marker not only when used in single-colored ISH but also when employed with double-labeled ISH or during simultaneous detection using IHC and ISH methods. These results suggest that UEA-1 is a useful marker when conducting analyses based on ISH methods. To clarify UEA-1 staining details, multi-fluorescent IHC (together with UEA-1 staining) was examined, resulting in more than 99% of cells being labeled by UEA-1 and overlapping with KCNQ1-expressing cells. PMID:26718243

  4. Initiating communication about parental mental illness in families: an issue of confidence and security.

    PubMed

    Pihkala, Heljä; Sandlund, Mikael; Cederström, Anita

    2012-05-01

    Beardslee's family intervention (FI) is a family-based intervention to prevent psychiatric problems for children of mentally ill parents. The parents' experiences are of importance in family-based interventions. Twenty five parents were interviewed about their experiences of FI. Data were analysed by qualitative methods. Confidence and security in the professionals and in FI as a method were prerequisites for initiating communication about the parents' mental illness with the children. FI provides a solid base for an alliance with the parents and might be a practicable method when parenthood and children are discussed with psychiatric patients.

  5. NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making (Guide OTAN pour l’analyse operationnelle basee sur le jugement dans la prise de decision de defense). Analyst-Oriented Volume: Code of Best Practice for Soft Operational Analysis

    DTIC Science & Technology

    2012-06-01

    Military Operational Research , with special theme ‘The use of ‘soft’ methods in OR’. OR52 (7 – 9 September 2010, Royal Holloway University of London...on human judgement. Judgement-based OA applies the methods of ‘Soft Operational Research ’ developed in academia. It has appeared, however, that the...similarity between judgemental methods in operational research practice and a number of other modes of professional analytical practice. The closest

  6. Methodological Standards for Meta-Analyses and Qualitative Systematic Reviews of Cardiac Prevention and Treatment Studies: A Scientific Statement From the American Heart Association.

    PubMed

    Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer

    2017-09-05

    Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.

  7. ShapeRotator: An R tool for standardized rigid rotations of articulated three-dimensional structures with application for geometric morphometrics.

    PubMed

    Vidal-García, Marta; Bandara, Lashi; Keogh, J Scott

    2018-05-01

    The quantification of complex morphological patterns typically involves comprehensive shape and size analyses, usually obtained by gathering morphological data from all the structures that capture the phenotypic diversity of an organism or object. Articulated structures are a critical component of overall phenotypic diversity, but data gathered from these structures are difficult to incorporate into modern analyses because of the complexities associated with jointly quantifying 3D shape in multiple structures. While there are existing methods for analyzing shape variation in articulated structures in two-dimensional (2D) space, these methods do not work in 3D, a rapidly growing area of capability and research. Here, we describe a simple geometric rigid rotation approach that removes the effect of random translation and rotation, enabling the morphological analysis of 3D articulated structures. Our method is based on Cartesian coordinates in 3D space, so it can be applied to any morphometric problem that also uses 3D coordinates (e.g., spherical harmonics). We demonstrate the method by applying it to a landmark-based dataset for analyzing shape variation using geometric morphometrics. We have developed an R tool (ShapeRotator) so that the method can be easily implemented in the commonly used R package geomorph and MorphoJ software. This method will be a valuable tool for 3D morphological analyses in articulated structures by allowing an exhaustive examination of shape and size diversity.

  8. Molecular species delimitation methods and population genetics data reveal extensive lineage diversity and cryptic species in Aglaopheniidae (Hydrozoa).

    PubMed

    Postaire, Bautisse; Magalon, Hélène; Bourmaud, Chloé A-F; Bruggemann, J Henrich

    2016-12-01

    A comprehensive inventory of global biodiversity would be greatly improved by automating methods for species delimitation. The Automatic Barcode Gap Discovery method, the Poisson tree processes algorithm and the Generalized mixed Yule-coalescent model have been proposed as means of increasing the rate of biodiversity description using single locus data. We applied these methods to explore the diversity within the Aglaopheniidae, a hydrozoan family with many species widely distributed across tropical and temperate oceans. Our analyses revealed widespread cryptic diversity in this family, almost half of the morpho-species presenting several independent evolutionary lineages, as well as support for cases of synonymy. For two common species of this family, Lytocarpia brevirostris and Macrorhynchia phoenicea, we compared the outputs to clustering analyses based on microsatellite data and to nuclear gene phylogenies. For L. brevirostris, microsatellite data were congruent with results of the species delimitation methods, revealing the existence of two cryptic species with Indo-Pacific distribution. For M. phoenicea, all analyses confirmed the presence of two cryptic species within the South-Western Indian Ocean. Our study suggests that the diversity of Aglaopheniidae might be much higher than assumed, likely related to low dispersal capacities. Sequence-based species delimitation methods seem highly valuable to reveal cryptic diversity in hydrozoans; their application in an integrative framework will be very useful in describing the phyletic diversity of these organisms. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. CyTOF workflow: differential discovery in high-throughput high-dimensional cytometry datasets

    PubMed Central

    Nowicka, Malgorzata; Krieg, Carsten; Weber, Lukas M.; Hartmann, Felix J.; Guglietta, Silvia; Becher, Burkhard; Levesque, Mitchell P.; Robinson, Mark D.

    2017-01-01

    High dimensional mass and flow cytometry (HDCyto) experiments have become a method of choice for high throughput interrogation and characterization of cell populations.Here, we present an R-based pipeline for differential analyses of HDCyto data, largely based on Bioconductor packages. We computationally define cell populations using FlowSOM clustering, and facilitate an optional but reproducible strategy for manual merging of algorithm-generated clusters. Our workflow offers different analysis paths, including association of cell type abundance with a phenotype or changes in signaling markers within specific subpopulations, or differential analyses of aggregated signals. Importantly, the differential analyses we show are based on regression frameworks where the HDCyto data is the response; thus, we are able to model arbitrary experimental designs, such as those with batch effects, paired designs and so on. In particular, we apply generalized linear mixed models to analyses of cell population abundance or cell-population-specific analyses of signaling markers, allowing overdispersion in cell count or aggregated signals across samples to be appropriately modeled. To support the formal statistical analyses, we encourage exploratory data analysis at every step, including quality control (e.g. multi-dimensional scaling plots), reporting of clustering results (dimensionality reduction, heatmaps with dendrograms) and differential analyses (e.g. plots of aggregated signals). PMID:28663787

  10. Parent-Based Adolescent Sexual Health Interventions And Effect on Communication Outcomes: A Systematic Review and Meta-Analyses

    PubMed Central

    Maria, Diane Santa; Markham, Christine; Mullen, Patricia Dolan; Bluethmann, Shirley

    2016-01-01

    Context Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. Methods A systematic search of databases for the period 1998–2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Results Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. Conclusions These findings point to gaps in the range of programs examined in published trials—for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. PMID:25639664

  11. Accurate Phylogenetic Tree Reconstruction from Quartets: A Heuristic Approach

    PubMed Central

    Reaz, Rezwana; Bayzid, Md. Shamsuzzoha; Rahman, M. Sohel

    2014-01-01

    Supertree methods construct trees on a set of taxa (species) combining many smaller trees on the overlapping subsets of the entire set of taxa. A ‘quartet’ is an unrooted tree over taxa, hence the quartet-based supertree methods combine many -taxon unrooted trees into a single and coherent tree over the complete set of taxa. Quartet-based phylogeny reconstruction methods have been receiving considerable attentions in the recent years. An accurate and efficient quartet-based method might be competitive with the current best phylogenetic tree reconstruction methods (such as maximum likelihood or Bayesian MCMC analyses), without being as computationally intensive. In this paper, we present a novel and highly accurate quartet-based phylogenetic tree reconstruction method. We performed an extensive experimental study to evaluate the accuracy and scalability of our approach on both simulated and biological datasets. PMID:25117474

  12. Bayesian techniques for analyzing group differences in the Iowa Gambling Task: A case study of intuitive and deliberate decision-makers.

    PubMed

    Steingroever, Helen; Pachur, Thorsten; Šmíra, Martin; Lee, Michael D

    2018-06-01

    The Iowa Gambling Task (IGT) is one of the most popular experimental paradigms for comparing complex decision-making across groups. Most commonly, IGT behavior is analyzed using frequentist tests to compare performance across groups, and to compare inferred parameters of cognitive models developed for the IGT. Here, we present a Bayesian alternative based on Bayesian repeated-measures ANOVA for comparing performance, and a suite of three complementary model-based methods for assessing the cognitive processes underlying IGT performance. The three model-based methods involve Bayesian hierarchical parameter estimation, Bayes factor model comparison, and Bayesian latent-mixture modeling. We illustrate these Bayesian methods by applying them to test the extent to which differences in intuitive versus deliberate decision style are associated with differences in IGT performance. The results show that intuitive and deliberate decision-makers behave similarly on the IGT, and the modeling analyses consistently suggest that both groups of decision-makers rely on similar cognitive processes. Our results challenge the notion that individual differences in intuitive and deliberate decision styles have a broad impact on decision-making. They also highlight the advantages of Bayesian methods, especially their ability to quantify evidence in favor of the null hypothesis, and that they allow model-based analyses to incorporate hierarchical and latent-mixture structures.

  13. puma: a Bioconductor package for propagating uncertainty in microarray analysis.

    PubMed

    Pearson, Richard D; Liu, Xuejun; Sanguinetti, Guido; Milo, Marta; Lawrence, Neil D; Rattray, Magnus

    2009-07-09

    Most analyses of microarray data are based on point estimates of expression levels and ignore the uncertainty of such estimates. By determining uncertainties from Affymetrix GeneChip data and propagating these uncertainties to downstream analyses it has been shown that we can improve results of differential expression detection, principal component analysis and clustering. Previously, implementations of these uncertainty propagation methods have only been available as separate packages, written in different languages. Previous implementations have also suffered from being very costly to compute, and in the case of differential expression detection, have been limited in the experimental designs to which they can be applied. puma is a Bioconductor package incorporating a suite of analysis methods for use on Affymetrix GeneChip data. puma extends the differential expression detection methods of previous work from the 2-class case to the multi-factorial case. puma can be used to automatically create design and contrast matrices for typical experimental designs, which can be used both within the package itself but also in other Bioconductor packages. The implementation of differential expression detection methods has been parallelised leading to significant decreases in processing time on a range of computer architectures. puma incorporates the first R implementation of an uncertainty propagation version of principal component analysis, and an implementation of a clustering method based on uncertainty propagation. All of these techniques are brought together in a single, easy-to-use package with clear, task-based documentation. For the first time, the puma package makes a suite of uncertainty propagation methods available to a general audience. These methods can be used to improve results from more traditional analyses of microarray data. puma also offers improvements in terms of scope and speed of execution over previously available methods. puma is recommended for anyone working with the Affymetrix GeneChip platform for gene expression analysis and can also be applied more generally.

  14. 'Aussie normals': an a priori study to develop clinical chemistry reference intervals in a healthy Australian population.

    PubMed

    Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E

    2015-02-01

    Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.

  15. Whale song analyses using bioinformatics sequence analysis approaches

    NASA Astrophysics Data System (ADS)

    Chen, Yian A.; Almeida, Jonas S.; Chou, Lien-Siang

    2005-04-01

    Animal songs are frequently analyzed using discrete hierarchical units, such as units, themes and songs. Because animal songs and bio-sequences may be understood as analogous, bioinformatics analysis tools DNA/protein sequence alignment and alignment-free methods are proposed to quantify the theme similarities of the songs of false killer whales recorded off northeast Taiwan. The eighteen themes with discrete units that were identified in an earlier study [Y. A. Chen, masters thesis, University of Charleston, 2001] were compared quantitatively using several distance metrics. These metrics included the scores calculated using the Smith-Waterman algorithm with the repeated procedure; the standardized Euclidian distance and the angle metrics based on word frequencies. The theme classifications based on different metrics were summarized and compared in dendrograms using cluster analyses. The results agree with earlier classifications derived by human observation qualitatively. These methods further quantify the similarities among themes. These methods could be applied to the analyses of other animal songs on a larger scale. For instance, these techniques could be used to investigate song evolution and cultural transmission quantifying the dissimilarities of humpback whale songs across different seasons, years, populations, and geographic regions. [Work supported by SC Sea Grant, and Ilan County Government, Taiwan.

  16. Analysis of concrete beams using applied element method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  17. A Radioactivity Based Quantitative Analysis of the Amount of Thorium Present in Ores and Metallurgical Products; ANALYSE QUANTITATIVE DU THORIUM DANS LES MINERAIS ET LES PRODUITS THORIFERES PAR UNE METHODE BASEE SUR LA RADIOACTIVITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collee, R.; Govaerts, J.; Winand, L.

    1959-10-31

    A brief resume of the classical methods of quantitative determination of thorium in ores and thoriferous products is given to show that a rapid, accurate, and precise physical method based on the radioactivity of thorium would be of great utility. A method based on the utilization of the characteristic spectrum of the thorium gamma radiation is presented. The preparation of the samples and the instruments needed for the measurements is discussed. The experimental results show that the reproducibility is very satisfactory and that it is possible to detect Th contents of 1% or smaller. (J.S.R.)

  18. Rhythmic entrainment source separation: Optimizing analyses of neural responses to rhythmic sensory stimulation.

    PubMed

    Cohen, Michael X; Gulbinaite, Rasa

    2017-02-15

    Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Novel citation-based search method for scientific literature: application to meta-analyses.

    PubMed

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  20. [Clinical research XXIII. From clinical judgment to meta-analyses].

    PubMed

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  1. Size-based separation methods of circulating tumor cells.

    PubMed

    Hao, Si-Jie; Wan, Yuan; Xia, Yi-Qiu; Zou, Xin; Zheng, Si-Yang

    2018-02-01

    Circulating tumor cells (CTCs) originate from the primary tumor mass and enter into the peripheral bloodstream. Compared to other "liquid biopsy" portfolios such as exosome, circulating tumor DNA/RNA (ctDNA/RNA), CTCs have incomparable advantages in analyses of transcriptomics, proteomics, and signal colocalization. Hence, CTCs hold the key to understanding the biology of metastasis and play a vital role in cancer diagnosis, treatment monitoring, and prognosis. Size-based enrichment features are prominent in CTC isolation. It is a label-free, simple and fast method. Enriched CTCs remain unmodified and viable for a wide range of subsequent analyses. In this review, we comprehensively summarize the differences of size and deformability between CTCs and blood cells, which would facilitate the development of technologies of size-based CTC isolation. Then we review representative size-/deformability-based technologies available for CTC isolation and highlight the recent achievements in molecular analysis of isolated CTCs. To wrap up, we discuss the substantial challenges facing the field, and elaborate on prospects. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. A versatile software package for inter-subject correlation based analyses of fMRI.

    PubMed

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  3. A versatile software package for inter-subject correlation based analyses of fMRI

    PubMed Central

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818

  4. A risk-based statistical investigation of the quantification of polymorphic purity of a pharmaceutical candidate by solid-state 19F NMR.

    PubMed

    Barry, Samantha J; Pham, Tran N; Borman, Phil J; Edwards, Andrew J; Watson, Simon A

    2012-01-27

    The DMAIC (Define, Measure, Analyse, Improve and Control) framework and associated statistical tools have been applied to both identify and reduce variability observed in a quantitative (19)F solid-state NMR (SSNMR) analytical method. The method had been developed to quantify levels of an additional polymorph (Form 3) in batches of an active pharmaceutical ingredient (API), where Form 1 is the predominant polymorph. In order to validate analyses of the polymorphic form, a single batch of API was used as a standard each time the method was used. The level of Form 3 in this standard was observed to gradually increase over time, the effect not being immediately apparent due to method variability. In order to determine the cause of this unexpected increase and to reduce method variability, a risk-based statistical investigation was performed to identify potential factors which could be responsible for these effects. Factors identified by the risk assessment were investigated using a series of designed experiments to gain a greater understanding of the method. The increase of the level of Form 3 in the standard was primarily found to correlate with the number of repeat analyses, an effect not previously reported in SSNMR literature. Differences in data processing (phasing and linewidth) were found to be responsible for the variability in the method. After implementing corrective actions the variability was reduced such that the level of Form 3 was within an acceptable range of ±1% ww(-1) in fresh samples of API. Copyright © 2011. Published by Elsevier B.V.

  5. The use of belief-based probabilistic methods in volcanology: Scientists' views and implications for risk assessments

    NASA Astrophysics Data System (ADS)

    Donovan, Amy; Oppenheimer, Clive; Bravo, Michael

    2012-12-01

    This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.

  6. Concept mapping as a method to enhance evidence-based public health.

    PubMed

    van Bon-Martens, Marja J H; van de Goor, Ien A M; van Oers, Hans A M

    2017-02-01

    In this paper we explore the suitability of concept mapping as a method for integrating knowledge from science, practice, and policy. In earlier research we described and analysed five cases of concept mapping procedures in the Netherlands, serving different purposes and fields in public health. In the current paper, seven new concept mapping studies of co-produced work are added to extend this analysis. For each of these twelve studies we analysed: (1) how the method was able to integrate knowledge from practice with scientific knowledge by facilitating dialogue and collaboration between different stakeholders in the field of public health, such as academic researchers, practitioners, policy-makers and the public; (2) how the method was able to bring theory development a step further (scientific relevance); and (3) how the method was able to act as a sound basis for practical decision-making (practical relevance). Based on the answers to these research questions, all but one study was considered useful for building more evidence-based public health, even though the extent to which they underpinned actual decision-making varied. The chance of actually being implemented in practice seems strongly related to the extent to which the responsible decision-makers are involved in the way the concept map is prepared and executed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Problems in Choosing Tools and Methods for Teaching Programming

    ERIC Educational Resources Information Center

    Vitkute-Adžgauskiene, Davia; Vidžiunas, Antanas

    2012-01-01

    The paper analyses the problems in selecting and integrating tools for delivering basic programming knowledge at the university level. Discussion and analysis of teaching the programming disciplines, the main principles of study programme design, requirements for teaching tools, methods and corresponding languages is presented, based on literature…

  8. The High School & Beyond Data Set: Academic Self-Concept Measures.

    ERIC Educational Resources Information Center

    Strein, William

    A series of confirmatory factor analyses using both LISREL VI (maximum likelihood method) and LISCOMP (weighted least squares method using covariance matrix based on polychoric correlations) and including cross-validation on independent samples were applied to items from the High School and Beyond data set to explore the measurement…

  9. Analysis of Processed Foods Containing Oils and Fats by Time of Flight Mass Spectrometry with an APCI Direct Probe.

    PubMed

    Ito, Shihomi; Chikasou, Masato; Inohana, Shuichi; Fujita, Kazuhiro

    2016-01-01

    Discriminating vegetable oils and animal and milk fats by infrared absorption spectroscopy is difficult due to similarities in their spectral patterns. Therefore, a rapid and simple method for analyzing vegetable oils, animal fats, and milk fats using TOF/MS with an APCI direct probe ion source was developed. This method enabled discrimination of these oils and fats based on mass spectra and detailed analyses of the ions derived from sterols, even in samples consisting of only a few milligrams. Analyses of the mass spectra of processed foods containing oils and milk fats, such as butter, cheese, and chocolate, enabled confirmation of the raw material origin based on specific ions derived from the oils and fats used to produce the final product.

  10. Methods for Automating Analysis of Glacier Morphology for Regional Modelling: Centerlines, Extensions, and Elevation Bands

    NASA Astrophysics Data System (ADS)

    Viger, R. J.; Van Beusekom, A. E.

    2016-12-01

    The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.

  11. Improved neutron-gamma discrimination for a 3He neutron detector using subspace learning methods

    DOE PAGES

    Wang, C. L.; Funk, L. L.; Riedel, R. A.; ...

    2017-02-10

    3He gas based neutron linear-position-sensitive detectors (LPSDs) have been applied for many neutron scattering instruments. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio on the orders of 10 5-10 6. The NGD ratios of 3He detectors need to be improved for even better scientific results from neutron scattering. Digital Signal Processing (DSP) analyses of waveforms were proposed for obtaining better NGD ratios, based on features extracted from rise-time, pulse amplitude, charge integration, a simplified Wiener filter, and the cross-correlation between individual and template waveforms of neutron and gamma events. Fisher linear discriminant analysis (FLDA)more » and three multivariate analyses (MVAs) of the features were performed. The NGD ratios are improved by about 10 2-10 3 times compared with the traditional PHA method. Finally, our results indicate the NGD capabilities of 3He tube detectors can be significantly improved with subspace-learning based methods, which may result in a reduced data-collection time and better data quality for further data reduction.« less

  12. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.

    PubMed

    Lilly, Jonathan M

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  13. Direct sampling of cystic fibrosis lungs indicates that DNA-based analyses of upper-airway specimens can misrepresent lung microbiota.

    PubMed

    Goddard, Amanda F; Staudinger, Benjamin J; Dowd, Scot E; Joshi-Datar, Amruta; Wolcott, Randall D; Aitken, Moira L; Fligner, Corinne L; Singh, Pradeep K

    2012-08-21

    Recent work using culture-independent methods suggests that the lungs of cystic fibrosis (CF) patients harbor a vast array of bacteria not conventionally implicated in CF lung disease. However, sampling lung secretions in living subjects requires that expectorated specimens or collection devices pass through the oropharynx. Thus, contamination could confound results. Here, we compared culture-independent analyses of throat and sputum specimens to samples directly obtained from the lungs at the time of transplantation. We found that CF lungs with advanced disease contained relatively homogenous populations of typical CF pathogens. In contrast, upper-airway specimens from the same subjects contained higher levels of microbial diversity and organisms not typically considered CF pathogens. Furthermore, sputum exhibited day-to-day variation in the abundance of nontypical organisms, even in the absence of clinical changes. These findings suggest that oropharyngeal contamination could limit the accuracy of DNA-based measurements on upper-airway specimens. This work highlights the importance of sampling procedures for microbiome studies and suggests that methods that account for contamination are needed when DNA-based methods are used on clinical specimens.

  14. Factor Structure of Cognition and Functional Capacity in Two Studies of Schizophrenia and Bipolar Disorder: Implications for Genomic Studies

    PubMed Central

    Harvey, Philip D.; Aslan, Mihaela; Du, Mengtian; Zhao, Hongyu; Siever, Larry J.; Pulver, Ann; Gaziano, J. Michael; Concato, John

    2015-01-01

    Objective Impairments in cognition and everyday functioning are common in schizophrenia and bipolar disorder. Based on two studies of schizophrenia (SCZ) and bipolar I disorder (BPI) with similar methods, this paper presents factor analyses of cognitive and functional capacity (FC) measures. The overall goal of these analyses was to determine whether performance-based assessments should be examined individually, or aggregated on the basis of the correlational structure of the tests and as well as to evaluate the similarity of factor structures in SCZ and BPI. Method Veterans Affairs (VA) Cooperative Studies Program study #572, evaluated cognitive and FC measures among 5,414 BPI and 3,942 SZ patients. A second study evaluated similar neuropsychological (NP) and FC measures among 368 BPI and 436 SZ patients. Principal components analysis, as well as exploratory and confirmatory factor analyses, were used to examine the data. Results Analyses in both datasets suggested that NP and FC measures were explained by of a single underlying factor in BPI and SCZ patients, both when analyzed separately or as in a combined sample. The factor structure in both studies was similar, with or without inclusion of FC measures; homogeneous loadings were observed for that single factor across cognitive and FC domains across the samples. Conclusions The empirically derived factor model suggests that NP performance and FC are best explained as a single latent trait applicable to people with schizophrenia and bipolar illness. This single measure may enhance the robustness of the analyses relating genomic data to performance-based phenotypes. PMID:26710094

  15. A noninvasive, direct real-time PCR method for sex determination in multiple avian species

    USGS Publications Warehouse

    Brubaker, Jessica L.; Karouna-Renier, Natalie K.; Chen, Yu; Jenko, Kathryn; Sprague, Daniel T.; Henry, Paula F.P.

    2011-01-01

    Polymerase chain reaction (PCR)-based methods to determine the sex of birds are well established and have seen few modifications since they were first introduced in the 1990s. Although these methods allowed for sex determination in species that were previously difficult to analyse, they were not conducive to high-throughput analysis because of the laboriousness of DNA extraction and gel electrophoresis. We developed a high-throughput real-time PCR-based method for analysis of sex in birds, which uses noninvasive sample collection and avoids DNA extraction and gel electrophoresis.

  16. [A research on real-time ventricular QRS classification methods for single-chip-microcomputers].

    PubMed

    Peng, L; Yang, Z; Li, L; Chen, H; Chen, E; Lin, J

    1997-05-01

    Ventricular QRS classification is key technique of ventricular arrhythmias detection in single-chip-microcomputer based dynamic electrocardiogram real-time analyser. This paper adopts morphological feature vector including QRS amplitude, interval information to reveal QRS morphology. After studying the distribution of QRS morphology feature vector of MIT/BIH DB ventricular arrhythmia files, we use morphological feature vector cluster to classify multi-morphology QRS. Based on the method, morphological feature parameters changing method which is suitable to catch occasional ventricular arrhythmias is presented. Clinical experiments verify missed ventricular arrhythmia is less than 1% by this method.

  17. Earthquake Building Damage Mapping Based on Feature Analyzing Method from Synthetic Aperture Radar Data

    NASA Astrophysics Data System (ADS)

    An, L.; Zhang, J.; Gong, L.

    2018-04-01

    Playing an important role in gathering information of social infrastructure damage, Synthetic Aperture Radar (SAR) remote sensing is a useful tool for monitoring earthquake disasters. With the wide application of this technique, a standard method, comparing post-seismic to pre-seismic data, become common. However, multi-temporal SAR processes, are not always achievable. To develop a post-seismic data only method for building damage detection, is of great importance. In this paper, the authors are now initiating experimental investigation to establish an object-based feature analysing classification method for building damage recognition.

  18. The research on user behavior evaluation method for network state

    NASA Astrophysics Data System (ADS)

    Zhang, Chengyuan; Xu, Haishui

    2017-08-01

    Based on the correlation between user behavior and network running state, this paper proposes a method of user behavior evaluation based on network state. Based on the analysis and evaluation methods in other fields of study, we introduce the theory and tools of data mining. Based on the network status information provided by the trusted network view, the user behavior data and the network state data are analysed. Finally, we construct the user behavior evaluation index and weight, and on this basis, we can accurately quantify the influence degree of the specific behavior of different users on the change of network running state, so as to provide the basis for user behavior control decision.

  19. Comparing Computer-Adaptive and Curriculum-Based Measurement Methods of Assessment

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Gebhardt, Sarah N.

    2012-01-01

    This article reported the concurrent, predictive, and diagnostic accuracy of a computer-adaptive test (CAT) and curriculum-based measurements (CBM; both computation and concepts/application measures) for universal screening in mathematics among students in first through fourth grade. Correlational analyses indicated moderate to strong…

  20. A brain-region-based meta-analysis method utilizing the Apriori algorithm.

    PubMed

    Niu, Zhendong; Nie, Yaoxin; Zhou, Qian; Zhu, Linlin; Wei, Jieyao

    2016-05-18

    Brain network connectivity modeling is a crucial method for studying the brain's cognitive functions. Meta-analyses can unearth reliable results from individual studies. Meta-analytic connectivity modeling is a connectivity analysis method based on regions of interest (ROIs) which showed that meta-analyses could be used to discover brain network connectivity. In this paper, we propose a new meta-analysis method that can be used to find network connectivity models based on the Apriori algorithm, which has the potential to derive brain network connectivity models from activation information in the literature, without requiring ROIs. This method first extracts activation information from experimental studies that use cognitive tasks of the same category, and then maps the activation information to corresponding brain areas by using the automatic anatomical label atlas, after which the activation rate of these brain areas is calculated. Finally, using these brain areas, a potential brain network connectivity model is calculated based on the Apriori algorithm. The present study used this method to conduct a mining analysis on the citations in a language review article by Price (Neuroimage 62(2):816-847, 2012). The results showed that the obtained network connectivity model was consistent with that reported by Price. The proposed method is helpful to find brain network connectivity by mining the co-activation relationships among brain regions. Furthermore, results of the co-activation relationship analysis can be used as a priori knowledge for the corresponding dynamic causal modeling analysis, possibly achieving a significant dimension-reducing effect, thus increasing the efficiency of the dynamic causal modeling analysis.

  1. Examining the Impact of a Video Case-Based Mathematics Methods Course on Secondary Pre-Service Teachers' Skills at Analysing Students' Strategies

    ERIC Educational Resources Information Center

    Martinez, Mara Vanina; Superfine, Alison Castro; Carlton, Theresa; Dasgupta, Chandan

    2015-01-01

    This paper focuses on results from a study conducted with two cohorts of pre-service teachers (PSTs) in a video case-based mathematics methods course at a large Midwestern university in the US. The motivation for this study was to look beyond whether or not PSTs pay attention to mathematical thinking of students, as shown by previous studies when…

  2. A web-based endpoint adjudication system for interim analyses in clinical trials.

    PubMed

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However, once these issues were resolved, the reviewers found the system user-friendly and easy to navigate. Web-based data adjudication depends upon expeditious data collection and verification. Further, ability to use web-based technologies, in addition to clinical expertise, must be considered in selecting EAC members. The automated nature of this system makes it a practical mechanism for ensuring timely endpoint adjudication. The authors believe a similar approach could be useful for handling endpoint adjudication for future clinical trials.

  3. What Do We Know and How Well Do We Know It? Identifying Practice-Based Insights in Education

    ERIC Educational Resources Information Center

    Miller, Barbara; Pasley, Joan

    2012-01-01

    Knowledge derived from practice forms a significant portion of the knowledge base in the education field, yet is not accessible using existing empirical research methods. This paper describes a systematic, rigorous, grounded approach to collecting and analysing practice-based knowledge using the authors' research in teacher leadership as an…

  4. Fast-Running Aeroelastic Code Based on Unsteady Linearized Aerodynamic Solver Developed

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Bakhle, Milind A.; Keith, T., Jr.

    2003-01-01

    The NASA Glenn Research Center has been developing aeroelastic analyses for turbomachines for use by NASA and industry. An aeroelastic analysis consists of a structural dynamic model, an unsteady aerodynamic model, and a procedure to couple the two models. The structural models are well developed. Hence, most of the development for the aeroelastic analysis of turbomachines has involved adapting and using unsteady aerodynamic models. Two methods are used in developing unsteady aerodynamic analysis procedures for the flutter and forced response of turbomachines: (1) the time domain method and (2) the frequency domain method. Codes based on time domain methods require considerable computational time and, hence, cannot be used during the design process. Frequency domain methods eliminate the time dependence by assuming harmonic motion and, hence, require less computational time. Early frequency domain analyses methods neglected the important physics of steady loading on the analyses for simplicity. A fast-running unsteady aerodynamic code, LINFLUX, which includes steady loading and is based on the frequency domain method, has been modified for flutter and response calculations. LINFLUX, solves unsteady linearized Euler equations for calculating the unsteady aerodynamic forces on the blades, starting from a steady nonlinear aerodynamic solution. First, we obtained a steady aerodynamic solution for a given flow condition using the nonlinear unsteady aerodynamic code TURBO. A blade vibration analysis was done to determine the frequencies and mode shapes of the vibrating blades, and an interface code was used to convert the steady aerodynamic solution to a form required by LINFLUX. A preprocessor was used to interpolate the mode shapes from the structural dynamic mesh onto the computational dynamics mesh. Then, we used LINFLUX to calculate the unsteady aerodynamic forces for a given mode, frequency, and phase angle. A postprocessor read these unsteady pressures and calculated the generalized aerodynamic forces, eigenvalues, and response amplitudes. The eigenvalues determine the flutter frequency and damping. As a test case, the flutter of a helical fan was calculated with LINFLUX and compared with calculations from TURBO-AE, a nonlinear time domain code, and from ASTROP2, a code based on linear unsteady aerodynamics.

  5. Analyses of Fatigue Crack Growth and Closure Near Threshold Conditions for Large-Crack Behavior

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1999-01-01

    A plasticity-induced crack-closure model was used to study fatigue crack growth and closure in thin 2024-T3 aluminum alloy under constant-R and constant-K(sub max) threshold testing procedures. Two methods of calculating crack-opening stresses were compared. One method was based on a contact-K analyses and the other on crack-opening-displacement (COD) analyses. These methods gave nearly identical results under constant-amplitude loading, but under threshold simulations the contact-K analyses gave lower opening stresses than the contact COD method. Crack-growth predictions tend to support the use of contact-K analyses. Crack-growth simulations showed that remote closure can cause a rapid rise in opening stresses in the near threshold regime for low-constraint and high applied stress levels. Under low applied stress levels and high constraint, a rise in opening stresses was not observed near threshold conditions. But crack-tip-opening displacement (CTOD) were of the order of measured oxide thicknesses in the 2024 alloy under constant-R simulations. In contrast, under constant-K(sub max) testing the CTOD near threshold conditions were an order-of-magnitude larger than measured oxide thicknesses. Residual-plastic deformations under both constant-R and constant-K(sub max) threshold simulations were several times larger than the expected oxide thicknesses. Thus, residual-plastic deformations, in addition to oxide and roughness, play an integral part in threshold development.

  6. An evaluation of a bioelectrical impedance analyser for the estimation of body fat content.

    PubMed Central

    Maughan, R J

    1993-01-01

    Measurement of body composition is an important part of any assessment of health or fitness. Hydrostatic weighing is generally accepted as the most reliable method for the measurement of body fat content, but is inconvenient. Electrical impedance analysers have recently been proposed as an alternative to the measurement of skinfold thickness. Both these latter methods are convenient, but give values based on estimates obtained from population studies. This study compared values of body fat content obtained by hydrostatic weighing, skinfold thickness measurement and electrical impedance on 50 (28 women, 22 men) healthy volunteers. Mean(s.e.m.) values obtained by the three methods were: hydrostatic weighing, 20.5(1.2)%; skinfold thickness, 21.8(1.0)%; impedance, 20.8(0.9)%. The results indicate that the correlation between the skinfold method and hydrostatic weighing (0.931) is somewhat higher than that between the impedance method and hydrostatic weighing (0.830). This is, perhaps, not surprising given the fact that the impedance method is based on an estimate of total body water which is then used to calculate body fat content. The skinfold method gives an estimate of body density, and the assumptions involved in the conversion from body density to body fat content are the same for both methods. PMID:8457817

  7. Rapid methods for the detection of foodborne bacterial pathogens: principles, applications, advantages and limitations

    PubMed Central

    Law, Jodi Woan-Fei; Ab Mutalib, Nurul-Syakima; Chan, Kok-Gan; Lee, Learn-Han

    2015-01-01

    The incidence of foodborne diseases has increased over the years and resulted in major public health problem globally. Foodborne pathogens can be found in various foods and it is important to detect foodborne pathogens to provide safe food supply and to prevent foodborne diseases. The conventional methods used to detect foodborne pathogen are time consuming and laborious. Hence, a variety of methods have been developed for rapid detection of foodborne pathogens as it is required in many food analyses. Rapid detection methods can be categorized into nucleic acid-based, biosensor-based and immunological-based methods. This review emphasizes on the principles and application of recent rapid methods for the detection of foodborne bacterial pathogens. Detection methods included are simple polymerase chain reaction (PCR), multiplex PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), loop-mediated isothermal amplification (LAMP) and oligonucleotide DNA microarray which classified as nucleic acid-based methods; optical, electrochemical and mass-based biosensors which classified as biosensor-based methods; enzyme-linked immunosorbent assay (ELISA) and lateral flow immunoassay which classified as immunological-based methods. In general, rapid detection methods are generally time-efficient, sensitive, specific and labor-saving. The developments of rapid detection methods are vital in prevention and treatment of foodborne diseases. PMID:25628612

  8. SAS Code for Calculating Intraclass Correlation Coefficients and Effect Size Benchmarks for Site-Randomized Education Experiments

    ERIC Educational Resources Information Center

    Brandon, Paul R.; Harrison, George M.; Lawton, Brian E.

    2013-01-01

    When evaluators plan site-randomized experiments, they must conduct the appropriate statistical power analyses. These analyses are most likely to be valid when they are based on data from the jurisdictions in which the studies are to be conducted. In this method note, we provide software code, in the form of a SAS macro, for producing statistical…

  9. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    PubMed Central

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.

    2016-01-01

    ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. The metabolite, protein, and lipid extraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental, in vitro, and clinical). IMPORTANCE In systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample. Author Video: An author video summary of this article is available. PMID:27822525

  10. Coordinate based random effect size meta-analysis of neuroimaging studies.

    PubMed

    Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J

    2017-06-01

    Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. The SPAR thermal analyzer: Present and future

    NASA Astrophysics Data System (ADS)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  12. The SPAR thermal analyzer: Present and future

    NASA Technical Reports Server (NTRS)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    1982-01-01

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  13. Wavelet threshold method of resolving noise interference in periodic short-impulse signals chaotic detection

    NASA Astrophysics Data System (ADS)

    Deng, Ke; Zhang, Lu; Luo, Mao-Kang

    2010-03-01

    The chaotic oscillator has already been considered as a powerful method to detect weak signals, even weak signals accompanied with noises. However, many examples, analyses and simulations indicate that chaotic oscillator detection system cannot guarantee the immunity to noises (even white noise). In fact the randomness of noises has a serious or even a destructive effect on the detection results in many cases. To solve this problem, we present a new detecting method based on wavelet threshold processing that can detect the chaotic weak signal accompanied with noise. All theoretical analyses and simulation experiments indicate that the new method reduces the noise interferences to detection significantly, thereby making the corresponding chaotic oscillator that detects the weak signals accompanied with noises more stable and reliable.

  14. Mean-variance portfolio optimization by using time series approaches based on logarithmic utility function

    NASA Astrophysics Data System (ADS)

    Soeryana, E.; Fadhlina, N.; Sukono; Rusyaman, E.; Supian, S.

    2017-01-01

    Investments in stocks investors are also faced with the issue of risk, due to daily price of stock also fluctuate. For minimize the level of risk, investors usually forming an investment portfolio. Establishment of a portfolio consisting of several stocks are intended to get the optimal composition of the investment portfolio. This paper discussed about optimizing investment portfolio of Mean-Variance to stocks by using mean and volatility is not constant based on logarithmic utility function. Non constant mean analysed using models Autoregressive Moving Average (ARMA), while non constant volatility models are analysed using the Generalized Autoregressive Conditional heteroscedastic (GARCH). Optimization process is performed by using the Lagrangian multiplier technique. As a numerical illustration, the method is used to analyse some Islamic stocks in Indonesia. The expected result is to get the proportion of investment in each Islamic stock analysed.

  15. Factors affecting the relationship between quantitative polymerase chain reaction (qPCR) and culture-based enumeration of Enterococcus in environmental waters.

    PubMed

    Raith, M R; Ebentier, D L; Cao, Y; Griffith, J F; Weisberg, S B

    2014-03-01

    To determine the extent to which discrepancies between qPCR and culture-based results in beach water quality monitoring can be attributed to: (i) within-method variability, (ii) between-method difference within each method class (qPCR or culture) and (iii) between-class difference. We analysed 306 samples using two culture-based (EPA1600 and Enterolert) and two qPCR (Taqman and Scorpion) methods, each in duplicate. Both qPCR methods correlated with EPA1600, but regression analyses indicated approximately 0·8 log10 unit overestimation by qPCR compared to culture methods. Differences between methods within a class were less than half of this and were minimal for between-replicate within a method. Using the 104 Enterococcus per 100 ml management decision threshold, Taqman qPCR indicated the same decisions as EPA1600 for 87% of the samples, but indicated beach posting for unhealthful water when EPA1600 did not for 12% of the samples. After accounting for within-method and within-class variability, 8% of the samples exhibited true between-class discrepancy where both qPCR methods indicated beach posting while both culture methods did not. Measurement target difference (DNA vs growth) accounted for the majority of the qPCR-vs-culture discrepancy, but its influence on monitoring application is outweighed by frequent incorrect posting with culture methods due to incubation time delay. This is the first study to quantify the frequency with which culture-vs-qPCR discrepancies can be attributed to target difference - vs - method variability. © 2013 The Society for Applied Microbiology.

  16. Designing for Learner Engagement with Computer Based Testing

    ERIC Educational Resources Information Center

    Walker, Richard; Handley, Zoe

    2016-01-01

    The issues influencing student engagement with high-stakes computer-based exams were investigated, drawing on feedback from two cohorts of international MA Education students encountering this assessment method for the first time. Qualitative data from surveys and focus groups on the students' examination experience were analysed, leading to the…

  17. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    ERIC Educational Resources Information Center

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  18. 40 CFR 98.147 - Records that must be retained.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (metric tons). (3) Data on carbonate-based mineral mass fractions provided by the raw material supplier... of this subpart. (4) Results of all tests used to verify the carbonate-based mineral mass fraction...(s), and any variations of the methods, used in the analyses. (iii) Mass fraction of each sample...

  19. 40 CFR 98.147 - Records that must be retained.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (metric tons). (3) Data on carbonate-based mineral mass fractions provided by the raw material supplier... of this subpart. (4) Results of all tests used to verify the carbonate-based mineral mass fraction...(s), and any variations of the methods, used in the analyses. (iii) Mass fraction of each sample...

  20. 40 CFR 98.147 - Records that must be retained.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (metric tons). (3) Data on carbonate-based mineral mass fractions provided by the raw material supplier... of this subpart. (4) Results of all tests used to verify the carbonate-based mineral mass fraction...(s), and any variations of the methods, used in the analyses. (iii) Mass fraction of each sample...

  1. User's guide: RPGrow$: a red pine growth and analysis spreadsheet for the Lake States.

    Treesearch

    Carol A. Hyldahl; Gerald H. Grossman

    1993-01-01

    Describes RPGrow$, a stand-level, interactive spreadsheet for projecting growth and yield and estimating financial returns of red pine plantations in the Lake States. This spreadsheet is based on published growth models for red pine. Financial analyses are based on discounted cash flow methods.

  2. In-service teachers' perceptions of project-based learning.

    PubMed

    Habók, Anita; Nagy, Judit

    2016-01-01

    The study analyses teachers' perceptions of methods, teacher roles, success and evaluation in PBL and traditional classroom instruction. The analysis is based on empirical data collected in primary schools and vocational secondary schools. An analysis of 109 questionnaires revealed numerous differences based on degree of experience and type of school. In general, project-based methods were preferred among teachers, who mostly perceived themselves as facilitators and considered motivation and transmission of values central to their work. Teachers appeared not to capitalize on the use of ICT tools or emotions. Students actively participated in the evaluation process via oral evaluation.

  3. Agreement between methods of measurement of mean aortic wall thickness by MRI.

    PubMed

    Rosero, Eric B; Peshock, Ronald M; Khera, Amit; Clagett, G Patrick; Lo, Hao; Timaran, Carlos

    2009-03-01

    To assess the agreement between three methods of calculation of mean aortic wall thickness (MAWT) using magnetic resonance imaging (MRI). High-resolution MRI of the infrarenal abdominal aorta was performed on 70 subjects with a history of coronary artery disease who were part of a multi-ethnic population-based sample. MAWT was calculated as the mean distance between the adventitial and luminal aortic boundaries using three different methods: average distance at four standard positions (AWT-4P), average distance at 100 automated positions (AWT-100P), and using a mathematical computation derived from the total vessel and luminal areas (AWT-VA). Bland-Altman plots and Passing-Bablok regression analyses were used to assess agreement between methods. Bland-Altman analyses demonstrated a positive bias of 3.02+/-7.31% between the AWT-VA and the AWT-4P methods, and of 1.76+/-6.82% between the AWT-100P and the AWT-4P methods. Passing-Bablok regression analyses demonstrated constant bias between the AWT-4P method and the other two methods. Proportional bias was, however, not evident among the three methods. MRI methods of measurement of MAWT using a limited number of positions of the aortic wall systematically underestimate the MAWT value compared with the method that calculates MAWT from the vessel areas. Copyright (c) 2009 Wiley-Liss, Inc.

  4. Corolla morphology influences diversification rates in bifid toadflaxes (Linaria sect. Versicolores)

    PubMed Central

    Fernández-Mazuecos, Mario; Blanco-Pastor, José Luis; Gómez, José M.; Vargas, Pablo

    2013-01-01

    Background and Aims The role of flower specialization in plant speciation and evolution remains controversial. In this study the evolution of flower traits restricting access to pollinators was analysed in the bifid toadflaxes (Linaria sect. Versicolores), a monophyletic group of ∼30 species and subspecies with highly specialized corollas. Methods A time-calibrated phylogeny based on both nuclear and plastid DNA sequences was obtained using a coalescent-based method, and flower morphology was characterized by means of morphometric analyses. Directional trends in flower shape evolution and trait-dependent diversification rates were jointly analysed using recently developed methods, and morphological shifts were reconstructed along the phylogeny. Pollinator surveys were conducted for a representative sample of species. Key Results A restrictive character state (narrow corolla tube) was reconstructed in the most recent common ancestor of Linaria sect. Versicolores. After its early loss in the most species-rich clade, this character state has been convergently reacquired in multiple lineages of this clade in recent times, yet it seems to have exerted a negative influence on diversification rates. Comparative analyses and pollinator surveys suggest that the narrow- and broad-tubed flowers are evolutionary optima representing divergent strategies of pollen placement on nectar-feeding insects. Conclusions The results confirm that different forms of floral specialization can lead to dissimilar evolutionary success in terms of diversification. It is additionally suggested that opposing individual-level and species-level selection pressures may have driven the evolution of pollinator-restrictive traits in bifid toadflaxes. PMID:24142920

  5. Health economics and outcomes methods in risk-based decision-making for blood safety.

    PubMed

    Custer, Brian; Janssen, Mart P

    2015-08-01

    Analytical methods appropriate for health economic assessments of transfusion safety interventions have not previously been described in ways that facilitate their use. Within the context of risk-based decision-making (RBDM), health economics can be important for optimizing decisions among competing interventions. The objective of this review is to address key considerations and limitations of current methods as they apply to blood safety. Because a voluntary blood supply is an example of a public good, analyses should be conducted from the societal perspective when possible. Two primary study designs are recommended for most blood safety intervention assessments: budget impact analysis (BIA), which measures the cost to implement an intervention both to the blood operator but also in a broader context, and cost-utility analysis (CUA), which measures the ratio between costs and health gain achieved, in terms of reduced morbidity and mortality, by use of an intervention. These analyses often have important limitations because data that reflect specific aspects, for example, blood recipient population characteristics or complication rates, are not available. Sensitivity analyses play an important role. The impact of various uncertain factors can be studied conjointly in probabilistic sensitivity analyses. The use of BIA and CUA together provides a comprehensive assessment of the costs and benefits from implementing (or not) specific interventions. RBDM is multifaceted and impacts a broad spectrum of stakeholders. Gathering and analyzing health economic evidence as part of the RBDM process enhances the quality, completeness, and transparency of decision-making. © 2015 AABB.

  6. Integrated analysis of large space systems

    NASA Technical Reports Server (NTRS)

    Young, J. P.

    1980-01-01

    Based on the belief that actual flight hardware development of large space systems will necessitate a formalized method of integrating the various engineering discipline analyses, an efficient highly user oriented software system capable of performing interdisciplinary design analyses with tolerable solution turnaround time is planned Specific analysis capability goals were set forth with initial emphasis given to sequential and quasi-static thermal/structural analysis and fully coupled structural/control system analysis. Subsequently, the IAC would be expanded to include a fully coupled thermal/structural/control system, electromagnetic radiation, and optical performance analyses.

  7. Time-frequency analysis based on ensemble local mean decomposition and fast kurtogram for rotating machinery fault diagnosis

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Liu, Zhiwen; Miao, Qiang; Zhang, Xin

    2018-03-01

    A time-frequency analysis method based on ensemble local mean decomposition (ELMD) and fast kurtogram (FK) is proposed for rotating machinery fault diagnosis. Local mean decomposition (LMD), as an adaptive non-stationary and nonlinear signal processing method, provides the capability to decompose multicomponent modulation signal into a series of demodulated mono-components. However, the occurring mode mixing is a serious drawback. To alleviate this, ELMD based on noise-assisted method was developed. Still, the existing environmental noise in the raw signal remains in corresponding PF with the component of interest. FK has good performance in impulse detection while strong environmental noise exists. But it is susceptible to non-Gaussian noise. The proposed method combines the merits of ELMD and FK to detect the fault for rotating machinery. Primarily, by applying ELMD the raw signal is decomposed into a set of product functions (PFs). Then, the PF which mostly characterizes fault information is selected according to kurtosis index. Finally, the selected PF signal is further filtered by an optimal band-pass filter based on FK to extract impulse signal. Fault identification can be deduced by the appearance of fault characteristic frequencies in the squared envelope spectrum of the filtered signal. The advantages of ELMD over LMD and EEMD are illustrated in the simulation analyses. Furthermore, the efficiency of the proposed method in fault diagnosis for rotating machinery is demonstrated on gearbox case and rolling bearing case analyses.

  8. A Survey of Current and Projected Ethical Dilemmas of Rehabilitation Counselors

    ERIC Educational Resources Information Center

    Hartley, Michael T.; Cartwright, Brenda Y.

    2016-01-01

    Purpose: This study surveyed current and projected ethical dilemmas of rehabilitation counselors. Method: As a mixed-methods approach, the study used both quantitative and qualitative analyses. Results: Of the 211 participants who completed the survey, 116 (55.0%) reported an ethical dilemma. Based on the descriptions, common themes involved roles…

  9. An Empirical Comparison of Heterogeneity Variance Estimators in 12,894 Meta-Analyses

    ERIC Educational Resources Information Center

    Langan, Dean; Higgins, Julian P. T.; Simmonds, Mark

    2015-01-01

    Heterogeneity in meta-analysis is most commonly estimated using a moment-based approach described by DerSimonian and Laird. However, this method has been shown to produce biased estimates. Alternative methods to estimate heterogeneity include the restricted maximum likelihood approach and those proposed by Paule and Mandel, Sidik and Jonkman, and…

  10. Information Work Analysis: An Approach to Research on Information Interactions and Information Behaviour in Context

    ERIC Educational Resources Information Center

    Huvila, Isto

    2008-01-01

    Introduction: A work roles and role theory-based approach to conceptualise human information activity, denoted information work analysis is discussed. The present article explicates the approach and its special characteristics and benefits in comparison to earlier methods of analysing human information work. Method: The approach is discussed in…

  11. Application of a label-free, gel-free quantitative proteomics method for ecotoxicological studies of small fish species

    EPA Science Inventory

    Although two-dimensional electrophoresis (2D-GE) remains the basis for many ecotoxicoproteomic analyses, new, non gel-based methods are beginning to be applied to overcome throughput and coverage limitations of 2D-GE. The overall objective of our research was to apply a comprehe...

  12. Equivalent orthotropic elastic moduli identification method for laminated electrical steel sheets

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Nishikawa, Yasunari; Yamasaki, Shintaro; Fujita, Kikuo; Kawamoto, Atsushi; Kuroishi, Masakatsu; Nakai, Hideo

    2016-05-01

    In this paper, a combined numerical-experimental methodology for the identification of elastic moduli of orthotropic media is presented. Special attention is given to the laminated electrical steel sheets, which are modeled as orthotropic media with nine independent engineering elastic moduli. The elastic moduli are determined specifically for use with finite element vibration analyses. We propose a three-step methodology based on a conventional nonlinear least squares fit between measured and computed natural frequencies. The methodology consists of: (1) successive augmentations of the objective function by increasing the number of modes, (2) initial condition updates, and (3) appropriate selection of the natural frequencies based on their sensitivities on the elastic moduli. Using the results of numerical experiments, it is shown that the proposed method achieves more accurate converged solution than a conventional approach. Finally, the proposed method is applied to measured natural frequencies and mode shapes of the laminated electrical steel sheets. It is shown that the method can successfully identify the orthotropic elastic moduli that can reproduce the measured natural frequencies and frequency response functions by using finite element analyses with a reasonable accuracy.

  13. Exchange coupling and magnetic anisotropy of exchanged-biased quantum tunnelling single-molecule magnet Ni3Mn2 complexes using theoretical methods based on Density Functional Theory.

    PubMed

    Gómez-Coca, Silvia; Ruiz, Eliseo

    2012-03-07

    The magnetic properties of a new family of single-molecule magnet Ni(3)Mn(2) complexes were studied using theoretical methods based on Density Functional Theory (DFT). The first part of this study is devoted to analysing the exchange coupling constants, focusing on the intramolecular as well as the intermolecular interactions. The calculated intramolecular J values were in excellent agreement with the experimental data, which show that all the couplings are ferromagnetic, leading to an S = 7 ground state. The intermolecular interactions were investigated because the two complexes studied do not show tunnelling at zero magnetic field. Usually, this exchange-biased quantum tunnelling is attributed to the presence of intermolecular interactions calculated with the help of theoretical methods. The results indicate the presence of weak intermolecular antiferromagnetic couplings that cannot explain the ferromagnetic value found experimentally for one of the systems. In the second part, the goal is to analyse magnetic anisotropy through the calculation of the zero-field splitting parameters (D and E), using DFT methods including the spin-orbit effect.

  14. The analysis of verbal interaction sequences in dyadic clinical communication: a review of methods.

    PubMed

    Connor, Martin; Fletcher, Ian; Salmon, Peter

    2009-05-01

    To identify methods available for sequential analysis of dyadic verbal clinical communication and to review their methodological and conceptual differences. Critical review, based on literature describing sequential analyses of clinical and other relevant social interaction. Dominant approaches are based on analysis of communication according to its precise position in the series of utterances that constitute event-coded dialogue. For practical reasons, methods focus on very short-term processes, typically the influence of one party's speech on what the other says next. Studies of longer-term influences are rare. Some analyses have statistical limitations, particularly in disregarding heterogeneity between consultations, patients or practitioners. Additional techniques, including ones that can use information about timing and duration of speech from interval-coding are becoming available. There is a danger that constraints of commonly used methods shape research questions and divert researchers from potentially important communication processes including ones that operate over a longer-term than one or two speech turns. Given that no one method can model the complexity of clinical communication, multiple methods, both quantitative and qualitative, are necessary. Broadening the range of methods will allow the current emphasis on exploratory studies to be balanced by tests of hypotheses about clinically important communication processes.

  15. Recombinant plasmid-based quantitative Real-Time PCR analysis of Salmonella enterica serotypes and its application to milk samples.

    PubMed

    Gokduman, Kurtulus; Avsaroglu, M Dilek; Cakiris, Aris; Ustek, Duran; Gurakan, G Candan

    2016-03-01

    The aim of the current study was to develop, a new, rapid, sensitive and quantitative Salmonella detection method using a Real-Time PCR technique based on an inexpensive, easy to produce, convenient and standardized recombinant plasmid positive control. To achieve this, two recombinant plasmids were constructed as reference molecules by cloning the two most commonly used Salmonella-specific target gene regions, invA and ttrRSBC. The more rapid detection enabled by the developed method (21 h) compared to the traditional culture method (90 h) allows the quantitative evaluation of Salmonella (quantification limits of 10(1)CFU/ml and 10(0)CFU/ml for the invA target and the ttrRSBC target, respectively), as illustrated using milk samples. Three advantages illustrated by the current study demonstrate the potential of the newly developed method to be used in routine analyses in the medical, veterinary, food and water/environmental sectors: I--The method provides fast analyses including the simultaneous detection and determination of correct pathogen counts; II--The method is applicable to challenging samples, such as milk; III--The method's positive controls (recombinant plasmids) are reproducible in large quantities without the need to construct new calibration curves. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Nonlinear Earthquake Analysis of Reinforced Concrete Frames with Fiber and Bernoulli-Euler Beam-Column Element

    PubMed Central

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched. PMID:24578667

  17. On the use of log-transformation vs. nonlinear regression for analyzing biological power laws.

    PubMed

    Xiao, Xiao; White, Ethan P; Hooten, Mevin B; Durham, Susan L

    2011-10-01

    Power-law relationships are among the most well-studied functional relationships in biology. Recently the common practice of fitting power laws using linear regression (LR) on log-transformed data has been criticized, calling into question the conclusions of hundreds of studies. It has been suggested that nonlinear regression (NLR) is preferable, but no rigorous comparison of these two methods has been conducted. Using Monte Carlo simulations, we demonstrate that the error distribution determines which method performs better, with NLR better characterizing data with additive, homoscedastic, normal error and LR better characterizing data with multiplicative, heteroscedastic, lognormal error. Analysis of 471 biological power laws shows that both forms of error occur in nature. While previous analyses based on log-transformation appear to be generally valid, future analyses should choose methods based on a combination of biological plausibility and analysis of the error distribution. We provide detailed guidelines and associated computer code for doing so, including a model averaging approach for cases where the error structure is uncertain.

  18. Conducting Meta-Analyses Based on p Values

    PubMed Central

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  19. Microplate-based filter paper assay to measure total cellulase activity.

    PubMed

    Xiao, Zhizhuang; Storms, Reginald; Tsang, Adrian

    2004-12-30

    The standard filter paper assay (FPA) published by the International Union of Pure and Applied Chemistry (IUPAC) is widely used to determine total cellulase activity. However, the IUPAC method is not suitable for the parallel analyses of large sample numbers. We describe here a microplate-based method for assaying large sample numbers. To achieve this, we reduced the enzymatic reaction volume to 60 microl from the 1.5 ml used in the IUPAC method. The modified 60-microl format FPA can be carried out in 96-well assay plates. Statistical analyses showed that the cellulase activities of commercial cellulases from Trichoderma reesei and Aspergillus species determined with our 60-microl format FPA were not significantly different from the activities measured with the standard FPA. Our results also indicate that the 60-microl format FPA is quantitative and highly reproducible. Moreover, the addition of excess beta-glucosidase increased the sensitivity of the assay by up to 60%. 2004 Wiley Periodicals, Inc.

  20. An efficient temporal database design method based on EER

    NASA Astrophysics Data System (ADS)

    Liu, Zhi; Huang, Jiping; Miao, Hua

    2007-12-01

    Many existing methods of modeling temporal information are based on logical model, which makes relational schema optimization more difficult and more complicated. In this paper, based on the conventional EER model, the author attempts to analyse and abstract temporal information in the phase of conceptual modelling according to the concrete requirement to history information. Then a temporal data model named BTEER is presented. BTEER not only retains all designing ideas and methods of EER which makes BTEER have good upward compatibility, but also supports the modelling of valid time and transaction time effectively at the same time. In addition, BTEER can be transformed to EER easily and automatically. It proves in practice, this method can model the temporal information well.

  1. A computer program to obtain time-correlated gust loads for nonlinear aircraft using the matched-filter-based method

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III

    1994-01-01

    NASA Langley Research Center has, for several years, conducted research in the area of time-correlated gust loads for linear and nonlinear aircraft. The results of this work led NASA to recommend that the Matched-Filter-Based One-Dimensional Search Method be used for gust load analyses of nonlinear aircraft. This manual describes this method, describes a FORTRAN code which performs this method, and presents example calculations for a sample nonlinear aircraft model. The name of the code is MFD1DS (Matched-Filter-Based One-Dimensional Search). The program source code, the example aircraft equations of motion, a sample input file, and a sample program output are all listed in the appendices.

  2. Visualizing dispersive features in 2D image via minimum gradient method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Yu; Wang, Yan; Shen, Zhi -Xun

    Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less

  3. Visualizing dispersive features in 2D image via minimum gradient method

    DOE PAGES

    He, Yu; Wang, Yan; Shen, Zhi -Xun

    2017-07-24

    Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less

  4. The MPLEx Protocol for Multi-omic Analyses of Soil Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicora, Carrie D.; Burnum-Johnson, Kristin E.; Nakayasu, Ernesto S.

    Mass spectrometry (MS)-based integrated metaproteomic, metabolomic and lipidomic (multi-omic) studies are transforming our ability to understand and characterize microbial communities in environmental and biological systems. These measurements are even enabling enhanced analyses of complex soil microbial communities, which are the most complex microbial systems known to date. Multi-omic analyses, however, do have sample preparation challenges since separate extractions are typically needed for each omic study, thereby greatly amplifying the preparation time and amount of sample required. To address this limitation, a 3-in-1 method for simultaneous metabolite, protein, and lipid extraction (MPLEx) from the exact same soil sample was created bymore » adapting a solvent-based approach. This MPLEx protocol has proven to be simple yet robust for many sample types and even when utilized for limited quantities of complex soil samples. The MPLEx method also greatly enabled the rapid multi-omic measurements needed to gain a better understanding of the members of each microbial community, while evaluating the changes taking place upon biological and environmental perturbations.« less

  5. Cluster analysis of European Y-chromosomal STR haplotypes using the discrete Laplace method.

    PubMed

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2014-07-01

    The European Y-chromosomal short tandem repeat (STR) haplotype distribution has previously been analysed in various ways. Here, we introduce a new way of analysing population substructure using a new method based on clustering within the discrete Laplace exponential family that models the probability distribution of the Y-STR haplotypes. Creating a consistent statistical model of the haplotypes enables us to perform a wide range of analyses. Previously, haplotype frequency estimation using the discrete Laplace method has been validated. In this paper we investigate how the discrete Laplace method can be used for cluster analysis to further validate the discrete Laplace method. A very important practical fact is that the calculations can be performed on a normal computer. We identified two sub-clusters of the Eastern and Western European Y-STR haplotypes similar to results of previous studies. We also compared pairwise distances (between geographically separated samples) with those obtained using the AMOVA method and found good agreement. Further analyses that are impossible with AMOVA were made using the discrete Laplace method: analysis of the homogeneity in two different ways and calculating marginal STR distributions. We found that the Y-STR haplotypes from e.g. Finland were relatively homogeneous as opposed to the relatively heterogeneous Y-STR haplotypes from e.g. Lublin, Eastern Poland and Berlin, Germany. We demonstrated that the observed distributions of alleles at each locus were similar to the expected ones. We also compared pairwise distances between geographically separated samples from Africa with those obtained using the AMOVA method and found good agreement. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Instance Analysis for the Error of Three-pivot Pressure Transducer Static Balancing Method for Hydraulic Turbine Runner

    NASA Astrophysics Data System (ADS)

    Weng, Hanli; Li, Youping

    2017-04-01

    The working principle, process device and test procedure of runner static balancing test method by weighting with three-pivot pressure transducers are introduced in this paper. Based on an actual instance of a V hydraulic turbine runner, the error and sensitivity of the three-pivot pressure transducer static balancing method are analysed. Suggestions about improving the accuracy and the application of the method are also proposed.

  7. Exploring the potential of analysing visual search behaviour data using FROC (free-response receiver operating characteristic) method: an initial study

    NASA Astrophysics Data System (ADS)

    Dong, Leng; Chen, Yan; Dias, Sarah; Stone, William; Dias, Joseph; Rout, John; Gale, Alastair G.

    2017-03-01

    Visual search techniques and FROC analysis have been widely used in radiology to understand medical image perceptual behaviour and diagnostic performance. The potential of exploiting the advantages of both methodologies is of great interest to medical researchers. In this study, eye tracking data of eight dental practitioners was investigated. The visual search measures and their analyses are considered here. Each participant interpreted 20 dental radiographs which were chosen by an expert dental radiologist. Various eye movement measurements were obtained based on image area of interest (AOI) information. FROC analysis was then carried out by using these eye movement measurements as a direct input source. The performance of FROC methods using different input parameters was tested. The results showed that there were significant differences in FROC measures, based on eye movement data, between groups with different experience levels. Namely, the area under the curve (AUC) score evidenced higher values for experienced group for the measurements of fixation and dwell time. Also, positive correlations were found for AUC scores between the eye movement data conducted FROC and rating based FROC. FROC analysis using eye movement measurements as input variables can act as a potential performance indicator to deliver assessment in medical imaging interpretation and assess training procedures. Visual search data analyses lead to new ways of combining eye movement data and FROC methods to provide an alternative dimension to assess performance and visual search behaviour in the area of medical imaging perceptual tasks.

  8. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  9. A methodological investigation of hominoid craniodental morphology and phylogenetics.

    PubMed

    Bjarnason, Alexander; Chamberlain, Andrew T; Lockwood, Charles A

    2011-01-01

    The evolutionary relationships of extant great apes and humans have been largely resolved by molecular studies, yet morphology-based phylogenetic analyses continue to provide conflicting results. In order to further investigate this discrepancy we present bootstrap clade support of morphological data based on two quantitative datasets, one dataset consisting of linear measurements of the whole skull from 5 hominoid genera and the second dataset consisting of 3D landmark data from the temporal bone of 5 hominoid genera, including 11 sub-species. Using similar protocols for both datasets, we were able to 1) compare distance-based phylogenetic methods to cladistic parsimony of quantitative data converted into discrete character states, 2) vary outgroup choice to observe its effect on phylogenetic inference, and 3) analyse male and female data separately to observe the effect of sexual dimorphism on phylogenies. Phylogenetic analysis was sensitive to methodological decisions, particularly outgroup selection, where designation of Pongo as an outgroup and removal of Hylobates resulted in greater congruence with the proposed molecular phylogeny. The performance of distance-based methods also justifies their use in phylogenetic analysis of morphological data. It is clear from our analyses that hominoid phylogenetics ought not to be used as an example of conflict between the morphological and molecular, but as an example of how outgroup and methodological choices can affect the outcome of phylogenetic analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. Comparative analysis of EV isolation procedures for miRNAs detection in serum samples.

    PubMed

    Andreu, Zoraida; Rivas, Eva; Sanguino-Pascual, Aitana; Lamana, Amalia; Marazuela, Mónica; González-Alvaro, Isidoro; Sánchez-Madrid, Francisco; de la Fuente, Hortensia; Yáñez-Mó, María

    2016-01-01

    Extracellular vesicles (EVs) are emerging as potent non-invasive biomarkers. However, current methodologies are time consuming and difficult to translate to clinical practice. To analyse EV-encapsulated circulating miRNA, we searched for a quick, easy and economic method to enrich frozen human serum samples for EV. We compared the efficiency of several protocols and commercial kits to isolate EVs. Different methods based on precipitation, columns or filter systems were tested and compared with ultracentrifugation, which is the most classical protocol to isolate EVs. EV samples were assessed for purity and quantity by nanoparticle tracking analysis and western blot or cytometry against major EV protein markers. For biomarker validation, levels of a set of miRNAs were determined in EV fractions and compared with their levels in total serum. EVs isolated with precipitation-based methods were enriched for a subgroup of miRNAs that corresponded to miRNAs described to be encapsulated into EVs (miR-126, miR-30c and miR-143), while the detection of miR-21, miR-16-5p and miR-19a was very low compared with total serum. Our results point to precipitation using polyethylene glycol (PEG) as a suitable method for an easy and cheap enrichment of serum EVs for miRNA analyses. The overall performance of PEG was very similar, or better than other commercial precipitating reagents, in both protein and miRNA yield, but in comparison to them PEG is much cheaper. Other methods presented poorer results, mostly when assessing miRNA by qPCR analyses. Using PEG precipitation in a longitudinal study with human samples, we demonstrated that miRNA could be assessed in frozen samples up to 8 years of storage. We report a method based on a cut-off value of mean of fold EV detection versus serum that provides an estimate of the degree of encapsulation of a given miRNA.

  11. Reporting quality of statistical methods in surgical observational studies: protocol for systematic review.

    PubMed

    Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume

    2014-06-28

    Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.

  12. Simple Sodium Dodecyl Sulfate-Assisted Sample Preparation Method for LC-MS-based Proteomic Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jianying; Dann, Geoffrey P.; Shi, Tujin

    2012-03-10

    Sodium dodecyl sulfate (SDS) is one of the most popular laboratory reagents used for highly efficient biological sample extraction; however, SDS presents a significant challenge to LC-MS-based proteomic analyses due to its severe interference with reversed-phase LC separations and electrospray ionization interfaces. This study reports a simple SDS-assisted proteomic sample preparation method facilitated by a novel peptide-level SDS removal protocol. After SDS-assisted protein extraction and digestion, SDS was effectively (>99.9%) removed from peptides through ion substitution-mediated DS- precipitation with potassium chloride (KCl) followed by {approx}10 min centrifugation. Excellent peptide recovery (>95%) was observed for less than 20 {mu}g of peptides.more » Further experiments demonstrated the compatibility of this protocol with LC-MS/MS analyses. The resulting proteome coverage from this SDS-assisted protocol was comparable to or better than those obtained from other standard proteomic preparation methods in both mammalian tissues and bacterial samples. These results suggest that this SDS-assisted protocol is a practical, simple, and broadly applicable proteomic sample processing method, which can be particularly useful when dealing with samples difficult to solubilize by other methods.« less

  13. Component isolation for multi-component signal analysis using a non-parametric gaussian latent feature model

    NASA Astrophysics Data System (ADS)

    Yang, Yang; Peng, Zhike; Dong, Xingjian; Zhang, Wenming; Clifton, David A.

    2018-03-01

    A challenge in analysing non-stationary multi-component signals is to isolate nonlinearly time-varying signals especially when they are overlapped in time and frequency plane. In this paper, a framework integrating time-frequency analysis-based demodulation and a non-parametric Gaussian latent feature model is proposed to isolate and recover components of such signals. The former aims to remove high-order frequency modulation (FM) such that the latter is able to infer demodulated components while simultaneously discovering the number of the target components. The proposed method is effective in isolating multiple components that have the same FM behavior. In addition, the results show that the proposed method is superior to generalised demodulation with singular-value decomposition-based method, parametric time-frequency analysis with filter-based method and empirical model decomposition base method, in recovering the amplitude and phase of superimposed components.

  14. Statistical strategies to quantify respiratory sinus arrhythmia: Are commonly used metrics equivalent?

    PubMed Central

    Lewis, Gregory F.; Furman, Senta A.; McCool, Martha F.; Porges, Stephen W.

    2011-01-01

    Three frequently used RSA metrics are investigated to document violations of assumptions for parametric analyses, moderation by respiration, influences of nonstationarity, and sensitivity to vagal blockade. Although all metrics are highly correlated, new findings illustrate that the metrics are noticeably different on the above dimensions. Only one method conforms to the assumptions for parametric analyses, is not moderated by respiration, is not influenced by nonstationarity, and reliably generates stronger effect sizes. Moreover, this method is also the most sensitive to vagal blockade. Specific features of this method may provide insights into improving the statistical characteristics of other commonly used RSA metrics. These data provide the evidence to question, based on statistical grounds, published reports using particular metrics of RSA. PMID:22138367

  15. Partial differential equation techniques for analysing animal movement: A comparison of different methods.

    PubMed

    Wang, Yi-Shan; Potts, Jonathan R

    2017-03-07

    Recent advances in animal tracking have allowed us to uncover the drivers of movement in unprecedented detail. This has enabled modellers to construct ever more realistic models of animal movement, which aid in uncovering detailed patterns of space use in animal populations. Partial differential equations (PDEs) provide a popular tool for mathematically analysing such models. However, their construction often relies on simplifying assumptions which may greatly affect the model outcomes. Here, we analyse the effect of various PDE approximations on the analysis of some simple movement models, including a biased random walk, central-place foraging processes and movement in heterogeneous landscapes. Perhaps the most commonly-used PDE method dates back to a seminal paper of Patlak from 1953. However, our results show that this can be a very poor approximation in even quite simple models. On the other hand, more recent methods, based on transport equation formalisms, can provide more accurate results, as long as the kernel describing the animal's movement is sufficiently smooth. When the movement kernel is not smooth, we show that both the older and newer methods can lead to quantitatively misleading results. Our detailed analysis will aid future researchers in the appropriate choice of PDE approximation for analysing models of animal movement. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Displacement-based back-analysis of the model parameters of the Nuozhadu high earth-rockfill dam.

    PubMed

    Wu, Yongkang; Yuan, Huina; Zhang, Bingyin; Zhang, Zongliang; Yu, Yuzhen

    2014-01-01

    The parameters of the constitutive model, the creep model, and the wetting model of materials of the Nuozhadu high earth-rockfill dam were back-analyzed together based on field monitoring displacement data by employing an intelligent back-analysis method. In this method, an artificial neural network is used as a substitute for time-consuming finite element analysis, and an evolutionary algorithm is applied for both network training and parameter optimization. To avoid simultaneous back-analysis of many parameters, the model parameters of the three main dam materials are decoupled and back-analyzed separately in a particular order. Displacement back-analyses were performed at different stages of the construction period, with and without considering the creep and wetting deformations. Good agreement between the numerical results and the monitoring data was obtained for most observation points, which implies that the back-analysis method and decoupling method are effective for solving complex problems with multiple models and parameters. The comparison of calculation results based on different sets of back-analyzed model parameters indicates the necessity of taking the effects of creep and wetting into consideration in the numerical analyses of high earth-rockfill dams. With the resulting model parameters, the stress and deformation distributions at completion are predicted and analyzed.

  17. Fast determination of royal jelly freshness by a chromogenic reaction.

    PubMed

    Zheng, Huo-Qing; Wei, Wen-Ting; Wu, Li-Ming; Hu, Fu-Liang; Dietemann, Vincent

    2012-06-01

    Royal jelly is one of the most important products of honeybees. Given its role in development of bee brood into fertile individuals of the royal caste it is also used in health products for human consumption. Royal jelly spoils and loses its health-promoting properties depending on storage duration and conditions. To ensure product quality before selling, it is therefore necessary to assess royal jelly freshness. Many indexes of freshness have been suggested, but they all lack reliability or require complex and time-consuming analyses. Here we describe a method to detect royal jelly freshness based on a chromogenic reaction between royal jelly and HCl. We demonstrate that analyses based on color parameters allow for the discrimination of royal jelly samples based on the duration of their storage. Color parameters of royal jelly stored at -18 and 4 °C for 28 d remained comparable to that of fresh samples, which supports the reliability of the method. The method of freshness determination described is practical, cheap, and fast and can thus be used in real-time when trading royal jelly. The method developed can be used to assess royal jelly freshness. It is practical, cheap, and fast and can thus be used in real-time when trading royal jelly. © 2012 Institute of Food Technologists®

  18. Recent advances in the analysis of behavioural organization and interpretation as indicators of animal welfare

    PubMed Central

    Asher, Lucy; Collins, Lisa M.; Ortiz-Pelaez, Angel; Drewe, Julian A.; Nicol, Christine J.; Pfeiffer, Dirk U.

    2009-01-01

    While the incorporation of mathematical and engineering methods has greatly advanced in other areas of the life sciences, they have been under-utilized in the field of animal welfare. Exceptions are beginning to emerge and share a common motivation to quantify ‘hidden’ aspects in the structure of the behaviour of an individual, or group of animals. Such analyses have the potential to quantify behavioural markers of pain and stress and quantify abnormal behaviour objectively. This review seeks to explore the scope of such analytical methods as behavioural indicators of welfare. We outline four classes of analyses that can be used to quantify aspects of behavioural organization. The underlying principles, possible applications and limitations are described for: fractal analysis, temporal methods, social network analysis, and agent-based modelling and simulation. We hope to encourage further application of analyses of behavioural organization by highlighting potential applications in the assessment of animal welfare, and increasing awareness of the scope for the development of new mathematical methods in this area. PMID:19740922

  19. Tolerance analyses of a quadrupole magnet for advanced photon source upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, J., E-mail: Jieliu@aps.anl.gov; Jaski, M., E-mail: jaski@aps.anl.gov; Borland, M., E-mail: borland@aps.anl.gov

    2016-07-27

    Given physics requirements, the mechanical fabrication and assembly tolerances for storage ring magnets can be calculated using analytical methods [1, 2]. However, this method is not easy for complicated magnet designs [1]. In this paper, a novel method is proposed to determine fabrication and assembly tolerances consistent with physics requirements, through a combination of magnetic and mechanical tolerance analyses. In this study, finite element analysis using OPERA is conducted to estimate the effect of fabrication and assembly errors on the magnetic field of a quadrupole magnet and to determine the allowable tolerances to achieve the specified magnetic performances. Based onmore » the study, allowable fabrication and assembly tolerances for the quadrupole assembly are specified for the mechanical design of the quadrupole magnet. Next, to achieve the required assembly level tolerances, mechanical tolerance stackup analyses using a 3D tolerance analysis package are carried out to determine the part and subassembly level fabrication tolerances. This method can be used to determine the tolerances for design of other individual magnets and of magnet strings.« less

  20. A novel approach for modelling vegetation distributions and analysing vegetation sensitivity through trait-climate relationships in China

    PubMed Central

    Yang, Yanzheng; Zhu, Qiuan; Peng, Changhui; Wang, Han; Xue, Wei; Lin, Guanghui; Wen, Zhongming; Chang, Jie; Wang, Meng; Liu, Guobin; Li, Shiqing

    2016-01-01

    Increasing evidence indicates that current dynamic global vegetation models (DGVMs) have suffered from insufficient realism and are difficult to improve, particularly because they are built on plant functional type (PFT) schemes. Therefore, new approaches, such as plant trait-based methods, are urgently needed to replace PFT schemes when predicting the distribution of vegetation and investigating vegetation sensitivity. As an important direction towards constructing next-generation DGVMs based on plant functional traits, we propose a novel approach for modelling vegetation distributions and analysing vegetation sensitivity through trait-climate relationships in China. The results demonstrated that a Gaussian mixture model (GMM) trained with a LMA-Nmass-LAI data combination yielded an accuracy of 72.82% in simulating vegetation distribution, providing more detailed parameter information regarding community structures and ecosystem functions. The new approach also performed well in analyses of vegetation sensitivity to different climatic scenarios. Although the trait-climate relationship is not the only candidate useful for predicting vegetation distributions and analysing climatic sensitivity, it sheds new light on the development of next-generation trait-based DGVMs. PMID:27052108

  1. Predictive distributions for between-study heterogeneity and simple methods for their application in Bayesian meta-analysis

    PubMed Central

    Turner, Rebecca M; Jackson, Dan; Wei, Yinghui; Thompson, Simon G; Higgins, Julian P T

    2015-01-01

    Numerous meta-analyses in healthcare research combine results from only a small number of studies, for which the variance representing between-study heterogeneity is estimated imprecisely. A Bayesian approach to estimation allows external evidence on the expected magnitude of heterogeneity to be incorporated. The aim of this paper is to provide tools that improve the accessibility of Bayesian meta-analysis. We present two methods for implementing Bayesian meta-analysis, using numerical integration and importance sampling techniques. Based on 14 886 binary outcome meta-analyses in the Cochrane Database of Systematic Reviews, we derive a novel set of predictive distributions for the degree of heterogeneity expected in 80 settings depending on the outcomes assessed and comparisons made. These can be used as prior distributions for heterogeneity in future meta-analyses. The two methods are implemented in R, for which code is provided. Both methods produce equivalent results to standard but more complex Markov chain Monte Carlo approaches. The priors are derived as log-normal distributions for the between-study variance, applicable to meta-analyses of binary outcomes on the log odds-ratio scale. The methods are applied to two example meta-analyses, incorporating the relevant predictive distributions as prior distributions for between-study heterogeneity. We have provided resources to facilitate Bayesian meta-analysis, in a form accessible to applied researchers, which allow relevant prior information on the degree of heterogeneity to be incorporated. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:25475839

  2. The effect of instructional methodology on high school students natural sciences standardized tests scores

    NASA Astrophysics Data System (ADS)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  3. Particle Morphology Analysis of Biomass Material Based on Improved Image Processing Method

    PubMed Central

    Lu, Zhaolin

    2017-01-01

    Particle morphology, including size and shape, is an important factor that significantly influences the physical and chemical properties of biomass material. Based on image processing technology, a method was developed to process sample images, measure particle dimensions, and analyse the particle size and shape distributions of knife-milled wheat straw, which had been preclassified into five nominal size groups using mechanical sieving approach. Considering the great variation of particle size from micrometer to millimeter, the powders greater than 250 μm were photographed by a flatbed scanner without zoom function, and the others were photographed using a scanning electron microscopy (SEM) with high-image resolution. Actual imaging tests confirmed the excellent effect of backscattered electron (BSE) imaging mode of SEM. Particle aggregation is an important factor that affects the recognition accuracy of the image processing method. In sample preparation, the singulated arrangement and ultrasonic dispersion methods were used to separate powders into particles that were larger and smaller than the nominal size of 250 μm. In addition, an image segmentation algorithm based on particle geometrical information was proposed to recognise the finer clustered powders. Experimental results demonstrated that the improved image processing method was suitable to analyse the particle size and shape distributions of ground biomass materials and solve the size inconsistencies in sieving analysis. PMID:28298925

  4. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  5. A ‘reader’ unit of the chemical computer

    PubMed Central

    Smelov, Pavel S.

    2018-01-01

    We suggest the main principals and functional units of the parallel chemical computer, namely, (i) a generator (which is a network of coupled oscillators) of oscillatory dynamic modes, (ii) a unit which is able to recognize these modes (a ‘reader’) and (iii) a decision-making unit, which analyses the current mode, compares it with the external signal and sends a command to the mode generator to switch it to the other dynamical regime. Three main methods of the functioning of the reader unit are suggested and tested computationally: (a) the polychronization method, which explores the differences between the phases of the generator oscillators; (b) the amplitude method which detects clusters of the generator and (c) the resonance method which is based on the resonances between the frequencies of the generator modes and the internal frequencies of the damped oscillations of the reader cells. Pro and contra of these methods have been analysed. PMID:29410852

  6. The Establishment of LTO Emission Inventory of Civil Aviation Airports Based on Big Data

    NASA Astrophysics Data System (ADS)

    Lu, Chengwei; Liu, Hefan; Song, Danlin; Yang, Xinyue; Tan, Qinwen; Hu, Xiang; Kang, Xue

    2018-03-01

    An estimation model on LTO emissions of civil aviation airports was developed in this paper, LTO big data was acquired by analysing the internet with Python, while the LTO emissions was dynamically calculated based on daily LTO data, an uncertainty analysis was conducted with Monte Carlo method. Through the model, the emission of LTO in Shuangliu International Airport was calculated, and the characteristics and temporal distribution of LTO in 2015 was analysed. Results indicates that compared with the traditional methods, the model established can calculate the LTO emissions from different types of airplanes more accurately. Based on the hourly LTO information of 302 valid days, it was obtained that the total number of LTO cycles in Chengdu Shuangliu International Airport was 274,645 and the annual amount of emission of SO2, NOx, VOCs, CO, PM10 and PM2.5 was estimated, and the uncertainty of the model was around 7% to 10% varies on pollutants.

  7. Estimating population diversity with CatchAll

    PubMed Central

    Bunge, John; Woodard, Linda; Böhning, Dankmar; Foster, James A.; Connolly, Sean; Allen, Heather K.

    2012-01-01

    Motivation: The massive data produced by next-generation sequencing require advanced statistical tools. We address estimating the total diversity or species richness in a population. To date, only relatively simple methods have been implemented in available software. There is a need for software employing modern, computationally intensive statistical analyses including error, goodness-of-fit and robustness assessments. Results: We present CatchAll, a fast, easy-to-use, platform-independent program that computes maximum likelihood estimates for finite-mixture models, weighted linear regression-based analyses and coverage-based non-parametric methods, along with outlier diagnostics. Given sample ‘frequency count’ data, CatchAll computes 12 different diversity estimates and applies a model-selection algorithm. CatchAll also derives discounted diversity estimates to adjust for possibly uncertain low-frequency counts. It is accompanied by an Excel-based graphics program. Availability: Free executable downloads for Linux, Windows and Mac OS, with manual and source code, at www.northeastern.edu/catchall. Contact: jab18@cornell.edu PMID:22333246

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C. L.; Funk, L. L.; Riedel, R. A.

    3He gas based neutron linear-position-sensitive detectors (LPSDs) have been applied for many neutron scattering instruments. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio on the orders of 10 5-10 6. The NGD ratios of 3He detectors need to be improved for even better scientific results from neutron scattering. Digital Signal Processing (DSP) analyses of waveforms were proposed for obtaining better NGD ratios, based on features extracted from rise-time, pulse amplitude, charge integration, a simplified Wiener filter, and the cross-correlation between individual and template waveforms of neutron and gamma events. Fisher linear discriminant analysis (FLDA)more » and three multivariate analyses (MVAs) of the features were performed. The NGD ratios are improved by about 10 2-10 3 times compared with the traditional PHA method. Finally, our results indicate the NGD capabilities of 3He tube detectors can be significantly improved with subspace-learning based methods, which may result in a reduced data-collection time and better data quality for further data reduction.« less

  9. Ab initio O(N) elongation-counterpoise method for BSSE-corrected interaction energy analyses in biosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orimoto, Yuuichi; Xie, Peng; Liu, Kai

    2015-03-14

    An Elongation-counterpoise (ELG-CP) method was developed for performing accurate and efficient interaction energy analysis and correcting the basis set superposition error (BSSE) in biosystems. The method was achieved by combining our developed ab initio O(N) elongation method with the conventional counterpoise method proposed for solving the BSSE problem. As a test, the ELG-CP method was applied to the analysis of the DNAs’ inter-strands interaction energies with respect to the alkylation-induced base pair mismatch phenomenon that causes a transition from G⋯C to A⋯T. It was found that the ELG-CP method showed high efficiency (nearly linear-scaling) and high accuracy with a negligiblymore » small energy error in the total energy calculations (in the order of 10{sup −7}–10{sup −8} hartree/atom) as compared with the conventional method during the counterpoise treatment. Furthermore, the magnitude of the BSSE was found to be ca. −290 kcal/mol for the calculation of a DNA model with 21 base pairs. This emphasizes the importance of BSSE correction when a limited size basis set is used to study the DNA models and compare small energy differences between them. In this work, we quantitatively estimated the inter-strands interaction energy for each possible step in the transition process from G⋯C to A⋯T by the ELG-CP method. It was found that the base pair replacement in the process only affects the interaction energy for a limited area around the mismatch position with a few adjacent base pairs. From the interaction energy point of view, our results showed that a base pair sliding mechanism possibly occurs after the alkylation of guanine to gain the maximum possible number of hydrogen bonds between the bases. In addition, the steps leading to the A⋯T replacement accompanied with replications were found to be unfavorable processes corresponding to ca. 10 kcal/mol loss in stabilization energy. The present study indicated that the ELG-CP method is promising for performing effective interaction energy analyses in biosystems.« less

  10. Galaxy Cluster Mass Reconstruction Project - II. Quantifying scatter and bias using contrasting mock catalogues

    DOE PAGES

    Old, L.; Wojtak, R.; Mamon, G. A.; ...

    2015-03-26

    Our paper is the second in a series in which we perform an extensive comparison of various galaxy-based cluster mass estimation techniques that utilize the positions, velocities and colours of galaxies. Our aim is to quantify the scatter, systematic bias and completeness of cluster masses derived from a diverse set of 25 galaxy-based methods using two contrasting mock galaxy catalogues based on a sophisticated halo occupation model and a semi-analytic model. Analysing 968 clusters, we find a wide range in the rms errors in log M200c delivered by the different methods (0.18–1.08 dex, i.e. a factor of ~1.5–12), with abundance-matchingmore » and richness methods providing the best results, irrespective of the input model assumptions. In addition, certain methods produce a significant number of catastrophic cases where the mass is under- or overestimated by a factor greater than 10. Given the steeply falling high-mass end of the cluster mass function, we recommend that richness- or abundance-matching-based methods are used in conjunction with these methods as a sanity check for studies selecting high-mass clusters. We also see a stronger correlation of the recovered to input number of galaxies for both catalogues in comparison with the group/cluster mass, however, this does not guarantee that the correct member galaxies are being selected. Finally, we did not observe significantly higher scatter for either mock galaxy catalogues. These results have implications for cosmological analyses that utilize the masses, richnesses, or abundances of clusters, which have different uncertainties when different methods are used.« less

  11. A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: Theory and applications.

    PubMed

    Hayashi, Ryusuke; Watanabe, Osamu; Yokoyama, Hiroki; Nishida, Shin'ya

    2017-06-01

    Characterization of the functional relationship between sensory inputs and neuronal or observers' perceptual responses is one of the fundamental goals of systems neuroscience and psychophysics. Conventional methods, such as reverse correlation and spike-triggered data analyses are limited in their ability to resolve complex and inherently nonlinear neuronal/perceptual processes because these methods require input stimuli to be Gaussian with a zero mean. Recent studies have shown that analyses based on a generalized linear model (GLM) do not require such specific input characteristics and have advantages over conventional methods. GLM, however, relies on iterative optimization algorithms and its calculation costs become very expensive when estimating the nonlinear parameters of a large-scale system using large volumes of data. In this paper, we introduce a new analytical method for identifying a nonlinear system without relying on iterative calculations and yet also not requiring any specific stimulus distribution. We demonstrate the results of numerical simulations, showing that our noniterative method is as accurate as GLM in estimating nonlinear parameters in many cases and outperforms conventional, spike-triggered data analyses. As an example of the application of our method to actual psychophysical data, we investigated how different spatiotemporal frequency channels interact in assessments of motion direction. The nonlinear interaction estimated by our method was consistent with findings from previous vision studies and supports the validity of our method for nonlinear system identification.

  12. Influence of exposure assessment and parameterization on exposure response. Aspects of epidemiologic cohort analysis using the Libby Amphibole asbestos worker cohort.

    PubMed

    Bateson, Thomas F; Kopylev, Leonid

    2015-01-01

    Recent meta-analyses of occupational epidemiology studies identified two important exposure data quality factors in predicting summary effect measures for asbestos-associated lung cancer mortality risk: sufficiency of job history data and percent coverage of work history by measured exposures. The objective was to evaluate different exposure parameterizations suggested in the asbestos literature using the Libby, MT asbestos worker cohort and to evaluate influences of exposure measurement error caused by historically estimated exposure data on lung cancer risks. Focusing on workers hired after 1959, when job histories were well-known and occupational exposures were predominantly based on measured exposures (85% coverage), we found that cumulative exposure alone, and with allowance of exponential decay, fit lung cancer mortality data similarly. Residence-time-weighted metrics did not fit well. Compared with previous analyses based on the whole cohort of Libby workers hired after 1935, when job histories were less well-known and exposures less frequently measured (47% coverage), our analyses based on higher quality exposure data yielded an effect size as much as 3.6 times higher. Future occupational cohort studies should continue to refine retrospective exposure assessment methods, consider multiple exposure metrics, and explore new methods of maintaining statistical power while minimizing exposure measurement error.

  13. Assessing Student Openness to Inquiry-Based Learning in Precalculus

    ERIC Educational Resources Information Center

    Cooper, Thomas; Bailey, Brad; Briggs, Karen; Holliday, John

    2017-01-01

    The authors have completed a 2-year quasi-experimental study on the use of inquiry-based learning (IBL) in precalculus. This study included six traditional lecture-style courses and seven modified Moore method courses taught by three instructors. Both quantitative and qualitative analyses were used to investigate the attitudes and beliefs of the…

  14. Effects of Experiential-Based Videos in Multi-Disciplinary Learning

    ERIC Educational Resources Information Center

    Jabbar, Khalid Bin Abdul; Ong, Alex; Choy, Jeanette; Lim, Lisa

    2013-01-01

    This study examined the use of authentic experiential-based videos in self-explanation activities on 32 polytechnic students' learning and motivation, using a mixed method quasi-experimental design. The control group analysed a set of six pre-recorded videos of a subject performing the standing broad jump (SBJ). The experimental group captured…

  15. Let's Be PALS: An Evidence-Based Approach to Professional Development

    ERIC Educational Resources Information Center

    Dunst, Carl J.; Trivette, Carol M.

    2009-01-01

    An evidence-based approach to professional development is described on the basis of the findings from a series of research syntheses and meta-analyses of adult learning methods and strategies. The approach, called PALS (Participatory Adult Learning Strategy), places major emphasis on both active learner involvement in all aspects of training…

  16. Chronic Disease Risk Reduction with a Community-Based Lifestyle Change Programme

    ERIC Educational Resources Information Center

    Merrill, Ray M; Aldana, Steven G; Greenlaw, Roger L; Salberg, Audrey; Englert, Heike

    2008-01-01

    Objective To assess whether reduced health risks resulting from the Coronary Health Improvement Project (CHIP) persist through 18 months. Methods: The CHIP is a four-week health education course designed to help individuals reduce cardiovascular risk by improving nutrition and physical activity behaviors. Analyses were based on 211 CHIP enrollees,…

  17. A Method to Examine Content Domain Structures

    ERIC Educational Resources Information Center

    D'Agostino, Jerome; Karpinski, Aryn; Welsh, Megan

    2011-01-01

    After a test is developed, most content validation analyses shift from ascertaining domain definition to studying domain representation and relevance because the domain is assumed to be set once a test exists. We present an approach that allows for the examination of alternative domain structures based on extant test items. In our example based on…

  18. Identification of Bacterial Populations in Drinking Water Using 16S rRNA-Based Sequence Analyses

    EPA Science Inventory

    Intracellular RNA is rapidly degraded in stressed cells and is more unstable outside of the cell than DNA. As a result, RNA-based methods have been suggested to study the active microbial fraction in environmental matrices. The aim of this study was to identify bacterial populati...

  19. On existence of the σ(600) Its physical implications and related problems

    NASA Astrophysics Data System (ADS)

    Ishida, Shin

    1998-05-01

    We make a re-analysis of 1=0 ππ scattering phase shift δ00 through a new method of S-matrix parametrization (IA; interfering amplitude method), and show a result suggesting strongly for the existence of σ-particle-long-sought Chiral partner of π-meson. Furthermore, through the phenomenological analyses of typical production processes of the 2π-system, the pp-central collision and the J/Ψ→ωππ decay, by applying an intuitive formula as sum of Breit-Wigner amplitudes, (VMW; variant mass and width method), the other evidences for the σ-existence are given. The validity of the methods used in the above analyses is investigated, using a simple field theoretical model, from the general viewpoint of unitarity and the applicability of final state interaction (FSI-) theorem, especially in relation to the "universality" argument. It is shown that the IA and VMW are obtained as the physical state representations of scattering and production amplitudes, respectively. The VMW is shown to be an effective method to obtain the resonance properties from production processes, which generally have the unknown strong-phases. The conventional analyses based on the "universality" seem to be powerless for this purpose.

  20. Incorporating nonlinearity into mediation analyses.

    PubMed

    Knafl, George J; Knafl, Kathleen A; Grey, Margaret; Dixon, Jane; Deatrick, Janet A; Gallo, Agatha M

    2017-03-21

    Mediation is an important issue considered in the behavioral, medical, and social sciences. It addresses situations where the effect of a predictor variable X on an outcome variable Y is explained to some extent by an intervening, mediator variable M. Methods for addressing mediation have been available for some time. While these methods continue to undergo refinement, the relationships underlying mediation are commonly treated as linear in the outcome Y, the predictor X, and the mediator M. These relationships, however, can be nonlinear. Methods are needed for assessing when mediation relationships can be treated as linear and for estimating them when they are nonlinear. Existing adaptive regression methods based on fractional polynomials are extended here to address nonlinearity in mediation relationships, but assuming those relationships are monotonic as would be consistent with theories about directionality of such relationships. Example monotonic mediation analyses are provided assessing linear and monotonic mediation of the effect of family functioning (X) on a child's adaptation (Y) to a chronic condition by the difficulty (M) for the family in managing the child's condition. Example moderated monotonic mediation and simulation analyses are also presented. Adaptive methods provide an effective way to incorporate possibly nonlinear monotonicity into mediation relationships.

  1. What do results from coordinate-based meta-analyses tell us?

    PubMed

    Albajes-Eizagirre, Anton; Radua, Joaquim

    2018-08-01

    Coordinate-based meta-analyses (CBMA) methods, such as Activation Likelihood Estimation (ALE) and Seed-based d Mapping (SDM), have become an invaluable tool for summarizing the findings of voxel-based neuroimaging studies. However, the progressive sophistication of these methods may have concealed two particularities of their statistical tests. Common univariate voxelwise tests (such as the t/z-tests used in SPM and FSL) detect voxels that activate, or voxels that show differences between groups. Conversely, the tests conducted in CBMA test for "spatial convergence" of findings, i.e., they detect regions where studies report "more peaks than in most regions", regions that activate "more than most regions do", or regions that show "larger differences between groups than most regions do". The first particularity is that these tests rely on two spatial assumptions (voxels are independent and have the same probability to have a "false" peak), whose violation may make their results either conservative or liberal, though fortunately current versions of ALE, SDM and some other methods consider these assumptions. The second particularity is that the use of these tests involves an important paradox: the statistical power to detect a given effect is higher if there are no other effects in the brain, whereas lower in presence of multiple effects. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. DNA-based methods of geochemical prospecting

    DOEpatents

    Ashby, Matthew [Mill Valley, CA

    2011-12-06

    The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.

  3. A sediment graph model based on SCS-CN method

    NASA Astrophysics Data System (ADS)

    Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.

    2008-01-01

    SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.

  4. Process for Assessing the Stability of HAN (Hydroxylamine)-Based Liquid Propellants.

    DTIC Science & Technology

    1987-07-29

    liquid propellants on the basis of HAN according to Fig. 1 can be determined directly by Fischer titration. This method requires a special unit, as the...Wasserreagenzien nach Eugen Scholz fUr die Karl - Fischer -Titration (Guidelines by Messrs. Riedel-de Haen for Titration according to the Karl Fischer ...Propellant components 2 2.2 Methods of determination 3 2.3 Acid/base titration and pK values 4 2.4 The Titroprozessor 636 8 2.5 Propellant analyses 10

  5. Mass spectrometry-based protein identification with accurate statistical significance assignment.

    PubMed

    Alves, Gelio; Yu, Yi-Kuo

    2015-03-01

    Assigning statistical significance accurately has become increasingly important as metadata of many types, often assembled in hierarchies, are constructed and combined for further biological analyses. Statistical inaccuracy of metadata at any level may propagate to downstream analyses, undermining the validity of scientific conclusions thus drawn. From the perspective of mass spectrometry-based proteomics, even though accurate statistics for peptide identification can now be achieved, accurate protein level statistics remain challenging. We have constructed a protein ID method that combines peptide evidences of a candidate protein based on a rigorous formula derived earlier; in this formula the database P-value of every peptide is weighted, prior to the final combination, according to the number of proteins it maps to. We have also shown that this protein ID method provides accurate protein level E-value, eliminating the need of using empirical post-processing methods for type-I error control. Using a known protein mixture, we find that this protein ID method, when combined with the Sorić formula, yields accurate values for the proportion of false discoveries. In terms of retrieval efficacy, the results from our method are comparable with other methods tested. The source code, implemented in C++ on a linux system, is available for download at ftp://ftp.ncbi.nlm.nih.gov/pub/qmbp/qmbp_ms/RAId/RAId_Linux_64Bit. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.

  6. Learning Financial Accounting in a Tertiary Institution of a Developing Country. An Investigation into Instructional Methods

    ERIC Educational Resources Information Center

    Abeysekera, Indra

    2011-01-01

    This study examines three instructional methods (traditional, interactive, and group case-based study), and student opinions on their preference for learning financial accounting in large classes at a metropolitan university in Sri Lanka. It analyses the results of a survey questionnaire of students, using quantitative techniques to determine the…

  7. Decomposing biodiversity data using the Latent Dirichlet Allocation model, a probabilistic multivariate statistical method

    Treesearch

    Denis Valle; Benjamin Baiser; Christopher W. Woodall; Robin Chazdon; Jerome Chave

    2014-01-01

    We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates...

  8. Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakkila, E.A.

    1978-10-01

    Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.

  9. Exploring the Relationship between Fidelity of Implementation and Academic Achievement in a Third-Grade Gifted Curriculum: A Mixed-Methods Study

    ERIC Educational Resources Information Center

    Azano, Amy; Missett, Tracy C.; Callahan, Carolyn M.; Oh, Sarah; Brunner, Marguerite; Foster, Lisa H.; Moon, Tonya R.

    2011-01-01

    This study used sequential mixed-methods analyses to investigate the effectiveness of a research-based language arts curriculum for gifted third graders. Using analytic induction, researchers found that teachers' beliefs and expectations (time, sense of autonomy, expectations for students, professional expertise) influenced the degree to which…

  10. GPs’ thoughts on prescribing medication and evidence-based knowledge: The benefit aspect is a strong motivator

    PubMed Central

    Skoglund, Ingmarie; Segesten, Kerstin; Björkelund, Cecilia

    2007-01-01

    Objective To describe GPs’ thoughts of prescribing medication and evidence-based knowledge (EBM) concerning drug therapy. Design Tape-recorded focus-group interviews transcribed verbatim and analysed using qualitative methods. Setting GPs from the south-eastern part of Västra Götaland, Sweden. Subjects A total of 16 GPs out of 178 from the south-eastern part of the region strategically chosen to represent urban and rural, male and female, long and short GP experience. Methods Transcripts were analysed using a descriptive qualitative method. Results The categories were: benefits, time and space, and expert knowledge. The benefit was a merge of positive elements, all aspects of the GPs’ tasks. Time and space were limitations for GPs’ tasks. EBM as a constituent of expert knowledge should be more customer adjusted to be able to be used in practice. Benefit was the most important category, existing in every decision-making situation for the GP. The core category was prompt and pragmatic benefit, which was the utmost benefit. Conclusion GPs’ thoughts on evidence-based medicine and prescribing medication were highly related to reflecting on benefit and results. The interviews indicated that prompt and pragmatic benefit is important for comprehending their thoughts. PMID:17497487

  11. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    PubMed Central

    2017-01-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325

  12. Local durian (Durio zibethinus murr.) exploration for potentially superior tree as parents in Ngrambe District, Ngawi

    NASA Astrophysics Data System (ADS)

    Yuniastuti, E.; Anggita, A.; Nandariyah; Sukaya

    2018-03-01

    The characteristics durian based on specific area gives a wide diversity of phenotype. This research objective was to build an inventory of the local durian of Ngrambe as well as to obtain potentially superior local durian as prospective parent trees. The research was conducted in Ngrambe sub-district, on October 2015 until April 2016 using the explorative descriptive method. The determination of sample point used the non-probability method of snowball sampling type. Primary data include the morphology of plant characters, trunks, leaves, flower, fruits and seeds and their superiority. The data of the research were analyzed using SIMQUAL (Similarity for Qualitative) function based on the DICE coefficient on NTSYS v.2.02. The data cluster and dendrogram analyses were determined by Unweighted Pair-Group Arithmetic Average (UPGMA) method. The result of DICE coefficient analyses of 58 local durian accession based on the phenotypic character of vegetative organs ranged from 0.84-1.0. The phenotypic character of the vegetative and generative organ from 3 local durian accession superior potential ranged from 0.7 to 0.8. In conclusion, the accession of local durian which were Miyem and Rusmiyati have advantage and potential as prospective parent trees.

  13. Students' satisfaction to hybrid problem-based learning format for basic life support/advanced cardiac life support teaching.

    PubMed

    Chilkoti, Geetanjali; Mohta, Medha; Wadhwa, Rachna; Saxena, Ashok Kumar; Sharma, Chhavi Sarabpreet; Shankar, Neelima

    2016-11-01

    Students are exposed to basic life support (BLS) and advanced cardiac life support (ACLS) training in the first semester in some medical colleges. The aim of this study was to compare students' satisfaction between lecture-based traditional method and hybrid problem-based learning (PBL) in BLS/ACLS teaching to undergraduate medical students. We conducted a questionnaire-based, cross-sectional survey among 118 1 st -year medical students from a university medical college in the city of New Delhi, India. We aimed to assess the students' satisfaction between lecture-based and hybrid-PBL method in BLS/ACLS teaching. Likert 5-point scale was used to assess students' satisfaction levels between the two teaching methods. Data were collected and scores regarding the students' satisfaction levels between these two teaching methods were analysed using a two-sided paired t -test. Most students preferred hybrid-PBL format over traditional lecture-based method in the following four aspects; learning and understanding, interest and motivation, training of personal abilities and being confident and satisfied with the teaching method ( P < 0.05). Implementation of hybrid-PBL format along with the lecture-based method in BLS/ACLS teaching provided high satisfaction among undergraduate medical students.

  14. Automated detection of retinal landmarks for the identification of clinically relevant regions in fundus photography

    NASA Astrophysics Data System (ADS)

    Ometto, Giovanni; Calivá, Francesco; Al-Diri, Bashir; Bek, Toke; Hunter, Andrew

    2016-03-01

    Automatic, quick and reliable identification of retinal landmarks from fundus photography is key for measurements used in research, diagnosis, screening and treating of common diseases affecting the eyes. This study presents a fast method for the detection of the centre of mass of the vascular arcades, optic nerve head (ONH) and fovea, used in the definition of five clinically relevant areas in use for screening programmes for diabetic retinopathy (DR). Thirty-eight fundus photographs showing 7203 DR lesions were analysed to find the landmarks manually by two retina-experts and automatically by the proposed method. The automatic identification of the ONH and fovea were performed using template matching based on normalised cross correlation. The centre of mass of the arcades was obtained by fitting an ellipse on sample coordinates of the main vessels. The coordinates were obtained by processing the image with hessian filtering followed by shape analyses and finally sampling the results. The regions obtained manually and automatically were used to count the retinal lesions falling within, and to evaluate the method. 92.7% of the lesions were falling within the same regions based on the landmarks selected by the two experts. 91.7% and 89.0% were counted in the same areas identified by the method and the first and second expert respectively. The inter-repeatability of the proposed method and the experts is comparable, while the 100% intra-repeatability makes the algorithm a valuable tool in tasks like analyses in real-time, of large datasets and of intra-patient variability.

  15. Phylogeny of Syndermata (syn. Rotifera): Mitochondrial gene order verifies epizoic Seisonidea as sister to endoparasitic Acanthocephala within monophyletic Hemirotifera.

    PubMed

    Sielaff, Malte; Schmidt, Hanno; Struck, Torsten H; Rosenkranz, David; Mark Welch, David B; Hankeln, Thomas; Herlyn, Holger

    2016-03-01

    A monophyletic origin of endoparasitic thorny-headed worms (Acanthocephala) and wheel-animals (Rotifera) is widely accepted. However, the phylogeny inside the clade, be it called Syndermata or Rotifera, has lacked validation by mitochondrial (mt) data. Herein, we present the first mt genome of the key taxon Seison and report conflicting results of phylogenetic analyses: while mt sequence-based topologies showed monophyletic Lemniscea (Bdelloidea+Acanthocephala), gene order analyses supported monophyly of Pararotatoria (Seisonidea+Acanthocephala) and Hemirotifera (Bdelloidea+Pararotatoria). Sequence-based analyses obviously suffered from substitution saturation, compositional bias, and branch length heterogeneity; however, we observed no compromising effects in gene order analyses. Moreover, gene order-based topologies were robust to changes in coding (genes vs. gene pairs, two-state vs. multistate, aligned vs. non-aligned), tree reconstruction methods, and the treatment of the two monogonont mt genomes. Thus, mt gene order verifies seisonids as sister to acanthocephalans within monophyletic Hemirotifera, while deviating results of sequence-based analyses reflect artificial signal. This conclusion implies that the complex life cycle of extant acanthocephalans evolved from a free-living state, as retained by most monogononts and bdelloids, via an epizoic state with a simple life cycle, as shown by seisonids. Hence, Acanthocephala represent a rare example where ancestral transitional stages have counterparts amongst the closest relatives. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees.

    PubMed

    Martínez-Aquino, Andrés

    2016-08-01

    Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host-parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a "compass" when "walking" through jungles of tangled phylogenetic trees.

  17. Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees

    PubMed Central

    2016-01-01

    Abstract Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host–parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a “compass” when “walking” through jungles of tangled phylogenetic trees. PMID:29491928

  18. Data accuracy assessment using enterprise architecture

    NASA Astrophysics Data System (ADS)

    Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias

    2011-02-01

    Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.

  19. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  20. Economic evaluation of home-based telebehavioural health care compared to in-person treatment delivery for depression.

    PubMed

    Bounthavong, Mark; Pruitt, Larry D; Smolenski, Derek J; Gahm, Gregory A; Bansal, Aasthaa; Hansen, Ryan N

    2018-02-01

    Introduction Home-based telebehavioural healthcare improves access to mental health care for patients restricted by travel burden. However, there is limited evidence assessing the economic value of home-based telebehavioural health care compared to in-person care. We sought to compare the economic impact of home-based telebehavioural health care and in-person care for depression among current and former US service members. Methods We performed trial-based cost-minimisation and cost-utility analyses to assess the economic impact of home-based telebehavioural health care versus in-person behavioural care for depression. Our analyses focused on the payer perspective (Department of Defense and Department of Veterans Affairs) at three months. We also performed a scenario analysis where all patients possessed video-conferencing technology that was approved by these agencies. The cost-utility analysis evaluated the impact of different depression categories on the incremental cost-effectiveness ratio. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model assumptions. Results In the base case analysis the total direct cost of home-based telebehavioural health care was higher than in-person care (US$71,974 versus US$20,322). Assuming that patients possessed government-approved video-conferencing technology, home-based telebehavioural health care was less costly compared to in-person care (US$19,177 versus US$20,322). In one-way sensitivity analyses, the proportion of patients possessing personal computers was a major driver of direct costs. In the cost-utility analysis, home-based telebehavioural health care was dominant when patients possessed video-conferencing technology. Results from probabilistic sensitivity analyses did not differ substantially from base case results. Discussion Home-based telebehavioural health care is dependent on the cost of supplying video-conferencing technology to patients but offers the opportunity to increase access to care. Health-care policies centred on implementation of home-based telebehavioural health care should ensure that these technologies are able to be successfully deployed on patients' existing technology.

  1. Methods for detecting, quantifying, and adjusting for dissemination bias in meta-analysis are described.

    PubMed

    Mueller, Katharina Felicitas; Meerpohl, Joerg J; Briel, Matthias; Antes, Gerd; von Elm, Erik; Lang, Britta; Motschall, Edith; Schwarzer, Guido; Bassler, Dirk

    2016-12-01

    To systematically review methodological articles which focus on nonpublication of studies and to describe methods of detecting and/or quantifying and/or adjusting for dissemination in meta-analyses. To evaluate whether the methods have been applied to an empirical data set for which one can be reasonably confident that all studies conducted have been included. We systematically searched Medline, the Cochrane Library, and Web of Science, for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for dissemination bias in meta-analyses. The literature search retrieved 2,224 records, of which we finally included 150 full-text articles. A great variety of methods to detect, quantify, or adjust for dissemination bias were described. Methods included graphical methods mainly based on funnel plot approaches, statistical methods, such as regression tests, selection models, sensitivity analyses, and a great number of more recent statistical approaches. Only few methods have been validated in empirical evaluations using unpublished studies obtained from regulators (Food and Drug Administration, European Medicines Agency). We present an overview of existing methods to detect, quantify, or adjust for dissemination bias. It remains difficult to advise which method should be used as they are all limited and their validity has rarely been assessed. Therefore, a thorough literature search remains crucial in systematic reviews, and further steps to increase the availability of all research results need to be taken. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  3. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  4. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  5. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  6. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    PubMed

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  7. Diet behaviour among young people in transition to adulthood (18-25 year olds): a mixed method study.

    PubMed

    Poobalan, Amudha S; Aucott, Lorna S; Clarke, Amanda; Smith, William Cairns S

    2014-01-01

    Background : Young people (18-25 years) during the adolescence/adulthood transition are vulnerable to weight gain and notoriously hard to reach. Despite increased levels of overweight/obesity in this age group, diet behaviour, a major contributor to obesity, is poorly understood. The purpose of this study was to explore diet behaviour among 18-25 year olds with influential factors including attitudes, motivators and barriers. Methods : An explanatory mixed method study design, based on health Behaviour Change Theories was used. Those at University/college and in the community, including those Not in Education, Employment or Training (NEET) were included. An initial quantitative questionnaire survey underpinned by the Theory of Planned Behaviour and Social Cognitive Theory was conducted and the results from this were incorporated into the qualitative phase. Seven focus groups were conducted among similar young people, varying in education and socioeconomic status. Exploratory univariate analysis was followed by multi-staged modelling to analyse the quantitative data. 'Framework Analysis' was used to analyse the focus groups. Results : 1313 questionnaires were analysed. Self-reported overweight/obesity prevalence was 22%, increasing with age, particularly in males. Based on the survey, 40% of young people reported eating an adequate amount of fruits and vegetables and 59% eating regular meals, but 32% reported unhealthy snacking. Based on the statistical modelling, positive attitudes towards diet and high intention (89%), did not translate into healthy diet behaviour. From the focus group discussions, the main motivators for diet behaviour were 'self-appearance' and having 'variety of food'. There were mixed opinions on 'cost' of food and 'taste'. Conclusion : Elements deemed really important to young people have been identified. This mixed method study is the largest in this vulnerable and neglected group covering a wide spectrum of the community. It provides evidence base to inform tailored interventions for a healthy diet within this age group.

  8. Diet behaviour among young people in transition to adulthood (18–25 year olds): a mixed method study

    PubMed Central

    Poobalan, Amudha S.; Aucott, Lorna S.; Clarke, Amanda; Smith, William Cairns S.

    2014-01-01

    Background : Young people (18–25 years) during the adolescence/adulthood transition are vulnerable to weight gain and notoriously hard to reach. Despite increased levels of overweight/obesity in this age group, diet behaviour, a major contributor to obesity, is poorly understood. The purpose of this study was to explore diet behaviour among 18–25 year olds with influential factors including attitudes, motivators and barriers. Methods: An explanatory mixed method study design, based on health Behaviour Change Theories was used. Those at University/college and in the community, including those Not in Education, Employment or Training (NEET) were included. An initial quantitative questionnaire survey underpinned by the Theory of Planned Behaviour and Social Cognitive Theory was conducted and the results from this were incorporated into the qualitative phase. Seven focus groups were conducted among similar young people, varying in education and socioeconomic status. Exploratory univariate analysis was followed by multi-staged modelling to analyse the quantitative data. ‘Framework Analysis’ was used to analyse the focus groups. Results: 1313 questionnaires were analysed. Self-reported overweight/obesity prevalence was 22%, increasing with age, particularly in males. Based on the survey, 40% of young people reported eating an adequate amount of fruits and vegetables and 59% eating regular meals, but 32% reported unhealthy snacking. Based on the statistical modelling, positive attitudes towards diet and high intention (89%), did not translate into healthy diet behaviour. From the focus group discussions, the main motivators for diet behaviour were ‘self-appearance’ and having ‘variety of food’. There were mixed opinions on ‘cost’ of food and ‘taste’. Conclusion: Elements deemed really important to young people have been identified. This mixed method study is the largest in this vulnerable and neglected group covering a wide spectrum of the community. It provides evidence base to inform tailored interventions for a healthy diet within this age group. PMID:25750826

  9. A Government/Industry Summary of the Design Analysis Methods for Vibrations (DAMVIBS) Program

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G. (Compiler)

    1993-01-01

    The NASA Langley Research Center in 1984 initiated a rotorcraft structural dynamics program, designated DAMVIBS (Design Analysis Methods for VIBrationS), with the objective of establishing the technology base needed by the rotorcraft industry for developing an advanced finite-element-based dynamics design analysis capability for vibrations. An assessment of the program showed that the DAMVIBS Program has resulted in notable technical achievements and major changes in industrial design practice, all of which have significantly advanced the industry's capability to use and rely on finite-element-based dynamics analyses during the design process.

  10. Evolution of Rhizaria: new insights from phylogenomic analysis of uncultivated protists.

    PubMed

    Burki, Fabien; Kudryavtsev, Alexander; Matz, Mikhail V; Aglyamova, Galina V; Bulman, Simon; Fiers, Mark; Keeling, Patrick J; Pawlowski, Jan

    2010-12-02

    Recent phylogenomic analyses have revolutionized our view of eukaryote evolution by revealing unexpected relationships between and within the eukaryotic supergroups. However, for several groups of uncultivable protists, only the ribosomal RNA genes and a handful of proteins are available, often leading to unresolved evolutionary relationships. A striking example concerns the supergroup Rhizaria, which comprises several groups of uncultivable free-living protists such as radiolarians, foraminiferans and gromiids, as well as the parasitic plasmodiophorids and haplosporids. Thus far, the relationships within this supergroup have been inferred almost exclusively from rRNA, actin, and polyubiquitin genes, and remain poorly resolved. To address this, we have generated large Expressed Sequence Tag (EST) datasets for 5 species of Rhizaria belonging to 3 important groups: Acantharea (Astrolonche sp., Phyllostaurus sp.), Phytomyxea (Spongospora subterranea, Plasmodiophora brassicae) and Gromiida (Gromia sphaerica). 167 genes were selected for phylogenetic analyses based on the representation of at least one rhizarian species for each gene. Concatenation of these genes produced a supermatrix composed of 36,735 amino acid positions, including 10 rhizarians, 9 stramenopiles, and 9 alveolates. Phylogenomic analyses of this large dataset revealed a strongly supported clade grouping Foraminifera and Acantharea. The position of this clade within Rhizaria was sensitive to the method employed and the taxon sampling: Maximum Likelihood (ML) and Bayesian analyses using empirical model of evolution favoured an early divergence, whereas the CAT model and ML analyses with fast-evolving sites or the foraminiferan species Reticulomyxa filosa removed suggested a derived position, closely related to Gromia and Phytomyxea. In contrast to what has been previously reported, our analyses also uncovered the presence of the rhizarian-specific polyubiquitin insertion in Acantharea. Finally, this work reveals another possible rhizarian signature in the 60S ribosomal protein L10a. Our study provides new insights into the evolution of Rhizaria based on phylogenomic analyses of ESTs from three groups of previously under-sampled protists. It was enabled through the application of a recently developed method of transcriptome analysis, requiring very small amount of starting material. Our study illustrates the potential of this method to elucidate the early evolution of eukaryotes by providing large amount of data for uncultivable free-living and parasitic protists.

  11. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    NASA Astrophysics Data System (ADS)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  12. Decision tree and PCA-based fault diagnosis of rotating machinery

    NASA Astrophysics Data System (ADS)

    Sun, Weixiang; Chen, Jin; Li, Jiaqing

    2007-04-01

    After analysing the flaws of conventional fault diagnosis methods, data mining technology is introduced to fault diagnosis field, and a new method based on C4.5 decision tree and principal component analysis (PCA) is proposed. In this method, PCA is used to reduce features after data collection, preprocessing and feature extraction. Then, C4.5 is trained by using the samples to generate a decision tree model with diagnosis knowledge. At last the tree model is used to make diagnosis analysis. To validate the method proposed, six kinds of running states (normal or without any defect, unbalance, rotor radial rub, oil whirl, shaft crack and a simultaneous state of unbalance and radial rub), are simulated on Bently Rotor Kit RK4 to test C4.5 and PCA-based method and back-propagation neural network (BPNN). The result shows that C4.5 and PCA-based diagnosis method has higher accuracy and needs less training time than BPNN.

  13. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  14. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Protein detection through different platforms of immuno-loop-mediated isothermal amplification

    NASA Astrophysics Data System (ADS)

    Pourhassan-Moghaddam, Mohammad; Rahmati-Yamchi, Mohammad; Akbarzadeh, Abolfazl; Daraee, Hadis; Nejati-Koshki, Kazem; Hanifehpour, Younes; Joo, Sang Woo

    2013-11-01

    Different immunoassay-based methods have been devised to detect protein targets. These methods have some challenges that make them inefficient for assaying ultra-low-amounted proteins. ELISA, iPCR, iRCA, and iNASBA are the common immunoassay-based methods of protein detection, each of which has specific and common technical challenges making it necessary to introduce a novel method in order to avoid their problems for detection of target proteins. Here we propose a new method nominated as `immuno-loop-mediated isothermal amplification' or `iLAMP'. This new method is free from the problems of the previous methods and has significant advantages over them. In this paper we also offer various configurations in order to improve the applicability of this method in real-world sample analyses. Important potential applications of this method are stated as well.

  16. Identification of Poly-N-Acetyllactosamine-Carrying Glycoproteins from HL-60 Human Promyelocytic Leukemia Cells Using a Site-Specific Glycome Analysis Method, Glyco-RIDGE

    NASA Astrophysics Data System (ADS)

    Togayachi, Akira; Tomioka, Azusa; Fujita, Mika; Sukegawa, Masako; Noro, Erika; Takakura, Daisuke; Miyazaki, Michiyo; Shikanai, Toshihide; Narimatsu, Hisashi; Kaji, Hiroyuki

    2018-04-01

    To elucidate the relationship between the protein function and the diversity and heterogeneity of glycans conjugated to the protein, glycosylation sites, glycan variation, and glycan proportions at each site of the glycoprotein must be analyzed. Glycopeptide-based structural analysis technology using mass spectrometry has been developed; however, complicated analyses of complex spectra obtained by multistage fragmentation are necessary, and sensitivity and throughput of the analyses are low. Therefore, we developed a liquid chromatography/mass spectrometry (MS)-based glycopeptide analysis method to reveal the site-specific glycome (Glycan heterogeneity-based Relational IDentification of Glycopeptide signals on Elution profile, Glyco-RIDGE). This method used accurate masses and retention times of glycopeptides, without requiring MS2, and could be applied to complex mixtures. To increase the number of identified peptide, fractionation of sample glycopeptides for reduction of sample complexity is required. Therefore, in this study, glycopeptides were fractionated into four fractions by hydrophilic interaction chromatography, and each fraction was analyzed using the Glyco-RIDGE method. As a result, many glycopeptides having long glycans were enriched in the highest hydrophilic fraction. Based on the monosaccharide composition, these glycans were thought to be poly-N-acetyllactosamine (polylactosamine [pLN]), and 31 pLN-carrier proteins were identified in HL-60 cells. Gene ontology enrichment analysis revealed that pLN carriers included many molecules related to signal transduction, receptors, and cell adhesion. Thus, these findings provided important insights into the analysis of the glycoproteome using our novel Glyco-RIDGE method. [Figure not available: see fulltext.

  17. Molecular epidemiologic analysis of a Pneumocystis pneumonia outbreak among renal transplant patients.

    PubMed

    Urabe, N; Ishii, Y; Hyodo, Y; Aoki, K; Yoshizawa, S; Saga, T; Murayama, S Y; Sakai, K; Homma, S; Tateda, K

    2016-04-01

    Between 18 November and 3 December 2011, five renal transplant patients at the Department of Nephrology, Toho University Omori Medical Centre, Tokyo, were diagnosed with Pneumocystis pneumonia (PCP). We used molecular epidemiologic methods to determine whether the patients were infected with the same strain of Pneumocystis jirovecii. DNA extracted from the residual bronchoalveolar lavage fluid from the five outbreak cases and from another 20 cases of PCP between 2007 and 2014 were used for multilocus sequence typing to compare the genetic similarity of the P. jirovecii. DNA base sequencing by the Sanger method showed some regions where two bases overlapped and could not be defined. A next-generation sequencer was used to analyse the types and ratios of these overlapping bases. DNA base sequences of P. jirovecii in the bronchoalveolar lavage fluid from four of the five PCP patients in the 2011 outbreak and from another two renal transplant patients who developed PCP in 2013 were highly homologous. The Sanger method revealed 14 genomic regions where two differing DNA bases overlapped and could not be identified. Analyses of the overlapping bases by a next-generation sequencer revealed that the differing types of base were present in almost identical ratios. There is a strong possibility that the PCP outbreak at the Toho University Omori Medical Centre was caused by the same strain of P. jirovecii. Two different types of base present in some regions may be due to P. jirovecii's being a diploid species. Copyright © 2015 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  18. A study of direct moxibustion using mathematical methods.

    PubMed

    Liu, Miao; Kauh, Sang Ken; Lim, Sabina

    2012-01-01

    Direct moxibustion is an important and widely used treatment method in traditional medical science. The use of a mathematical method to analyse direct moxibustion treatment is necessary and helpful in exploring the new direct moxibustion instruments and their standardisation. Thus, this paper aims to use a mathematical method to study direct moxibustion in skin to demonstrate a direct relationship between direct moxibustion and skin stimuli. In this paper, the transient thermal response of skin layers is analysed to study direct moxibustion using the data got from standardised method to measure the temperature of a burning moxa cone. Numerical simulations based on an appropriate finite element model are developed to predict the heat transfer, thermal damage and thermal stress distribution of barley moxa cones and jujube moxa cones in the skin tissue. The results are verified by the ancient literatures of traditional Chinese medicine and clinical application, and showed that mathematical method can be a good interface between moxa cone and skin tissue providing the numerical value basis for moxibustion.

  19. Quasi-Static Probabilistic Structural Analyses Process and Criteria

    NASA Technical Reports Server (NTRS)

    Goldberg, B.; Verderaime, V.

    1999-01-01

    Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.

  20. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  1. Mitochondrial genomes of Meloidogyne chitwoodi and M. incognita (Nematoda: Tylenchina): comparative analysis, gene order and phylogenetic relationships with other nematodes.

    PubMed

    Humphreys-Pereira, Danny A; Elling, Axel A

    2014-01-01

    Root-knot nematodes (Meloidogyne spp.) are among the most important plant pathogens. In this study, the mitochondrial (mt) genomes of the root-knot nematodes, M. chitwoodi and M. incognita were sequenced. PCR analyses suggest that both mt genomes are circular, with an estimated size of 19.7 and 18.6-19.1kb, respectively. The mt genomes each contain a large non-coding region with tandem repeats and the control region. The mt gene arrangement of M. chitwoodi and M. incognita is unlike that of other nematodes. Sequence alignments of the two Meloidogyne mt genomes showed three translocations; two in transfer RNAs and one in cox2. Compared with other nematode mt genomes, the gene arrangement of M. chitwoodi and M. incognita was most similar to Pratylenchus vulnus. Phylogenetic analyses (Maximum Likelihood and Bayesian inference) were conducted using 78 complete mt genomes of diverse nematode species. Analyses based on nucleotides and amino acids of the 12 protein-coding mt genes showed strong support for the monophyly of class Chromadorea, but only amino acid-based analyses supported the monophyly of class Enoplea. The suborder Spirurina was not monophyletic in any of the phylogenetic analyses, contradicting the Clade III model, which groups Ascaridomorpha, Spiruromorpha and Oxyuridomorpha based on the small subunit ribosomal RNA gene. Importantly, comparisons of mt gene arrangement and tree-based methods placed Meloidogyne as sister taxa of Pratylenchus, a migratory plant endoparasitic nematode, and not with the sedentary endoparasitic Heterodera. Thus, comparative analyses of mt genomes suggest that sedentary endoparasitism in Meloidogyne and Heterodera is based on convergent evolution. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. [Quality assessment in anesthesia].

    PubMed

    Kupperwasser, B

    1996-01-01

    Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.

  3. Review of amateur meteor research

    NASA Astrophysics Data System (ADS)

    Rendtel, Jürgen

    2017-09-01

    Significant amounts of meteor astronomical data are provided by amateurs worldwide, using various methods. This review concentrates on optical data. Long-term meteor shower analyses based on consistent data are possible over decades (Orionids, Geminids, κ-Cygnids) and allow combination with modelling results. Small and weak structures related to individual stream filaments of cometary dust have been analysed in both major and minor showers (Quadrantids, September ε-Perseids), providing feedback to meteoroid ejection and stream evolution processes. Meteoroid orbit determination from video meteor networks contributes to the improvement of the IAU meteor data base. Professional-amateur cooperation also concerns observations and detailed analysis of fireball data, including meteorite ground searches.

  4. Pooling across cells to normalize single-cell RNA sequencing data with many zero counts.

    PubMed

    Lun, Aaron T L; Bach, Karsten; Marioni, John C

    2016-04-27

    Normalization of single-cell RNA sequencing data is necessary to eliminate cell-specific biases prior to downstream analyses. However, this is not straightforward for noisy single-cell data where many counts are zero. We present a novel approach where expression values are summed across pools of cells, and the summed values are used for normalization. Pool-based size factors are then deconvolved to yield cell-based factors. Our deconvolution approach outperforms existing methods for accurate normalization of cell-specific biases in simulated data. Similar behavior is observed in real data, where deconvolution improves the relevance of results of downstream analyses.

  5. Comparison between two methodologies for urban drainage decision aid.

    PubMed

    Moura, P M; Baptista, M B; Barraud, S

    2006-01-01

    The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.

  6. Diversity and evolutionary origins of fungi associated with seeds of a neotropical pioneer tree: a case study for analysing fungal environmental samples.

    PubMed

    U'ren, Jana M; Dalling, James W; Gallery, Rachel E; Maddison, David R; Davis, E Christine; Gibson, Cara M; Arnold, A Elizabeth

    2009-04-01

    Fungi associated with seeds of tropical trees pervasively affect seed survival and germination, and thus are an important, but understudied, component of forest ecology. Here, we examine the diversity and evolutionary origins of fungi isolated from seeds of an important pioneer tree (Cecropia insignis, Cecropiaceae) following burial in soil for five months in a tropical moist forest in Panama. Our approach, which relied on molecular sequence data because most isolates did not sporulate in culture, provides an opportunity to evaluate several methods currently used to analyse environmental samples of fungi. First, intra- and interspecific divergence were estimated for the nu-rITS and 5.8S gene for four genera of Ascomycota that are commonly recovered from seeds. Using these values we estimated species boundaries for 527 isolates, showing that seed-associated fungi are highly diverse, horizontally transmitted, and genotypically congruent with some foliar endophytes from the same site. We then examined methods for inferring the taxonomic placement and phylogenetic relationships of these fungi, evaluating the effects of manual versus automated alignment, model selection, and inference methods, as well as the quality of BLAST-based identification using GenBank. We found that common methods such as neighbor-joining and Bayesian inference differ in their sensitivity to alignment methods; analyses of particular fungal genera differ in their sensitivity to alignments; and numerous and sometimes intricate disparities exist between BLAST-based versus phylogeny-based identification methods. Lastly, we used our most robust methods to infer phylogenetic relationships of seed-associated fungi in four focal genera, and reconstructed ancestral states to generate preliminary hypotheses regarding the evolutionary origins of this guild. Our results illustrate the dynamic evolutionary relationships among endophytic fungi, pathogens, and seed-associated fungi, and the apparent evolutionary distinctiveness of saprotrophs. Our study also elucidates the diversity, taxonomy, and ecology of an important group of plant-associated fungi and highlights some of the advantages and challenges inherent in the use of ITS data for environmental sampling of fungi.

  7. A Method of DTM Construction Based on Quadrangular Irregular Networks and Related Error Analysis

    PubMed Central

    Kang, Mengjun

    2015-01-01

    A new method of DTM construction based on quadrangular irregular networks (QINs) that considers all the original data points and has a topological matrix is presented. A numerical test and a real-world example are used to comparatively analyse the accuracy of QINs against classical interpolation methods and other DTM representation methods, including SPLINE, KRIGING and triangulated irregular networks (TINs). The numerical test finds that the QIN method is the second-most accurate of the four methods. In the real-world example, DTMs are constructed using QINs and the three classical interpolation methods. The results indicate that the QIN method is the most accurate method tested. The difference in accuracy rank seems to be caused by the locations of the data points sampled. Although the QIN method has drawbacks, it is an alternative method for DTM construction. PMID:25996691

  8. Some important considerations in the development of stress corrosion cracking test methods.

    NASA Technical Reports Server (NTRS)

    Wei, R. P.; Novak, S. R.; Williams, D. P.

    1972-01-01

    Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.

  9. Controlling bias and inflation in epigenome- and transcriptome-wide association studies using the empirical null distribution.

    PubMed

    van Iterson, Maarten; van Zwet, Erik W; Heijmans, Bastiaan T

    2017-01-27

    We show that epigenome- and transcriptome-wide association studies (EWAS and TWAS) are prone to significant inflation and bias of test statistics, an unrecognized phenomenon introducing spurious findings if left unaddressed. Neither GWAS-based methodology nor state-of-the-art confounder adjustment methods completely remove bias and inflation. We propose a Bayesian method to control bias and inflation in EWAS and TWAS based on estimation of the empirical null distribution. Using simulations and real data, we demonstrate that our method maximizes power while properly controlling the false positive rate. We illustrate the utility of our method in large-scale EWAS and TWAS meta-analyses of age and smoking.

  10. Doping control analysis of trimetazidine and characterization of major metabolites using mass spectrometric approaches.

    PubMed

    Sigmund, Gerd; Koch, Anja; Orlovius, Anne-Katrin; Guddat, Sven; Thomas, Andreas; Schänzer, Wilhelm; Thevis, Mario

    2014-01-01

    Since January 2014, the anti-anginal drug trimetazidine [1-(2,3,4-trimethoxybenzyl)-piperazine] has been classified as prohibited substance by the World Anti-Doping Agency (WADA), necessitating specific and robust detection methods in sports drug testing laboratories. In the present study, the implementation of the intact therapeutic agent into two different initial testing procedures based on gas chromatography-mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) is reported, along with the characterization of urinary metabolites by electrospray ionization-high resolution/high accuracy (tandem) mass spectrometry. For GC-MS analyses, urine samples were subjected to liquid-liquid extraction sample preparation, while LC-MS/MS analyses were conducted by established 'dilute-and-inject' approaches. Both screening methods were validated for trimetazidine concerning specificity, limits of detection (0.5-50 ng/mL), intra-day and inter-day imprecision (<20%), and recovery (41%) in case of the GC-MS-based method. In addition, major metabolites such as the desmethylated trimetazidine and the corresponding sulfoconjugate, oxo-trimetazidine, and trimetazidine-N-oxide as identified in doping control samples were used to complement the LC-MS/MS-based assay, although intact trimetazidine was found at highest abundance of the relevant trimetazidine-related analytes in all tested sports drug testing samples. Retrospective data mining regarding doping control analyses conducted between 1999 and 2013 at the Cologne Doping Control Laboratory concerning trimetazidine revealed a considerable prevalence of the drug particularly in endurance and strength sports accounting for up to 39 findings per year. Copyright © 2014 John Wiley & Sons, Ltd.

  11. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE PAGES

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund; ...

    2018-03-28

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  12. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  13. Univariate and bivariate likelihood-based meta-analysis methods performed comparably when marginal sensitivity and specificity were the targets of inference.

    PubMed

    Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H

    2017-03-01

    To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Evaluation of a real-time quantitative PCR method with propidium monazide treatment for analyses of viable fecal indicator bacteria in wastewater samples

    EPA Science Inventory

    The U.S. EPA is currently evaluating rapid, real-time quantitative PCR (qPCR) methods for determining recreational water quality based on measurements of fecal indicator bacteria DNA sequences. In order to potentially use qPCR for other Clean Water Act needs, such as updating cri...

  15. Assessing the Accuracy of Classwide Direct Observation Methods: Two Analyses Using Simulated and Naturalistic Data

    ERIC Educational Resources Information Center

    Dart, Evan H.; Radley, Keith C.; Briesch, Amy M.; Furlow, Christopher M.; Cavell, Hannah J.; Briesch, Amy M.

    2016-01-01

    Two studies investigated the accuracy of eight different interval-based group observation methods that are commonly used to assess the effects of classwide interventions. In Study 1, a Microsoft Visual Basic program was created to simulate a large set of observational data. Binary data were randomly generated at the student level to represent…

  16. Comparison of the occupational safety applications in marble quarries of Carrara (Italy) and Iscehisar (Turkey) by using Elmeri method.

    PubMed

    Ersoy, Metin; Yesilkaya, Liyaddin

    2016-01-01

    In this paper, a brief summary is given about marble quarries in Carrara (Italy) and Iscehisar (Turkey), the Elmeri method is introduced, work accidents that can happen in marble quarries and their causes besides work safety behaviours in fields are explained, and the Elmeri monitoring method is applied and analysed. For this reason, marble quarries are divided into seven in terms of working conditions and active six quarries both in Carrara and Iscehisar areas, and work safety behaviours are analysed. Analysis process is based on True-False method; there are 18 items in total under six main topics; three items on each topic. The safety index for each section and the main topics are also calculated. According to the calculated safety indexes, Carrara area marble quarries (65.08%) are safer than Iscehisar area marble quarries (46.01%).

  17. Spectrophotometric analyses of hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) in water.

    PubMed

    Shi, Cong; Xu, Zhonghou; Smolinski, Benjamin L; Arienti, Per M; O'Connor, Gregory; Meng, Xiaoguang

    2015-07-01

    A simple and accurate spectrophotometric method for on-site analysis of royal demolition explosive (RDX) in water samples was developed based on the Berthelot reaction. The sensitivity and accuracy of an existing spectrophotometric method was improved by: replacing toxic chemicals with more stable and safer reagents; optimizing the reagent dose and reaction time; improving color stability; and eliminating the interference from inorganic nitrogen compounds in water samples. Cation and anion exchange resin cartridges were developed and used for sample pretreatment to eliminate the effect of ammonia and nitrate on RDX analyses. The detection limit of the method was determined to be 100 μg/L. The method was used successfully for analysis of RDX in untreated industrial wastewater samples. It can be used for on-site monitoring of RDX in wastewater for early detection of chemical spills and failure of wastewater treatment systems. Copyright © 2015. Published by Elsevier B.V.

  18. Comparability of HbA1c and lipids measured with dried blood spot versus venous samples: a systematic review and meta-analysis

    PubMed Central

    2014-01-01

    Background Levels of haemoglobin A1c (HbA1c) and blood lipids are important determinants of risk in patients with diabetes. Standard analysis methods based upon venous blood samples can be logistically challenging in resource-poor settings where much of the diabetes epidemic is occurring. Dried blood spots (DBS) provide a simple alternative method for sample collection but the comparability of data from analyses based on DBS is not well established. Methods We conducted a systematic review and meta-analysis to define the association of findings for HbA1c and blood lipids for analyses based upon standard methods compared to DBS. The Cochrane, Embase and Medline databases were searched for relevant reports and summary regression lines were estimated. Results 705 abstracts were found by the initial electronic search with 6 further reports identified by manual review of the full papers. 16 studies provided data for one or more outcomes of interest. There was a close agreement between the results for HbA1c assays based on venous and DBS samples (DBS = 0.9858venous + 0.3809), except for assays based upon affinity chromatography. Significant adjustment was required for assays of total cholesterol (DBS = 0.6807venous + 1.151) but results for triglycerides (DBS = 0.9557venous + 0.1427) were directly comparable. Conclusions For HbA1c and selected blood lipids, assays based on DBS samples are clearly associated with assays based on standard venous samples. There are, however, significant uncertainties about the nature of these associations and there is a need for standardisation of the sample collection, transportation, storage and analysis methods before the technique can be considered mainstream. This should be a research priority because better elucidation of metabolic risks in resource poor settings, where venous sampling is infeasible, will be key to addressing the global epidemic of cardiovascular diseases. PMID:25045323

  19. Folding to Curved Surfaces: A Generalized Design Method and Mechanics of Origami-based Cylindrical Structures.

    PubMed

    Wang, Fei; Gong, Haoran; Chen, Xi; Chen, C Q

    2016-09-14

    Origami structures enrich the field of mechanical metamaterials with the ability to convert morphologically and systematically between two-dimensional (2D) thin sheets and three-dimensional (3D) spatial structures. In this study, an in-plane design method is proposed to approximate curved surfaces of interest with generalized Miura-ori units. Using this method, two combination types of crease lines are unified in one reprogrammable procedure, generating multiple types of cylindrical structures. Structural completeness conditions of the finite-thickness counterparts to the two types are also proposed. As an example of the design method, the kinematics and elastic properties of an origami-based circular cylindrical shell are analysed. The concept of Poisson's ratio is extended to the cylindrical structures, demonstrating their auxetic property. An analytical model of rigid plates linked by elastic hinges, consistent with numerical simulations, is employed to describe the mechanical response of the structures. Under particular load patterns, the circular shells display novel mechanical behaviour such as snap-through and limiting folding positions. By analysing the geometry and mechanics of the origami structures, we extend the design space of mechanical metamaterials and provide a basis for their practical applications in science and engineering.

  20. Folding to Curved Surfaces: A Generalized Design Method and Mechanics of Origami-based Cylindrical Structures

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Gong, Haoran; Chen, Xi; Chen, C. Q.

    2016-09-01

    Origami structures enrich the field of mechanical metamaterials with the ability to convert morphologically and systematically between two-dimensional (2D) thin sheets and three-dimensional (3D) spatial structures. In this study, an in-plane design method is proposed to approximate curved surfaces of interest with generalized Miura-ori units. Using this method, two combination types of crease lines are unified in one reprogrammable procedure, generating multiple types of cylindrical structures. Structural completeness conditions of the finite-thickness counterparts to the two types are also proposed. As an example of the design method, the kinematics and elastic properties of an origami-based circular cylindrical shell are analysed. The concept of Poisson’s ratio is extended to the cylindrical structures, demonstrating their auxetic property. An analytical model of rigid plates linked by elastic hinges, consistent with numerical simulations, is employed to describe the mechanical response of the structures. Under particular load patterns, the circular shells display novel mechanical behaviour such as snap-through and limiting folding positions. By analysing the geometry and mechanics of the origami structures, we extend the design space of mechanical metamaterials and provide a basis for their practical applications in science and engineering.

  1. Folding to Curved Surfaces: A Generalized Design Method and Mechanics of Origami-based Cylindrical Structures

    PubMed Central

    Wang, Fei; Gong, Haoran; Chen, Xi; Chen, C. Q.

    2016-01-01

    Origami structures enrich the field of mechanical metamaterials with the ability to convert morphologically and systematically between two-dimensional (2D) thin sheets and three-dimensional (3D) spatial structures. In this study, an in-plane design method is proposed to approximate curved surfaces of interest with generalized Miura-ori units. Using this method, two combination types of crease lines are unified in one reprogrammable procedure, generating multiple types of cylindrical structures. Structural completeness conditions of the finite-thickness counterparts to the two types are also proposed. As an example of the design method, the kinematics and elastic properties of an origami-based circular cylindrical shell are analysed. The concept of Poisson’s ratio is extended to the cylindrical structures, demonstrating their auxetic property. An analytical model of rigid plates linked by elastic hinges, consistent with numerical simulations, is employed to describe the mechanical response of the structures. Under particular load patterns, the circular shells display novel mechanical behaviour such as snap-through and limiting folding positions. By analysing the geometry and mechanics of the origami structures, we extend the design space of mechanical metamaterials and provide a basis for their practical applications in science and engineering. PMID:27624892

  2. Audio-Visual Perception of 3D Cinematography: An fMRI Study Using Condition-Based and Computation-Based Analyses

    PubMed Central

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard “condition-based” designs, as well as “computational” methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli. PMID:24194828

  3. A review of the application of propensity score methods yielded increasing use, advantages in specific settings, but not substantially different estimates compared with conventional multivariable methods

    PubMed Central

    Stürmer, Til; Joshi, Manisha; Glynn, Robert J.; Avorn, Jerry; Rothman, Kenneth J.; Schneeweiss, Sebastian

    2006-01-01

    Objective Propensity score analyses attempt to control for confounding in non-experimental studies by adjusting for the likelihood that a given patient is exposed. Such analyses have been proposed to address confounding by indication, but there is little empirical evidence that they achieve better control than conventional multivariate outcome modeling. Study design and methods Using PubMed and Science Citation Index, we assessed the use of propensity scores over time and critically evaluated studies published through 2003. Results Use of propensity scores increased from a total of 8 papers before 1998 to 71 in 2003. Most of the 177 published studies abstracted assessed medications (N=60) or surgical interventions (N=51), mainly in cardiology and cardiac surgery (N=90). Whether PS methods or conventional outcome models were used to control for confounding had little effect on results in those studies in which such comparison was possible. Only 9 out of 69 studies (13%) had an effect estimate that differed by more than 20% from that obtained with a conventional outcome model in all PS analyses presented. Conclusions Publication of results based on propensity score methods has increased dramatically, but there is little evidence that these methods yield substantially different estimates compared with conventional multivariable methods. PMID:16632131

  4. Analyses of deep mammalian sequence alignments and constraint predictions for 1% of the human genome

    PubMed Central

    Margulies, Elliott H.; Cooper, Gregory M.; Asimenos, George; Thomas, Daryl J.; Dewey, Colin N.; Siepel, Adam; Birney, Ewan; Keefe, Damian; Schwartz, Ariel S.; Hou, Minmei; Taylor, James; Nikolaev, Sergey; Montoya-Burgos, Juan I.; Löytynoja, Ari; Whelan, Simon; Pardi, Fabio; Massingham, Tim; Brown, James B.; Bickel, Peter; Holmes, Ian; Mullikin, James C.; Ureta-Vidal, Abel; Paten, Benedict; Stone, Eric A.; Rosenbloom, Kate R.; Kent, W. James; Bouffard, Gerard G.; Guan, Xiaobin; Hansen, Nancy F.; Idol, Jacquelyn R.; Maduro, Valerie V.B.; Maskeri, Baishali; McDowell, Jennifer C.; Park, Morgan; Thomas, Pamela J.; Young, Alice C.; Blakesley, Robert W.; Muzny, Donna M.; Sodergren, Erica; Wheeler, David A.; Worley, Kim C.; Jiang, Huaiyang; Weinstock, George M.; Gibbs, Richard A.; Graves, Tina; Fulton, Robert; Mardis, Elaine R.; Wilson, Richard K.; Clamp, Michele; Cuff, James; Gnerre, Sante; Jaffe, David B.; Chang, Jean L.; Lindblad-Toh, Kerstin; Lander, Eric S.; Hinrichs, Angie; Trumbower, Heather; Clawson, Hiram; Zweig, Ann; Kuhn, Robert M.; Barber, Galt; Harte, Rachel; Karolchik, Donna; Field, Matthew A.; Moore, Richard A.; Matthewson, Carrie A.; Schein, Jacqueline E.; Marra, Marco A.; Antonarakis, Stylianos E.; Batzoglou, Serafim; Goldman, Nick; Hardison, Ross; Haussler, David; Miller, Webb; Pachter, Lior; Green, Eric D.; Sidow, Arend

    2007-01-01

    A key component of the ongoing ENCODE project involves rigorous comparative sequence analyses for the initially targeted 1% of the human genome. Here, we present orthologous sequence generation, alignment, and evolutionary constraint analyses of 23 mammalian species for all ENCODE targets. Alignments were generated using four different methods; comparisons of these methods reveal large-scale consistency but substantial differences in terms of small genomic rearrangements, sensitivity (sequence coverage), and specificity (alignment accuracy). We describe the quantitative and qualitative trade-offs concomitant with alignment method choice and the levels of technical error that need to be accounted for in applications that require multisequence alignments. Using the generated alignments, we identified constrained regions using three different methods. While the different constraint-detecting methods are in general agreement, there are important discrepancies relating to both the underlying alignments and the specific algorithms. However, by integrating the results across the alignments and constraint-detecting methods, we produced constraint annotations that were found to be robust based on multiple independent measures. Analyses of these annotations illustrate that most classes of experimentally annotated functional elements are enriched for constrained sequences; however, large portions of each class (with the exception of protein-coding sequences) do not overlap constrained regions. The latter elements might not be under primary sequence constraint, might not be constrained across all mammals, or might have expendable molecular functions. Conversely, 40% of the constrained sequences do not overlap any of the functional elements that have been experimentally identified. Together, these findings demonstrate and quantify how many genomic functional elements await basic molecular characterization. PMID:17567995

  5. Quantifying, displaying and accounting for heterogeneity in the meta-analysis of RCTs using standard and generalised Q statistics

    PubMed Central

    2011-01-01

    Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747

  6. An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid.

    PubMed

    van Staden, J F; Mashamba, Mulalo G; Stefan, Raluca I

    2002-09-01

    An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid is proposed. A solution of 0.1 mol L(-1) sodium chloride is used as carrier. Titration is achieved by aspirating acetic acid samples between two strong base-zone volumes into a holding coil and by channelling the stack of well-defined zones with flow reversal through a reaction coil to a potentiometric sensor where the peak widths were measured. A linear relationship between peak width and logarithm of the acid concentration was obtained in the range 1-9 g/100 mL. Vinegar samples were analysed without any sample pre-treatment. The method has a relative standard deviation of 0.4% with a sample frequency of 28 samples per hour. The results revealed good agreement between the proposed sequential injection and an automated batch titration method.

  7. Developing Environmentally Responsible Behaviours Through the Implementation of Argumentation- and Problem-Based Learning Models

    NASA Astrophysics Data System (ADS)

    Fettahlıoğlu, Pınar; Aydoğdu, Mustafa

    2018-04-01

    The purpose of this research is to investigate the effect of using argumentation and problem-based learning approaches on the development of environmentally responsible behaviours among pre-service science teachers. Experimental activities were implemented for 14 weeks for 52 class hours in an environmental education class within a science teaching department. A mixed method was used as a research design; particularly, a special type of Concurrent Nested Strategy was applied. The quantitative portion was based on the one-group pre-test and post-test models, and the qualitative portion was based on the holistic multiple-case study method. The quantitative portion of the research was conducted with 34 third-year pre-service science teachers studying at a state university. The qualitative portion of the study was conducted with six pre-service science teachers selected among the 34 pre-service science teachers based on the pre-test results obtained from an environmentally responsible behaviour scale. t tests for dependent groups were used to analyse quantitative data. Both descriptive and content analyses of the qualitative data were performed. The results of the study showed that the use of the argumentation and problem-based learning approaches significantly contributed to the development of environmentally responsible behaviours among pre-service science teachers.

  8. Fracture Mechanics Analyses for Interface Crack Problems - A Review

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Shivakumar, Kunigal; Raju, Ivatury S.

    2013-01-01

    Recent developments in fracture mechanics analyses of the interfacial crack problem are reviewed. The intent of the review is to renew the awareness of the oscillatory singularity at the crack tip of a bimaterial interface and the problems that occur when calculating mode mixity using numerical methods such as the finite element method in conjunction with the virtual crack closure technique. Established approaches to overcome the nonconvergence issue of the individual mode strain energy release rates are reviewed. In the recent literature many attempts to overcome the nonconvergence issue have been developed. Among the many approaches found only a few methods hold the promise of providing practical solutions. These are the resin interlayer method, the method that chooses the crack tip element size greater than the oscillation zone, the crack tip element method that is based on plate theory and the crack surface displacement extrapolation method. Each of the methods is validated on a very limited set of simple interface crack problems. However, their utility for a wide range of interfacial crack problems is yet to be established.

  9. Patients' views on changes in doctor-patient communication between 1982 and 2001: a mixed-methods study.

    PubMed

    Butalid, Ligaya; Verhaak, Peter F M; Boeije, Hennie R; Bensing, Jozien M

    2012-08-08

    Doctor-patient communication has been influenced over time by factors such as the rise of evidence-based medicine and a growing emphasis on patient-centred care. Despite disputes in the literature on the tension between evidence-based medicine and patient-centered medicine, patients' views on what constitutes high quality of doctor-patient communication are seldom an explicit topic for research. The aim of this study is to examine whether analogue patients (lay people judging videotaped consultations) perceive shifts in the quality of doctor-patient communication over a twenty-year period. Analogue patients (N = 108) assessed 189 videotaped general practice consultations from two periods (1982-1984 and 2000-2001). They provided ratings on three dimensions (scale 1-10) and gave written feedback. With a mixed-methods research design, we examined these assessments quantitatively (in relation to observed communication coded with RIAS) and qualitatively. 1) The quantitative analyses showed that biomedical communication and rapport building were positively associated with the quality assessments of videotaped consultations from the first period, but not from the second. Psychosocial communication and personal remarks were related to positive quality assessments of both periods; 2) the qualitative analyses showed that in both periods, participants provided the same balance between positive and negative comments. Listening, giving support, and showing respect were considered equally important in both periods. We identified shifts in the participants' observations on how GPs explained things to the patient, the division of roles and responsibilities, and the emphasis on problem-focused communication (first period) versus solution-focused communication (last period). Analogue patients recognize shifts in the quality of doctor-patient communication from two different periods, including a shift from problem-focused communication to solution-focused communication, and they value an egalitarian doctor-patient relationship. The two research methods were complementary; based on the quantitative analyses we found shifts in communication, which we confirmed and specified in our qualitative analyses.

  10. Determination of fossil carbon content in Swedish waste fuel by four different methods.

    PubMed

    Jones, Frida C; Blomqvist, Evalena W; Bisaillon, Mattias; Lindberg, Daniel K; Hupa, Mikko

    2013-10-01

    This study aimed to determine the content of fossil carbon in waste combusted in Sweden by using four different methods at seven geographically spread combustion plants. In total, the measurement campaign included 42 solid samples, 21 flue gas samples, 3 sorting analyses and 2 investigations using the balance method. The fossil carbon content in the solid samples and in the flue gas samples was determined using (14)C-analysis. From the analyses it was concluded that about a third of the carbon in mixed Swedish waste (municipal solid waste and industrial waste collected at Swedish industry sites) is fossil. The two other methods (the balance method and calculations from sorting analyses), based on assumptions and calculations, gave similar results in the plants in which they were used. Furthermore, the results indicate that the difference between samples containing as much as 80% industrial waste and samples consisting of solely municipal solid waste was not as large as expected. Besides investigating the fossil content of the waste, the project was also established to investigate the usability of various methods. However, it is difficult to directly compare the different methods used in this project because besides the estimation of emitted fossil carbon the methods provide other information, which is valuable to the plant owner. Therefore, the choice of method can also be controlled by factors other than direct determination of the fossil fuel emissions when considering implementation in the combustion plants.

  11. Biases and power for groups comparison on subjective health measurements.

    PubMed

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.

  12. Single-cell analysis of population context advances RNAi screening at multiple levels

    PubMed Central

    Snijder, Berend; Sacher, Raphael; Rämö, Pauli; Liberali, Prisca; Mench, Karin; Wolfrum, Nina; Burleigh, Laura; Scott, Cameron C; Verheije, Monique H; Mercer, Jason; Moese, Stefan; Heger, Thomas; Theusner, Kristina; Jurgeit, Andreas; Lamparter, David; Balistreri, Giuseppe; Schelhaas, Mario; De Haan, Cornelis A M; Marjomäki, Varpu; Hyypiä, Timo; Rottier, Peter J M; Sodeik, Beate; Marsh, Mark; Gruenberg, Jean; Amara, Ali; Greber, Urs; Helenius, Ari; Pelkmans, Lucas

    2012-01-01

    Isogenic cells in culture show strong variability, which arises from dynamic adaptations to the microenvironment of individual cells. Here we study the influence of the cell population context, which determines a single cell's microenvironment, in image-based RNAi screens. We developed a comprehensive computational approach that employs Bayesian and multivariate methods at the single-cell level. We applied these methods to 45 RNA interference screens of various sizes, including 7 druggable genome and 2 genome-wide screens, analysing 17 different mammalian virus infections and four related cell physiological processes. Analysing cell-based screens at this depth reveals widespread RNAi-induced changes in the population context of individual cells leading to indirect RNAi effects, as well as perturbations of cell-to-cell variability regulators. We find that accounting for indirect effects improves the consistency between siRNAs targeted against the same gene, and between replicate RNAi screens performed in different cell lines, in different labs, and with different siRNA libraries. In an era where large-scale RNAi screens are increasingly performed to reach a systems-level understanding of cellular processes, we show that this is often improved by analyses that account for and incorporate the single-cell microenvironment. PMID:22531119

  13. Evaluating the healthiness of chain-restaurant menu items using crowdsourcing: a new method.

    PubMed

    Lesser, Lenard I; Wu, Leslie; Matthiessen, Timothy B; Luft, Harold S

    2017-01-01

    To develop a technology-based method for evaluating the nutritional quality of chain-restaurant menus to increase the efficiency and lower the cost of large-scale data analysis of food items. Using a Modified Nutrient Profiling Index (MNPI), we assessed chain-restaurant items from the MenuStat database with a process involving three steps: (i) testing 'extreme' scores; (ii) crowdsourcing to analyse fruit, nut and vegetable (FNV) amounts; and (iii) analysis of the ambiguous items by a registered dietitian. In applying the approach to assess 22 422 foods, only 3566 could not be scored automatically based on MenuStat data and required further evaluation to determine healthiness. Items for which there was low agreement between trusted crowd workers, or where the FNV amount was estimated to be >40 %, were sent to a registered dietitian. Crowdsourcing was able to evaluate 3199, leaving only 367 to be reviewed by the registered dietitian. Overall, 7 % of items were categorized as healthy. The healthiest category was soups (26 % healthy), while desserts were the least healthy (2 % healthy). An algorithm incorporating crowdsourcing and a dietitian can quickly and efficiently analyse restaurant menus, allowing public health researchers to analyse the healthiness of menu items.

  14. A simple distillation method to extract bromine from natural water and salt samples for isotope analysis by multi-collector inductively coupled plasma mass spectrometry.

    PubMed

    Eggenkamp, H G M; Louvat, P

    2018-04-30

    In natural samples bromine is present in trace amounts, and measurement of stable Br isotopes necessitates its separation from the matrix. Most methods described previously need large samples or samples with high Br/Cl ratios. The use of metals as reagents, proposed in previous Br distillation methods, must be avoided for multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS) analyses, because of risk of cross-contamination, since the instrument is also used to measure stable isotopes of metals. Dedicated to water and evaporite samples with low Br/Cl ratios, the proposed method is a simple distillation that separates bromide from chloride for isotopic analyses by MC-ICP-MS. It is based on the difference in oxidation potential between chloride and bromide in the presence of nitric acid. The sample is mixed with dilute (1:5) nitric acid in a distillation flask and heated over a candle flame for 10 min. The distillate (bromine) is trapped in an ammonia solution and reduced to bromide. Chloride is only distilled to a very small extent. The obtained solution can be measured directly by MC-ICP-MS for stable Br isotopes. The method was tested for a variety of volumes, ammonia concentrations, pH values and distillation times and compared with the classic ion-exchange chromatography method. The method more efficiently separates Br from Cl, so that samples with lower Br/Cl ratios can be analysed, with Br isotope data in agreement with those obtained by previous methods. Unlike other Br extraction methods based on oxidation, the distillation method presented here does not use any metallic ion for redox reactions that could contaminate the mass spectrometer. It is efficient in separating Br from samples with low Br/Cl ratios. The method ensures reproducible recovery yields and a long-term reproducibility of ±0.11‰ (1 standard deviation). The distillation method was successfully applied to samples with low Br/Cl ratios and low Br amounts (down to 20 μg). Copyright © 2018 John Wiley & Sons, Ltd.

  15. BASTet: Shareable and Reproducible Analysis and Visualization of Mass Spectrometry Imaging Data via OpenMSI.

    PubMed

    Rubel, Oliver; Bowen, Benjamin P

    2018-01-01

    Mass spectrometry imaging (MSI) is a transformative imaging method that supports the untargeted, quantitative measurement of the chemical composition and spatial heterogeneity of complex samples with broad applications in life sciences, bioenergy, and health. While MSI data can be routinely collected, its broad application is currently limited by the lack of easily accessible analysis methods that can process data of the size, volume, diversity, and complexity generated by MSI experiments. The development and application of cutting-edge analytical methods is a core driver in MSI research for new scientific discoveries, medical diagnostics, and commercial-innovation. However, the lack of means to share, apply, and reproduce analyses hinders the broad application, validation, and use of novel MSI analysis methods. To address this central challenge, we introduce the Berkeley Analysis and Storage Toolkit (BASTet), a novel framework for shareable and reproducible data analysis that supports standardized data and analysis interfaces, integrated data storage, data provenance, workflow management, and a broad set of integrated tools. Based on BASTet, we describe the extension of the OpenMSI mass spectrometry imaging science gateway to enable web-based sharing, reuse, analysis, and visualization of data analyses and derived data products. We demonstrate the application of BASTet and OpenMSI in practice to identify and compare characteristic substructures in the mouse brain based on their chemical composition measured via MSI.

  16. Photovoltaic panel extraction from very high-resolution aerial imagery using region-line primitive association analysis and template matching

    NASA Astrophysics Data System (ADS)

    Wang, Min; Cui, Qi; Sun, Yujie; Wang, Qiao

    2018-07-01

    In object-based image analysis (OBIA), object classification performance is jointly determined by image segmentation, sample or rule setting, and classifiers. Typically, as a crucial step to obtain object primitives, image segmentation quality significantly influences subsequent feature extraction and analyses. By contrast, template matching extracts specific objects from images and prevents shape defects caused by image segmentation. However, creating or editing templates is tedious and sometimes results in incomplete or inaccurate templates. In this study, we combine OBIA and template matching techniques to address these problems and aim for accurate photovoltaic panel (PVP) extraction from very high-resolution (VHR) aerial imagery. The proposed method is based on the previously proposed region-line primitive association framework, in which complementary information between region (segment) and line (straight line) primitives is utilized to achieve a more powerful performance than routine OBIA. Several novel concepts, including the mutual fitting ratio and best-fitting template based on region-line primitive association analyses, are proposed. Automatic template generation and matching method for PVP extraction from VHR imagery are designed for concept and model validation. Results show that the proposed method can successfully extract PVPs without any user-specified matching template or training sample. High user independency and accuracy are the main characteristics of the proposed method in comparison with routine OBIA and template matching techniques.

  17. Systematic literature reviews and meta-analyses: part 6 of a series on evaluation of scientific publications.

    PubMed

    Ressing, Meike; Blettner, Maria; Klug, Stefanie J

    2009-07-01

    Because of the rising number of scientific publications, it is important to have a means of jointly summarizing and assessing different studies on a single topic. Systematic literature reviews, meta-analyses of published data, and meta-analyses of individual data (pooled reanalyses) are now being published with increasing frequency. We here describe the essential features of these methods and discuss their strengths and weaknesses. This article is based on a selective literature search. The different types of review and meta-analysis are described, the methods used in each are outlined so that they can be evaluated, and a checklist is given for the assessment of reviews and meta-analyses of scientific articles. Systematic literature reviews provide an overview of the state of research on a given topic and enable an assessment of the quality of individual studies. They also allow the results of different studies to be evaluated together when these are inconsistent. Meta-analyses additionally allow calculation of pooled estimates of an effect. The different types of review and meta-analysis are discussed with examples from the literature on one particular topic. Systematic literature reviews and meta-analyses enable the research findings and treatment effects obtained in different individual studies to be summed up and evaluated.

  18. A combined emitter threat assessment method based on ICW-RCM

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Wang, Hongwei; Guo, Xiaotao; Wang, Yubing

    2017-08-01

    Considering that the tradition al emitter threat assessment methods are difficult to intuitively reflect the degree of target threaten and the deficiency of real-time and complexity, on the basis of radar chart method(RCM), an algorithm of emitter combined threat assessment based on ICW-RCM (improved combination weighting method, ICW) is proposed. The coarse sorting is integrated with fine sorting in emitter combined threat assessment, sequencing the emitter threat level roughly accordance to radar operation mode, and reducing task priority of the low-threat emitter; On the basis of ICW-RCM, sequencing the same radar operation mode emitter roughly, finally, obtain the results of emitter threat assessment through coarse and fine sorting. Simulation analyses show the correctness and effectiveness of this algorithm. Comparing with classical method of emitter threat assessment based on CW-RCM, the algorithm is visual in image and can work quickly with lower complexity.

  19. On the use of log-transformation vs. nonlinear regression for analyzing biological power laws

    USGS Publications Warehouse

    Xiao, X.; White, E.P.; Hooten, M.B.; Durham, S.L.

    2011-01-01

    Power-law relationships are among the most well-studied functional relationships in biology. Recently the common practice of fitting power laws using linear regression (LR) on log-transformed data has been criticized, calling into question the conclusions of hundreds of studies. It has been suggested that nonlinear regression (NLR) is preferable, but no rigorous comparison of these two methods has been conducted. Using Monte Carlo simulations, we demonstrate that the error distribution determines which method performs better, with NLR better characterizing data with additive, homoscedastic, normal error and LR better characterizing data with multiplicative, heteroscedastic, lognormal error. Analysis of 471 biological power laws shows that both forms of error occur in nature. While previous analyses based on log-transformation appear to be generally valid, future analyses should choose methods based on a combination of biological plausibility and analysis of the error distribution. We provide detailed guidelines and associated computer code for doing so, including a model averaging approach for cases where the error structure is uncertain. ?? 2011 by the Ecological Society of America.

  20. Parent-based adolescent sexual health interventions and effect on communication outcomes: a systematic review and meta-analyses.

    PubMed

    Santa Maria, Diane; Markham, Christine; Bluethmann, Shirley; Mullen, Patricia Dolan

    2015-03-01

    Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. A systematic search of databases for the period 1998-2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. These findings point to gaps in the range of programs examined in published trials-for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. Copyright © 2015 by the Guttmacher Institute.

  1. A practical method for extending the biuret assay to protein determination of corn-based products.

    PubMed

    Liu, Zelong; Pan, Junhui

    2017-06-01

    A modified biuret method suitable for protein determination of corn-based products was developed by introducing a combination of an alkaline reagent with sodium dodecyl sulfate (reagent A) and heat treatments. The method was tested on seven corn-based samples. The results showed mostly good agreement (P>0.05) as compared to the Kjeldahl values. The proposed method was found to enhance the accuracy of prediction on zein content using bovine serum albumin as standard. Reagent A and sample treatment were proved to effectively improve protein solubilization for the thermally-dried corn-based products, e.g. corn gluten meal. The absorbance was stable for at least 1-h. Moreover, the whole measurement of protein content only needs 15-20min more than the traditional biuret assay, and can be performed in batches. The findings suggest that the proposed method could be a timesaving alternative for routine protein analyses in corn processing factories. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    I. W. Ginsberg

    Multiresolutional decompositions known as spectral fingerprints are often used to extract spectral features from multispectral/hyperspectral data. In this study, the authors investigate the use of wavelet-based algorithms for generating spectral fingerprints. The wavelet-based algorithms are compared to the currently used method, traditional convolution with first-derivative Gaussian filters. The comparison analyses consists of two parts: (a) the computational expense of the new method is compared with the computational costs of the current method and (b) the outputs of the wavelet-based methods are compared with those of the current method to determine any practical differences in the resulting spectral fingerprints. The resultsmore » show that the wavelet-based algorithms can greatly reduce the computational expense of generating spectral fingerprints, while practically no differences exist in the resulting fingerprints. The analysis is conducted on a database of hyperspectral signatures, namely, Hyperspectral Digital Image Collection Experiment (HYDICE) signatures. The reduction in computational expense is by a factor of about 30, and the average Euclidean distance between resulting fingerprints is on the order of 0.02.« less

  3. Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data.

    PubMed

    Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing

    2017-05-15

    Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection.

  4. Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data

    PubMed Central

    Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing

    2017-01-01

    Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection. PMID:28505135

  5. On existence of the {sigma}(600) Its physical implications and related problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishida, Shin

    1998-05-29

    We make a re-analysis of 1=0 {pi}{pi} scattering phase shift {delta}{sub 0}{sup 0} through a new method of S-matrix parametrization (IA; interfering amplitude method), and show a result suggesting strongly for the existence of {sigma}-particle-long-sought Chiral partner of {pi}-meson. Furthermore, through the phenomenological analyses of typical production processes of the 2{pi}-system, the pp-central collision and the J/{psi}{yields}{omega}{pi}{pi} decay, by applying an intuitive formula as sum of Breit-Wigner amplitudes, (VMW; variant mass and width method), the other evidences for the {sigma}-existence are given. The validity of the methods used in the above analyses is investigated, using a simple field theoretical model,more » from the general viewpoint of unitarity and the applicability of final state interaction (FSI-) theorem, especially in relation to the ''universality'' argument. It is shown that the IA and VMW are obtained as the physical state representations of scattering and production amplitudes, respectively. The VMW is shown to be an effective method to obtain the resonance properties from production processes, which generally have the unknown strong-phases. The conventional analyses based on the 'universality' seem to be powerless for this purpose.« less

  6. Missing data and multiple imputation in clinical epidemiological research.

    PubMed

    Pedersen, Alma B; Mikkelsen, Ellen M; Cronin-Fenton, Deirdre; Kristensen, Nickolaj R; Pham, Tra My; Pedersen, Lars; Petersen, Irene

    2017-01-01

    Missing data are ubiquitous in clinical epidemiological research. Individuals with missing data may differ from those with no missing data in terms of the outcome of interest and prognosis in general. Missing data are often categorized into the following three types: missing completely at random (MCAR), missing at random (MAR), and missing not at random (MNAR). In clinical epidemiological research, missing data are seldom MCAR. Missing data can constitute considerable challenges in the analyses and interpretation of results and can potentially weaken the validity of results and conclusions. A number of methods have been developed for dealing with missing data. These include complete-case analyses, missing indicator method, single value imputation, and sensitivity analyses incorporating worst-case and best-case scenarios. If applied under the MCAR assumption, some of these methods can provide unbiased but often less precise estimates. Multiple imputation is an alternative method to deal with missing data, which accounts for the uncertainty associated with missing data. Multiple imputation is implemented in most statistical software under the MAR assumption and provides unbiased and valid estimates of associations based on information from the available data. The method affects not only the coefficient estimates for variables with missing data but also the estimates for other variables with no missing data.

  7. Missing data and multiple imputation in clinical epidemiological research

    PubMed Central

    Pedersen, Alma B; Mikkelsen, Ellen M; Cronin-Fenton, Deirdre; Kristensen, Nickolaj R; Pham, Tra My; Pedersen, Lars; Petersen, Irene

    2017-01-01

    Missing data are ubiquitous in clinical epidemiological research. Individuals with missing data may differ from those with no missing data in terms of the outcome of interest and prognosis in general. Missing data are often categorized into the following three types: missing completely at random (MCAR), missing at random (MAR), and missing not at random (MNAR). In clinical epidemiological research, missing data are seldom MCAR. Missing data can constitute considerable challenges in the analyses and interpretation of results and can potentially weaken the validity of results and conclusions. A number of methods have been developed for dealing with missing data. These include complete-case analyses, missing indicator method, single value imputation, and sensitivity analyses incorporating worst-case and best-case scenarios. If applied under the MCAR assumption, some of these methods can provide unbiased but often less precise estimates. Multiple imputation is an alternative method to deal with missing data, which accounts for the uncertainty associated with missing data. Multiple imputation is implemented in most statistical software under the MAR assumption and provides unbiased and valid estimates of associations based on information from the available data. The method affects not only the coefficient estimates for variables with missing data but also the estimates for other variables with no missing data. PMID:28352203

  8. The Use of Linear Instrumental Variables Methods in Health Services Research and Health Economics: A Cautionary Note

    PubMed Central

    Terza, Joseph V; Bradford, W David; Dismuke, Clara E

    2008-01-01

    Objective To investigate potential bias in the use of the conventional linear instrumental variables (IV) method for the estimation of causal effects in inherently nonlinear regression settings. Data Sources Smoking Supplement to the 1979 National Health Interview Survey, National Longitudinal Alcohol Epidemiologic Survey, and simulated data. Study Design Potential bias from the use of the linear IV method in nonlinear models is assessed via simulation studies and real world data analyses in two commonly encountered regression setting: (1) models with a nonnegative outcome (e.g., a count) and a continuous endogenous regressor; and (2) models with a binary outcome and a binary endogenous regressor. Principle Findings The simulation analyses show that substantial bias in the estimation of causal effects can result from applying the conventional IV method in inherently nonlinear regression settings. Moreover, the bias is not attenuated as the sample size increases. This point is further illustrated in the survey data analyses in which IV-based estimates of the relevant causal effects diverge substantially from those obtained with appropriate nonlinear estimation methods. Conclusions We offer this research as a cautionary note to those who would opt for the use of linear specifications in inherently nonlinear settings involving endogeneity. PMID:18546544

  9. A Simple Method for Estimating Informative Node Age Priors for the Fossil Calibration of Molecular Divergence Time Analyses

    PubMed Central

    Nowak, Michael D.; Smith, Andrew B.; Simpson, Carl; Zwickl, Derrick J.

    2013-01-01

    Molecular divergence time analyses often rely on the age of fossil lineages to calibrate node age estimates. Most divergence time analyses are now performed in a Bayesian framework, where fossil calibrations are incorporated as parametric prior probabilities on node ages. It is widely accepted that an ideal parameterization of such node age prior probabilities should be based on a comprehensive analysis of the fossil record of the clade of interest, but there is currently no generally applicable approach for calculating such informative priors. We provide here a simple and easily implemented method that employs fossil data to estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade, which can be used to fit an informative parametric prior probability distribution on a node age. Specifically, our method uses the extant diversity and the stratigraphic distribution of fossil lineages confidently assigned to a clade to fit a branching model of lineage diversification. Conditioning this on a simple model of fossil preservation, we estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade. The likelihood surface of missing history can then be translated into a parametric prior probability distribution on the age of the clade of interest. We show that the method performs well with simulated fossil distribution data, but that the likelihood surface of missing history can at times be too complex for the distribution-fitting algorithm employed by our software tool. An empirical example of the application of our method is performed to estimate echinoid node ages. A simulation-based sensitivity analysis using the echinoid data set shows that node age prior distributions estimated under poor preservation rates are significantly less informative than those estimated under high preservation rates. PMID:23755303

  10. Cesarean delivery rates among family physicians versus obstetricians: a population-based cohort study using instrumental variable methods

    PubMed Central

    Dawe, Russell Eric; Bishop, Jessica; Pendergast, Amanda; Avery, Susan; Monaghan, Kelly; Duggan, Norah; Aubrey-Bassler, Kris

    2017-01-01

    Background: Previous research suggests that family physicians have rates of cesarean delivery that are lower than or equivalent to those for obstetricians, but adjustments for risk differences in these analyses may have been inadequate. We used an econometric method to adjust for observed and unobserved factors affecting the risk of cesarean delivery among women attended by family physicians versus obstetricians. Methods: This retrospective population-based cohort study included all Canadian (except Quebec) hospital deliveries by family physicians and obstetricians between Apr. 1, 2006, and Mar. 31, 2009. We excluded women with multiple gestations, and newborns with a birth weight less than 500 g or gestational age less than 20 weeks. We estimated the relative risk of cesarean delivery using instrumental-variable-adjusted and logistic regression. Results: The final cohort included 776 299 women who gave birth in 390 hospitals. The risk of cesarean delivery was 27.3%, and the mean proportion of deliveries by family physicians was 26.9% (standard deviation 23.8%). The relative risk of cesarean delivery for family physicians versus obstetricians was 0.48 (95% confidence interval [CI] 0.41-0.56) with logistic regression and 1.27 (95% CI 1.02-1.57) with instrumental-variable-adjusted regression. Interpretation: Our conventional analyses suggest that family physicians have a lower rate of cesarean delivery than obstetricians, but instrumental variable analyses suggest the opposite. Because instrumental variable methods adjust for unmeasured factors and traditional methods do not, the large discrepancy between these estimates of risk suggests that clinical and/or sociocultural factors affecting the decision to perform cesarean delivery may not be accounted for in our database. PMID:29233843

  11. Applying machine learning to identify autistic adults using imitation: An exploratory study.

    PubMed

    Li, Baihua; Sharma, Arjun; Meng, James; Purushwalkam, Senthil; Gowen, Emma

    2017-01-01

    Autism spectrum condition (ASC) is primarily diagnosed by behavioural symptoms including social, sensory and motor aspects. Although stereotyped, repetitive motor movements are considered during diagnosis, quantitative measures that identify kinematic characteristics in the movement patterns of autistic individuals are poorly studied, preventing advances in understanding the aetiology of motor impairment, or whether a wider range of motor characteristics could be used for diagnosis. The aim of this study was to investigate whether data-driven machine learning based methods could be used to address some fundamental problems with regard to identifying discriminative test conditions and kinematic parameters to classify between ASC and neurotypical controls. Data was based on a previous task where 16 ASC participants and 14 age, IQ matched controls observed then imitated a series of hand movements. 40 kinematic parameters extracted from eight imitation conditions were analysed using machine learning based methods. Two optimal imitation conditions and nine most significant kinematic parameters were identified and compared with some standard attribute evaluators. To our knowledge, this is the first attempt to apply machine learning to kinematic movement parameters measured during imitation of hand movements to investigate the identification of ASC. Although based on a small sample, the work demonstrates the feasibility of applying machine learning methods to analyse high-dimensional data and suggest the potential of machine learning for identifying kinematic biomarkers that could contribute to the diagnostic classification of autism.

  12. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  13. Sybil--efficient constraint-based modelling in R.

    PubMed

    Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J

    2013-11-13

    Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).

  14. Alternatives to Piloting Textbooks.

    ERIC Educational Resources Information Center

    Muther, Connie

    1985-01-01

    Using short-term pilot programs to evaluate textbooks can lead to unreliable results and interfere with effective education. Alternative methods for evaluating textbook-based programs include obtaining documented analyses of competitors' products from sales agents, visiting districts using programs being considered, and examining publishers' own…

  15. Using an Evidential Reasoning Approach for Portfolio Assessments in a Project-Based Learning Engineering Design Course

    ERIC Educational Resources Information Center

    Jaeger, Martin; Adair, Desmond

    2015-01-01

    The purpose of this study is to analyse the feasibility of an evidential reasoning (ER) method for portfolio assessments and comparison of the results found with those based on a traditional holistic judgement. An ER approach has been incorporated into portfolio assessment of an undergraduate engineering design course delivered as a project-based…

  16. Rank-based estimation in the {ell}1-regularized partly linear model for censored outcomes with application to integrated analyses of clinical predictors and gene expression data.

    PubMed

    Johnson, Brent A

    2009-10-01

    We consider estimation and variable selection in the partial linear model for censored data. The partial linear model for censored data is a direct extension of the accelerated failure time model, the latter of which is a very important alternative model to the proportional hazards model. We extend rank-based lasso-type estimators to a model that may contain nonlinear effects. Variable selection in such partial linear model has direct application to high-dimensional survival analyses that attempt to adjust for clinical predictors. In the microarray setting, previous methods can adjust for other clinical predictors by assuming that clinical and gene expression data enter the model linearly in the same fashion. Here, we select important variables after adjusting for prognostic clinical variables but the clinical effects are assumed nonlinear. Our estimator is based on stratification and can be extended naturally to account for multiple nonlinear effects. We illustrate the utility of our method through simulation studies and application to the Wisconsin prognostic breast cancer data set.

  17. Transportation Network Role for Central Italy Macroregion Development in a Territorial Frames Model Based

    NASA Astrophysics Data System (ADS)

    Di Ludovico, Donato; D'Ovidio, Gino

    2017-10-01

    This paper refers to an interdisciplinary planning research approach that aims to combine urban aspects related to a territorial spatial development with transport requirements connected to an efficiency and sustainable mobility. The proposed research method is based on “Territorial Frames” (TFs) model that derived from an original interpretation of the local context divided into a summation of territorial settlement fabrics characterized in terms of spatial tile, morphology and mobility axes. The TFs, with their own autonomous, different size and structure, are used as the main plot, able to assemble the settlement systems and their posturbane forms. With a view to polycentric and spatial development, the research method allows us to analyse the completeness of the TFs and their connective potential, in order to locate the missing/inefficient elements of the transportation network and planning other TFs essential to support economic and social development processes of the most isolated and disadvantaged inland areas. Finally, a case study of the Italian Median Macroregion configuration based on TFs model approach is proposed, analysed and discussed.

  18. Meta-analysis of two studies in the presence of heterogeneity with applications in rare diseases.

    PubMed

    Friede, Tim; Röver, Christian; Wandel, Simon; Neuenschwander, Beat

    2017-07-01

    Random-effects meta-analyses are used to combine evidence of treatment effects from multiple studies. Since treatment effects may vary across trials due to differences in study characteristics, heterogeneity in treatment effects between studies must be accounted for to achieve valid inference. The standard model for random-effects meta-analysis assumes approximately normal effect estimates and a normal random-effects model. However, standard methods based on this model ignore the uncertainty in estimating the between-trial heterogeneity. In the special setting of only two studies and in the presence of heterogeneity, we investigate here alternatives such as the Hartung-Knapp-Sidik-Jonkman method (HKSJ), the modified Knapp-Hartung method (mKH, a variation of the HKSJ method) and Bayesian random-effects meta-analyses with priors covering plausible heterogeneity values; R code to reproduce the examples is presented in an appendix. The properties of these methods are assessed by applying them to five examples from various rare diseases and by a simulation study. Whereas the standard method based on normal quantiles has poor coverage, the HKSJ and mKH generally lead to very long, and therefore inconclusive, confidence intervals. The Bayesian intervals on the whole show satisfying properties and offer a reasonable compromise between these two extremes. © 2016 The Authors. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A model-based test for treatment effects with probabilistic classifications.

    PubMed

    Cavagnaro, Daniel R; Davis-Stober, Clintin P

    2018-05-21

    Within modern psychology, computational and statistical models play an important role in describing a wide variety of human behavior. Model selection analyses are typically used to classify individuals according to the model(s) that best describe their behavior. These classifications are inherently probabilistic, which presents challenges for performing group-level analyses, such as quantifying the effect of an experimental manipulation. We answer this challenge by presenting a method for quantifying treatment effects in terms of distributional changes in model-based (i.e., probabilistic) classifications across treatment conditions. The method uses hierarchical Bayesian mixture modeling to incorporate classification uncertainty at the individual level into the test for a treatment effect at the group level. We illustrate the method with several worked examples, including a reanalysis of the data from Kellen, Mata, and Davis-Stober (2017), and analyze its performance more generally through simulation studies. Our simulations show that the method is both more powerful and less prone to type-1 errors than Fisher's exact test when classifications are uncertain. In the special case where classifications are deterministic, we find a near-perfect power-law relationship between the Bayes factor, derived from our method, and the p value obtained from Fisher's exact test. We provide code in an online supplement that allows researchers to apply the method to their own data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Classification and regression tree analysis vs. multivariable linear and logistic regression methods as statistical tools for studying haemophilia.

    PubMed

    Henrard, S; Speybroeck, N; Hermans, C

    2015-11-01

    Haemophilia is a rare genetic haemorrhagic disease characterized by partial or complete deficiency of coagulation factor VIII, for haemophilia A, or IX, for haemophilia B. As in any other medical research domain, the field of haemophilia research is increasingly concerned with finding factors associated with binary or continuous outcomes through multivariable models. Traditional models include multiple logistic regressions, for binary outcomes, and multiple linear regressions for continuous outcomes. Yet these regression models are at times difficult to implement, especially for non-statisticians, and can be difficult to interpret. The present paper sought to didactically explain how, why, and when to use classification and regression tree (CART) analysis for haemophilia research. The CART method is non-parametric and non-linear, based on the repeated partitioning of a sample into subgroups based on a certain criterion. Breiman developed this method in 1984. Classification trees (CTs) are used to analyse categorical outcomes and regression trees (RTs) to analyse continuous ones. The CART methodology has become increasingly popular in the medical field, yet only a few examples of studies using this methodology specifically in haemophilia have to date been published. Two examples using CART analysis and previously published in this field are didactically explained in details. There is increasing interest in using CART analysis in the health domain, primarily due to its ease of implementation, use, and interpretation, thus facilitating medical decision-making. This method should be promoted for analysing continuous or categorical outcomes in haemophilia, when applicable. © 2015 John Wiley & Sons Ltd.

  1. Detecting the Presence of Bacterial DNA and RNA by Polymerase Chain Reaction to Diagnose Suspected Periprosthetic Joint Infection after Antibiotic Therapy.

    PubMed

    Fang, Xin-Yu; Li, Wen-Bo; Zhang, Chao-Fan; Huang, Zi-da; Zeng, Hui-Yi; Dong, Zheng; Zhang, Wen-Ming

    2018-02-01

    To explore the diagnostic efficiency of DNA-based and RNA-based quantitative polymerase chain reaction (qPCR) analyses for periprosthetic joint infection (PJI). To determine the detection limit of DNA-based and RNA-based qPCR in vitro, Staphylococcus aureus and Escherichia coli strains were added to sterile synovial fluid obtained from a patient with knee osteoarthritis. Serial dilutions of samples were analyzed by DNA-based and RNA-based qPCR. Clinically, patients who were suspected of having PJI and eventually underwent revision arthroplasty in our hospital from July 2014 to December 2016 were screened. Preoperative puncture or intraoperative collection was performed on patients who met the inclusion and exclusion criteria to obtain synovial fluid. DNA-based and RNA-based PCR analyses and culture were performed on each synovial fluid sample. The patients' demographic characteristics, medical history, and laboratory test results were recorded. The diagnostic efficiency of both PCR assays was compared with culture methods. The in vitro analysis demonstrated that DNA-based qPCR assay was highly sensitive, with the detection limit being 1200 colony forming units (CFU)/mL of S. aureus and 3200 CFU/mL of E. coli. Meanwhile, The RNA-based qPCR assay could detect 2300 CFU/mL of S. aureus and 11 000 CFU/mL of E. coli. Clinically, the sensitivity, specificity, and accuracy were 65.7%, 100%, and 81.6%, respectively, for the culture method; 81.5%, 84.8%, and 83.1%, respectively, for DNA-based qPCR; and 73.6%, 100%, and 85.9%, respectively, for RNA-based qPCR. DNA-based qPCR could detect suspected PJI with high sensitivity after antibiotic therapy. RNA-based qPCR could reduce the false positive rates of DNA-based assays. qPCR-based methods could improve the efficiency of PJI diagnosis. © 2018 Chinese Orthopaedic Association and John Wiley & Sons Australia, Ltd.

  2. Invasive candidiasis: future directions in non-culture based diagnosis.

    PubMed

    Posch, Wilfried; Heimdörfer, David; Wilflingseder, Doris; Lass-Flörl, Cornelia

    2017-09-01

    Delayed initial antifungal therapy is associated with high mortality rates caused by invasive candida infections, since accurate detection of the opportunistic pathogenic yeast and its identification display a diagnostic challenge. diagnosis of candida infections relies on time-consuming methods such as blood cultures, serologic and histopathologic examination. to allow for fast detection and characterization of invasive candidiasis, there is a need to improve diagnostic tools. trends in diagnostics switch to non-culture-based methods, which allow specified diagnosis within significantly shorter periods of time in order to provide early and appropriate antifungal treatment. Areas covered: within this review comprise novel pathogen- and host-related testing methods, e.g. multiplex-PCR analyses, T2 magnetic resonance, fungus-specific DNA microarrays, microRNA characterization or analyses of IL-17 as biomarker for early detection of invasive candidiasis. Expert commentary: Early recognition and diagnosis of fungal infections is a key issue for improved patient management. As shown in this review, a broad range of novel molecular based tests for the detection and identification of Candida species is available. However, several assays are in-house assays and lack standardization, clinical validation as well as data on sensitivity and specificity. This underscores the need for the development of faster and more accurate diagnostic tests.

  3. A Gibbs sampler for Bayesian analysis of site-occupancy data

    USGS Publications Warehouse

    Dorazio, Robert M.; Rodriguez, Daniel Taylor

    2012-01-01

    1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.

  4. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  5. Qualitative and quantitative determination of yohimbine in authentic yohimbe bark and in commercial aphrodisiacs by HPLC-UV-API/ MS methods.

    PubMed

    Zanolari, Boris; Ndjoko, Karine; Ioset, Jean-Robert; Marston, Andrew; Hostettmann, Kurt

    2003-01-01

    The development and validation of a rapid qualitative and quantitative method based on an HPLC-UV-MS technique with atmospheric pressure chemical ionisation and electrospray ionisation for the analysis of yohimbine in a number of commercial aphrodisiac products is reported. HPLC with multiple-stage mass spectrometry experiments allowed the identification of the target compound and increased the selectivity of complex analyses such as those involved with multi-botanical preparations. The precision and the robustness of the method were improved by the use of two internal standards: codeine for UV detection and deuterium-labelled yohimbine for MS detection. Twenty commercial aphrodisiac preparations were analysed and the amount of yohimbine measured and expressed as the maximal dose per day suggested on product labels ranged from 1.32 to 23.16 mg.

  6. [The mathematical methods for the estimation of the workload on the personnel of the Bureau of Forensic Medical Expertise].

    PubMed

    Édeleva, A N; Proĭdakova, E V

    2014-01-01

    The objective of the present study was to analyse the financial support of forensic medical research with the application of mathematical methods based at the Nizhni Novgorod Regional Bureau of Forensic Medical Expertise. The authors elaborated the prognosis of the expenses for the forensic medical expertise of the corpses of the elderly and senile subjects.

  7. Estimation of S/G ratio in woods using 1064 nm FT-Raman spectroscopy

    Treesearch

    Umesh P. Agarwal; Sally A. Ralph; Dharshana Padmakshan; Sarah Liu; Steven D. Karlen; Cliff Foster; John Ralph

    2015-01-01

    Two simple methods based on the 370 cm-1 Raman band intensity were developed for estimation of syringyl-to-guaiacyl (S/G) ratio in woods. The methods, in principle, are representative of the whole cell wall lignin and not just the portion of lignin that gets cleaved to release monomers, for example, during certain S/G chemical analyses. As such,...

  8. Optimisation of DNA extraction from the crustacean Daphnia

    PubMed Central

    Athanasio, Camila Gonçalves; Chipman, James K.; Viant, Mark R.

    2016-01-01

    Daphnia are key model organisms for mechanistic studies of phenotypic plasticity, adaptation and microevolution, which have led to an increasing demand for genomics resources. A key step in any genomics analysis, such as high-throughput sequencing, is the availability of sufficient and high quality DNA. Although commercial kits exist to extract genomic DNA from several species, preparation of high quality DNA from Daphnia spp. and other chitinous species can be challenging. Here, we optimise methods for tissue homogenisation, DNA extraction and quantification customised for different downstream analyses (e.g., LC-MS/MS, Hiseq, mate pair sequencing or Nanopore). We demonstrate that if Daphnia magna are homogenised as whole animals (including the carapace), absorbance-based DNA quantification methods significantly over-estimate the amount of DNA, resulting in using insufficient starting material for experiments, such as preparation of sequencing libraries. This is attributed to the high refractive index of chitin in Daphnia’s carapace at 260 nm. Therefore, unless the carapace is removed by overnight proteinase digestion, the extracted DNA should be quantified with fluorescence-based methods. However, overnight proteinase digestion will result in partial fragmentation of DNA therefore the prepared DNA is not suitable for downstream methods that require high molecular weight DNA, such as PacBio, mate pair sequencing and Nanopore. In conclusion, we found that the MasterPure DNA purification kit, coupled with grinding of frozen tissue, is the best method for extraction of high molecular weight DNA as long as the extracted DNA is quantified with fluorescence-based methods. This method generated high yield and high molecular weight DNA (3.10 ± 0.63 ng/µg dry mass, fragments >60 kb), free of organic contaminants (phenol, chloroform) and is suitable for large number of downstream analyses. PMID:27190714

  9. Single-case synthesis tools II: Comparing quantitative outcome measures.

    PubMed

    Zimmerman, Kathleen N; Pustejovsky, James E; Ledford, Jennifer R; Barton, Erin E; Severini, Katherine E; Lloyd, Blair P

    2018-03-07

    Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes-overlap measures (percentage non-overlapping data, improvement rate difference, and Tau) and parametric within-case effect sizes (standardized mean difference and log response ratio [increasing and decreasing])-were compared to determine if choice of synthesis method within and across classes impacts conclusions regarding effectiveness. The effectiveness of sensory-based interventions (SBI), a commonly used class of treatments for young children, was evaluated. Separately from evaluations of rigor and quality, authors evaluated behavior change between baseline and SBI conditions. SBI were unlikely to result in positive behavior change across all measures except IRD. However, subgroup analyses resulted in variable conclusions, indicating that the choice of measures for SCD meta-analyses can impact conclusions. Suggestions for using the log response ratio in SCD meta-analyses and considerations for understanding variability in SCD meta-analysis conclusions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Vander Lugt correlation of DNA sequence data

    NASA Astrophysics Data System (ADS)

    Christens-Barry, William A.; Hawk, James F.; Martin, James C.

    1990-12-01

    DNA, the molecule containing the genetic code of an organism, is a linear chain of subunits. It is the sequence of subunits, of which there are four kinds, that constitutes the unique blueprint of an individual. This sequence is the focus of a large number of analyses performed by an army of geneticists, biologists, and computer scientists. Most of these analyses entail searches for specific subsequences within the larger set of sequence data. Thus, most analyses are essentially pattern recognition or correlation tasks. Yet, there are special features to such analysis that influence the strategy and methods of an optical pattern recognition approach. While the serial processing employed in digital electronic computers remains the main engine of sequence analyses, there is no fundamental reason that more efficient parallel methods cannot be used. We describe an approach using optical pattern recognition (OPR) techniques based on matched spatial filtering. This allows parallel comparison of large blocks of sequence data. In this study we have simulated a Vander Lugt1 architecture implementing our approach. Searches for specific target sequence strings within a block of DNA sequence from the Co/El plasmid2 are performed.

  11. Ground State and Finite Temperature Lanczos Methods

    NASA Astrophysics Data System (ADS)

    Prelovšek, P.; Bonča, J.

    The present review will focus on recent development of exact- diagonalization (ED) methods that use Lanczos algorithm to transform large sparse matrices onto the tridiagonal form. We begin with a review of basic principles of the Lanczos method for computing ground-state static as well as dynamical properties. Next, generalization to finite-temperatures in the form of well established finite-temperature Lanczos method is described. The latter allows for the evaluation of temperatures T>0 static and dynamic quantities within various correlated models. Several extensions and modification of the latter method introduced more recently are analysed. In particular, the low-temperature Lanczos method and the microcanonical Lanczos method, especially applicable within the high-T regime. In order to overcome the problems of exponentially growing Hilbert spaces that prevent ED calculations on larger lattices, different approaches based on Lanczos diagonalization within the reduced basis have been developed. In this context, recently developed method based on ED within a limited functional space is reviewed. Finally, we briefly discuss the real-time evolution of correlated systems far from equilibrium, which can be simulated using the ED and Lanczos-based methods, as well as approaches based on the diagonalization in a reduced basis.

  12. Spatial extent of a hydrothermal system at Kilauea Volcano, Hawaii, determined from array analyses of shallow long-period seismicity 1. Method

    USGS Publications Warehouse

    Almendros, J.; Chouet, B.; Dawson, P.

    2001-01-01

    We present a probabilistic method to locate the source of seismic events using seismic antennas. The method is based on a comparison of the event azimuths and slownesses derived from frequency-slowness analyses of array data, with a slowness vector model. Several slowness vector models are considered including both homogeneous and horizontally layered half-spaces and also a more complex medium representing the actual topography and three-dimensional velocity structure of the region under study. In this latter model the slowness vector is obtained from frequency-slowness analyses of synthetic signals. These signals are generated using the finite difference method and include the effects of topography and velocity structure to reproduce as closely as possible the behavior of the observed wave fields. A comparison of these results with those obtained with a homogeneous half-space demonstrates the importance of structural and topographic effects, which, if ignored, lead to a bias in the source location. We use synthetic seismograms to test the accuracy and stability of the method and to investigate the effect of our choice of probability distributions. We conclude that this location method can provide the source position of shallow events within a complex volcanic structure such as Kilauea Volcano with an error of ??200 m. Copyright 2001 by the American Geophysical Union.

  13. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Layered clustering multi-fault diagnosis for hydraulic piston pump

    NASA Astrophysics Data System (ADS)

    Du, Jun; Wang, Shaoping; Zhang, Haiyan

    2013-04-01

    Efficient diagnosis is very important for improving reliability and performance of aircraft hydraulic piston pump, and it is one of the key technologies in prognostic and health management system. In practice, due to harsh working environment and heavy working loads, multiple faults of an aircraft hydraulic pump may occur simultaneously after long time operations. However, most existing diagnosis methods can only distinguish pump faults that occur individually. Therefore, new method needs to be developed to realize effective diagnosis of simultaneous multiple faults on aircraft hydraulic pump. In this paper, a new method based on the layered clustering algorithm is proposed to diagnose multiple faults of an aircraft hydraulic pump that occur simultaneously. The intensive failure mechanism analyses of the five main types of faults are carried out, and based on these analyses the optimal combination and layout of diagnostic sensors is attained. The three layered diagnosis reasoning engine is designed according to the faults' risk priority number and the characteristics of different fault feature extraction methods. The most serious failures are first distinguished with the individual signal processing. To the desultory faults, i.e., swash plate eccentricity and incremental clearance increases between piston and slipper, the clustering diagnosis algorithm based on the statistical average relative power difference (ARPD) is proposed. By effectively enhancing the fault features of these two faults, the ARPDs calculated from vibration signals are employed to complete the hypothesis testing. The ARPDs of the different faults follow different probability distributions. Compared with the classical fast Fourier transform-based spectrum diagnosis method, the experimental results demonstrate that the proposed algorithm can diagnose the multiple faults, which occur synchronously, with higher precision and reliability.

  15. Trends in study design and the statistical methods employed in a leading general medicine journal.

    PubMed

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing after the presentation of the FDA guidance for adaptive design. © 2017 John Wiley & Sons Ltd.

  16. Validation of a self-administered web-based 24-hour dietary recall among pregnant women.

    PubMed

    Savard, Claudia; Lemieux, Simone; Lafrenière, Jacynthe; Laramée, Catherine; Robitaille, Julie; Morisset, Anne-Sophie

    2018-04-23

    The use of valid dietary assessment methods is crucial to analyse adherence to dietary recommendations among pregnant women. This study aims to assess the relative validity of a self-administered Web-based 24-h dietary recall, the R24W, against a pen-paper 3-day food record (FR) among pregnant women. Sixty (60) pregnant women recruited at 9.3 ± 0.7 weeks of pregnancy in Quebec City completed, at each trimester, 3 R24W and a 3-day FR. Mean energy and nutrient intakes reported by both tools were compared using paired Student T-Tests. Pearson correlations were used to analyze the association between both methods. Agreement between the two methods was evaluated using cross-classification analyses, weighted kappa coefficients and Bland-Altman analyses. Pearson correlation coefficients were all significant, except for vitamin B 12 (r = 0.03; p = 0.83) and ranged from 0.27 to 0.76 (p < 0.05). Differences between mean intakes assessed by the R24W and the FR did not exceed 10% in 19 variables and were not significant for 16 out of 26 variables. In cross-classification analyses, the R24W ranked, on average, 79.1% of participants in the same or adjacent quartiles as the FR. Compared to a 3-day FR, the R24W is a valid method to assess intakes of energy and most nutrients but may be less accurate in the evaluation of intakes of fat (as a proportion of energy intake), vitamin D, zinc and folic acid. During pregnancy, the R24W was a more accurate tool at a group-level than at an individual-level and should, therefore, be used in an epidemiological rather than a clinical setting. The R24W may be particularly valuable as a tool used in cohort studies to provide valid information on pregnant women's dietary intakes and facilitate evaluation of associations between diet and adverse pregnancy outcomes.

  17. Effectiveness of Jigsaw learning compared to lecture-based learning in dental education.

    PubMed

    Sagsoz, O; Karatas, O; Turel, V; Yildiz, M; Kaya, E

    2017-02-01

    The objective of this study was to evaluate the success levels of students using the Jigsaw learning method in dental education. Fifty students with similar grade point average (GPA) scores were selected and randomly assigned into one of two groups (n = 25). A pretest concerning 'adhesion and bonding agents in dentistry' was administered to all students before classes. The Jigsaw learning method was applied to the experimental group for 3 weeks. At the same time, the control group was taking classes using the lecture-based learning method. At the end of the 3 weeks, all students were retested (post-test) on the subject. A retention test was administered 3 weeks after the post-test. Mean scores were calculated for each test for the experimental and control groups, and the data obtained were analysed using the independent samples t-test. No significant difference was determined between the Jigsaw and lecture-based methods at pretest or post-test. The highest mean test score was observed in the post-test with the Jigsaw method. In the retention test, success with the Jigsaw method was significantly higher than that with the lecture-based method. The Jigsaw method is as effective as the lecture-based method. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Dispensing Processes Impact Apparent Biological Activity as Determined by Computational and Statistical Analyses

    PubMed Central

    Ekins, Sean; Olechno, Joe; Williams, Antony J.

    2013-01-01

    Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research. PMID:23658723

  19. The analysis of morphometric data on rocky mountain wolves and artic wolves using statistical method

    NASA Astrophysics Data System (ADS)

    Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Hamzah, Nor Shamsidah Amir; Nor, Maria Elena; Ahmad, Noor’ani; Azia Hazida Mohamad Azmi, Nur; Latip, Muhammad Faez Ab; Hilmi Azman, Ahmad

    2018-04-01

    Morphometrics is a quantitative analysis depending on the shape and size of several specimens. Morphometric quantitative analyses are commonly used to analyse fossil record, shape and size of specimens and others. The aim of the study is to find the differences between rocky mountain wolves and arctic wolves based on gender. The sample utilised secondary data which included seven variables as independent variables and two dependent variables. Statistical modelling was used in the analysis such was the analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA). The results showed there exist differentiating results between arctic wolves and rocky mountain wolves based on independent factors and gender.

  20. Cost-effectiveness of different strategies for selecting and treating individuals at increased risk of osteoporosis or osteopenia: a systematic review.

    PubMed

    Müller, Dirk; Pulm, Jannis; Gandjour, Afschin

    2012-01-01

    To compare cost-effectiveness modeling analyses of strategies to prevent osteoporotic and osteopenic fractures either based on fixed thresholds using bone mineral density or based on variable thresholds including bone mineral density and clinical risk factors. A systematic review was performed by using the MEDLINE database and reference lists from previous reviews. On the basis of predefined inclusion/exclusion criteria, we identified relevant studies published since January 2006. Articles included for the review were assessed for their methodological quality and results. The literature search resulted in 24 analyses, 14 of them using a fixed-threshold approach and 10 using a variable-threshold approach. On average, 70% of the criteria for methodological quality were fulfilled, but almost half of the analyses did not include medication adherence in the base case. The results of variable-threshold strategies were more homogeneous and showed more favorable incremental cost-effectiveness ratios compared with those based on a fixed threshold with bone mineral density. For analyses with fixed thresholds, incremental cost-effectiveness ratios varied from €80,000 per quality-adjusted life-year in women aged 55 years to cost saving in women aged 80 years. For analyses with variable thresholds, the range was €47,000 to cost savings. Risk assessment using variable thresholds appears to be more cost-effective than selecting high-risk individuals by fixed thresholds. Although the overall quality of the studies was fairly good, future economic analyses should further improve their methods, particularly in terms of including more fracture types, incorporating medication adherence, and including or discussing unrelated costs during added life-years. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. Comparison of sequencing-based methods to profile DNA methylation and identification of monoallelic epigenetic modifications

    PubMed Central

    Harris, R. Alan; Wang, Ting; Coarfa, Cristian; Nagarajan, Raman P.; Hong, Chibo; Downey, Sara L.; Johnson, Brett E.; Fouse, Shaun D.; Delaney, Allen; Zhao, Yongjun; Olshen, Adam; Ballinger, Tracy; Zhou, Xin; Forsberg, Kevin J.; Gu, Junchen; Echipare, Lorigail; O’Geen, Henriette; Lister, Ryan; Pelizzola, Mattia; Xi, Yuanxin; Epstein, Charles B.; Bernstein, Bradley E.; Hawkins, R. David; Ren, Bing; Chung, Wen-Yu; Gu, Hongcang; Bock, Christoph; Gnirke, Andreas; Zhang, Michael Q.; Haussler, David; Ecker, Joseph; Li, Wei; Farnham, Peggy J.; Waterland, Robert A.; Meissner, Alexander; Marra, Marco A.; Hirst, Martin; Milosavljevic, Aleksandar; Costello, Joseph F.

    2010-01-01

    Sequencing-based DNA methylation profiling methods are comprehensive and, as accuracy and affordability improve, will increasingly supplant microarrays for genome-scale analyses. Here, four sequencing-based methodologies were applied to biological replicates of human embryonic stem cells to compare their CpG coverage genome-wide and in transposons, resolution, cost, concordance and its relationship with CpG density and genomic context. The two bisulfite methods reached concordance of 82% for CpG methylation levels and 99% for non-CpG cytosine methylation levels. Using binary methylation calls, two enrichment methods were 99% concordant, while regions assessed by all four methods were 97% concordant. To achieve comprehensive methylome coverage while reducing cost, an approach integrating two complementary methods was examined. The integrative methylome profile along with histone methylation, RNA, and SNP profiles derived from the sequence reads allowed genome-wide assessment of allele-specific epigenetic states, identifying most known imprinted regions and new loci with monoallelic epigenetic marks and monoallelic expression. PMID:20852635

  2. Teaching evidence-based synthesis: an examination of the development and delivery of two innovative methodologies used at the University of Portsmouth.

    PubMed

    Gorczynski, Paul; Burnell, Karen; Dewey, Ann; Costello, Joseph T

    2017-02-01

    Evidence based practice (EBP) is a process that involves making conscientious decisions that take into account the best available information, clinical expertise, and values and experiences of the patient. EBP helps empower health care professionals to establish service provisions that are clinically excellent, cost-effective, and culturally sensitive to the wishes of their patients. With a need for rapid integration of new evidence into EBP, systematic reviews and meta-analyses have become important tools for health care professionals. Systematic reviews and meta-analyses are conducted in a conscientious manner, following an established set of rules where individuals identify studies that address a particular question based on clearly defined inclusion and exclusion criteria along with a predetermined method of analysis. Conducting systematic reviews and meta-analyses isn't easy nor quick and requires knowledge in a particular subject area, research methods, and statistics. Teaching health care professionals, including undergraduate and graduate students, the processes and skills necessary to carry out systematic reviews and meta-analyses is essential, yet few teaching resources exist for academic staff to facilitate this endeavor. The purpose of this article is to present two strategies taken by academic staff in the Faculty of Science at the University of Portsmouth, UK to teach evidence synthesis and processes to enhance EBP. One case involves a pedagogical approach used with exercise science masters students while the other details the work of an on-line postgraduate certificate program that has been developed in collaboration with Cochrane UK. © 2016 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  3. Next-Generation Sequencing-Based Approaches for Mutation Mapping and Identification in Caenorhabditis elegans

    PubMed Central

    Doitsidou, Maria; Jarriault, Sophie; Poole, Richard J.

    2016-01-01

    The use of next-generation sequencing (NGS) has revolutionized the way phenotypic traits are assigned to genes. In this review, we describe NGS-based methods for mapping a mutation and identifying its molecular identity, with an emphasis on applications in Caenorhabditis elegans. In addition to an overview of the general principles and concepts, we discuss the main methods, provide practical and conceptual pointers, and guide the reader in the types of bioinformatics analyses that are required. Owing to the speed and the plummeting costs of NGS-based methods, mapping and cloning a mutation of interest has become straightforward, quick, and relatively easy. Removing this bottleneck previously associated with forward genetic screens has significantly advanced the use of genetics to probe fundamental biological processes in an unbiased manner. PMID:27729495

  4. Sea surface temperature predictions using a multi-ocean analysis ensemble scheme

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Zhu, Jieshun; Li, Zhongxian; Chen, Haishan; Zeng, Gang

    2017-08-01

    This study examined the global sea surface temperature (SST) predictions by a so-called multiple-ocean analysis ensemble (MAE) initialization method which was applied in the National Centers for Environmental Prediction (NCEP) Climate Forecast System Version 2 (CFSv2). Different from most operational climate prediction practices which are initialized by a specific ocean analysis system, the MAE method is based on multiple ocean analyses. In the paper, the MAE method was first justified by analyzing the ocean temperature variability in four ocean analyses which all are/were applied for operational climate predictions either at the European Centre for Medium-range Weather Forecasts or at NCEP. It was found that these systems exhibit substantial uncertainties in estimating the ocean states, especially at the deep layers. Further, a set of MAE hindcasts was conducted based on the four ocean analyses with CFSv2, starting from each April during 1982-2007. The MAE hindcasts were verified against a subset of hindcasts from the NCEP CFS Reanalysis and Reforecast (CFSRR) Project. Comparisons suggested that MAE shows better SST predictions than CFSRR over most regions where ocean dynamics plays a vital role in SST evolutions, such as the El Niño and Atlantic Niño regions. Furthermore, significant improvements were also found in summer precipitation predictions over the equatorial eastern Pacific and Atlantic oceans, for which the local SST prediction improvements should be responsible. The prediction improvements by MAE imply a problem for most current climate predictions which are based on a specific ocean analysis system. That is, their predictions would drift towards states biased by errors inherent in their ocean initialization system, and thus have large prediction errors. In contrast, MAE arguably has an advantage by sampling such structural uncertainties, and could efficiently cancel these errors out in their predictions.

  5. Integration of a supersonic unsteady aerodynamic code into the NASA FASTEX system

    NASA Technical Reports Server (NTRS)

    Appa, Kari; Smith, Michael J. C.

    1987-01-01

    A supersonic unsteady aerodynamic loads prediction method based on the constant pressure method was integrated into the NASA FASTEX system. The updated FASTEX code can be employed for aeroelastic analyses in subsonic and supersonic flow regimes. A brief description of the supersonic constant pressure panel method, as applied to lifting surfaces and body configurations, is followed by a documentation of updates required to incorporate this method in the FASTEX code. Test cases showing correlations of predicted pressure distributions, flutter solutions, and stability derivatives with available data are reported.

  6. District nursing workforce planning: a review of the methods.

    PubMed

    Reid, Bernie; Kane, Kay; Curran, Carol

    2008-11-01

    District nursing services in Northern Ireland face increasing demands and challenges which may be responded to by effective and efficient workforce planning and development. The aim of this paper is to critically analyse district nursing workforce planning and development methods, in an attempt to find a suitable method for Northern Ireland. A systematic analysis of the literature reveals four methods: professional judgement; population-based health needs; caseload analysis and dependency-acuity. Each method has strengths and weaknesses. Professional judgement offers a 'belt and braces' approach but lacks sensitivity to fluctuating patient numbers. Population-based health needs methods develop staffing algorithms that reflect deprivation and geographical spread, but are poorly understood by district nurses. Caseload analysis promotes equitable workloads but poorly performing district nursing localities may continue if benchmarking processes only consider local data. Dependency-acuity methods provide a means of equalizing and prioritizing workload but are prone to district nurses overstating factors in patient dependency or understating carers' capability. In summary a mixed method approach is advocated to evaluate and adjust the size and mix of district nursing teams using empirically determined patient dependency and activity-based variables based on the population's health needs.

  7. A brief measure of attitudes toward mixed methods research in psychology.

    PubMed

    Roberts, Lynne D; Povee, Kate

    2014-01-01

    The adoption of mixed methods research in psychology has trailed behind other social science disciplines. Teaching psychology students, academics, and practitioners about mixed methodologies may increase the use of mixed methods within the discipline. However, tailoring and evaluating education and training in mixed methodologies requires an understanding of, and way of measuring, attitudes toward mixed methods research in psychology. To date, no such measure exists. In this article we present the development and initial validation of a new measure: Attitudes toward Mixed Methods Research in Psychology. A pool of 42 items developed from previous qualitative research on attitudes toward mixed methods research along with validation measures was administered via an online survey to a convenience sample of 274 psychology students, academics and psychologists. Principal axis factoring with varimax rotation on a subset of the sample produced a four-factor, 12-item solution. Confirmatory factor analysis on a separate subset of the sample indicated that a higher order four factor model provided the best fit to the data. The four factors; 'Limited Exposure,' '(in)Compatibility,' 'Validity,' and 'Tokenistic Qualitative Component'; each have acceptable internal reliability. Known groups validity analyses based on preferred research orientation and self-rated mixed methods research skills, and convergent and divergent validity analyses based on measures of attitudes toward psychology as a science and scientist and practitioner orientation, provide initial validation of the measure. This brief, internally reliable measure can be used in assessing attitudes toward mixed methods research in psychology, measuring change in attitudes as part of the evaluation of mixed methods education, and in larger research programs.

  8. Stability analysis of Eulerian-Lagrangian methods for the one-dimensional shallow-water equations

    USGS Publications Warehouse

    Casulli, V.; Cheng, R.T.

    1990-01-01

    In this paper stability and error analyses are discussed for some finite difference methods when applied to the one-dimensional shallow-water equations. Two finite difference formulations, which are based on a combined Eulerian-Lagrangian approach, are discussed. In the first part of this paper the results of numerical analyses for an explicit Eulerian-Lagrangian method (ELM) have shown that the method is unconditionally stable. This method, which is a generalized fixed grid method of characteristics, covers the Courant-Isaacson-Rees method as a special case. Some artificial viscosity is introduced by this scheme. However, because the method is unconditionally stable, the artificial viscosity can be brought under control either by reducing the spatial increment or by increasing the size of time step. The second part of the paper discusses a class of semi-implicit finite difference methods for the one-dimensional shallow-water equations. This method, when the Eulerian-Lagrangian approach is used for the convective terms, is also unconditionally stable and highly accurate for small space increments or large time steps. The semi-implicit methods seem to be more computationally efficient than the explicit ELM; at each time step a single tridiagonal system of linear equations is solved. The combined explicit and implicit ELM is best used in formulating a solution strategy for solving a network of interconnected channels. The explicit ELM is used at channel junctions for each time step. The semi-implicit method is then applied to the interior points in each channel segment. Following this solution strategy, the channel network problem can be reduced to a set of independent one-dimensional open-channel flow problems. Numerical results support properties given by the stability and error analyses. ?? 1990.

  9. A method for the automated processing and analysis of images of ULVWF-platelet strings.

    PubMed

    Reeve, Scott R; Abbitt, Katherine B; Cruise, Thomas D; Hose, D Rodney; Lawford, Patricia V

    2013-01-01

    We present a method for identifying and analysing unusually large von Willebrand factor (ULVWF)-platelet strings in noisy low-quality images. The method requires relatively inexpensive, non-specialist equipment and allows multiple users to be employed in the capture of images. Images are subsequently enhanced and analysed, using custom-written software to perform the processing tasks. The formation and properties of ULVWF-platelet strings released in in vitro flow-based assays have recently become a popular research area. Endothelial cells are incorporated into a flow chamber, chemically stimulated to induce ULVWF release and perfused with isolated platelets which are able to bind to the ULVWF to form strings. The numbers and lengths of the strings released are related to characteristics of the flow. ULVWF-platelet strings are routinely identified by eye from video recordings captured during experiments and analysed manually using basic NIH image software to determine the number of strings and their lengths. This is a laborious, time-consuming task and a single experiment, often consisting of data from four to six dishes of endothelial cells, can take 2 or more days to analyse. The method described here allows analysis of the strings to provide data such as the number and length of strings, number of platelets per string and the distance between each platelet to be found. The software reduces analysis time, and more importantly removes user subjectivity, producing highly reproducible results with an error of less than 2% when compared with detailed manual analysis.

  10. Performance Assessment of Kernel Density Clustering for Gene Expression Profile Data

    PubMed Central

    Zeng, Beiyan; Chen, Yiping P.; Smith, Oscar H.

    2003-01-01

    Kernel density smoothing techniques have been used in classification or supervised learning of gene expression profile (GEP) data, but their applications to clustering or unsupervised learning of those data have not been explored and assessed. Here we report a kernel density clustering method for analysing GEP data and compare its performance with the three most widely-used clustering methods: hierarchical clustering, K-means clustering, and multivariate mixture model-based clustering. Using several methods to measure agreement, between-cluster isolation, and withincluster coherence, such as the Adjusted Rand Index, the Pseudo F test, the r2 test, and the profile plot, we have assessed the effectiveness of kernel density clustering for recovering clusters, and its robustness against noise on clustering both simulated and real GEP data. Our results show that the kernel density clustering method has excellent performance in recovering clusters from simulated data and in grouping large real expression profile data sets into compact and well-isolated clusters, and that it is the most robust clustering method for analysing noisy expression profile data compared to the other three methods assessed. PMID:18629292

  11. Estimation of gene induction enables a relevance-based ranking of gene sets.

    PubMed

    Bartholomé, Kilian; Kreutz, Clemens; Timmer, Jens

    2009-07-01

    In order to handle and interpret the vast amounts of data produced by microarray experiments, the analysis of sets of genes with a common biological functionality has been shown to be advantageous compared to single gene analyses. Some statistical methods have been proposed to analyse the differential gene expression of gene sets in microarray experiments. However, most of these methods either require threshhold values to be chosen for the analysis, or they need some reference set for the determination of significance. We present a method that estimates the number of differentially expressed genes in a gene set without requiring a threshold value for significance of genes. The method is self-contained (i.e., it does not require a reference set for comparison). In contrast to other methods which are focused on significance, our approach emphasizes the relevance of the regulation of gene sets. The presented method measures the degree of regulation of a gene set and is a useful tool to compare the induction of different gene sets and place the results of microarray experiments into the biological context. An R-package is available.

  12. Assessment of Sample Preparation Bias in Mass Spectrometry-Based Proteomics.

    PubMed

    Klont, Frank; Bras, Linda; Wolters, Justina C; Ongay, Sara; Bischoff, Rainer; Halmos, Gyorgy B; Horvatovich, Péter

    2018-04-17

    For mass spectrometry-based proteomics, the selected sample preparation strategy is a key determinant for information that will be obtained. However, the corresponding selection is often not based on a fit-for-purpose evaluation. Here we report a comparison of in-gel (IGD), in-solution (ISD), on-filter (OFD), and on-pellet digestion (OPD) workflows on the basis of targeted (QconCAT-multiple reaction monitoring (MRM) method for mitochondrial proteins) and discovery proteomics (data-dependent acquisition, DDA) analyses using three different human head and neck tissues (i.e., nasal polyps, parotid gland, and palatine tonsils). Our study reveals differences between the sample preparation methods, for example, with respect to protein and peptide losses, quantification variability, protocol-induced methionine oxidation, and asparagine/glutamine deamidation as well as identification of cysteine-containing peptides. However, none of the methods performed best for all types of tissues, which argues against the existence of a universal sample preparation method for proteome analysis.

  13. On Some Methods in Safety Evaluation in Geotechnics

    NASA Astrophysics Data System (ADS)

    Puła, Wojciech; Zaskórski, Łukasz

    2015-06-01

    The paper demonstrates how the reliability methods can be utilised in order to evaluate safety in geotechnics. Special attention is paid to the so-called reliability based design that can play a useful and complementary role to Eurocode 7. In the first part, a brief review of first- and second-order reliability methods is given. Next, two examples of reliability-based design are demonstrated. The first one is focussed on bearing capacity calculation and is dedicated to comparison with EC7 requirements. The second one analyses a rigid pile subjected to lateral load and is oriented towards working stress design method. In the second part, applications of random field to safety evaluations in geotechnics are addressed. After a short review of the theory a Random Finite Element algorithm to reliability based design of shallow strip foundation is given. Finally, two illustrative examples for cohesive and cohesionless soils are demonstrated.

  14. Advanced Aeroservoelastic Testing and Data Analysis (Les Essais Aeroservoelastiques et l’Analyse des Donnees).

    DTIC Science & Technology

    1995-11-01

    network - based AFS concepts. Neural networks can addition of vanes in each engine exhaust for thrust provide...parameter estimation programs 19-11 8.6 Neural Network Based Methods unknown parameters of the postulated state space model Artificial neural network ...Forward Neural Network the network that the applicability of the recurrent neural and ii) Recurrent Neural Network [117-119]. network to

  15. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  16. Assessment of growth dynamics of human cranium middle fossa in foetal period.

    PubMed

    Skomra, Andrzej; Kędzia, Alicja; Dudek, Krzysztof; Bogacz, Wiesław

    2014-01-01

    Available literature analysis demonstrated smallness of studies of cranial base. The goal of the study was to analyse the medial fossa of the human cranium in the foetal period against other fossae. Survey material consisted of 110 human foetuses at a morphological age of 16-28 weeks of foetal life, CRL 98-220 mm. Anthropological, preparation method, reverse method and statistical analysis were utilized. The survey incorporated the following computer programmes: Renishaw, TraceSurf, AutoCAD, CATIA. The reverse method seems especially interesting (impression with polysiloxane (silicone elastomer of high adhesive power used in dentistry) with 18 D 4823 activator. Elicited impression accurately reflected complex shape of cranium base. On assessing the relative rate of cranium medial fossa, the rate was found to be stable (linear model) for the whole of the analysed period and is 0.19%/week, which stands for the gradual and steady growth of the middle fossa in relation to the whole of the cranium base. At the same time, from the 16th till 28th week of foetal life, relative volume of the cranium middle fossa increases more intensively than cranium anterior fossa, whereas the cranium middle fossa volume as compared with the cranium posterior fossa is definitely slower. In the analysed period, the growth rate of the cranium base middle fossa was bigger in the 4th and 5th weeks than in the 6th and 7th weeks of foetal life. The investigations revealed cranium base asymmetry of the left side. Furthermore, the anterior fossae volume on the left side is significantly bigger than the one of the fossae on the right side. Volume growth rate is more intensive in the 4th and 5th than in the 6th and 7th weeks of foetal life. In the examined period, the relative growth rate of cranium base middle fossa is 0.19%/week and it is stable - linear model. The study revealed correlations in the form of mathematical models, which enabled foetuses age assessment.

  17. A Review of Recent Aeroelastic Analysis Methods for Propulsion at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Bakhle, Milind A.; Srivastava, R.; Mehmed, Oral; Stefko, George L.

    1993-01-01

    This report reviews aeroelastic analyses for propulsion components (propfans, compressors and turbines) being developed and used at NASA LeRC. These aeroelastic analyses include both structural and aerodynamic models. The structural models include a typical section, a beam (with and without disk flexibility), and a finite-element blade model (with plate bending elements). The aerodynamic models are based on the solution of equations ranging from the two-dimensional linear potential equation to the three-dimensional Euler equations for multibladed configurations. Typical calculated results are presented for each aeroelastic model. Suggestions for further research are made. Many of the currently available aeroelastic models and analysis methods are being incorporated in a unified computer program, APPLE (Aeroelasticity Program for Propulsion at LEwis).

  18. Fingerprint of Hedyotis diffusa Willd. by HPLC-MS.

    PubMed

    Yang, Ting; Yang, Yi-Hua; Yang, Ju-Yun; Chen, Ben-Mei; Duan, Ju-Ping; Yu, Shu-Yi; Ouyang, Hong-Tao; Cheng, Jun-Ping; Chen, Yu-Xiang

    2008-01-01

    A HPLC-MS fingerprint method has been developed based on the consistent chromatographic features of the major chemical constituents among 10 batches of Hedyotis diffusa Willd. Chromatographic separation was conducted on a Hypersil-Keystone Hypurity C(18) column using methanol:water:acetic acid as the mobile phase. Major compounds, including oleanolic acid, ursolic acid and ferulic acid, were analysed by HPLC-MS. Their analysis was ascertained by comparison with data derived from the standard compounds. The HPLC-MS fingerprint was successfully applied to analyse and differentiate samples from different geographical origins, or processing methods. H. diffusa was well distinguished from Hedyotis chrysotricha by HPLC-MS. Therefore the establishment of fingerprint of H. diffusa is critical in assessing and controlling its overall quality.

  19. Fuzzy risk analysis of a modern γ-ray industrial irradiator.

    PubMed

    Castiglia, F; Giardina, M

    2011-06-01

    Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.

  20. New encoded single-indicator sequences based on physico-chemical parameters for efficient exon identification.

    PubMed

    Meher, J K; Meher, P K; Dash, G N; Raval, M K

    2012-01-01

    The first step in gene identification problem based on genomic signal processing is to convert character strings into numerical sequences. These numerical sequences are then analysed spectrally or using digital filtering techniques for the period-3 peaks, which are present in exons (coding areas) and absent in introns (non-coding areas). In this paper, we have shown that single-indicator sequences can be generated by encoding schemes based on physico-chemical properties. Two new methods are proposed for generating single-indicator sequences based on hydration energy and dipole moments. The proposed methods produce high peak at exon locations and effectively suppress false exons (intron regions having greater peak than exon regions) resulting in high discriminating factor, sensitivity and specificity.

  1. METHODS TO CLASSIFY ENVIRONMENTAL SAMPLES BASED ON MOLD ANALYSES BY QPCR

    EPA Science Inventory

    Quantitative PCR (QPCR) analysis of molds in indoor environmental samples produces highly accurate speciation and enumeration data. In a number of studies, eighty of the most common or potentially problematic indoor molds were identified and quantified in dust samples from homes...

  2. 78 FR 13597 - Proposed Priority-National Institute on Disability and Rehabilitation Research-Rehabilitation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-28

    ... and rehabilitation research; (2) foster an exchange of expertise, information, and training methods to... refined analyses of data, producing observational findings, and creating other sources of research-based... Institute on Disability and Rehabilitation Research--Rehabilitation Research and Training Center AGENCY...

  3. Neuroanatomical Correlates of Intelligence

    ERIC Educational Resources Information Center

    Luders, Eileen; Narr, Katherine L.; Thompson, Paul M.; Toga, Arthur W.

    2009-01-01

    With the advancement of image acquisition and analysis methods in recent decades, unique opportunities have emerged to study the neuroanatomical correlates of intelligence. Traditional approaches examining global measures have been complemented by insights from more regional analyses based on pre-defined areas. Newer state-of-the-art approaches…

  4. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  5. A ricin forensic profiling approach based on a complex set of biomarkers.

    PubMed

    Fredriksson, Sten-Åke; Wunschel, David S; Lindström, Susanne Wiklund; Nilsson, Calle; Wahl, Karen; Åstot, Crister

    2018-08-15

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1-PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods and robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Biases and Power for Groups Comparison on Subjective Health Measurements

    PubMed Central

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald’s test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative. PMID:23115620

  7. The Opiliones tree of life: shedding light on harvestmen relationships through transcriptomics.

    PubMed

    Fernández, Rosa; Sharma, Prashant P; Tourinho, Ana Lúcia; Giribet, Gonzalo

    2017-02-22

    Opiliones are iconic arachnids with a Palaeozoic origin and a diversity that reflects ancient biogeographic patterns dating back at least to the times of Pangea. Owing to interest in harvestman diversity, evolution and biogeography, their relationships have been thoroughly studied using morphology and PCR-based Sanger approaches to infer their systematic relationships. More recently, two studies utilized transcriptomics-based phylogenomics to explore their basal relationships and diversification, but sampling was limiting for understanding deep evolutionary patterns, as they lacked good taxon representation at the family level. Here, we analysed a set of the 14 existing transcriptomes with 40 additional ones generated for this study, representing approximately 80% of the extant familial diversity in Opiliones. Our phylogenetic analyses, including a set of data matrices with different gene occupancy and evolutionary rates, and using a multitude of methods correcting for a diversity of factors affecting phylogenomic data matrices, provide a robust and stable Opiliones tree of life, where most families and higher taxa are precisely placed. Our dating analyses using alternative calibration points, methods and analytical parameters provide well-resolved old divergences, consistent with ancient regionalization in Pangea in some groups, and Pangean vicariance in others. The integration of state-of-the-art molecular techniques and analyses, together with the broadest taxonomic sampling to date presented in a phylogenomic study of harvestmen, provide new insights into harvestmen interrelationships, as well as an overview of the general biogeographic patterns of this ancient arthropod group. © 2017 The Author(s).

  8. The Opiliones tree of life: shedding light on harvestmen relationships through transcriptomics

    PubMed Central

    Sharma, Prashant P.; Tourinho, Ana Lúcia

    2017-01-01

    Opiliones are iconic arachnids with a Palaeozoic origin and a diversity that reflects ancient biogeographic patterns dating back at least to the times of Pangea. Owing to interest in harvestman diversity, evolution and biogeography, their relationships have been thoroughly studied using morphology and PCR-based Sanger approaches to infer their systematic relationships. More recently, two studies utilized transcriptomics-based phylogenomics to explore their basal relationships and diversification, but sampling was limiting for understanding deep evolutionary patterns, as they lacked good taxon representation at the family level. Here, we analysed a set of the 14 existing transcriptomes with 40 additional ones generated for this study, representing approximately 80% of the extant familial diversity in Opiliones. Our phylogenetic analyses, including a set of data matrices with different gene occupancy and evolutionary rates, and using a multitude of methods correcting for a diversity of factors affecting phylogenomic data matrices, provide a robust and stable Opiliones tree of life, where most families and higher taxa are precisely placed. Our dating analyses using alternative calibration points, methods and analytical parameters provide well-resolved old divergences, consistent with ancient regionalization in Pangea in some groups, and Pangean vicariance in others. The integration of state-of-the-art molecular techniques and analyses, together with the broadest taxonomic sampling to date presented in a phylogenomic study of harvestmen, provide new insights into harvestmen interrelationships, as well as an overview of the general biogeographic patterns of this ancient arthropod group. PMID:28228511

  9. Genomic Repeat Abundances Contain Phylogenetic Signal

    PubMed Central

    Dodsworth, Steven; Chase, Mark W.; Kelly, Laura J.; Leitch, Ilia J.; Macas, Jiří; Novák, Petr; Piednoël, Mathieu; Weiss-Schneeweiss, Hanna; Leitch, Andrew R.

    2015-01-01

    A large proportion of genomic information, particularly repetitive elements, is usually ignored when researchers are using next-generation sequencing. Here we demonstrate the usefulness of this repetitive fraction in phylogenetic analyses, utilizing comparative graph-based clustering of next-generation sequence reads, which results in abundance estimates of different classes of genomic repeats. Phylogenetic trees are then inferred based on the genome-wide abundance of different repeat types treated as continuously varying characters; such repeats are scattered across chromosomes and in angiosperms can constitute a majority of nuclear genomic DNA. In six diverse examples, five angiosperms and one insect, this method provides generally well-supported relationships at interspecific and intergeneric levels that agree with results from more standard phylogenetic analyses of commonly used markers. We propose that this methodology may prove especially useful in groups where there is little genetic differentiation in standard phylogenetic markers. At the same time as providing data for phylogenetic inference, this method additionally yields a wealth of data for comparative studies of genome evolution. PMID:25261464

  10. A multivariate variational objective analysis-assimilation method. Part 2: Case study results with and without satellite data

    NASA Technical Reports Server (NTRS)

    Achtemeier, Gary L.; Kidder, Stanley Q.; Scott, Robert W.

    1988-01-01

    The variational multivariate assimilation method described in a companion paper by Achtemeier and Ochs is applied to conventional and conventional plus satellite data. Ground-based and space-based meteorological data are weighted according to the respective measurement errors and blended into a data set that is a solution of numerical forms of the two nonlinear horizontal momentum equations, the hydrostatic equation, and an integrated continuity equation for a dry atmosphere. The analyses serve first, to evaluate the accuracy of the model, and second to contrast the analyses with and without satellite data. Evaluation criteria measure the extent to which: (1) the assimilated fields satisfy the dynamical constraints, (2) the assimilated fields depart from the observations, and (3) the assimilated fields are judged to be realistic through pattern analysis. The last criterion requires that the signs, magnitudes, and patterns of the hypersensitive vertical velocity and local tendencies of the horizontal velocity components be physically consistent with respect to the larger scale weather systems.

  11. Maximizing beneficence and autonomy. Ethical support for the use of nonpharmacological methods for managing dental anxiety.

    PubMed

    Donate-Bartfield, Evelyn; Spellecy, Ryan; Shane, Nicholas J

    2010-01-01

    This article examines advantages associated with nonpharmacological behavioral management techniques and suggests that there are benefits to their use (such as achieving a more lasting solution to the problem of dental anxiety) that are not realized with medication-based interventions. Analyses that use Kantian and existential viewpoints for exploring the use of medication versus behavioral interventions for managing life problems yield parallel conclusions: there are advantages gained by using behavioral interventions that are not always associated with medication-based interventions. These analyses, taken together with an understanding of the psychology of dental anxiety management, suggest that using nonpharmacological techniques for the management of dental anxiety can maximize adherence to the ethical principles of beneficence and patient autonomy. The authors discuss the barriers that make nonpharmacological interventions for anxiety management difficult for dentists to routinely use, and suggest that additional training in these methods and increased collaboration with mental health professionals are needed for dentists.

  12. Study on Electricity Business Expansion and Electricity Sales Based on Seasonal Adjustment

    NASA Astrophysics Data System (ADS)

    Zhang, Yumin; Han, Xueshan; Wang, Yong; Zhang, Li; Yang, Guangsen; Sun, Donglei; Wang, Bolun

    2017-05-01

    [1] proposed a novel analysis and forecast method of electricity business expansion based on Seasonal Adjustment, we extend this work to include the effect the micro and macro aspects, respectively. From micro aspect, we introduce the concept of load factor to forecast the stable value of electricity consumption of single new consumer after the installation of new capacity of the high-voltage transformer. From macro aspects, considering the growth of business expanding is also stimulated by the growth of electricity sales, it is necessary to analyse the antecedent relationship between business expanding and electricity sales. First, forecast electricity consumption of customer group and release rules of expanding capacity, respectively. Second, contrast the degree of fitting and prediction accuracy to find out the antecedence relationship and analyse the reason. Also, it can be used as a contrast to observe the influence of customer group in different ranges on the prediction precision. Finally, Simulation results indicate that the proposed method is accurate to help determine the value of expanding capacity and electricity consumption.

  13. A General Accelerated Degradation Model Based on the Wiener Process.

    PubMed

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  14. Geo-information processing service composition for concurrent tasks: A QoS-aware game theory approach

    NASA Astrophysics Data System (ADS)

    Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong

    2012-10-01

    Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.

  15. A General Accelerated Degradation Model Based on the Wiener Process

    PubMed Central

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107

  16. Cryogenic Tank Structure Sizing With Structural Optimization Method

    NASA Technical Reports Server (NTRS)

    Wang, J. T.; Johnson, T. F.; Sleight, D. W.; Saether, E.

    2001-01-01

    Structural optimization methods in MSC /NASTRAN are used to size substructures and to reduce the weight of a composite sandwich cryogenic tank for future launch vehicles. Because the feasible design space of this problem is non-convex, many local minima are found. This non-convex problem is investigated in detail by conducting a series of analyses along a design line connecting two feasible designs. Strain constraint violations occur for some design points along the design line. Since MSC/NASTRAN uses gradient-based optimization procedures. it does not guarantee that the lowest weight design can be found. In this study, a simple procedure is introduced to create a new starting point based on design variable values from previous optimization analyses. Optimization analysis using this new starting point can produce a lower weight design. Detailed inputs for setting up the MSC/NASTRAN optimization analysis and final tank design results are presented in this paper. Approaches for obtaining further weight reductions are also discussed.

  17. A method for identifying EMI critical circuits during development of a large C3

    NASA Astrophysics Data System (ADS)

    Barr, Douglas H.

    The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.

  18. Collision analysis of one kind of chaos-based hash function

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Peng, Wenbing; Liao, Xiaofeng; Xiang, Tao

    2010-02-01

    In the last decade, various chaos-based hash functions have been proposed. Nevertheless, the corresponding analyses of them lag far behind. In this Letter, we firstly take a chaos-based hash function proposed very recently in Amin, Faragallah and Abd El-Latif (2009) [11] as a sample to analyze its computational collision problem, and then generalize the construction method of one kind of chaos-based hash function and summarize some attentions to avoid the collision problem. It is beneficial to the hash function design based on chaos in the future.

  19. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. On an interface of the online system for a stochastic analysis of the varied information flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorshenin, Andrey K.; MIREA, MGUPI; Kuzmin, Victor Yu.

    The article describes a possible approach to the construction of an interface of an online asynchronous system that allows researchers to analyse varied information flows. The implemented stochastic methods are based on the mixture models and the method of moving separation of mixtures. The general ideas of the system functionality are demonstrated on an example for some moments of a finite normal mixture.

  1. When do species-tree and concatenated estimates disagree? An empirical analysis with higher-level scincid lizard phylogeny.

    PubMed

    Lambert, Shea M; Reeder, Tod W; Wiens, John J

    2015-01-01

    Simulation studies suggest that coalescent-based species-tree methods are generally more accurate than concatenated analyses. However, these species-tree methods remain impractical for many large datasets. Thus, a critical but unresolved issue is when and why concatenated and coalescent species-tree estimates will differ. We predict such differences for branches in concatenated trees that are short, weakly supported, and have conflicting gene trees. We test these predictions in Scincidae, the largest lizard family, with data from 10 nuclear genes for 17 ingroup taxa and 44 genes for 12 taxa. We support our initial predictions, andsuggest that simply considering uncertainty in concatenated trees may sometimes encompass the differences between these methods. We also found that relaxed-clock concatenated trees can be surprisingly similar to the species-tree estimate. Remarkably, the coalescent species-tree estimates had slightly lower support values when based on many more genes (44 vs. 10) and a small (∼30%) reduction in taxon sampling. Thus, taxon sampling may be more important than gene sampling when applying species-tree methods to deep phylogenetic questions. Finally, our coalescent species-tree estimates tentatively support division of Scincidae into three monophyletic subfamilies, a result otherwise found only in concatenated analyses with extensive species sampling. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Changes in monosaccharides, organic acids and amino acids during Cabernet Sauvignon wine ageing based on a simultaneous analysis using gas chromatography-mass spectrometry.

    PubMed

    Zhang, Xin-Ke; Lan, Yi-Bin; Zhu, Bao-Qing; Xiang, Xiao-Feng; Duan, Chang-Qing; Shi, Ying

    2018-01-01

    Monosaccharides, organic acids and amino acids are the important flavour-related components in wines. The aim of this article is to develop and validate a method that could simultaneously analyse these compounds in wine based on silylation derivatisation and gas chromatography-mass spectrometry (GC-MS), and apply this method to the investigation of the changes of these compounds and speculate upon their related influences on Cabernet Sauvignon wine flavour during wine ageing. This work presented a new approach for wine analysis and provided more information concerning red wine ageing. This method could simultaneously quantitatively analyse 2 monosaccharides, 8 organic acids and 13 amino acids in wine. A validation experiment showed good linearity, sensitivity, reproducibility and recovery. Multiple derivatives of five amino acids have been found but their effects on quantitative analysis were negligible, except for methionine. The evolution pattern of each category was different, and we speculated that the corresponding mechanisms involving microorganism activities, physical interactions and chemical reactions had a great correlation with red wine flavours during ageing. Simultaneously quantitative analysis of monosaccharides, organic acids and amino acids in wine was feasible and reliable and this method has extensive application prospects. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  3. The role of renin–angiotensin–aldosterone system genes in the progression of chronic kidney disease: findings from the Chronic Renal Insufficiency Cohort (CRIC) study

    PubMed Central

    Kelly, Tanika N.; Raj, Dominic; Rahman, Mahboob; Kretzler, Matthias; Kallem, Radhakrishna R.; Ricardo, Ana C.; Rosas, Sylvia E.; Tao, Kaixiang; Xie, Dawei; Hamm, Lotuce Lee; He, Jiang; Appel, J.; Feldman, Harold I.; Go, Alan S.; Kusek, John W.; Lash, James P.; Ojo, Akinlolu; Townsend, Raymond R.

    2015-01-01

    Background We conducted single-marker, gene- and pathway-based analyses to examine the association between renin–angiotensin–aldosterone system (RAAS) variants and chronic kidney disease (CKD) progression among Chronic Renal Insufficiency Cohort study participants. Methods A total of 1523 white and 1490 black subjects were genotyped for 490 single nucleotide polymorphisms (SNPs) in 12 RAAS genes as part of the ITMAT-Broad-CARe array. CKD progression phenotypes included decline in estimated glomerular filtration rate (eGFR) over time and the occurrence of a renal disease event, defined as incident end-stage renal disease or halving of eGFR from baseline. Mixed-effects models were used to examine SNP associations with eGFR decline, while Cox proportional hazards models tested SNP associations with renal events. Gene- and pathway-based analyses were conducted using the truncated product method. All analyses were stratified by race, and a Bonferroni correction was applied to adjust for multiple testing. Results Among white and black participants, eGFR declined an average of 1.2 and 2.3 mL/min/1.73 m2/year, respectively, while renal events occurred in a respective 11.5 and 24.9% of participants. We identified strong gene- and pathway-based associations with CKD progression. The AGT and RENBP genes were consistently associated with risk of renal events in separate analyses of white and black participants (both P < 1.00 × 10−6). Driven by the significant gene-based findings, the entire RAAS pathway was also associated with renal events in both groups (both P < 1.00 × 10−6). No single-marker associations with CKD progression were observed. Conclusions The current study provides strong evidence for a role of the RAAS in CKD progression. PMID:25906781

  4. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services.

    PubMed

    Aarons, Gregory A; Fettes, Danielle L; Sommerfeld, David H; Palinkas, Lawrence A

    2012-02-01

    Many public sector service systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This article describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. The authors integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research.

  5. Mixed Methods for Implementation Research: Application to Evidence-Based Practice Implementation and Staff Turnover in Community Based Organizations Providing Child Welfare Services

    PubMed Central

    Aarons, Gregory A.; Fettes, Danielle L.; Sommerfeld, David H.; Palinkas, Lawrence

    2013-01-01

    Many public sector services systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well-suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This paper describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. We integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research. PMID:22146861

  6. Does Subjective Rating Reflect Behavioural Coding? Personality in 2 Month-Old Dog Puppies: An Open-Field Test and Adjective-Based Questionnaire.

    PubMed

    Barnard, Shanis; Marshall-Pescini, Sarah; Passalacqua, Chiara; Beghelli, Valentina; Capra, Alexa; Normando, Simona; Pelosi, Annalisa; Valsecchi, Paola

    2016-01-01

    A number of studies have recently investigated personality traits in non-human species, with the dog gaining popularity as a subject species for research in this area. Recent research has shown the consistency of personality traits across both context and time for adult dogs, both when using questionnaire based methods of investigation and behavioural analyses of the dogs' behaviour. However, only a few studies have assessed the correspondence between these two methods, with results varying considerably across studies. Furthermore, most studies have focused on adult dogs, despite the fact that an understanding of personality traits in young puppies may be important for research focusing on the genetic basis of personality traits. In the current study, we sought to evaluate the correspondence between a questionnaire based method and the in depth analyses of the behaviour of 2-month old puppies in an open-field test in which a number of both social and non-social stimuli were presented to the subjects. We further evaluated consistency of traits over time by re-testing a subset of puppies. The correspondence between methods was high and test- retest consistency (for the main trait) was also good using both evaluation methods. Results showed clear factors referring to the two main personality traits 'extroversion,' (i.e. the enthusiastic, exuberant approach to the stimuli) and 'neuroticism,' (i.e. the more cautious and fearful approach to the stimuli), potentially similar to the shyness-boldness dimension found in previous studies. Furthermore, both methods identified an 'amicability' dimension, expressing the positive interactions the pups directed at the humans stranger, and a 'reservedness' dimension which identified pups who largely chose not to interact with the stimuli, and were defined as quiet and not nosey in the questionnaire.

  7. Costing evidence for health care decision-making in Austria: A systematic review

    PubMed Central

    Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata

    2017-01-01

    Background With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. Methods A systematic literature review of English and German language peer-reviewed as well as grey literature (2004–2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. Results A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. Conclusion There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods. PMID:28806728

  8. IFPTarget: A Customized Virtual Target Identification Method Based on Protein-Ligand Interaction Fingerprinting Analyses.

    PubMed

    Li, Guo-Bo; Yu, Zhu-Jun; Liu, Sha; Huang, Lu-Yi; Yang, Ling-Ling; Lohans, Christopher T; Yang, Sheng-Yong

    2017-07-24

    Small-molecule target identification is an important and challenging task for chemical biology and drug discovery. Structure-based virtual target identification has been widely used, which infers and prioritizes potential protein targets for the molecule of interest (MOI) principally via a scoring function. However, current "universal" scoring functions may not always accurately identify targets to which the MOI binds from the retrieved target database, in part due to a lack of consideration of the important binding features for an individual target. Here, we present IFPTarget, a customized virtual target identification method, which uses an interaction fingerprinting (IFP) method for target-specific interaction analyses and a comprehensive index (Cvalue) for target ranking. Evaluation results indicate that the IFP method enables substantially improved binding pose prediction, and Cvalue has an excellent performance in target ranking for the test set. When applied to screen against our established target library that contains 11,863 protein structures covering 2842 unique targets, IFPTarget could retrieve known targets within the top-ranked list and identified new potential targets for chemically diverse drugs. IFPTarget prediction led to the identification of the metallo-β-lactamase VIM-2 as a target for quercetin as validated by enzymatic inhibition assays. This study provides a new in silico target identification tool and will aid future efforts to develop new target-customized methods for target identification.

  9. Improved accuracy of quantitative parameter estimates in dynamic contrast-enhanced CT study with low temporal resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Sun Mo, E-mail: Sunmo.Kim@rmp.uhn.on.ca; Haider, Masoom A.; Jaffray, David A.

    Purpose: A previously proposed method to reduce radiation dose to patient in dynamic contrast-enhanced (DCE) CT is enhanced by principal component analysis (PCA) filtering which improves the signal-to-noise ratio (SNR) of time-concentration curves in the DCE-CT study. The efficacy of the combined method to maintain the accuracy of kinetic parameter estimates at low temporal resolution is investigated with pixel-by-pixel kinetic analysis of DCE-CT data. Methods: The method is based on DCE-CT scanning performed with low temporal resolution to reduce the radiation dose to the patient. The arterial input function (AIF) with high temporal resolution can be generated with a coarselymore » sampled AIF through a previously published method of AIF estimation. To increase the SNR of time-concentration curves (tissue curves), first, a region-of-interest is segmented into squares composed of 3 × 3 pixels in size. Subsequently, the PCA filtering combined with a fraction of residual information criterion is applied to all the segmented squares for further improvement of their SNRs. The proposed method was applied to each DCE-CT data set of a cohort of 14 patients at varying levels of down-sampling. The kinetic analyses using the modified Tofts’ model and singular value decomposition method, then, were carried out for each of the down-sampling schemes between the intervals from 2 to 15 s. The results were compared with analyses done with the measured data in high temporal resolution (i.e., original scanning frequency) as the reference. Results: The patients’ AIFs were estimated to high accuracy based on the 11 orthonormal bases of arterial impulse responses established in the previous paper. In addition, noise in the images was effectively reduced by using five principal components of the tissue curves for filtering. Kinetic analyses using the proposed method showed superior results compared to those with down-sampling alone; they were able to maintain the accuracy in the quantitative histogram parameters of volume transfer constant [standard deviation (SD), 98th percentile, and range], rate constant (SD), blood volume fraction (mean, SD, 98th percentile, and range), and blood flow (mean, SD, median, 98th percentile, and range) for sampling intervals between 10 and 15 s. Conclusions: The proposed method of PCA filtering combined with the AIF estimation technique allows low frequency scanning for DCE-CT study to reduce patient radiation dose. The results indicate that the method is useful in pixel-by-pixel kinetic analysis of DCE-CT data for patients with cervical cancer.« less

  10. Methodological Issues Surrounding the Use of Baseline Health-Related Quality of Life Data to Inform Trial-Based Economic Evaluations of Interventions Within Emergency and Critical Care Settings: A Systematic Literature Review.

    PubMed

    Dritsaki, Melina; Achana, Felix; Mason, James; Petrou, Stavros

    2017-05-01

    Trial-based cost-utility analyses require health-related quality of life data that generate utility values in order to express health outcomes in terms of quality-adjusted life years (QALYs). Assessments of baseline health-related quality of life are problematic where trial participants are incapacitated or critically ill at the time of randomisation. This review aims to identify and critique methods for handling non-availability of baseline health-related quality of life data in trial-based cost-utility analyses within emergency and critical illness settings. A systematic literature review was conducted, following PRISMA guidelines, to identify trial-based cost-utility analyses of interventions within emergency and critical care settings. Databases searched included the National Institute for Health Research (NIHR) Journals Library (1991-July 2016), Cochrane Library (all years); National Health Service (NHS) Economic Evaluation Database (all years) and Ovid MEDLINE/Embase (without time restriction). Strategies employed to handle non-availability of baseline health-related quality of life data in final QALY estimations were identified and critiqued. A total of 4224 published reports were screened, 19 of which met the study inclusion criteria (mean trial size 1670): 14 (74 %) from the UK, four (21%) from other European countries and one (5%) from India. Twelve studies (63%) were based in emergency departments and seven (37%) in intensive care units. Only one study was able to elicit patient-reported health-related quality of life at baseline. To overcome the lack of baseline data when estimating QALYs, eight studies (42%) assigned a fixed utility weight corresponding to either death, an unconscious health state or a country-specific norm to patients at baseline, four (21%) ignored baseline utilities, three (16%) applied values from another study, one (5%) generated utility values via retrospective recall and one (5%) elicited utilities from experts. A preliminary exploration of these methods shows that incremental QALY estimation is unlikely to be biased if balanced trial allocation is achieved and subsequent collection of health-related quality of life data occurs at the earliest possible opportunity following commencement of treatment, followed by an adequate number of follow-up assessments. Trial-based cost-utility analyses within emergency and critical illness settings have applied different methods for QALY estimation, employing disparate assumptions about the health-related quality of life of patients at baseline. Where baseline measurement is not practical, measurement at the earliest opportunity following commencement of treatment should minimise bias in QALY estimation.

  11. CscoreTool: fast Hi-C compartment analysis at high resolution.

    PubMed

    Zheng, Xiaobin; Zheng, Yixian

    2018-05-01

    The genome-wide chromosome conformation capture (Hi-C) has revealed that the eukaryotic genome can be partitioned into A and B compartments that have distinctive chromatin and transcription features. Current Principle Component Analyses (PCA)-based method for the A/B compartment prediction based on Hi-C data requires substantial CPU time and memory. We report the development of a method, CscoreTool, which enables fast and memory-efficient determination of A/B compartments at high resolution even in datasets with low sequencing depth. https://github.com/scoutzxb/CscoreTool. xzheng@carnegiescience.edu. Supplementary data are available at Bioinformatics online.

  12. Comparison of hydraulic conductivities for a sand and gravel aquifer in southeastern Massachusetts, estimated by three methods

    USGS Publications Warehouse

    Warren, L.P.; Church, P.E.; Turtora, Michael

    1996-01-01

    Hydraulic conductivities of a sand and gravel aquifer were estimated by three methods: constant- head multiport-permeameter tests, grain-size analyses (with the Hazen approximation method), and slug tests. Sediment cores from 45 boreholes were undivided or divided into two or three vertical sections to estimate hydraulic conductivity based on permeameter tests and grain-size analyses. The cores were collected from depth intervals in the screened zone of the aquifer in each observation well. Slug tests were performed on 29 observation wells installed in the boreholes. Hydraulic conductivities of 35 sediment cores estimated by use of permeameter tests ranged from 0.9 to 86 meters per day, with a mean of 22.8 meters per day. Hydraulic conductivities of 45 sediment cores estimated by use of grain-size analyses ranged from 0.5 to 206 meters per day, with a mean of 40.7 meters per day. Hydraulic conductivities of aquifer material at 29 observation wells estimated by use of slug tests ranged from 0.6 to 79 meters per day, with a mean of 32.9 meters per day. The repeatability of estimated hydraulic conductivities were estimated to be within 30 percent for the permeameter method, 12 percent for the grain-size method, and 9.5 percent for the slug test method. Statistical tests determined that the medians of estimates resulting from the slug tests and grain-size analyses were not significantly different but were significantly higher than the median of estimates resulting from the permeameter tests. Because the permeameter test is the only method considered which estimates vertical hydraulic conductivity, the difference in estimates may be attributed to vertical or horizontal anisotropy. The difference in the average hydraulic conductivities estimated by use of each method was less than 55 percent when compared to the estimated hydraulic conductivity determined from an aquifer test conducted near the study area.

  13. Lindemann histograms as a new method to analyse nano-patterns and phases

    NASA Astrophysics Data System (ADS)

    Makey, Ghaith; Ilday, Serim; Tokel, Onur; Ibrahim, Muhamet; Yavuz, Ozgun; Pavlov, Ihor; Gulseren, Oguz; Ilday, Omer

    The detection, observation, and analysis of material phases and atomistic patterns are of great importance for understanding systems exhibiting both equilibrium and far-from-equilibrium dynamics. As such, there is intense research on phase transitions and pattern dynamics in soft matter, statistical and nonlinear physics, and polymer physics. In order to identify phases and nano-patterns, the pair correlation function is commonly used. However, this approach is limited in terms of recognizing competing patterns in dynamic systems, and lacks visualisation capabilities. In order to solve these limitations, we introduce Lindemann histogram quantification as an alternative method to analyse solid, liquid, and gas phases, along with hexagonal, square, and amorphous nano-pattern symmetries. We show that the proposed approach based on Lindemann parameter calculated per particle maps local number densities to material phase or particles pattern. We apply the Lindemann histogram method on dynamical colloidal self-assembly experimental data and identify competing patterns.

  14. Classification of 'Chemlali' accessions according to the geographical area using chemometric methods of phenolic profiles analysed by HPLC-ESI-TOF-MS.

    PubMed

    Taamalli, Amani; Arráez Román, David; Zarrouk, Mokhtar; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto

    2012-05-01

    The present work describes a classification method of Tunisian 'Chemlali' olive oils based on their phenolic composition and geographical area. For this purpose, the data obtained by HPLC-ESI-TOF-MS from 13 samples of extra virgin olive oils, obtained from different production area throughout the country, were used for this study focusing in 23 phenolics compounds detected. The quantitative results showed a significant variability among the analysed oil samples. Factor analysis method using principal component was applied to the data in order to reduce the number of factors which explain the variability of the selected compounds. The data matrix constructed was subjected to a canonical discriminant analysis (CDA) in order to classify the oil samples. These results showed that 100% of cross-validated original group cases were correctly classified, which proves the usefulness of the selected variables. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Analysing the Image Building Effects of TV Advertisements Using Internet Community Data

    NASA Astrophysics Data System (ADS)

    Uehara, Hiroshi; Sato, Tadahiko; Yoshida, Kenichi

    This paper proposes a method to measure the effects of TV advertisements on the Internet bulletin boards. It aims to clarify how the viewes' interests on TV advertisements are reflected on their images on the promoted products. Two kinds of time series data are generated based on the proposed method. First one represents the time series fluctuation of the interests on the TV advertisements. Another one represents the time series fluctuation of the images on the products. By analysing the correlations between these two time series data, we try to clarify the implicit relationship between the viewer's interests on the TV advertisement and their images on the promoted products. By applying the proposed method to an Internet bulletin board that deals with certain cosmetic brand, we show that the images on the products vary depending on the difference of the interests on each TV advertisement.

  16. Predicting clinical trial results based on announcements of interim analyses

    PubMed Central

    2014-01-01

    Background Announcements of interim analyses of a clinical trial convey information about the results beyond the trial’s Data Safety Monitoring Board (DSMB). The amount of information conveyed may be minimal, but the fact that none of the trial’s stopping boundaries has been crossed implies that the experimental therapy is neither extremely effective nor hopeless. Predicting success of the ongoing trial is of interest to the trial’s sponsor, the medical community, pharmaceutical companies, and investors. We determine the probability of trial success by quantifying only the publicly available information from interim analyses of an ongoing trial. We illustrate our method in the context of the National Surgical Adjuvant Breast and Bowel (NSABP) trial, C-08. Methods We simulated trials based on the specifics of the NSABP C-08 protocol that were publicly available. We quantified the uncertainty around the treatment effect using prior weights for the various possibilities in light of other colon cancer studies and other studies of the investigational agent, bevacizumab. We considered alternative prior distributions. Results Subsequent to the trial’s third interim analysis, our predictive probabilities were: that the trial would eventually be successful, 48.0%; would stop for futility, 7.4%; and would continue to completion without statistical significance, 44.5%. The actual trial continued to completion without statistical significance. Conclusions Announcements of interim analyses provide information outside the DSMB’s sphere of confidentiality. This information is potentially helpful to clinical trial prognosticators. ‘Information leakage’ from standard interim analyses such as in NSABP C-08 is conventionally viewed as acceptable even though it may be quite revealing. Whether leakage from more aggressive types of adaptations is acceptable should be assessed at the design stage. PMID:24607270

  17. Phospholipid and Respiratory Quinone Analyses From Extreme Environments

    NASA Astrophysics Data System (ADS)

    Pfiffner, S. M.

    2008-12-01

    Extreme environments on Earth have been chosen as surrogate sites to test methods and strategies for the deployment of space craft in the search for extraterrestrial life. Surrogate sites for many of the NASA astrobiology institutes include the South African gold mines, Canadian subpermafrost, Atacama Desert, and acid rock drainage. Soils, sediments, rock cores, fracture waters, biofilms, and service and drill waters represent the types of samples collected from these sites. These samples were analyzed by gas chromatography mass spectrometry for phospholipid fatty acid methyl esters and by high performance liquid chromatography atmospheric pressure chemical ionization tandem mass spectrometry for respiratory quinones. Phospholipid analyses provided estimates of biomass, community composition, and compositional changes related to nutritional limitations or exposure to toxic conditions. Similar to phospholipid analyses, respiratory quinone analyses afforded identification of certain types of microorganisms in the community based on respiration and offered clues to in situ redox conditions. Depending on the number of samples analyzed, selected multivariate statistical methods were applied to relate membrane lipid results with site biogeochemical parameters. Successful detection of life signatures and refinement of methodologies at surrogate sites on Earth will be critical for the recognition of extraterrestrial life. At this time, membrane lipid analyses provide useful information not easily obtained by other molecular techniques.

  18. Evaluation of different approaches for identifying optimal sites to predict mean hillslope soil moisture content

    NASA Astrophysics Data System (ADS)

    Liao, Kaihua; Zhou, Zhiwen; Lai, Xiaoming; Zhu, Qing; Feng, Huihui

    2017-04-01

    The identification of representative soil moisture sampling sites is important for the validation of remotely sensed mean soil moisture in a certain area and ground-based soil moisture measurements in catchment or hillslope hydrological studies. Numerous approaches have been developed to identify optimal sites for predicting mean soil moisture. Each method has certain advantages and disadvantages, but they have rarely been evaluated and compared. In our study, surface (0-20 cm) soil moisture data from January 2013 to March 2016 (a total of 43 sampling days) were collected at 77 sampling sites on a mixed land-use (tea and bamboo) hillslope in the hilly area of Taihu Lake Basin, China. A total of 10 methods (temporal stability (TS) analyses based on 2 indices, K-means clustering based on 6 kinds of inputs and 2 random sampling strategies) were evaluated for determining optimal sampling sites for mean soil moisture estimation. They were TS analyses based on the smallest index of temporal stability (ITS, a combination of the mean relative difference and standard deviation of relative difference (SDRD)) and based on the smallest SDRD, K-means clustering based on soil properties and terrain indices (EFs), repeated soil moisture measurements (Theta), EFs plus one-time soil moisture data (EFsTheta), and the principal components derived from EFs (EFs-PCA), Theta (Theta-PCA), and EFsTheta (EFsTheta-PCA), and global and stratified random sampling strategies. Results showed that the TS based on the smallest ITS was better (RMSE = 0.023 m3 m-3) than that based on the smallest SDRD (RMSE = 0.034 m3 m-3). The K-means clustering based on EFsTheta (-PCA) was better (RMSE <0.020 m3 m-3) than these based on EFs (-PCA) and Theta (-PCA). The sampling design stratified by the land use was more efficient than the global random method. Forty and 60 sampling sites are needed for stratified sampling and global sampling respectively to make their performances comparable to the best K-means method (EFsTheta-PCA). Overall, TS required only one site, but its accuracy was limited. The best K-means method required <8 sites and yielded high accuracy, but extra soil and terrain information is necessary when using this method. The stratified sampling strategy can only be used if no pre-knowledge about soil moisture variation is available. This information will help in selecting the optimal methods for estimation the area mean soil moisture.

  19. Chemical modification of magnetite nanoparticles and preparation of acrylic-base magnetic nanocomposite particles via miniemulsion polymerization

    NASA Astrophysics Data System (ADS)

    Mahdieh, Athar; Mahdavian, Ali Reza; Salehi-Mobarakeh, Hamid

    2017-03-01

    Nowadays, magnetic nanocomposite particles have attracted many interests because of their versatile applications. A new method for chemical modification of Fe3O4 nanoparticles with polymerizable groups is presented here. After synthesis of Fe3O4 nanoparticles by co-precipitation method, they were modified sequentially with 3-aminopropyl triethoxysilane (APTES), acryloyl chloride (AC) and benzoyl chloride (BC) and all were characterized by FTIR, XRD, SEM and TGA analyses. Then the modified magnetite nanoparticles with unsaturated acrylic groups were copolymerized with methyl methacrylate (MMA), butyl acrylate (BA) and acrylic acid (AA) through miniemulsion polymerization. Although several reports exist on preparation of magnetite-base polymer particles, but the efficiency of magnetite encapsulationwith reasonable content and obtaining final stable latexes with limited aggregation ofFe3O4 are still important issues. These were considered here by controlling reaction parameters. Hence, a seriesofmagneticnanocomposites latex particlescontaining different amounts of Fe3O4 nanoparticles (0-10 wt%) were prepared with core-shell morphology and diameter below 200 nm and were characterized by FT-IR, DSC and TGA analyses. Their morphology and size distribution were studied by SEM, TEM and DLS analyses too. Magnetic properties of all products were also measuredby VSM analysis and the results revealed almost superparamagnetic properties for the obtained nanocomposite particles.

  20. Measuring stigma after spinal cord injury: Development and psychometric characteristics of the SCI-QOL Stigma item bank and short form.

    PubMed

    Kisala, Pamela A; Tulsky, David S; Pace, Natalie; Victorson, David; Choi, Seung W; Heinemann, Allen W

    2015-05-01

    To develop a calibrated item bank and computer adaptive test (CAT) to assess the effects of stigma on health-related quality of life in individuals with spinal cord injury (SCI). Grounded-theory based qualitative item development methods, large-scale item calibration field testing, confirmatory factor analysis, and item response theory (IRT)-based psychometric analyses. Five SCI Model System centers and one Department of Veterans Affairs medical center in the United States. Adults with traumatic SCI. SCI-QOL Stigma Item Bank A sample of 611 individuals with traumatic SCI completed 30 items assessing SCI-related stigma. After 7 items were iteratively removed, factor analyses confirmed a unidimensional pool of items. Graded Response Model IRT analyses were used to estimate slopes and thresholds for the final 23 items. The SCI-QOL Stigma item bank is unique not only in the assessment of SCI-related stigma but also in the inclusion of individuals with SCI in all phases of its development. Use of confirmatory factor analytic and IRT methods provide flexibility and precision of measurement. The item bank may be administered as a CAT or as a 10-item fixed-length short form and can be used for research and clinical applications.

  1. Spatio-temporal scaling effects on longshore sediment transport pattern along the nearshore zone

    NASA Astrophysics Data System (ADS)

    Khorram, Saeed; Ergil, Mustafa

    2018-03-01

    A measure of uncertainties, entropy has been employed in such different applications as coastal engineering probability inferences. Entropy sediment transport integration theories present novel visions in coastal analyses/modeling the application and development of which are still far-reaching. Effort has been made in the present paper to propose a method that needs an entropy-power index for spatio-temporal patterns analyses. Results have shown that the index is suitable for marine/hydrological ecosystem components analyses based on a beach area case study. The method makes use of six Makran Coastal monthly data (1970-2015) and studies variables such as spatio-temporal patterns, LSTR (long-shore sediment transport rate), wind speed, and wave height all of which are time-dependent and play considerable roles in terrestrial coastal investigations; the mentioned variables show meaningful spatio-temporal variability most of the time, but explanation of their combined performance is not easy. Accordingly, the use of an entropy-power index can show considerable signals that facilitate the evaluation of water resources and will provide an insight regarding hydrological parameters' interactions at scales as large as beach areas. Results have revealed that an STDDPI (entropy based spatio-temporal disorder dynamics power index) can simulate wave, long-shore sediment transport rate, and wind when granulometry, concentration, and flow conditions vary.

  2. Measuring stigma after spinal cord injury: Development and psychometric characteristics of the SCI-QOL Stigma item bank and short form

    PubMed Central

    Kisala, Pamela A.; Tulsky, David S.; Pace, Natalie; Victorson, David; Choi, Seung W.; Heinemann, Allen W.

    2015-01-01

    Objective To develop a calibrated item bank and computer adaptive test (CAT) to assess the effects of stigma on health-related quality of life in individuals with spinal cord injury (SCI). Design Grounded-theory based qualitative item development methods, large-scale item calibration field testing, confirmatory factor analysis, and item response theory (IRT)-based psychometric analyses. Setting Five SCI Model System centers and one Department of Veterans Affairs medical center in the United States. Participants Adults with traumatic SCI. Main Outcome Measures SCI-QOL Stigma Item Bank Results A sample of 611 individuals with traumatic SCI completed 30 items assessing SCI-related stigma. After 7 items were iteratively removed, factor analyses confirmed a unidimensional pool of items. Graded Response Model IRT analyses were used to estimate slopes and thresholds for the final 23 items. Conclusions The SCI-QOL Stigma item bank is unique not only in the assessment of SCI-related stigma but also in the inclusion of individuals with SCI in all phases of its development. Use of confirmatory factor analytic and IRT methods provide flexibility and precision of measurement. The item bank may be administered as a CAT or as a 10-item fixed-length short form and can be used for research and clinical applications. PMID:26010973

  3. Interim analyses in 2 x 2 crossover trials.

    PubMed

    Cook, R J

    1995-09-01

    A method is presented for performing interim analyses in long term 2 x 2 crossover trials with serial patient entry. The analyses are based on a linear statistic that combines data from individuals observed for one treatment period with data from individuals observed for both periods. The coefficients in this linear combination can be chosen quite arbitrarily, but we focus on variance-based weights to maximize power for tests regarding direct treatment effects. The type I error rate of this procedure is controlled by utilizing the joint distribution of the linear statistics over analysis stages. Methods for performing power and sample size calculations are indicated. A two-stage sequential design involving simultaneous patient entry and a single between-period interim analysis is considered in detail. The power and average number of measurements required for this design are compared to those of the usual crossover trial. The results indicate that, while there is minimal loss in power relative to the usual crossover design in the absence of differential carry-over effects, the proposed design can have substantially greater power when differential carry-over effects are present. The two-stage crossover design can also lead to more economical studies in terms of the expected number of measurements required, due to the potential for early stopping. Attention is directed toward normally distributed responses.

  4. [Methylmercury: existing recommendations; methods of analysing and interpreting the results; economic evaluation].

    PubMed

    González-Estecha, Montserrat; Bodas-Pinedo, Andrés; Martínez-García, María José; Trasobares-Iglesias, Elena M; Bermejo-Barrera, Pilar; Ordóñez-Iriarte, José María; Llorente-Ballesteros, María Teresa; Prieto-Menchero, Santiago; Guillén-Pérez, José Jesús; Martell-Claros, Nieves; Cuadrado-Cenzual, María Ángeles; Rubio-Herrera, Miguel Ángel; Martínez-Álvarez, Jesús Román; Calvo-Manuel, Elpidio; Farré-Rovira, Rosaura; Herráiz-Martínez, Miguel Ángel; Bretón Lesmes, Irene; García-Donaire, José Antonio; Sáinz-Martín, María; Martínez-Astorquiza, Txantón; Gallardo-Pino, Carmen; Moreno-Rojas, Rafael; Salas-Salvadó, Jordi; Blanco Fuentes, María; Arroyo-Fernández, Manuel; Calle Pascual, Alfonso

    2014-11-04

    The beneficial effects of fish consumption are well- known. Nevertheless, there is worldwide concern regard methylmercury concentrations in fish, which is why many countries such as the United States, Australia, New Zealand, Canada and numerous European countries have made fish consumption recommendations for their populations, particularly vulnerable groups, in order to México methylmercury intake. Blood and hair are the best biological samples for measuring methylmercury. The most widely-used method to analyse methylmercury is cold vapor atomic absorption spectrometry, although there are also direct methods based on the thermal decomposition of the sample. In recent years, the number of laboratories that measure mercury by inductively coupled plasma mass spectrometry has increased. In addition, the different kinds of mercury can be distinguished by coupling chromatography methods of separation. Laboratories that analyse mercury in biological samples need to participate in external quality control programmes. Even if mercury emissions are reduced, mercury may remain in the environment for many years, so dietary recommendations are fundamental in order to reduce exposure. It is necessary to propose public health measures aimed at decreasing mercury exposure and to evaluate the benefits of such measures from the economic and social standpoints. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  5. Implementation of novel statistical procedures and other advanced approaches to improve analysis of CASA data.

    PubMed

    Ramón, M; Martínez-Pastor, F

    2018-04-23

    Computer-aided sperm analysis (CASA) produces a wealth of data that is frequently ignored. The use of multiparametric statistical methods can help explore these datasets, unveiling the subpopulation structure of sperm samples. In this review we analyse the significance of the internal heterogeneity of sperm samples and its relevance. We also provide a brief description of the statistical tools used for extracting sperm subpopulations from the datasets, namely unsupervised clustering (with non-hierarchical, hierarchical and two-step methods) and the most advanced supervised methods, based on machine learning. The former method has allowed exploration of subpopulation patterns in many species, whereas the latter offering further possibilities, especially considering functional studies and the practical use of subpopulation analysis. We also consider novel approaches, such as the use of geometric morphometrics or imaging flow cytometry. Finally, although the data provided by CASA systems provides valuable information on sperm samples by applying clustering analyses, there are several caveats. Protocols for capturing and analysing motility or morphometry should be standardised and adapted to each experiment, and the algorithms should be open in order to allow comparison of results between laboratories. Moreover, we must be aware of new technology that could change the paradigm for studying sperm motility and morphology.

  6. Method variation in the impact of missing data on response shift detection.

    PubMed

    Schwartz, Carolyn E; Sajobi, Tolulope T; Verdam, Mathilde G E; Sebille, Veronique; Lix, Lisa M; Guilleux, Alice; Sprangers, Mirjam A G

    2015-03-01

    Missing data due to attrition or item non-response can result in biased estimates and loss of power in longitudinal quality-of-life (QOL) research. The impact of missing data on response shift (RS) detection is relatively unknown. This overview article synthesizes the findings of three methods tested in this special section regarding the impact of missing data patterns on RS detection in incomplete longitudinal data. The RS detection methods investigated include: (1) Relative importance analysis to detect reprioritization RS in stroke caregivers; (2) Oort's structural equation modeling (SEM) to detect recalibration, reprioritization, and reconceptualization RS in cancer patients; and (3) Rasch-based item-response theory-based (IRT) models as compared to SEM models to detect recalibration and reprioritization RS in hospitalized chronic disease patients. Each method dealt with missing data differently, either with imputation (1), attrition-based multi-group analysis (2), or probabilistic analysis that is robust to missingness due to the specific objectivity property (3). Relative importance analyses were sensitive to the type and amount of missing data and imputation method, with multiple imputation showing the largest RS effects. The attrition-based multi-group SEM revealed differential effects of both the changes in health-related QOL and the occurrence of response shift by attrition stratum, and enabled a more complete interpretation of findings. The IRT RS algorithm found evidence of small recalibration and reprioritization effects in General Health, whereas SEM mostly evidenced small recalibration effects. These differences may be due to differences between the two methods in handling of missing data. Missing data imputation techniques result in different conclusions about the presence of reprioritization RS using the relative importance method, while the attrition-based SEM approach highlighted different recalibration and reprioritization RS effects by attrition group. The IRT analyses detected more recalibration and reprioritization RS effects than SEM, presumably due to IRT's robustness to missing data. Future research should apply simulation techniques in order to make conclusive statements about the impacts of missing data according to the type and amount of RS.

  7. Similarity Measures for Protein Ensembles

    PubMed Central

    Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper

    2009-01-01

    Analyses of similarities and changes in protein conformation can provide important information regarding protein function and evolution. Many scores, including the commonly used root mean square deviation, have therefore been developed to quantify the similarities of different protein conformations. However, instead of examining individual conformations it is in many cases more relevant to analyse ensembles of conformations that have been obtained either through experiments or from methods such as molecular dynamics simulations. We here present three approaches that can be used to compare conformational ensembles in the same way as the root mean square deviation is used to compare individual pairs of structures. The methods are based on the estimation of the probability distributions underlying the ensembles and subsequent comparison of these distributions. We first validate the methods using a synthetic example from molecular dynamics simulations. We then apply the algorithms to revisit the problem of ensemble averaging during structure determination of proteins, and find that an ensemble refinement method is able to recover the correct distribution of conformations better than standard single-molecule refinement. PMID:19145244

  8. Upgrading Existing Buildings to Universal Design. What Cost-Benefit Analyses Can Tell Us.

    PubMed

    Aslaksen, Finn

    2016-01-01

    This article is based on a project aimed at finding the benefits of different measures to upgrade existing public buildings and outdoor areas to be accessible for all. The study was initiated by The Ministry of Children and Equality. The ministry asked for a study of benefits based on a stated preferences (SP) method and an easy-to-complete calculation tool for CBA. In the project 18 commonly used measures and their typical costs were identified. The benefits of each measure were analysed in a stated preference study. The SP analyses included 9 multiple choices in 4 different sequences in an Internet based survey with 800 respondents. The project concluded that it is possible to use stated preferences survey to identify the respondent's valuation of measures to improve accessibility in existing buildings. Some of the measures have a high cost-/benefit ratio. The project report including the calculation manual is based on the average valuation for each measure. But in the background analyses (not referred in the report) there are also some analyses of valuations for target groups for the various measures. The target groups were defined for each measure based on information about the respondents' abilities and use of technical aids. The analyses presented in this paper indicate how valuation varies between the target groups and the average population. This is named the measures profile. Some measures have benefits for the target group that are only twice as high as for the average citizen while another type of measures has high benefits only the target group. The first type which has a wide profile often has high overall socioeconomic benefits, while the last group with a narrow profile more often has low overall socioeconomic benefits, but may be very important for certain user groups and therefore essential for the elimination of discrimination and exclusion of those groups.

  9. Comparative Proteomic Analyses of Human Adipose Extracellular Matrices Decellularized Using Alternative Procedures.

    PubMed

    Thomas-Porch, Caasy; Li, Jie; Zanata, Fabiana; Martin, Elizabeth C; Pashos, Nicholas; Genemaras, Kaylynn; Poche, J Nicholas; Totaro, Nicholas P; Bratton, Melyssa R; Gaupp, Dina; Frazier, Trivia; Wu, Xiying; Ferreira, Lydia Masako; Tian, Weidong; Wang, Guangdi; Bunnell, Bruce A; Flynn, Lauren; Hayes, Daniel; Gimble, Jeffrey M

    2018-04-25

    Decellularized human adipose tissue has potential clinical utility as a processed biological scaffold for soft tissue cosmesis, grafting and reconstruction. Adipose tissue decellularization has been accomplished using enzymatic-, detergent-, and/or solvent-based methods. To examine the hypothesis that distinct decellularization processes may yield scaffolds with differing compositions, the current study employed mass spectrometry to compare the proteomes of human adipose-derived matrices generated through three independent methods combining enzymatic-, detergent-, and/or solvent-based steps. In addition to protein content, bioscaffolds were evaluated for DNA depletion, ECM composition, and physical structure using optical density, histochemical staining, and scanning electron microscopy (SEM). Mass spectrometry (MS) based proteomic analyses identified 25 proteins (having at least two peptide sequences detected) in the scaffolds generated with an enzymatic approach, 143 with the detergent approach, and 102 with the solvent approach, as compared to 155 detected in unprocessed native human fat. Immunohistochemical detection confirmed the presence of the structural proteins actin, collagen type VI, fibrillin, laminin, and vimentin. Subsequent in vivo analysis of the predominantly enzymatic- and detergent-based decellularized scaffolds following subcutaneous implantation in GFP + transgenic mice demonstrated that the matrices generated with both approaches supported the ingrowth of host-derived adipocyte progenitors and vasculature in a time dependent manner. Together, these results determine that decellularization methods influence the protein composition of adipose tissue-derived bioscaffolds. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.

  10. A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.

    PubMed

    Tipton, Elizabeth; Shuster, Jonathan

    2017-10-15

    Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Alkaloid profiles of Mimosa tenuiflora and associated methods of analysis

    USDA-ARS?s Scientific Manuscript database

    The alkaloid contents of the leaves and seeds of M. tenuiflora collected from northeastern Brazil were studied. Alkaloids were isolated by classical acid/base extraction procedures and by cation exchange solid phase extraction. The crude alkaloid fractions were then analysed by thin layer chromatogr...

  12. RECOVERING FILTER-BASED MICROARRAY DATA FOR PATHWAYS ANALYSIS USING A MULTIPOINT ALIGNMENT STRATEGY

    EPA Science Inventory

    The use of commercial microarrays are rapidly becoming the method of choice for profiling gene expression and assessing various disease states. Research Genetics has provided a series of well defined biological and software tools to the research community for these analyses. Th...

  13. Socioeconomic Strata, Mobile Technology, and Education: A Comparative Analysis

    ERIC Educational Resources Information Center

    Kim, Paul; Hagashi, Teresita; Carillo, Laura; Gonzales, Irina; Makany, Tamas; Lee, Bommi; Garate, Alberto

    2011-01-01

    Mobile devices are highly portable, easily distributable, substantially affordable, and have the potential to be pedagogically complementary resources in education. This study, incorporating mixed method analyses, discusses the implications of a mobile learning technology-based learning model in two public primary schools near the Mexico-USA…

  14. Analyses of procyanidins in foods using Diol phase HPLC

    USDA-ARS?s Scientific Manuscript database

    Separation of procyanidins using silica-based HPLC suffered from poor resolution for higher oligomers and low sensitivity due to the fluorescence quenching effects of methylene chloride in the mobile phase. Optimization of a published Diol-phase HPLC method resulted in near baseline separation for p...

  15. Molecular diversity of drinking water bacterial communities using 16S rRNA gene sequence analyses

    EPA Science Inventory

    Our understanding of the microbial community structure of drinking water distribution system has relied on culture-based methods. However, recent studies have suggested that the majority of bacteria inhabiting distribution systems are unable to grow on artificial media. The goal ...

  16. 77 FR 40601 - Final Priority: Disability and Rehabilitation Research Projects and Centers Program; Disability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... analyses of data, producing observational findings, and creating other sources of research-based... provide a rationale for the stage of research being proposed and the research methods associated with the... DEPARTMENT OF EDUCATION Final Priority: Disability and Rehabilitation Research Projects and...

  17. Evaluation of ice-tea quality by DART-TOF/MS.

    PubMed

    Rajchl, Aleš; Prchalová, Jana; Kružík, Vojtěch; Ševčík, Rudolf; Čížková, Helena

    2015-11-01

    DART (Direct Analysis in Real Time) coupled with Time-of-Flight Mass Spectrometry (TOF/MS) has been used for analyses of ice-teas. The article focuses on quality and authenticity of ice-teas as one of the most important tea-based products on the market. Twenty-one samples of ice-teas (black and green) were analysed. Selected compounds of ice-teas were determined: theobromine, caffeine, total phenolic compounds, total soluble solids, total amino acid concentration, preservatives and saccharides were determined. Fingerprints of DART-TOF/MS spectra were used for comprehensive assessment of the ice-tea samples. The DART-TOF/MS method was used for monitoring the following compounds: citric acid, caffeine, saccharides, artificial sweeteners (saccharin, acesulphame K), and preservatives (sorbic and benzoic acid), phosphoric acid and phenolic compounds. The measured data were subjected to a principal components analysis. The HPLC and DART-TOF/MS methods were compared in terms of determination of selected compounds (caffeine, benzoic acid, sorbic acid and saccharides) in the ice-teas. The DART-TOF/MS technique seems to be a suitable method for fast screening, testing quality and authenticity of tea-based products. Copyright © 2015 John Wiley & Sons, Ltd.

  18. A screening method based on UV-Visible spectroscopy and multivariate analysis to assess addition of filler juices and water to pomegranate juices.

    PubMed

    Boggia, Raffaella; Casolino, Maria Chiara; Hysenaj, Vilma; Oliveri, Paolo; Zunin, Paola

    2013-10-15

    Consumer demand for pomegranate juice has considerably grown, during the last years, for its potential health benefits. Since it is an expensive functional food, cheaper fruit juices addition (i.e., grape and apple juices) or its simple dilution, or polyphenols subtraction are deceptively used. At present, time-consuming analyses are used to control the quality of this product. Furthermore these analyses are expensive and require well-trained analysts. Thus, the purpose of this study was to propose a high-speed and easy-to-use shortcut. Based on UV-VIS spectroscopy and chemometrics, a screening method is proposed to quickly screening some common fillers of pomegranate juice that could decrease the antiradical scavenging capacity of pure products. The analytical method was applied to laboratory prepared juices, to commercial juices and to representative experimental mixtures at different levels of water and filler juices. The outcomes were evaluated by means of multivariate exploratory analysis. The results indicate that the proposed strategy can be a useful screening tool to assess addition of filler juices and water to pomegranate juices. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. ESEA: Discovering the Dysregulated Pathways based on Edge Set Enrichment Analysis

    PubMed Central

    Han, Junwei; Shi, Xinrui; Zhang, Yunpeng; Xu, Yanjun; Jiang, Ying; Zhang, Chunlong; Feng, Li; Yang, Haixiu; Shang, Desi; Sun, Zeguo; Su, Fei; Li, Chunquan; Li, Xia

    2015-01-01

    Pathway analyses are playing an increasingly important role in understanding biological mechanism, cellular function and disease states. Current pathway-identification methods generally focus on only the changes of gene expression levels; however, the biological relationships among genes are also the fundamental components of pathways, and the dysregulated relationships may also alter the pathway activities. We propose a powerful computational method, Edge Set Enrichment Analysis (ESEA), for the identification of dysregulated pathways. This provides a novel way of pathway analysis by investigating the changes of biological relationships of pathways in the context of gene expression data. Simulation studies illustrate the power and performance of ESEA under various simulated conditions. Using real datasets from p53 mutation, Type 2 diabetes and lung cancer, we validate effectiveness of ESEA in identifying dysregulated pathways. We further compare our results with five other pathway enrichment analysis methods. With these analyses, we show that ESEA is able to help uncover dysregulated biological pathways underlying complex traits and human diseases via specific use of the dysregulated biological relationships. We develop a freely available R-based tool of ESEA. Currently, ESEA can support pathway analysis of the seven public databases (KEGG; Reactome; Biocarta; NCI; SPIKE; HumanCyc; Panther). PMID:26267116

  20. GEMINI: Integrative Exploration of Genetic Variation and Genome Annotations

    PubMed Central

    Paila, Umadevi; Chapman, Brad A.; Kirchner, Rory; Quinlan, Aaron R.

    2013-01-01

    Modern DNA sequencing technologies enable geneticists to rapidly identify genetic variation among many human genomes. However, isolating the minority of variants underlying disease remains an important, yet formidable challenge for medical genetics. We have developed GEMINI (GEnome MINIng), a flexible software package for exploring all forms of human genetic variation. Unlike existing tools, GEMINI integrates genetic variation with a diverse and adaptable set of genome annotations (e.g., dbSNP, ENCODE, UCSC, ClinVar, KEGG) into a unified database to facilitate interpretation and data exploration. Whereas other methods provide an inflexible set of variant filters or prioritization methods, GEMINI allows researchers to compose complex queries based on sample genotypes, inheritance patterns, and both pre-installed and custom genome annotations. GEMINI also provides methods for ad hoc queries and data exploration, a simple programming interface for custom analyses that leverage the underlying database, and both command line and graphical tools for common analyses. We demonstrate GEMINI's utility for exploring variation in personal genomes and family based genetic studies, and illustrate its ability to scale to studies involving thousands of human samples. GEMINI is designed for reproducibility and flexibility and our goal is to provide researchers with a standard framework for medical genomics. PMID:23874191

  1. Blade resonance parameter identification based on tip-timing method without the once-per revolution sensor

    NASA Astrophysics Data System (ADS)

    Guo, Haotian; Duan, Fajie; Zhang, Jilong

    2016-01-01

    Blade tip-timing is the most effective method for blade vibration online measurement of turbomachinery. In this article a synchronous resonance vibration measurement method of blade based on tip-timing is presented. This method requires no once-per revolution sensor which makes it more generally applicable in the condition where this sensor is difficult to install, especially for the high-pressure rotors of dual-rotor engines. Only three casing mounted probes are required to identify the engine order, amplitude, natural frequency and the damping coefficient of the blade. A method is developed to identify the blade which a tip-timing data belongs to without once-per revolution sensor. Theoretical analyses of resonance parameter measurement are presented. Theoretic error of the method is investigated and corrected. Experiments are conducted and the results indicate that blade resonance parameter identification is achieved without once-per revolution sensor.

  2. Size exclusion chromatography for analyses of fibroin in silk: optimization of sampling and separation conditions

    NASA Astrophysics Data System (ADS)

    Pawcenis, Dominika; Koperska, Monika A.; Milczarek, Jakub M.; Łojewski, Tomasz; Łojewska, Joanna

    2014-02-01

    A direct goal of this paper was to improve the methods of sample preparation and separation for analyses of fibroin polypeptide with the use of size exclusion chromatography (SEC). The motivation for the study arises from our interest in natural polymers included in historic textile and paper artifacts, and is a logical response to the urgent need for developing rationale-based methods for materials conservation. The first step is to develop a reliable analytical tool which would give insight into fibroin structure and its changes caused by both natural and artificial ageing. To investigate the influence of preparation conditions, two sets of artificially aged samples were prepared (with and without NaCl in sample solution) and measured by the means of SEC with multi angle laser light scattering detector. It was shown that dialysis of fibroin dissolved in LiBr solution allows removal of the salt which destroys stacks chromatographic columns and prevents reproducible analyses. Salt rich (NaCl) water solutions of fibroin improved the quality of chromatograms.

  3. Triplet supertree heuristics for the tree of life

    PubMed Central

    Lin, Harris T; Burleigh, J Gordon; Eulenstein, Oliver

    2009-01-01

    Background There is much interest in developing fast and accurate supertree methods to infer the tree of life. Supertree methods combine smaller input trees with overlapping sets of taxa to make a comprehensive phylogenetic tree that contains all of the taxa in the input trees. The intrinsically hard triplet supertree problem takes a collection of input species trees and seeks a species tree (supertree) that maximizes the number of triplet subtrees that it shares with the input trees. However, the utility of this supertree problem has been limited by a lack of efficient and effective heuristics. Results We introduce fast hill-climbing heuristics for the triplet supertree problem that perform a step-wise search of the tree space, where each step is guided by an exact solution to an instance of a local search problem. To realize time efficient heuristics we designed the first nontrivial algorithms for two standard search problems, which greatly improve on the time complexity to the best known (naïve) solutions by a factor of n and n2 (the number of taxa in the supertree). These algorithms enable large-scale supertree analyses based on the triplet supertree problem that were previously not possible. We implemented hill-climbing heuristics that are based on our new algorithms, and in analyses of two published supertree data sets, we demonstrate that our new heuristics outperform other standard supertree methods in maximizing the number of triplets shared with the input trees. Conclusion With our new heuristics, the triplet supertree problem is now computationally more tractable for large-scale supertree analyses, and it provides a potentially more accurate alternative to existing supertree methods. PMID:19208181

  4. Analysing and correcting the differences between multi-source and multi-scale spatial remote sensing observations.

    PubMed

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.

  5. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    PubMed Central

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760

  6. A seismic optimization procedure for reinforced concrete framed buildings based on eigenfrequency optimization

    NASA Astrophysics Data System (ADS)

    Arroyo, Orlando; Gutiérrez, Sergio

    2017-07-01

    Several seismic optimization methods have been proposed to improve the performance of reinforced concrete framed (RCF) buildings; however, they have not been widely adopted among practising engineers because they require complex nonlinear models and are computationally expensive. This article presents a procedure to improve the seismic performance of RCF buildings based on eigenfrequency optimization, which is effective, simple to implement and efficient. The method is used to optimize a 10-storey regular building, and its effectiveness is demonstrated by nonlinear time history analyses, which show important reductions in storey drifts and lateral displacements compared to a non-optimized building. A second example for an irregular six-storey building demonstrates that the method provides benefits to a wide range of RCF structures and supports the applicability of the proposed method.

  7. Mass-based design and optimization of wave rotors for gas turbine engine enhancement

    NASA Astrophysics Data System (ADS)

    Chan, S.; Liu, H.

    2017-03-01

    An analytic method aiming at mass properties was developed for the preliminary design and optimization of wave rotors. In the present method, we introduce the mass balance principle into the design and thus can predict and optimize the mass qualities as well as the performance of wave rotors. A dedicated least-square method with artificial weighting coefficients was developed to solve the over-constrained system in the mass-based design. This method and the adoption of the coefficients were validated by numerical simulation. Moreover, the problem of fresh air exhaustion (FAE) was put forward and analyzed, and exhaust gas recirculation (EGR) was investigated. Parameter analyses and optimization elucidated which designs would not only achieve the best performance, but also operate with minimum EGR and no FAE.

  8. A rule-based automatic sleep staging method.

    PubMed

    Liang, Sheng-Fu; Kuo, Chin-En; Hu, Yu-Han; Cheng, Yu-Shian

    2012-03-30

    In this paper, a rule-based automatic sleep staging method was proposed. Twelve features including temporal and spectrum analyses of the EEG, EOG, and EMG signals were utilized. Normalization was applied to each feature to eliminating individual differences. A hierarchical decision tree with fourteen rules was constructed for sleep stage classification. Finally, a smoothing process considering the temporal contextual information was applied for the continuity. The overall agreement and kappa coefficient of the proposed method applied to the all night polysomnography (PSG) of seventeen healthy subjects compared with the manual scorings by R&K rules can reach 86.68% and 0.79, respectively. This method can integrate with portable PSG system for sleep evaluation at-home in the near future. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Evaluation of Low-Voltage Distribution Network Index Based on Improved Principal Component Analysis

    NASA Astrophysics Data System (ADS)

    Fan, Hanlu; Gao, Suzhou; Fan, Wenjie; Zhong, Yinfeng; Zhu, Lei

    2018-01-01

    In order to evaluate the development level of the low-voltage distribution network objectively and scientifically, chromatography analysis method is utilized to construct evaluation index model of low-voltage distribution network. Based on the analysis of principal component and the characteristic of logarithmic distribution of the index data, a logarithmic centralization method is adopted to improve the principal component analysis algorithm. The algorithm can decorrelate and reduce the dimensions of the evaluation model and the comprehensive score has a better dispersion degree. The clustering method is adopted to analyse the comprehensive score because the comprehensive score of the courts is concentrated. Then the stratification evaluation of the courts is realized. An example is given to verify the objectivity and scientificity of the evaluation method.

  10. Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data

    PubMed Central

    Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2015-01-01

    DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984

  11. Space Subdivision in Indoor Mobile Laser Scanning Point Clouds Based on Scanline Analysis.

    PubMed

    Zheng, Yi; Peter, Michael; Zhong, Ruofei; Oude Elberink, Sander; Zhou, Quan

    2018-06-05

    Indoor space subdivision is an important aspect of scene analysis that provides essential information for many applications, such as indoor navigation and evacuation route planning. Until now, most proposed scene understanding algorithms have been based on whole point clouds, which has led to complicated operations, high computational loads and low processing speed. This paper presents novel methods to efficiently extract the location of openings (e.g., doors and windows) and to subdivide space by analyzing scanlines. An opening detection method is demonstrated that analyses the local geometric regularity in scanlines to refine the extracted opening. Moreover, a space subdivision method based on the extracted openings and the scanning system trajectory is described. Finally, the opening detection and space subdivision results are saved as point cloud labels which will be used for further investigations. The method has been tested on a real dataset collected by ZEB-REVO. The experimental results validate the completeness and correctness of the proposed method for different indoor environment and scanning paths.

  12. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    PubMed Central

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106

  13. The Seepage Simulation of Single Hole and Composite Gas Drainage Based on LB Method

    NASA Astrophysics Data System (ADS)

    Chen, Yanhao; Zhong, Qiu; Gong, Zhenzhao

    2018-01-01

    Gas drainage is the most effective method to prevent and solve coal mine gas power disasters. It is very important to study the seepage flow law of gas in fissure coal gas. The LB method is a simplified computational model based on micro-scale, especially for the study of seepage problem. Based on fracture seepage mathematical model on the basis of single coal gas drainage, using the LB method during coal gas drainage of gas flow numerical simulation, this paper maps the single-hole drainage gas, symmetric slot and asymmetric slot, the different width of the slot combined drainage area gas flow under working condition of gas cloud of gas pressure, flow path diagram and flow velocity vector diagram, and analyses the influence on gas seepage field under various working conditions, and also discusses effective drainage method of the center hole slot on both sides, and preliminary exploration that is related to the combination of gas drainage has been carried on as well.

  14. Free vibration analysis of a robotic fish based on a continuous and non-uniform flexible backbone with distributed masses

    NASA Astrophysics Data System (ADS)

    Coral, W.; Rossi, C.; Curet, O. M.

    2015-12-01

    This paper presents a Differential Quadrature Element Method for free transverse vibration of a robotic fish based on a continuous and non-uniform flexible backbone with distributed masses (fish ribs). The proposed method is based on the theory of a Timoshenko cantilever beam. The effects of the masses (number, magnitude and position) on the value of natural frequencies are investigated. Governing equations, compatibility and boundary conditions are formulated according to the Differential Quadrature rules. The convergence, efficiency and accuracy are compared to other analytical solution proposed in the literature. Moreover, the proposed method has been validate against the physical prototype of a flexible fish backbone. The main advantages of this method, compared to the exact solutions available in the literature are twofold: first, smaller computational cost and second, it allows analysing the free vibration in beams whose section is an arbitrary function, which is normally difficult or even impossible with other analytical methods.

  15. Rapid Detection Method for the Four Most Common CHEK2 Mutations Based on Melting Profile Analysis.

    PubMed

    Borun, Pawel; Salanowski, Kacper; Godlewski, Dariusz; Walkowiak, Jaroslaw; Plawski, Andrzej

    2015-12-01

    CHEK2 is a tumor suppressor gene, and the mutations affecting the functionality of the protein product increase cancer risk in various organs. The elevated risk, in a significant percentage of cases, is determined by the occurrence of one of the four most common mutations in the CHEK2 gene, including c.470T>C (p.I157T), c.444+1G>A (IVS2+1G>A), c.1100delC, and c.1037+1538_1224+328del5395 (del5395). We have developed and validated a rapid and effective method for their detection based on high-resolution melting analysis and comparative-high-resolution melting, a novel approach enabling simultaneous detection of copy number variations. The analysis is performed in two polymerase chain reactions followed by melting analysis, without any additional reagents or handling other than that used in standard high-resolution melting. Validation of the method was conducted in a group of 103 patients with diagnosed breast cancer, a group of 240 unrelated patients with familial history of cancer associated with the CHEK2 gene mutations, and a 100-person control group. The results of the analyses for all three groups were fully consistent with the results from other methods. The method we have developed improves the identification of the CHEK2 mutation carriers, reduces the cost of such analyses, as well as facilitates their implementation. Along with the increased efficiency, the method maintains accuracy and reliability comparable to other more labor-consuming techniques.

  16. The allele combinations of three loci based on, liver, stomach cancers, hematencephalon, COPD and normal population: A preliminary study.

    PubMed

    Gai, Liping; Liu, Hui; Cui, Jing-Hui; Yu, Weijian; Ding, Xiao-Dong

    2017-03-20

    The purpose of this study was to examine the specific allele combinations of three loci connected with the liver cancers, stomach cancers, hematencephalon and patients with chronic obstructive pulmonary disease (COPD) and to explore the feasibility of the research methods. We explored different mathematical methods for statistical analyses to assess the association between the genotype and phenotype. At the same time we still analyses the statistical results of allele combinations of three loci by difference value method and ratio method. All the DNA blood samples were collected from patients with 50 liver cancers, 75 stomach cancers, 50 hematencephalon, 72 COPD and 200 normal populations. All the samples were from Chinese. Alleles from short tandem repeat (STR) loci were determined using the STR Profiler plus PCR amplification kit (15 STR loci). Previous research was based on combinations of single-locus alleles, and combinations of cross-loci (two loci) alleles. Allele combinations of three loci were obtained by computer counting and stronger genetic signal was obtained. The methods of allele combinations of three loci can help to identify the statistically significant differences of allele combinations between liver cancers, stomach cancers, patients with hematencephalon, COPD and the normal population. The probability of illness followed different rules and had apparent specificity. This method can be extended to other diseases and provide reference for early clinical diagnosis. Copyright © 2016. Published by Elsevier B.V.

  17. Development and comparison of a real-time PCR assay for detection of Dichelobacter nodosus with culturing and conventional PCR: harmonisation between three laboratories

    PubMed Central

    2012-01-01

    Background Ovine footrot is a contagious disease with worldwide occurrence in sheep. The main causative agent is the fastidious bacterium Dichelobacter nodosus. In Scandinavia, footrot was first diagnosed in Sweden in 2004 and later also in Norway and Denmark. Clinical examination of sheep feet is fundamental to diagnosis of footrot, but D. nodosus should also be detected to confirm the diagnosis. PCR-based detection using conventional PCR has been used at our institutes, but the method was laborious and there was a need for a faster, easier-to-interpret method. The aim of this study was to develop a TaqMan-based real-time PCR assay for detection of D. nodosus and to compare its performance with culturing and conventional PCR. Methods A D. nodosus-specific TaqMan based real-time PCR assay targeting the 16S rRNA gene was designed. The inclusivity and exclusivity (specificity) of the assay was tested using 55 bacterial and two fungal strains. To evaluate the sensitivity and harmonisation of results between different laboratories, aliquots of a single DNA preparation were analysed at three Scandinavian laboratories. The developed real-time PCR assay was compared to culturing by analysing 126 samples, and to a conventional PCR method by analysing 224 samples. A selection of PCR-products was cloned and sequenced in order to verify that they had been identified correctly. Results The developed assay had a detection limit of 3.9 fg of D. nodosus genomic DNA. This result was obtained at all three laboratories and corresponds to approximately three copies of the D. nodosus genome per reaction. The assay showed 100% inclusivity and 100% exclusivity for the strains tested. The real-time PCR assay found 54.8% more positive samples than by culturing and 8% more than conventional PCR. Conclusions The developed real-time PCR assay has good specificity and sensitivity for detection of D. nodosus, and the results are easy to interpret. The method is less time-consuming than either culturing or conventional PCR. PMID:22293440

  18. Evaluation of a Partial Genome Screening of Two Asthma Susceptibility Regions Using Bayesian Network Based Bayesian Multilevel Analysis of Relevance

    PubMed Central

    Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035

  19. Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software

    NASA Astrophysics Data System (ADS)

    Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.

    2017-12-01

    Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.

  20. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    NASA Technical Reports Server (NTRS)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  1. Comparison of EML 105 and advantage analysers measuring capillary versus venous whole blood glucose in neonates.

    PubMed

    McNamara, P J; Sharief, N

    2001-09-01

    Near-patient blood glucose monitoring is an essential component of neonatal intensive care but the analysers currently used are unreliable and inaccurate. The aim of this study was to compare a new glucose electrode-based analyser (EML 105) and a non-wipe reflectance photometry method (Advantage) as opposed to a recognized laboratory reference method (Hexokinase). We also investigated the effect of sample route and haematocrit on the accuracy of the glucose readings obtained by each method of analysis. Whole blood glucose concentrations ranging from 0 to 3.5 mmol/l were carefully prepared in a laboratory setting and blood samples from each respective solution were then measured by EML 105 and Advantage analysers. The results obtained were then compared with the corresponding plasma glucose reading obtained by the Hexokinase method, using linear regression analysis. An in vivo study was subsequently performed on 103 neonates, over a 1-y period, using capillary and venous whole blood samples. Whole blood glucose concentration was estimated from each sample using both analysers and compared with the corresponding plasma glucose concentration estimated by the Hexokinase method. Venous blood was centrifuged and haematocrit was estimated using standardized curves. The effect of haematocrit on the agreement between whole blood and plasma glucose was investigated, estimating the degree of correlation on a scatterplot of the results and linear regression analysis. Both the EML 105 and Hexokinase methods were highly accurate, in vitro, with small proportional biases of 2% and 5%, respectively. However, in vivo, both study analysers overestimated neonatal plasma glucose, ranging from at best 0.45 mmol/l (EML 105 venous) to 0.69 mmol/l (EML capillary). There was no significant difference in the agreement of capillary (GD = 0.12, 95% CI, [-0.32,0.08], p = 0.2) or venous samples (GD = 0.05, 95% CI. [0.09, 0.19], p = 0.49) with plasma glucose when analysed by either study method (GD = glucose difference between study analyser and reference method) However, the venous samples analysed by EML 105 estimated plasma glucose significantly better than capillary samples using the same method of analysis (GD = 0.24, 95% CI. [0.09,0.38], p < 0.01). The relationship between haematocrit and the resultant glucose differences was non-linear with correlation coefficients of r = -0.057 (EML 105 capillary), r = 0.145 (EML 105 venous), r = -0.127 (Advantage capillary) and r = -0.275 (Advantage venous). There was no significant difference in the effect of haematocrit on the performance of EML 105 versus Advantage, regardless of the sample route. Both EML 105 and Advantage overestimated plasma glucose, with no significant difference in the performance of either analyser, regardless of the route of analysis. Agreement with plasma glucose was better for venous samples but this was only statistically significant when EML 105 capillary and venous results were compared. Haematocrit is not a significant confounding factor towards the performance of either EML 105 or Advantage in neonates, regardless of the route of sampling. The margin of overestimation of blood glucose prohibits the recommendation of both EML 105 and Advantage for routine neonatal glucose screening. The consequences include failure accurately to diagnose hypoglycaemia and delays in the instigation of therapeutic measures, both of which may potentially result in an adverse, long-term, neurodevelopmental outcome.

  2. Multiple imputation of missing data in nested case-control and case-cohort studies.

    PubMed

    Keogh, Ruth H; Seaman, Shaun R; Bartlett, Jonathan W; Wood, Angela M

    2018-06-05

    The nested case-control and case-cohort designs are two main approaches for carrying out a substudy within a prospective cohort. This article adapts multiple imputation (MI) methods for handling missing covariates in full-cohort studies for nested case-control and case-cohort studies. We consider data missing by design and data missing by chance. MI analyses that make use of full-cohort data and MI analyses based on substudy data only are described, alongside an intermediate approach in which the imputation uses full-cohort data but the analysis uses only the substudy. We describe adaptations to two imputation methods: the approximate method (MI-approx) of White and Royston () and the "substantive model compatible" (MI-SMC) method of Bartlett et al. (). We also apply the "MI matched set" approach of Seaman and Keogh () to nested case-control studies, which does not require any full-cohort information. The methods are investigated using simulation studies and all perform well when their assumptions hold. Substantial gains in efficiency can be made by imputing data missing by design using the full-cohort approach or by imputing data missing by chance in analyses using the substudy only. The intermediate approach brings greater gains in efficiency relative to the substudy approach and is more robust to imputation model misspecification than the full-cohort approach. The methods are illustrated using the ARIC Study cohort. Supplementary Materials provide R and Stata code. © 2018, The International Biometric Society.

  3. Impact of censoring on learning Bayesian networks in survival modelling.

    PubMed

    Stajduhar, Ivan; Dalbelo-Basić, Bojana; Bogunović, Nikola

    2009-11-01

    Bayesian networks are commonly used for presenting uncertainty and covariate interactions in an easily interpretable way. Because of their efficient inference and ability to represent causal relationships, they are an excellent choice for medical decision support systems in diagnosis, treatment, and prognosis. Although good procedures for learning Bayesian networks from data have been defined, their performance in learning from censored survival data has not been widely studied. In this paper, we explore how to use these procedures to learn about possible interactions between prognostic factors and their influence on the variate of interest. We study how censoring affects the probability of learning correct Bayesian network structures. Additionally, we analyse the potential usefulness of the learnt models for predicting the time-independent probability of an event of interest. We analysed the influence of censoring with a simulation on synthetic data sampled from randomly generated Bayesian networks. We used two well-known methods for learning Bayesian networks from data: a constraint-based method and a score-based method. We compared the performance of each method under different levels of censoring to those of the naive Bayes classifier and the proportional hazards model. We did additional experiments on several datasets from real-world medical domains. The machine-learning methods treated censored cases in the data as event-free. We report and compare results for several commonly used model evaluation metrics. On average, the proportional hazards method outperformed other methods in most censoring setups. As part of the simulation study, we also analysed structural similarities of the learnt networks. Heavy censoring, as opposed to no censoring, produces up to a 5% surplus and up to 10% missing total arcs. It also produces up to 50% missing arcs that should originally be connected to the variate of interest. Presented methods for learning Bayesian networks from data can be used to learn from censored survival data in the presence of light censoring (up to 20%) by treating censored cases as event-free. Given intermediate or heavy censoring, the learnt models become tuned to the majority class and would thus require a different approach.

  4. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  5. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  6. Measurement System Analyses - Gauge Repeatability and Reproducibility Methods

    NASA Astrophysics Data System (ADS)

    Cepova, Lenka; Kovacikova, Andrea; Cep, Robert; Klaput, Pavel; Mizera, Ondrej

    2018-02-01

    The submitted article focuses on a detailed explanation of the average and range method (Automotive Industry Action Group, Measurement System Analysis approach) and of the honest Gauge Repeatability and Reproducibility method (Evaluating the Measurement Process approach). The measured data (thickness of plastic parts) were evaluated by both methods and their results were compared on the basis of numerical evaluation. Both methods were additionally compared and their advantages and disadvantages were discussed. One difference between both methods is the calculation of variation components. The AIAG method calculates the variation components based on standard deviation (then a sum of variation components does not give 100 %) and the honest GRR study calculates the variation components based on variance, where the sum of all variation components (part to part variation, EV & AV) gives the total variation of 100 %. Acceptance of both methods among the professional society, future use, and acceptance by manufacturing industry were also discussed. Nowadays, the AIAG is the leading method in the industry.

  7. Automatic load forecasting. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, D.J.; Vemuri, S.

    A method which lends itself to on-line forecasting of hourly electric loads is presented and the results of its use are compared to models developed using the Box-Jenkins method. The method consists of processing the historical hourly loads with a sequential least-squares estimator to identify a finite order autoregressive model which in turn is used to obtain a parsimonious autoregressive-moving average model. A procedure is also defined for incorporating temperature as a variable to improve forecasts where loads are temperature dependent. The method presented has several advantages in comparison to the Box-Jenkins method including much less human intervention and improvedmore » model identification. The method has been tested using three-hourly data from the Lincoln Electric System, Lincoln, Nebraska. In the exhaustive analyses performed on this data base this method produced significantly better results than the Box-Jenkins method. The method also proved to be more robust in that greater confidence could be placed in the accuracy of models based upon the various measures available at the identification stage.« less

  8. First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation

    DOE PAGES

    Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...

    2016-12-02

    Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.

  9. Sublinear growth of information in DNA sequences.

    PubMed

    Menconi, Giulia

    2005-07-01

    We introduce a novel method to analyse complete genomes and recognise some distinctive features by means of an adaptive compression algorithm, which is not DNA-oriented, based on the Lempel-Ziv scheme. We study the Information Content as a function of the number of symbols encoded by the algorithm and we analyse the dictionary created by the algorithm. Preliminary results are shown concerning regions showing a sublinear type of information growth, which is strictly connected to the presence of highly repetitive subregions that might be supposed to have a regulatory function within the genome.

  10. Subgroup analyses in confirmatory clinical trials: time to be specific about their purposes.

    PubMed

    Tanniou, Julien; van der Tweel, Ingeborg; Teerenstra, Steven; Roes, Kit C B

    2016-02-18

    It is well recognized that treatment effects may not be homogeneous across the study population. Subgroup analyses constitute a fundamental step in the assessment of evidence from confirmatory (Phase III) clinical trials, where conclusions for the overall study population might not hold. Subgroup analyses can have different and distinct purposes, requiring specific design and analysis solutions. It is relevant to evaluate methodological developments in subgroup analyses against these purposes to guide health care professionals and regulators as well as to identify gaps in current methodology. We defined four purposes for subgroup analyses: (1) Investigate the consistency of treatment effects across subgroups of clinical importance, (2) Explore the treatment effect across different subgroups within an overall non-significant trial, (3) Evaluate safety profiles limited to one or a few subgroup(s), (4) Establish efficacy in the targeted subgroup when included in a confirmatory testing strategy of a single trial. We reviewed the methodology in line with this "purpose-based" framework. The review covered papers published between January 2005 and April 2015 and aimed to classify them in none, one or more of the aforementioned purposes. In total 1857 potentially eligible papers were identified. Forty-eight papers were selected and 20 additional relevant papers were identified from their references, leading to 68 papers in total. Nineteen were dedicated to purpose 1, 16 to purpose 4, one to purpose 2 and none to purpose 3. Seven papers were dedicated to more than one purpose, the 25 remaining could not be classified unambiguously. Purposes of the methods were often not specifically indicated, methods for subgroup analysis for safety purposes were almost absent and a multitude of diverse methods were developed for purpose (1). It is important that researchers developing methodology for subgroup analysis explicitly clarify the objectives of their methods in terms that can be understood from a patient's, health care provider's and/or regulator's perspective. A clear operational definition for consistency of treatment effects across subgroups is lacking, but is needed to improve the usability of subgroup analyses in this setting. Finally, methods to particularly explore benefit-risk systematically across subgroups need more research.

  11. Implications of the Socio-Cultural Context for the Co-Construction of Talk in a Task-Based English as a Foreign Language Classroom in Japan

    ERIC Educational Resources Information Center

    Stone, Paul

    2012-01-01

    In this article I investigate how Japanese students manage interaction together in a task-based English as a foreign language (EFL) classroom. Using methods from conversation analysis and focusing on the contextual dimensions of language, I analyse data from a real classroom task with a view to understanding the ways in which social processes and…

  12. Geographic Object-Based Image Analysis - Towards a new paradigm.

    PubMed

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  13. Geographic Object-Based Image Analysis – Towards a new paradigm

    PubMed Central

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  14. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    PubMed

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Slope histogram distribution-based parametrisation of Martian geomorphic features

    NASA Astrophysics Data System (ADS)

    Balint, Zita; Székely, Balázs; Kovács, Gábor

    2014-05-01

    The application of geomorphometric methods on the large Martian digital topographic datasets paves the way to analyse the Martian areomorphic processes in more detail. One of the numerous methods is the analysis is to analyse local slope distributions. To this implementation a visualization program code was developed that allows to calculate the local slope histograms and to compare them based on Kolmogorov distance criterion. As input data we used the digital elevation models (DTMs) derived from HRSC high-resolution stereo camera image from various Martian regions. The Kolmogorov-criterion based discrimination produces classes of slope histograms that displayed using coloration obtaining an image map. In this image map the distribution can be visualized by their different colours representing the various classes. Our goal is to create a local slope histogram based classification for large Martian areas in order to obtain information about general morphological characteristics of the region. This is a contribution of the TMIS.ascrea project, financed by the Austrian Research Promotion Agency (FFG). The present research is partly realized in the frames of TÁMOP 4.2.4.A/2-11-1-2012-0001 high priority "National Excellence Program - Elaborating and Operating an Inland Student and Researcher Personal Support System convergence program" project's scholarship support, using Hungarian state and European Union funds and cofinances from the European Social Fund.

  16. Multicomplex-based pharmacophore-guided 3D-QSAR studies of N-substituted 2'-(aminoaryl)benzothiazoles as Aurora-A inhibitors.

    PubMed

    He, Gu; Qiu, Minghua; Li, Rui; Ouyang, Liang; Wu, Fengbo; Song, Xiangrong; Cheng, Li; Xiang, Mingli; Yu, Luoting

    2012-06-01

    Aurora-A has been known as one of the most important targets for cancer therapy, and some Aurora-A inhibitors have entered clinical trails. In this study, combination of the ligand-based and structure-based methods is used to clarify the essential quantitative structure-activity relationship of known Aurora-A inhibitors, and multicomplex-based pharmacophore-guided method has been suggested to generate a comprehensive pharmacophore of Aurora-A kinase based on a collection of crystal structures of Aurora-A-inhibitor complex. This model has been successfully used to identify the bioactive conformation and align 37 structurally diverse N-substituted 2'-(aminoaryl)benzothiazoles derivatives. The quantitative structure-activity relationship analyses have been performed on these Aurora-A inhibitors based on multicomplex-based pharmacophore-guided alignment. These results may provide important information for further design and virtual screening of novel Aurora-A inhibitors. © 2012 John Wiley & Sons A/S.

  17. Meta-analysis of pathway enrichment: combining independent and dependent omics data sets.

    PubMed

    Kaever, Alexander; Landesfeind, Manuel; Feussner, Kirstin; Morgenstern, Burkhard; Feussner, Ivo; Meinicke, Peter

    2014-01-01

    A major challenge in current systems biology is the combination and integrative analysis of large data sets obtained from different high-throughput omics platforms, such as mass spectrometry based Metabolomics and Proteomics or DNA microarray or RNA-seq-based Transcriptomics. Especially in the case of non-targeted Metabolomics experiments, where it is often impossible to unambiguously map ion features from mass spectrometry analysis to metabolites, the integration of more reliable omics technologies is highly desirable. A popular method for the knowledge-based interpretation of single data sets is the (Gene) Set Enrichment Analysis. In order to combine the results from different analyses, we introduce a methodical framework for the meta-analysis of p-values obtained from Pathway Enrichment Analysis (Set Enrichment Analysis based on pathways) of multiple dependent or independent data sets from different omics platforms. For dependent data sets, e.g. obtained from the same biological samples, the framework utilizes a covariance estimation procedure based on the nonsignificant pathways in single data set enrichment analysis. The framework is evaluated and applied in the joint analysis of Metabolomics mass spectrometry and Transcriptomics DNA microarray data in the context of plant wounding. In extensive studies of simulated data set dependence, the introduced correlation could be fully reconstructed by means of the covariance estimation based on pathway enrichment. By restricting the range of p-values of pathways considered in the estimation, the overestimation of correlation, which is introduced by the significant pathways, could be reduced. When applying the proposed methods to the real data sets, the meta-analysis was shown not only to be a powerful tool to investigate the correlation between different data sets and summarize the results of multiple analyses but also to distinguish experiment-specific key pathways.

  18. Prediction of the thermal environment and thermal response of simple panels exposed to radiant heat

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Ash, Robert L.

    1989-01-01

    A method of predicting the radiant heat flux distribution produced by a bank of tubular quartz heaters was applied to a radiant system consisting of a single unreflected lamp irradiating a flat metallic incident surface. In this manner, the method was experimentally verified for various radiant system parameter settings and used as a source of input for a finite element thermal analysis. Two finite element thermal analyses were applied to a thermal system consisting of a thin metallic panel exposed to radiant surface heating. A two-dimensional steady-state finite element thermal analysis algorithm, based on Galerkin's Method of Weighted Residuals (GFE), was formulated specifically for this problem and was used in comparison to the thermal analyzers of the Engineering Analysis Language (EAL). Both analyses allow conduction, convection, and radiation boundary conditions. Differences in the respective finite element formulation are discussed in terms of their accuracy and resulting comparison discrepancies. The thermal analyses are shown to perform well for the comparisons presented here with some important precautions about the various boundary condition models. A description of the experiment, corresponding analytical modeling, and resulting comparisons are presented.

  19. Design of suitable carrier buffer for free-flow zone electrophoresis by charge-to-mass ratio and band broadening analysis.

    PubMed

    Kong, Fan-Zhi; Yang, Ying; He, Yu-Chen; Zhang, Qiang; Li, Guo-Qing; Fan, Liu-Yin; Xiao, Hua; Li, Shan; Cao, Cheng-Xi

    2016-09-01

    In this work, charge-to-mass ratio (C/M) and band broadening analyses were combined to provide better guidance for the design of free-flow zone electrophoresis carrier buffer (CB). First, the C/M analyses of hemoglobin and C-phycocyanin (C-PC) under different pH were performed by CLC Protein Workbench software. Second, band dispersion due to the initial bandwidth, diffusion, and hydrodynamic broadening were discussed, respectively. Based on the analyses of the C/M and band broadening, a better guidance for preparation of free-flow zone electrophoresis CB was obtained. Series of experiments were performed to validate the proposed method. The experimental data showed high accordance with our prediction allowing the CB to be prepared easily with our proposed method. To further evaluate this method, C-PC was purified from crude extracts of Spirulina platensis with the selected separation condition. Results showed that C-PC was well separated from other phycobiliproteins that have similar physicochemical properties, and analytical grade product with purity up to 4.5 (A620/A280) was obtained. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Non-invasive pre-lens tear film assessment with high-speed videokeratoscopy.

    PubMed

    Llorens-Quintana, Clara; Mousavi, Maryam; Szczesna-Iskander, Dorota; Iskander, D Robert

    2018-02-01

    To evaluate the effect of two types of daily contact lenses (delefilcon A and omafilcon A) on the tear film and establish whether it is dependent on pre-corneal tear film characteristics using a new method to analyse high-speed videokeratoscopy recordings, as well as to determine the sensitivity of the method in differentiating between contact lens materials on eye. High-speed videokeratoscopy recordings were analysed using a custom made automated algorithm based on a fractal dimension approach that provides a set of parameters directly related to tear film stability. Fifty-four subjects participated in the study. Baseline measurements, in suppressed and natural blinking conditions, were taken before subjects were fitted with two different daily contact lenses and after four hours of contact lens wear. The method for analysing the stability of the tear film provides alternative parameters to the non-invasive break up time to assess the quality of the pre-corneal and pre-lens tear film. Both contact lenses significantly decreased the quality of the tear film in suppressed and natural blinking conditions (p<0.001). The utilised method was able to distinguish between contact lens materials on eye in suppressed blinking conditions. The pre-corneal tear film characteristics were not correlated with the decrease in pre-lens tear film quality. High-speed videokeratoscopy equipped with an automated method to analyse the dynamics of the tear film is able to distinguish between contact lens materials in vivo. Incorporating the assessment of pre-lens tear film to the clinical practice could aid improving contact lens fitting and understand contact lens comfort. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  1. A Modified Kirchhoff plate theory for Free Vibration analysis of functionally graded material plates using meshfree method

    NASA Astrophysics Data System (ADS)

    Nguyen Van Do, Vuong

    2018-04-01

    In this paper, a modified Kirchhoff theory is presented for free vibration analyses of functionally graded material (FGM) plate based on modified radial point interpolation method (RPIM). The shear deformation effects are taken account into modified theory to ignore the locking phenomenon of thin plates. Due to the proposed refined plate theory, the number of independent unknowns reduces one variable and exists with four degrees of freedom per node. The simulated free vibration results employed by the modified RPIM are compared with the other analytical solutions to verify the effectiveness and the accuracy of the developed mesh-free method. Detail parametric studies of the proposed method are then conducted including the effectiveness of thickness ratio, boundary condition and material inhomogeneity on the sample problems of square plates. Results illustrated that the modified mesh-free RPIM can effectively predict the numerical calculation as compared to the exact solutions. The obtained numerical results are indicated that the proposed method are stable and well accurate prediction to evaluate with other published analyses.

  2. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less

  3. Statistical methods for convergence detection of multi-objective evolutionary algorithms.

    PubMed

    Trautmann, H; Wagner, T; Naujoks, B; Preuss, M; Mehnen, J

    2009-01-01

    In this paper, two approaches for estimating the generation in which a multi-objective evolutionary algorithm (MOEA) shows statistically significant signs of convergence are introduced. A set-based perspective is taken where convergence is measured by performance indicators. The proposed techniques fulfill the requirements of proper statistical assessment on the one hand and efficient optimisation for real-world problems on the other hand. The first approach accounts for the stochastic nature of the MOEA by repeating the optimisation runs for increasing generation numbers and analysing the performance indicators using statistical tools. This technique results in a very robust offline procedure. Moreover, an online convergence detection method is introduced as well. This method automatically stops the MOEA when either the variance of the performance indicators falls below a specified threshold or a stagnation of their overall trend is detected. Both methods are analysed and compared for two MOEA and on different classes of benchmark functions. It is shown that the methods successfully operate on all stated problems needing less function evaluations while preserving good approximation quality at the same time.

  4. DAKOTA Design Analysis Kit for Optimization and Terascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.

    2010-02-24

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less

  5. A comparison of three methods of setting prescribing budgets, using data derived from defined daily dose analyses of historic patterns of use.

    PubMed Central

    Maxwell, M; Howie, J G; Pryde, C J

    1998-01-01

    BACKGROUND: Prescribing matters (particularly budget setting and research into prescribing variation between doctors) have been handicapped by the absence of credible measures of the volume of drugs prescribed. AIM: To use the defined daily dose (DDD) method to study variation in the volume and cost of drugs prescribed across the seven main British National Formulary (BNF) chapters with a view to comparing different methods of setting prescribing budgets. METHOD: Study of one year of prescribing statistics from all 129 general practices in Lothian, covering 808,059 patients: analyses of prescribing statistics for 1995 to define volume and cost/volume of prescribing for one year for 10 groups of practices defined by the age and deprivation status of their patients, for seven BNF chapters; creation of prescribing budgets for 1996 for each individual practice based on the use of target volume and cost statistics; comparison of 1996 DDD-based budgets with those set using the conventional historical approach; and comparison of DDD-based budgets with budgets set using a capitation-based formula derived from local cost/patient information. RESULTS: The volume of drugs prescribed was affected by the age structure of the practices in BNF Chapters 1 (gastrointestinal), 2 (cardiovascular), and 6 (endocrine), and by deprivation structure for BNF Chapters 3 (respiratory) and 4 (central nervous system). Costs per DDD in the major BNF chapters were largely independent of age, deprivation structure, or fundholding status. Capitation and DDD-based budgets were similar to each other, but both differed substantially from historic budgets. One practice in seven gained or lost more than 100,000 Pounds per annum using DDD or capitation budgets compared with historic budgets. The DDD-based budget, but not the capitation-based budget, can be used to set volume-specific prescribing targets. CONCLUSIONS: DDD-based and capitation-based prescribing budgets can be set using a simple explanatory model and generalizable methods. In this study, both differed substantially from historic budgets. DDD budgets could be created to accommodate new prescribing strategies and raised or lowered to reflect local intentions to alter overall prescribing volume or cost targets. We recommend that future work on setting budgets and researching prescribing variations should be based on DDD statistics. PMID:10024703

  6. Global Methods for Image Motion Analysis

    DTIC Science & Technology

    1992-10-01

    a variant of the same error function as in Adiv [2]. Another related approach was presented by Maybank [46,45]. Nearly all researchers in motion...with an application to stereo vision. In Proc. 7th Intern. Joint Conference on AI, pages 674{679, Vancouver, 1981. [45] S. J. Maybank . Algorithm for...analysing optical ow based on the least-squares method. Image and Vision Computing, 4:38{42, 1986. [46] S. J. Maybank . A Theoretical Study of Optical

  7. Incremental Validity and Informant Effect from a Multi-Method Perspective: Assessing Relations between Parental Acceptance and Children’s Behavioral Problems

    PubMed Central

    Izquierdo-Sotorrío, Eva; Holgado-Tello, Francisco P.; Carrasco, Miguel Á.

    2016-01-01

    This study examines the relationships between perceived parental acceptance and children’s behavioral problems (externalizing and internalizing) from a multi-informant perspective. Using mothers, fathers, and children as sources of information, we explore the informant effect and incremental validity. The sample was composed of 681 participants (227 children, 227 fathers, and 227 mothers). Children’s (40% boys) ages ranged from 9 to 17 years (M = 12.52, SD = 1.81). Parents and children completed both the Parental Acceptance Rejection/Control Questionnaire (PARQ/Control) and the check list of the Achenbach System of Empirically Based Assessment (ASEBA). Statistical analyses were based on the correlated uniqueness multitrait-multimethod matrix (model MTMM) by structural equations and different hierarchical regression analyses. Results showed a significant informant effect and a different incremental validity related to which combination of sources was considered. A multi-informant perspective rather than a single one increased the predictive value. Our results suggest that mother–father or child–father combinations seem to be the best way to optimize the multi-informant method in order to predict children’s behavioral problems based on perceived parental acceptance. PMID:27242582

  8. GPs' thoughts on prescribing medication and evidence-based knowledge: the benefit aspect is a strong motivator. A descriptive focus group study.

    PubMed

    Skoglund, Ingmarie; Segesten, Kerstin; Björkelund, Cecilia

    2007-06-01

    To describe GPs' thoughts of prescribing medication and evidence-based knowledge (EBM) concerning drug therapy. Tape-recorded focus-group interviews transcribed verbatim and analysed using qualitative methods. GPs from the south-eastern part of Västra Götaland, Sweden. A total of 16 GPs out of 178 from the south-eastern part of the region strategically chosen to represent urban and rural, male and female, long and short GP experience. Transcripts were analysed using a descriptive qualitative method. The categories were: benefits, time and space, and expert knowledge. The benefit was a merge of positive elements, all aspects of the GPs' tasks. Time and space were limitations for GPs' tasks. EBM as a constituent of expert knowledge should be more customer adjusted to be able to be used in practice. Benefit was the most important category, existing in every decision-making situation for the GP. The core category was prompt and pragmatic benefit, which was the utmost benefit. GPs' thoughts on evidence-based medicine and prescribing medication were highly related to reflecting on benefit and results. The interviews indicated that prompt and pragmatic benefit is important for comprehending their thoughts.

  9. Exploration of cellular reaction systems.

    PubMed

    Kirkilionis, Markus

    2010-01-01

    We discuss and review different ways to map cellular components and their temporal interaction with other such components to different non-spatially explicit mathematical models. The essential choices made in the literature are between discrete and continuous state spaces, between rule and event-based state updates and between deterministic and stochastic series of such updates. The temporal modelling of cellular regulatory networks (dynamic network theory) is compared with static network approaches in two first introductory sections on general network modelling. We concentrate next on deterministic rate-based dynamic regulatory networks and their derivation. In the derivation, we include methods from multiscale analysis and also look at structured large particles, here called macromolecular machines. It is clear that mass-action systems and their derivatives, i.e. networks based on enzyme kinetics, play the most dominant role in the literature. The tools to analyse cellular reaction networks are without doubt most complete for mass-action systems. We devote a long section at the end of the review to make a comprehensive review of related tools and mathematical methods. The emphasis is to show how cellular reaction networks can be analysed with the help of different associated graphs and the dissection into modules, i.e. sub-networks.

  10. Incremental Validity and Informant Effect from a Multi-Method Perspective: Assessing Relations between Parental Acceptance and Children's Behavioral Problems.

    PubMed

    Izquierdo-Sotorrío, Eva; Holgado-Tello, Francisco P; Carrasco, Miguel Á

    2016-01-01

    This study examines the relationships between perceived parental acceptance and children's behavioral problems (externalizing and internalizing) from a multi-informant perspective. Using mothers, fathers, and children as sources of information, we explore the informant effect and incremental validity. The sample was composed of 681 participants (227 children, 227 fathers, and 227 mothers). Children's (40% boys) ages ranged from 9 to 17 years (M = 12.52, SD = 1.81). Parents and children completed both the Parental Acceptance Rejection/Control Questionnaire (PARQ/Control) and the check list of the Achenbach System of Empirically Based Assessment (ASEBA). Statistical analyses were based on the correlated uniqueness multitrait-multimethod matrix (model MTMM) by structural equations and different hierarchical regression analyses. Results showed a significant informant effect and a different incremental validity related to which combination of sources was considered. A multi-informant perspective rather than a single one increased the predictive value. Our results suggest that mother-father or child-father combinations seem to be the best way to optimize the multi-informant method in order to predict children's behavioral problems based on perceived parental acceptance.

  11. [Detection of oil spills on water by differential polarization FTIR spectrometry].

    PubMed

    Yuan, Yue-ming; Xiong, Wei; Fang, Yong-hua; Lan, Tian-ge; Li, Da-cheng

    2010-08-01

    Detection of oil spills on water, by traditional thermal remote sensing, is based on the radiance contrast between the large area of clean water and the polluted area of water. And the categories of oil spills can not be identified by analysing the thermal infrared image. In order to find out the extent of pollution and identify the oil contaminants, an approach to the passive detection of oil spills on water by differential polarization FTIR spectrometry is proposed. This approach can detect the contaminants by obtaining and analysing the subtracted spectrum of horizontal and vertical polarization intensity spectrum. In the present article, the radiance model of differential polarization FTIR spectrometry is analysed, and an experiment about detection of No. O diesel and SF96 film on water by this method is presented. The results of this experiment indicate that this method can detect the oil contaminants on water without radiance contrast with clean water, and it also can identify oil spills by analysing the spectral characteristic of differential polarization FTIR spectrum. So it well makes up for the shortage of traditional thermal remote sensing on detecting oil spills on water.

  12. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  13. Application of Observed Precipitation in NCEP Global and Regional Data Assimilation Systems, Including Reanalysis and Land Data Assimilation

    NASA Astrophysics Data System (ADS)

    Mitchell, K. E.

    2006-12-01

    The Environmental Modeling Center (EMC) of the National Centers for Environmental Prediction (NCEP) applies several different analyses of observed precipitation in both the data assimilation and validation components of NCEP's global and regional numerical weather and climate prediction/analysis systems (including in NCEP global and regional reanalysis). This invited talk will survey these data assimilation and validation applications and methodologies, as well as the temporal frequency, spatial domains, spatial resolution, data sources, data density and data quality control in the precipitation analyses that are applied. Some of the precipitation analyses applied by EMC are produced by NCEP's Climate Prediction Center (CPC), while others are produced by the River Forecast Centers (RFCs) of the National Weather Service (NWS), or by automated algorithms of the NWS WSR-88D Radar Product Generator (RPG). Depending on the specific type of application in data assimilation or model forecast validation, the temporal resolution of the precipitation analyses may be hourly, daily, or pentad (5-day) and the domain may be global, continental U.S. (CONUS), or Mexico. The data sources for precipitation include ground-based gauge observations, radar-based estimates, and satellite-based estimates. The precipitation analyses over the CONUS are analyses of either hourly, daily or monthly totals of precipitation, and they are of two distinct types: gauge-only or primarily radar-estimated. The gauge-only CONUS analysis of daily precipitation utilizes an orographic-adjustment technique (based on the well-known PRISM precipitation climatology of Oregon State University) developed by the NWS Office of Hydrologic Development (OHD). The primary NCEP global precipitation analysis is the pentad CPC Merged Analysis of Precipitation (CMAP), which blends both gauge observations and satellite estimates. The presentation will include a brief comparison between the CMAP analysis and other global precipitation analyses by other institutions. Other global precipitation analyses produced by other methodologies are also used by EMC in certain applications, such as CPC's well-known satellite-IR based technique known as "GPI", and satellite-microwave based estimates from NESDIS or NASA. Finally, the presentation will cover the three assimilation methods used by EMC to assimilate precipitation data, including 1) 3D-VAR variational assimilation in NCEP's Global Data Assimilation System (GDAS), 2) direct insertion of precipitation-inferred vertical latent heating profiles in NCEP's N. American Data Assimilation System (NDAS) and its N. American Regional Reanalysis (NARR) counterpart, and 3) direct use of observed precipitation to drive the Noah land model component of NCEP's Global and N. American Land Data Assimilation Systems (GLDAS and NLDAS). In the applications of precipitation analyses in data assimilation at NCEP, the analyses are temporally disaggregated to hourly or less using time-weights calculated from A) either radar-based estimates or an analysis of hourly gauge-observations for the CONUS-domain daily precipitation analyses, or B) global model forecasts of 6-hourly precipitation (followed by linear interpolation to hourly or less) for the global CMAP precipitation analysis.

  14. ENVIRONMENTAL GOODS AND SERIVCES FROM RESTORATION ALTERNATIVES: EMERGY-BASED METHOD OF BENEFIT VALUE

    EPA Science Inventory

    Although economic benefit-cost analyses of environmental regulations has been conducted since the 1970s, an effective methodology has yet to be developed for the integrated assessment of regulatory impacts on the larger system as a whole, including its social and environmental as...

  15. Development and Validation of Cognitive Screening Instruments.

    ERIC Educational Resources Information Center

    Jarman, Ronald F.

    The author suggests that most research on the early detection of learning disabilities is characterisized by an ineffective and a theoretical method of selecting and validating tasks. An alternative technique is proposed, based on a neurological theory of cognitive processes, whereby task analysis is a first step, with empirical analyses as…

  16. Design of a Monitoring Program for Northern Spotted Owls

    Treesearch

    Jonathan Bart; Douglas S. Robson

    1995-01-01

    This paper discusses methods for estimating population trends of Northern Spotted Owls (Strix occidentalis caurina) based on point counts. Although the monitoring program will have five distinct components, attention here is restricted to one of these: roadside surveys of territorial birds. Analyses of Breeding Bird Survey data and computer...

  17. Spurious correlations and inference in landscape genetics

    Treesearch

    Samuel A. Cushman; Erin L. Landguth

    2010-01-01

    Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causalmodelling with partial...

  18. Evaluating the Diagnostic Validity of a Facet-Based Formative Assessment System

    ERIC Educational Resources Information Center

    DeBarger, Angela Haydel; DiBello, Louis; Minstrell, Jim; Feng, Mingyu; Stout, William; Pellegrino, James; Haertel, Geneva; Harris, Christopher; Ructinger, Liliana

    2011-01-01

    This paper describes methods for an alignment study and psychometric analyses of a formative assessment system, Diagnoser Tools for physics. Diagnoser Tools begin with facet clusters as the interpretive framework for designing questions and instructional activities. Thus each question in the diagnostic assessments includes distractors that…

  19. Policies and Identities in Mandarin Education: The Situated Multilingualism of University-Level "Heritage" Language Learners

    ERIC Educational Resources Information Center

    Kelleher, Ann Marie

    2010-01-01

    This dissertation explores complex positionings of Chinese heritage language (CHL) learners amid several intersecting discourses, including those around globalization, identity development and language policies. Using critical, qualitative methods, the study combines textual and site-based analyses, linking the language development experiences of…

  20. "Methods of Inquiry": Using Critical Thinking to Retain Students

    ERIC Educational Resources Information Center

    Ahuna, Kelly H.; Tinnesz, Christine Gray; VanZile-Tamsen, Carol

    2011-01-01

    In the late 1980s a large northeastern university implemented a critical thinking course for undergraduate students. Combining insights from cognitive psychology and philosophy, this class was designed to give students concrete strategies to promote self-regulated learning and ensure academic success. The analyses in this study are based on…

Top