Sample records for previously published methods

  1. Reference Accuracy among Research Articles Published in "Research on Social Work Practice"

    ERIC Educational Resources Information Center

    Wilks, Scott E.; Geiger, Jennifer R.; Bates, Samantha M.; Wright, Amy L.

    2017-01-01

    Objective: The objective was to examine reference errors in research articles published in Research on Social Work Practice. High rates of reference errors in other top social work journals have been noted in previous studies. Methods: Via a sampling frame of 22,177 total references among 464 research articles published in the previous decade, a…

  2. Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?

    PubMed

    Li, Tianjing; Dickersin, Kay

    2013-06-01

    Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  3. REMARK checklist elaborated to improve tumor prognostician

    Cancer.gov

    Experts have elaborated on a previously published checklist of 20 items -- including descriptions of design, methods, and analysis -- that researchers should address when publishing studies of prognostic markers. These markers are indicators that enable d

  4. Standard test method for grindability of coal by the Hardgrove-machine method. ASTM standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-05-01

    This test method is under the jurisdiction of ASTM Committee D-5 on Coal and Coke and is the direct responsibility of Subcommittee D05.07 on Physical Characteristics of Coal. The current edition was approved on November 10, 1997, and published May 1998. It was originally published as D 409-51. The last previous edition was D 409-93a.

  5. A New Method for Obtaining Russell-Saunders Terms

    ERIC Educational Resources Information Center

    Liu, Ying; Liu, Yue; Liu, Bihui

    2011-01-01

    A new method for obtaining Russell-Saunders terms of atomic configurations is reported. This new method is significantly different from, while at the same time complementary to, previously published methods for obtaining atomic terms. This novel procedure is elicited by the method used to determine the splitting of S, P, D terms in weak ligand…

  6. Annotated Bibliography on the Teaching of Psychology: 2007

    ERIC Educational Resources Information Center

    Johnson, David E.; Schroder, Simone I.; Erickson, Jonathan P.; Grimes, Katherine N.

    2008-01-01

    This annotated bibliography is a continuation of those previously published in "Teaching of Psychology" (e.g., Berry & Daniel, 1984; Fulkerson & Wise, 1987; Johnson & Schroder, 1997; Wise & Fulkerson, 1996). In this paper, the authors maintained similar search methods and criteria for inclusion that were used in previous bibliographies. They also…

  7. Washing with contaminated bar soap is unlikely to transfer bacteria.

    PubMed Central

    Heinze, J. E.; Yackovich, F.

    1988-01-01

    Recent reports of the isolation of microorganisms from used soap bars have raised the concern that bacteria may be transferred from contaminated soap bars during handwashing. Since only one study addressing this question has been published, we developed an additional procedure to test this concern. In our new method prewashed and softened commercial deodorant soap bars (0.8% triclocarban) not active against Gram-negative bacteria were inoculated with Escherichia coli and Pseudomonas aeruginosa to give mean total survival levels of 4.4 X 10(5) c.f.u. per bar which was 70-fold higher than those reported on used soap bars. Sixteen panelists were instructed to wash with the inoculated bars using their normal handwashing procedure. After washing, none of the 16 panelists had detectable levels of either test bacterium on their hands. Thus, the results obtained using our new method were in complete agreement with those obtained with the previously published method even though the two methods differ in a number of procedural aspects. These findings, along with other published reports, show that little hazard exists in routine handwashing with previously used soap bars and support the frequent use of soap and water for handwashing to prevent the spread of disease. PMID:3402545

  8. THE CHEMICAL ANALYSIS OF TERNARY ALLOYS OF PLUTONIUM WITH MOLYBDENUM AND URANIUM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, G.; Woodhead, J.; Jenkins, E.N.

    1958-09-01

    It is shown that the absorptiometric determination of molybdenum as thiocyanate may be used in the presence of plutonium. Molybdenum interferes with previously published methods for determining uranium and plutonium but conditlons have been established for its complete removal by solvent extraction of the compound with alpha -benzoin oxime. The previous methods for uranium and plutonium are satisfactory when applied to the residual aqueous phase following this solvent extraction. (auth)

  9. Use of Recommended Search Strategies in Systematic Reviews and the Impact of Librarian Involvement: A Cross-Sectional Survey of Recent Authors

    PubMed Central

    Koffel, Jonathan B.

    2015-01-01

    Background Previous research looking at published systematic reviews has shown that their search strategies are often suboptimal and that librarian involvement, though recommended, is low. Confidence in the results, however, is limited due to poor reporting of search strategies the published articles. Objectives To more accurately measure the use of recommended search methods in systematic reviews, the levels of librarian involvement, and whether librarian involvement predicts the use of recommended methods. Methods A survey was sent to all authors of English-language systematic reviews indexed in the Database of Abstracts of Reviews of Effects (DARE) from January 2012 through January 2014. The survey asked about their use of search methods recommended by the Institute of Medicine, Cochrane Collaboration, and the Agency for Healthcare Research and Quality and if and how a librarian was involved in the systematic review. Rates of use of recommended methods and librarian involvement were summarized. The impact of librarian involvement on use of recommended methods was examined using a multivariate logistic regression. Results 1560 authors completed the survey. Use of recommended search methods ranged widely from 98% for use of keywords to 9% for registration in PROSPERO and were generally higher than in previous studies. 51% of studies involved a librarian, but only 64% acknowledge their assistance. Librarian involvement was significantly associated with the use of 65% of recommended search methods after controlling for other potential predictors. Odds ratios ranged from 1.36 (95% CI 1.06 to 1.75) for including multiple languages to 3.07 (95% CI 2.06 to 4.58) for using controlled vocabulary. Conclusions Use of recommended search strategies is higher than previously reported, but many methods are still under-utilized. Librarian involvement predicts the use of most methods, but their involvement is under-reported within the published article. PMID:25938454

  10. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors.

    PubMed

    Koffel, Jonathan B

    2015-01-01

    Previous research looking at published systematic reviews has shown that their search strategies are often suboptimal and that librarian involvement, though recommended, is low. Confidence in the results, however, is limited due to poor reporting of search strategies the published articles. To more accurately measure the use of recommended search methods in systematic reviews, the levels of librarian involvement, and whether librarian involvement predicts the use of recommended methods. A survey was sent to all authors of English-language systematic reviews indexed in the Database of Abstracts of Reviews of Effects (DARE) from January 2012 through January 2014. The survey asked about their use of search methods recommended by the Institute of Medicine, Cochrane Collaboration, and the Agency for Healthcare Research and Quality and if and how a librarian was involved in the systematic review. Rates of use of recommended methods and librarian involvement were summarized. The impact of librarian involvement on use of recommended methods was examined using a multivariate logistic regression. 1560 authors completed the survey. Use of recommended search methods ranged widely from 98% for use of keywords to 9% for registration in PROSPERO and were generally higher than in previous studies. 51% of studies involved a librarian, but only 64% acknowledge their assistance. Librarian involvement was significantly associated with the use of 65% of recommended search methods after controlling for other potential predictors. Odds ratios ranged from 1.36 (95% CI 1.06 to 1.75) for including multiple languages to 3.07 (95% CI 2.06 to 4.58) for using controlled vocabulary. Use of recommended search strategies is higher than previously reported, but many methods are still under-utilized. Librarian involvement predicts the use of most methods, but their involvement is under-reported within the published article.

  11. Child Mortality Estimation 2013: An Overview of Updates in Estimation Methods by the United Nations Inter-Agency Group for Child Mortality Estimation

    PubMed Central

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    Background In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. Methods We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Findings Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. Conclusions The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues. PMID:25013954

  12. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    PubMed

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  13. Evaluation and improvements of a mayfly, Neocloeon (Centroptilum) triangulifer ?(Ephemeroptera: Baetidae) toxicity test method

    EPA Science Inventory

    A recently published test method for Neocloeon triangulifer assessed the sensitivities of larval mayflies to several reference toxicants (NaCl, KCl, and CuSO4). Subsequent exposures have shown discrepancies from those results previously reported. To identify potential sources of ...

  14. [A brief history of resuscitation - the influence of previous experience on modern techniques and methods].

    PubMed

    Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam

    2015-02-01

    Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.

  15. Indispensable finite time corrections for Fokker-Planck equations from time series data.

    PubMed

    Ragwitz, M; Kantz, H

    2001-12-17

    The reconstruction of Fokker-Planck equations from observed time series data suffers strongly from finite sampling rates. We show that previously published results are degraded considerably by such effects. We present correction terms which yield a robust estimation of the diffusion terms, together with a novel method for one-dimensional problems. We apply these methods to time series data of local surface wind velocities, where the dependence of the diffusion constant on the state variable shows a different behavior than previously suggested.

  16. Child mortality estimation 2013: an overview of updates in estimation methods by the United Nations Inter-agency Group for Child Mortality Estimation.

    PubMed

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues.

  17. Kinetic analysis of hyperpolarized data with minimum a priori knowledge: Hybrid maximum entropy and nonlinear least squares method (MEM/NLS).

    PubMed

    Mariotti, Erika; Veronese, Mattia; Dunn, Joel T; Southworth, Richard; Eykyn, Thomas R

    2015-06-01

    To assess the feasibility of using a hybrid Maximum-Entropy/Nonlinear Least Squares (MEM/NLS) method for analyzing the kinetics of hyperpolarized dynamic data with minimum a priori knowledge. A continuous distribution of rates obtained through the Laplace inversion of the data is used as a constraint on the NLS fitting to derive a discrete spectrum of rates. Performance of the MEM/NLS algorithm was assessed through Monte Carlo simulations and validated by fitting the longitudinal relaxation time curves of hyperpolarized [1-(13) C] pyruvate acquired at 9.4 Tesla and at three different flip angles. The method was further used to assess the kinetics of hyperpolarized pyruvate-lactate exchange acquired in vitro in whole blood and to re-analyze the previously published in vitro reaction of hyperpolarized (15) N choline with choline kinase. The MEM/NLS method was found to be adequate for the kinetic characterization of hyperpolarized in vitro time-series. Additional insights were obtained from experimental data in blood as well as from previously published (15) N choline experimental data. The proposed method informs on the compartmental model that best approximate the biological system observed using hyperpolarized (13) C MR especially when the metabolic pathway assessed is complex or a new hyperpolarized probe is used. © 2014 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc.

  18. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    PubMed

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  19. Simulation optimization of PSA-threshold based prostate cancer screening policies

    PubMed Central

    Zhang, Jingyu; Denton, Brian T.; Shah, Nilay D.; Inman, Brant A.

    2013-01-01

    We describe a simulation optimization method to design PSA screening policies based on expected quality adjusted life years (QALYs). Our method integrates a simulation model in a genetic algorithm which uses a probabilistic method for selection of the best policy. We present computational results about the efficiency of our algorithm. The best policy generated by our algorithm is compared to previously recommended screening policies. Using the policies determined by our model, we present evidence that patients should be screened more aggressively but for a shorter length of time than previously published guidelines recommend. PMID:22302420

  20. Measuring fluorescence polarization with a dichrometer.

    PubMed

    Sutherland, John C

    2017-09-01

    A method for obtaining fluorescence polarization data from an instrument designed to measure circular and linear dichroism is compared with a previously reported approach. The new method places a polarizer between the sample and a detector mounted perpendicular to the direction of the incident beam and results in determination of the fluorescence polarization ratio, whereas the previous method does not use a polarizer and yields the fluorescence anisotropy. A similar analysis with the detector located axially with the excitation beam demonstrates that there is no frequency modulated signal due to fluorescence polarization in the absence of a polarizer. Copyright © 2017. Published by Elsevier Inc.

  1. Examining the Efficacy of Self-Regulated Strategy Development for Students with Emotional or Behavioral Disorders: A Meta-Analysis

    ERIC Educational Resources Information Center

    Losinski, Mickey; Cuenca-Carlino, Yojanna; Zablocki, Mark; Teagarden, James

    2014-01-01

    Two previous reviews have indicated that self-regulated strategy instruction (SRSD) is an evidence-based practice that can improve the writing skills of students with emotional and behavioral disorders. The purpose of this meta-analysis is to extend the findings and analytic methods of previous reviews by examining published studies regarding…

  2. Underwater psychophysical audiogram of a young male California sea lion (Zalophus californianus).

    PubMed

    Mulsow, Jason; Houser, Dorian S; Finneran, James J

    2012-05-01

    Auditory evoked potential (AEP) data are commonly obtained in air while sea lions are under gas anesthesia; a procedure that precludes the measurement of underwater hearing sensitivity. This is a substantial limitation considering the importance of underwater hearing data in designing criteria aimed at mitigating the effects of anthropogenic noise exposure. To determine if some aspects of underwater hearing sensitivity can be predicted using rapid aerial AEP methods, this study measured underwater psychophysical thresholds for a young male California sea lion (Zalophus californianus) for which previously published aerial AEP thresholds exist. Underwater thresholds were measured in an aboveground pool at frequencies between 1 and 38 kHz. The underwater audiogram was very similar to those previously published for California sea lions, suggesting that the current and previously obtained psychophysical data are representative for this species. The psychophysical and previously measured AEP audiograms were most similar in terms of high-frequency hearing limit (HFHL), although the underwater HFHL was sharper and occurred at a higher frequency. Aerial AEP methods are useful for predicting reductions in the HFHL that are potentially independent of the testing medium, such as those due to age-related sensorineural hearing loss.

  3. Separation and quantification of monothiols and phytochelatins from a wide variety of cell cultures and tissues of trees and other plants using high performance liquid chromatography

    Treesearch

    Rakesh Minocha; P. Thangavel; Om Parkash Dhankher; Stephanie Long

    2008-01-01

    The HPLC method presented here for the quantification of metal-binding thiols is considerably shorter than most previously published methods. It is a sensitive and highly reproducible method that separates monobromobimane tagged monothiols (cysteine, glutathione, γ-glutamylcysteine) along with polythiols (PC2, PC3...

  4. Data-driven mapping of hypoxia-related tumor heterogeneity using DCE-MRI and OE-MRI.

    PubMed

    Featherstone, Adam K; O'Connor, James P B; Little, Ross A; Watson, Yvonne; Cheung, Sue; Babur, Muhammad; Williams, Kaye J; Matthews, Julian C; Parker, Geoff J M

    2018-04-01

    Previous work has shown that combining dynamic contrast-enhanced (DCE)-MRI and oxygen-enhanced (OE)-MRI binary enhancement maps can identify tumor hypoxia. The current work proposes a novel, data-driven method for mapping tissue oxygenation and perfusion heterogeneity, based on clustering DCE/OE-MRI data. DCE-MRI and OE-MRI were performed on nine U87 (glioblastoma) and seven Calu6 (non-small cell lung cancer) murine xenograft tumors. Area under the curve and principal component analysis features were calculated and clustered separately using Gaussian mixture modelling. Evaluation metrics were calculated to determine the optimum feature set and cluster number. Outputs were quantitatively compared with a previous non data-driven approach. The optimum method located six robustly identifiable clusters in the data, yielding tumor region maps with spatially contiguous regions in a rim-core structure, suggesting a biological basis. Mean within-cluster enhancement curves showed physiologically distinct, intuitive kinetics of enhancement. Regions of DCE/OE-MRI enhancement mismatch were located, and voxel categorization agreed well with the previous non data-driven approach (Cohen's kappa = 0.61, proportional agreement = 0.75). The proposed method locates similar regions to the previous published method of binarization of DCE/OE-MRI enhancement, but renders a finer segmentation of intra-tumoral oxygenation and perfusion. This could aid in understanding the tumor microenvironment and its heterogeneity. Magn Reson Med 79:2236-2245, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.

  5. Textbook Citations as a Measure of Journal Influence on International Business Education

    ERIC Educational Resources Information Center

    Urbancic, Frank R.

    2006-01-01

    Previously published rankings of journals in relation to international business research are based on a survey method or a journal-based citation method wherein functional discipline journals are excluded from consideration. The narrow focus of these studies has generated criticism for perpetuating an international business silo perspective. In…

  6. Creatinine elevation associated with nitromethane exposure: a marker of potential methanol toxicity.

    PubMed

    Cook, Matthew D; Clark, Richard F

    2007-10-01

    Nitromethane, methanol, and oil are the common components of radio-controlled (R/C) vehicle fuels. Nitromethane can cause a false elevation of serum creatinine concentration as measured by the widely used Jaffe colorimetric method. We gathered data from our poison control system and from previously published case reports to see if a correlation exists between serum methanol concentrations and spuriously elevated serum creatinine concentrations after human exposures to R/C fuel. The California Poison Control System (CPCS) computerized database was queried for all cases of human exposure to R/C vehicle fuel reported between December 1, 2002 and December 1, 2004. Serum creatinine and methanol concentrations were recorded when available, as was the method used to determine serum creatinine. A MEDLINE search was used to obtain previously published cases of human nitromethane exposure associated with falsely elevated creatinine concentrations. During the 2-year period, serum creatinine concentrations were recorded in 7 of 26 R/C fuel exposures (all ingestions), and 6 of these were abnormal (range of 1.9-11.5 mg/dL). In this series, the higher the serum creatinine concentration measured by Jaffe method, the higher the serum methanol concentration. The MEDLINE search yielded data from six previously published case reports on this topic. The data from these case reports seem to follow the trend seen in our case series. These data suggest that a spuriously elevated serum creatinine (by Jaffe method) may have value as an early surrogate marker of methanol poisoning in those who ingest R/C fuel. Also, the degree to which the serum creatinine is elevated may indicate the severity of methanol poisoning.

  7. A method of extracting speed-dependent vector correlations from 2 + 1 REMPI ion images.

    PubMed

    Wei, Wei; Wallace, Colin J; Grubb, Michael P; North, Simon W

    2017-07-07

    We present analytical expressions for extracting Dixon's bipolar moments in the semi-classical limit from experimental anisotropy parameters of sliced or reconstructed non-sliced images. The current method focuses on images generated by 2 + 1 REMPI (Resonance Enhanced Multi-photon Ionization) and is a necessary extension of our previously published 1 + 1 REMPI equations. Two approaches for applying the new equations, direct inversion and forward convolution, are presented. As demonstration of the new method, bipolar moments were extracted from images of carbonyl sulfide (OCS) photodissociation at 230 nm and NO 2 photodissociation at 355 nm, and the results are consistent with previous publications.

  8. Biomarkers of Selenium Action in Prostate Cancer

    DTIC Science & Technology

    2005-01-01

    secretory by conventional methods according to published literature. In addition, we have determined the similarities and differences in global gene...transition zone tissue of a 42-year-old man ac- arrays in the resulting data tables were ordered by their cording to previously described methods [4]. The pre...hundred fifteen genes identified by ELISA method . Replicating the conditions used for the SAM analysis showed significant differential expres- microarray

  9. Utility-preserving anonymization for health data publishing.

    PubMed

    Lee, Hyukki; Kim, Soohyung; Kim, Jong Wook; Chung, Yon Dohn

    2017-07-11

    Publishing raw electronic health records (EHRs) may be considered as a breach of the privacy of individuals because they usually contain sensitive information. A common practice for the privacy-preserving data publishing is to anonymize the data before publishing, and thus satisfy privacy models such as k-anonymity. Among various anonymization techniques, generalization is the most commonly used in medical/health data processing. Generalization inevitably causes information loss, and thus, various methods have been proposed to reduce information loss. However, existing generalization-based data anonymization methods cannot avoid excessive information loss and preserve data utility. We propose a utility-preserving anonymization for privacy preserving data publishing (PPDP). To preserve data utility, the proposed method comprises three parts: (1) utility-preserving model, (2) counterfeit record insertion, (3) catalog of the counterfeit records. We also propose an anonymization algorithm using the proposed method. Our anonymization algorithm applies full-domain generalization algorithm. We evaluate our method in comparison with existence method on two aspects, information loss measured through various quality metrics and error rate of analysis result. With all different types of quality metrics, our proposed method show the lower information loss than the existing method. In the real-world EHRs analysis, analysis results show small portion of error between the anonymized data through the proposed method and original data. We propose a new utility-preserving anonymization method and an anonymization algorithm using the proposed method. Through experiments on various datasets, we show that the utility of EHRs anonymized by the proposed method is significantly better than those anonymized by previous approaches.

  10. MBA: Is the Traditional Model Doomed?

    ERIC Educational Resources Information Center

    Lataif, Louis E.; And Others

    1992-01-01

    Presents 13 commentaries on a previously published case study about the value of a Master's of Business Administration to employers today. Critiques center on the case study method, theory-practice gap, and value of practical experience and include international perspectives. (SK)

  11. Parameterizing sorption isotherms using a hybrid global-local fitting procedure.

    PubMed

    Matott, L Shawn; Singh, Anshuman; Rabideau, Alan J

    2017-05-01

    Predictive modeling of the transport and remediation of groundwater contaminants requires an accurate description of the sorption process, which is usually provided by fitting an isotherm model to site-specific laboratory data. Commonly used calibration procedures, listed in order of increasing sophistication, include: trial-and-error, linearization, non-linear regression, global search, and hybrid global-local search. Given the considerable variability in fitting procedures applied in published isotherm studies, we investigated the importance of algorithm selection through a series of numerical experiments involving 13 previously published sorption datasets. These datasets, considered representative of state-of-the-art for isotherm experiments, had been previously analyzed using trial-and-error, linearization, or non-linear regression methods. The isotherm expressions were re-fit using a 3-stage hybrid global-local search procedure (i.e. global search using particle swarm optimization followed by Powell's derivative free local search method and Gauss-Marquardt-Levenberg non-linear regression). The re-fitted expressions were then compared to previously published fits in terms of the optimized weighted sum of squared residuals (WSSR) fitness function, the final estimated parameters, and the influence on contaminant transport predictions - where easily computed concentration-dependent contaminant retardation factors served as a surrogate measure of likely transport behavior. Results suggest that many of the previously published calibrated isotherm parameter sets were local minima. In some cases, the updated hybrid global-local search yielded order-of-magnitude reductions in the fitness function. In particular, of the candidate isotherms, the Polanyi-type models were most likely to benefit from the use of the hybrid fitting procedure. In some cases, improvements in fitness function were associated with slight (<10%) changes in parameter values, but in other cases significant (>50%) changes in parameter values were noted. Despite these differences, the influence of isotherm misspecification on contaminant transport predictions was quite variable and difficult to predict from inspection of the isotherms. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. The Elastic Behaviour of Sintered Metallic Fibre Networks: A Finite Element Study by Beam Theory

    PubMed Central

    Bosbach, Wolfram A.

    2015-01-01

    Background The finite element method has complimented research in the field of network mechanics in the past years in numerous studies about various materials. Numerical predictions and the planning efficiency of experimental procedures are two of the motivational aspects for these numerical studies. The widespread availability of high performance computing facilities has been the enabler for the simulation of sufficiently large systems. Objectives and Motivation In the present study, finite element models were built for sintered, metallic fibre networks and validated by previously published experimental stiffness measurements. The validated models were the basis for predictions about so far unknown properties. Materials and Methods The finite element models were built by transferring previously published skeletons of fibre networks into finite element models. Beam theory was applied as simplification method. Results and Conclusions The obtained material stiffness isn’t a constant but rather a function of variables such as sample size and boundary conditions. Beam theory offers an efficient finite element method for the simulated fibre networks. The experimental results can be approximated by the simulated systems. Two worthwhile aspects for future work will be the influence of size and shape and the mechanical interaction with matrix materials. PMID:26569603

  13. Analysis of fluorescently labeled glycosphingolipid-derived oligosaccharides following ceramide glycanase digestion and anthranilic acid labeling.

    PubMed

    Neville, David C A; Coquard, Virginie; Priestman, David A; te Vruchte, Danielle J M; Sillence, Daniel J; Dwek, Raymond A; Platt, Frances M; Butters, Terry D

    2004-08-15

    Interest in cellular glycosphingolipid (GSL) function has necessitated the development of a rapid and sensitive method to both analyze and characterize the full complement of structures present in various cells and tissues. An optimized method to characterize oligosaccharides released from glycosphingolipids following ceramide glycanase digestion has been developed. The procedure uses the fluorescent compound anthranilic acid (2-aminobenzoic acid; 2-AA) to label oligosaccharides prior to analysis using normal-phase high-performance liquid chromatography. The labeling procedure is rapid, selective, and easy to perform and is based on the published method of Anumula and Dhume [Glycobiology 8 (1998) 685], originally used to analyze N-linked oligosaccharides. It is less time consuming than a previously published 2-aminobenzamide labeling method [Anal. Biochem. 298 (2001) 207] for analyzing GSL-derived oligosaccharides, as the fluorescent labeling is performed on the enzyme reaction mixture. The purification of 2-AA-labeled products has been improved to ensure recovery of oligosaccharides containing one to four monosaccharide units, which was not previously possible using the Anumula and Dhume post-derivatization purification procedure. This new approach may also be used to analyze both N- and O-linked oligosaccharides.

  14. An investigation of the effects of reading and writing text-based messages while driving.

    DOT National Transportation Integrated Search

    2012-08-01

    Previous research, using driving simulation, crash data, and naturalistic methods, has begun to shed light on the dangers of texting while driving. Perhaps because of the dangers, no published work has experimentally investigated the dangers of texti...

  15. Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009

    USGS Publications Warehouse

    Soller, David R.

    2011-01-01

    As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  16. HIV Model Parameter Estimates from Interruption Trial Data including Drug Efficacy and Reservoir Dynamics

    PubMed Central

    Luo, Rutao; Piovoso, Michael J.; Martinez-Picado, Javier; Zurakowski, Ryan

    2012-01-01

    Mathematical models based on ordinary differential equations (ODE) have had significant impact on understanding HIV disease dynamics and optimizing patient treatment. A model that characterizes the essential disease dynamics can be used for prediction only if the model parameters are identifiable from clinical data. Most previous parameter identification studies for HIV have used sparsely sampled data from the decay phase following the introduction of therapy. In this paper, model parameters are identified from frequently sampled viral-load data taken from ten patients enrolled in the previously published AutoVac HAART interruption study, providing between 69 and 114 viral load measurements from 3–5 phases of viral decay and rebound for each patient. This dataset is considerably larger than those used in previously published parameter estimation studies. Furthermore, the measurements come from two separate experimental conditions, which allows for the direct estimation of drug efficacy and reservoir contribution rates, two parameters that cannot be identified from decay-phase data alone. A Markov-Chain Monte-Carlo method is used to estimate the model parameter values, with initial estimates obtained using nonlinear least-squares methods. The posterior distributions of the parameter estimates are reported and compared for all patients. PMID:22815727

  17. Simultaneous separation and quantitation of amino acids and polyamines of forest tree tissues and cell cultures within a single high-performance liquid chromatography run using dansyl derivatization

    Treesearch

    Rakesh Minocha; Stephanie Long

    2004-01-01

    The objective of the present study was to develop a rapid HPLC method for simultaneous separation and quantitation of dansylated amino acids and common polyamines in the same matrix for analyzing forest tree tissues and cell cultures. The major modifications incorporated into this method as compared to previously published HPLC methods for separation of only dansyl...

  18. A Method for Continuous (239)Pu Determinations in Arctic and Antarctic Ice Cores.

    PubMed

    Arienzo, M M; McConnell, J R; Chellman, N; Criscitiello, A S; Curran, M; Fritzsche, D; Kipfstuhl, S; Mulvaney, R; Nolan, M; Opel, T; Sigl, M; Steffensen, J P

    2016-07-05

    Atmospheric nuclear weapons testing (NWT) resulted in the injection of plutonium (Pu) into the atmosphere and subsequent global deposition. We present a new method for continuous semiquantitative measurement of (239)Pu in ice cores, which was used to develop annual records of fallout from NWT in ten ice cores from Greenland and Antarctica. The (239)Pu was measured directly using an inductively coupled plasma-sector field mass spectrometer, thereby reducing analysis time and increasing depth-resolution with respect to previous methods. To validate this method, we compared our one year averaged results to published (239)Pu records and other records of NWT. The (239)Pu profiles from the Arctic ice cores reflected global trends in NWT and were in agreement with discrete Pu profiles from lower latitude ice cores. The (239)Pu measurements in the Antarctic ice cores tracked low latitude NWT, consistent with previously published discrete records from Antarctica. Advantages of the continuous (239)Pu measurement method are (1) reduced sample preparation and analysis time; (2) no requirement for additional ice samples for NWT fallout determinations; (3) measurements are exactly coregistered with all other chemical, elemental, isotopic, and gas measurements from the continuous analytical system; and (4) the long half-life means the (239)Pu record is stable through time.

  19. Nonparticipatory Stiffness in the Male Perioral Complex

    ERIC Educational Resources Information Center

    Chu, Shin-Ying; Barlow, Steven M.; Lee, Jaehoon

    2009-01-01

    Purpose: The objective of this study was to extend previous published findings in the authors' laboratory using a new automated technology to quantitatively characterize nonparticipatory perioral stiffness in healthy male adults. Method: Quantitative measures of perioral stiffness were sampled during a nonparticipatory task using a…

  20. MOELCULAR SIZE EXCLUSION BY SOIL ORGANIC MATERIALS ESTIMATED FROM THEIR SWELLING IN ORGANIC SOLVENTS

    EPA Science Inventory

    A published method previously developed to measure the swelling characteristics of pow dered coal samples has been adapted for swelling measurements on various peat, pollen, chain, and cellulose samples The swelling of these macromolecular materials is the volumetric manifestatio...

  1. MOLECULAR SIZE EXCLUSION BY SOIL ORGANIC MATERIALS ESTIMATED FROM THEIR SWELLING IN ORGANIC SOLVENTS

    EPA Science Inventory

    A published method previously developed to measure the swelling characteristics of powdered coal samples has been adapted for swelling measurements on various peat, pollen, chitin, and cellulose samples. he swelling of these macromolecular materials is the volumetric manifestatio...

  2. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  3. Using mixed methods research designs in health psychology: an illustrated discussion from a pragmatist perspective.

    PubMed

    Bishop, Felicity L

    2015-02-01

    To outline some of the challenges of mixed methods research and illustrate how they can be addressed in health psychology research. This study critically reflects on the author's previously published mixed methods research and discusses the philosophical and technical challenges of mixed methods, grounding the discussion in a brief review of methodological literature. Mixed methods research is characterized as having philosophical and technical challenges; the former can be addressed by drawing on pragmatism, the latter by considering formal mixed methods research designs proposed in a number of design typologies. There are important differences among the design typologies which provide diverse examples of designs that health psychologists can adapt for their own mixed methods research. There are also similarities; in particular, many typologies explicitly orient to the technical challenges of deciding on the respective timing of qualitative and quantitative methods and the relative emphasis placed on each method. Characteristics, strengths, and limitations of different sequential and concurrent designs are identified by reviewing five mixed methods projects each conducted for a different purpose. Adapting formal mixed methods designs can help health psychologists address the technical challenges of mixed methods research and identify the approach that best fits the research questions and purpose. This does not obfuscate the need to address philosophical challenges of mixing qualitative and quantitative methods. Statement of contribution What is already known on this subject? Mixed methods research poses philosophical and technical challenges. Pragmatism in a popular approach to the philosophical challenges while diverse typologies of mixed methods designs can help address the technical challenges. Examples of mixed methods research can be hard to locate when component studies from mixed methods projects are published separately. What does this study add? Critical reflections on the author's previously published mixed methods research illustrate how a range of different mixed methods designs can be adapted and applied to address health psychology research questions. The philosophical and technical challenges of mixed methods research should be considered together and in relation to the broader purpose of the research. © 2014 The British Psychological Society.

  4. A Review of Propensity-Score Methods and Their Use in Cardiovascular Research.

    PubMed

    Deb, Saswata; Austin, Peter C; Tu, Jack V; Ko, Dennis T; Mazer, C David; Kiss, Alex; Fremes, Stephen E

    2016-02-01

    Observational studies using propensity-score methods have been increasing in the cardiovascular literature because randomized controlled trials are not always feasible or ethical. However, propensity-score methods can be confusing, and the general audience may not fully understand the importance of this technique. The objectives of this review are to describe (1) the fundamentals of propensity score methods, (2) the techniques to assess for propensity-score model adequacy, (3) the 4 major methods for using the propensity score (matching, stratification, covariate adjustment, and inverse probability of treatment weighting [IPTW]) using examples from previously published cardiovascular studies, and (4) the strengths and weaknesses of these 4 techniques. Our review suggests that matching or IPTW using the propensity score have shown to be most effective in reducing bias of the treatment effect. Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  5. Ecological Systems Theory in "School Psychology Review"

    ERIC Educational Resources Information Center

    Burns, Matthew K.; Warmbold-Brann, Kristy; Zaslofsky, Anne F.

    2015-01-01

    Ecological systems theory (EST) has been suggested as a framework to provide effective school psychology services, but previous reviews of research found questionable consistency between methods and the principles of EST. The current article reviewed 349 articles published in "School Psychology Review" (SPR) between 2006 and 2015 and…

  6. Using Colaizzi's method of data analysis to explore the experiences of nurse academics teaching on satellite campuses.

    PubMed

    Wirihana, Lisa; Welch, Anthony; Williamson, Moira; Christensen, Martin; Bakon, Shannon; Craft, Judy

    2018-03-16

    Phenomenology is a useful methodological approach in qualitative nursing research. It enables researchers to put aside their perceptions of a phenomenon and give meaning to a participant's experiences. Exploring the experiences of others enables previously unavailable insights to be discovered. To delineate the implementation of Colaizzi's ( 1978 ) method of data analysis in descriptive phenomenological nursing research. The use of Colaizzi's method of data analysis enabled new knowledge to be revealed and provided insights into the experiences of nurse academics teaching on satellite campuses. Local adaptation of the nursing curriculum and additional unnoticed responsibilities had not been identified previously and warrant further research. Colaizzi's ( 1978 ) method of data analysis is rigorous and robust, and therefore a qualitative method that ensures the credibility and reliability of its results. It allows researchers to reveal emergent themes and their interwoven relationships. Researchers using a descriptive phenomenological approach should consider using this method as a clear and logical process through which the fundamental structure of an experience can be explored. Colaizzi's phenomenological methodology can be used reliably to understand people's experiences. This may prove beneficial in the development of therapeutic policy and the provision of patient-centred care. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  7. Habitat suitability criteria via parametric distributions: estimation, model selection and uncertainty

    USGS Publications Warehouse

    Som, Nicholas A.; Goodman, Damon H.; Perry, Russell W.; Hardy, Thomas B.

    2016-01-01

    Previous methods for constructing univariate habitat suitability criteria (HSC) curves have ranged from professional judgement to kernel-smoothed density functions or combinations thereof. We present a new method of generating HSC curves that applies probability density functions as the mathematical representation of the curves. Compared with previous approaches, benefits of our method include (1) estimation of probability density function parameters directly from raw data, (2) quantitative methods for selecting among several candidate probability density functions, and (3) concise methods for expressing estimation uncertainty in the HSC curves. We demonstrate our method with a thorough example using data collected on the depth of water used by juvenile Chinook salmon (Oncorhynchus tschawytscha) in the Klamath River of northern California and southern Oregon. All R code needed to implement our example is provided in the appendix. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  8. Test-Retest Reliability of the Salutogenic Wellness Promotion Scale (SWPS)

    ERIC Educational Resources Information Center

    Anderson, L. M.; Moore, J. B.; Hayden, B. M.; Becker, C. M.

    2014-01-01

    Objective: This study examined the temporal stability (i.e. test-retest reliability) of the Salutogenic Wellness Promotion Scale (SWPS) using intraclass correlation coefficients (ICC). Current intraclass results were also compared to previously published interclass correlations to support the use of the intraclass method for test-retest…

  9. Planar Cubics Through a Point in a Direction

    NASA Technical Reports Server (NTRS)

    Chou, J. J.; Blake, M. W.

    1993-01-01

    It is shown that the planar cubics through three points and the associated tangent directions can be found by solving a cubic equation and a 2 x 2 system of linear equations. The result is combined with a previous published scheme to produce a better curve-fitting method.

  10. Simple technique to treat pupillary capture after transscleral fixation of intraocular lens.

    PubMed

    Jürgens, Ignasi; Rey, Amanda

    2015-01-01

    We describe a simple surgical technique to manage pupillary capture after previous transscleral fixation of an intraocular lens. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  11. An Unbiased Estimate of Global Interrater Agreement

    ERIC Educational Resources Information Center

    Cousineau, Denis; Laurencelle, Louis

    2017-01-01

    Assessing global interrater agreement is difficult as most published indices are affected by the presence of mixtures of agreements and disagreements. A previously proposed method was shown to be specifically sensitive to global agreement, excluding mixtures, but also negatively biased. Here, we propose two alternatives in an attempt to find what…

  12. Antimicrobial activities of bacteriocins E50-52 and B602 against MRSA and other nosocomial infections

    USDA-ARS?s Scientific Manuscript database

    Our objective was to determine the antimicrobial activities of previously published bacteriocins E50-52 and B602 against methicillin resistant Staphylococcus aureus (MRSA) and other prominent nosocomial bacterial infections. methods: Several Russian hospitals were enlisted into the study from 2003 ...

  13. Ancient science in a digital age.

    PubMed

    Lehoux, Daryn

    2013-03-01

    Technology is rapidly changing our understanding of ancient science. New methods of visualization are bringing to light important texts we could not previously read; changes in online publishing are allowing unprecedented access to difficult-to-find materials; and online mapping tools are offering new pictures of lost spaces, connectivities, and physical objects.

  14. Innovative Approaches to Fuel-Air Mixing and Combustion in Airbreathing Hypersonic Engines

    NASA Astrophysics Data System (ADS)

    MacLeod, C.

    This paper describes some innovative methods for achieving enhanced fuel-air mixing and combustion in Scramjet-like spaceplane engines. A multimodal approach to the problem is discussed; this involves using several concurrent methods of forced mixing. The paper concentrates on Electromagnetic Activation (EMA) and Electrostatic Attraction as suitable techniques for this purpose - although several other potential methods are also discussed. Previously published empirical data is used to draw conclusions about the likely effectiveness of the system and possible engine topologies are outlined.

  15. Statistical testing of association between menstruation and migraine.

    PubMed

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  16. Comparing electrochemical performance of transition metal silicate cathodes and chevrel phase Mo6S8 in the analogous rechargeable Mg-ion battery system

    NASA Astrophysics Data System (ADS)

    Chen, Xinzhi; Bleken, Francesca L.; Løvvik, Ole Martin; Vullum-Bruer, Fride

    2016-07-01

    Polyanion based silicate materials, MgMSiO4 (M = Fe, Mn, Co), previously reported to be promising cathode materials for Mg-ion batteries, have been re-examined. Both the sol-gel and molten salt methods are employed to synthesize MgMSiO4 composites. Mo6S8 is synthesized by a molten salt method combined with Cu leaching and investigated in the equivalent electrochemical system as a bench mark. Electrochemical measurements for Mo6S8 performed using the 2nd generation electrolyte show similar results to those reported in literature. Electrochemical performance of the silicate materials on the other hand, do not show the promising results previously reported. A thorough study of these published results are presented here, and compared to the current experimental data on the same material system. It appears that there are certain inconsistencies in the published results which cannot be explained. To further corroborate the present experimental results, atomic-scale calculations from first principles are performed, demonstrating that diffusion barriers are very high for Mg diffusion in MgMSiO4. In conclusion, MgMSiO4 (M = Fe, Mn, Co) olivine materials do not seem to be such good candidates for cathode materials in Mg-ion batteries as previously reported.

  17. Determining the Best Method for Estimating the Observed Level of Maximum Detrainment Based on Radar Reflectivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carletta, Nicholas D.; Mullendore, Gretchen L.; Starzec, Mariusz

    Convective mass transport is the transport of mass from near the surface up to the upper troposphere and lower stratosphere (UTLS) by a deep convective updraft. This transport can alter the chemical makeup and water vapor balance of the UTLS, which affects cloud formation and the radiative properties of the atmosphere. It is therefore important to understand the exact altitudes at which mass is detrained from convection. The purpose of this study was to improve upon previously published methodologies for estimating the level of maximum detrainment (LMD) within convection using data from a single ground-based radar. Four methods were usedmore » to identify the LMD and validated against dual-Doppler derived vertical mass divergence fields for six cases with a variety of storm types. The best method for locating the LMD was determined to be the method that used a reflectivity texture technique to determine convective cores and a multi-layer echo identification to determine anvil locations. Although an improvement over previously published methods, the new methodology still produced unreliable results in certain regimes. The methodology worked best when applied to mature updrafts, as the anvil needs time to grow to a detectable size. Thus, radar reflectivity is found to be valuable in estimating the LMD, but storm maturity must also be considered for best results.« less

  18. Evaluation of the PCR method for identification of Bifidobacterium species.

    PubMed

    Youn, S Y; Seo, J M; Ji, G E

    2008-01-01

    Bifidobacterium species are known for their beneficial effects on health and their wide use as probiotics. Although various polymerase chain reaction (PCR) methods for the identification of Bifidobacterium species have been published, the reliability of these methods remains open to question. In this study, we evaluated 37 previously reported PCR primer sets designed to amplify 16S rDNA, 23S rDNA, intergenic spacer regions, or repetitive DNA sequences of various Bifidobacterium species. Ten of 37 experimental primer sets showed specificity for B. adolescentis, B. angulatum, B. pseudocatenulatum, B. breve, B. bifidum, B. longum, B. longum biovar infantis and B. dentium. The results suggest that published Bifidobacterium primer sets should be re-evaluated for both reproducibility and specificity for the identification of Bifidobacterium species using PCR. Improvement of existing PCR methods will be needed to facilitate identification of other Bifidobacterium strains, such as B. animalis, B. catenulatum, B. thermophilum and B. subtile.

  19. GStream: Improving SNP and CNV Coverage on Genome-Wide Association Studies

    PubMed Central

    Alonso, Arnald; Marsal, Sara; Tortosa, Raül; Canela-Xandri, Oriol; Julià, Antonio

    2013-01-01

    We present GStream, a method that combines genome-wide SNP and CNV genotyping in the Illumina microarray platform with unprecedented accuracy. This new method outperforms previous well-established SNP genotyping software. More importantly, the CNV calling algorithm of GStream dramatically improves the results obtained by previous state-of-the-art methods and yields an accuracy that is close to that obtained by purely CNV-oriented technologies like Comparative Genomic Hybridization (CGH). We demonstrate the superior performance of GStream using microarray data generated from HapMap samples. Using the reference CNV calls generated by the 1000 Genomes Project (1KGP) and well-known studies on whole genome CNV characterization based either on CGH or genotyping microarray technologies, we show that GStream can increase the number of reliably detected variants up to 25% compared to previously developed methods. Furthermore, the increased genome coverage provided by GStream allows the discovery of CNVs in close linkage disequilibrium with SNPs, previously associated with disease risk in published Genome-Wide Association Studies (GWAS). These results could provide important insights into the biological mechanism underlying the detected disease risk association. With GStream, large-scale GWAS will not only benefit from the combined genotyping of SNPs and CNVs at an unprecedented accuracy, but will also take advantage of the computational efficiency of the method. PMID:23844243

  20. An improved protocol for the preparation of total genomic DNA from isolates of yeast and mould using Whatman FTA filter papers.

    PubMed

    Borman, Andrew M; Fraser, Mark; Linton, Christopher J; Palmer, Michael D; Johnson, Elizabeth M

    2010-06-01

    Here, we present a significantly improved version of our previously published method for the extraction of fungal genomic DNA from pure cultures using Whatman FTA filter paper matrix technology. This modified protocol is extremely rapid, significantly more cost effective than our original method, and importantly, substantially reduces the problem of potential cross-contamination between sequential filters when employing FTA technology.

  1. Integration of Marine Mammal Movement and Behavior into the Effects of Sound on the Marine Environment

    DTIC Science & Technology

    2011-09-30

    capability to emulate the dive and movement behavior of marine mammals provides a significant advantage to modeling environmental impact than do historic...approaches used in Navy environmental assessments (EA) and impact statements (EIS). Many previous methods have been statistical or pseudo-statistical...Siderius. 2011. Comparison of methods used for computing the impact of sound on the marine environment, Marine Environmental Research, 71:342-350. [published

  2. Corpus Callosum Anatomy in Chronically Treated and Stimulant Naive ADHD

    ERIC Educational Resources Information Center

    Schnoebelen, Sarah; Semrud-Clikeman, Margaret; Pliszka, Steven R.

    2010-01-01

    Objective: To determine the effect of chronic stimulant treatment on corpus callosum (CC) size in children with ADHD using volumetric and area measurements. Previously published research indicated possible medication effects on specific areas of the CC. Method: Measurements of the CC from anatomical MRIs were obtained from children aged 9-16 in…

  3. Early Behavioral Intervention Is Associated with Normalized Brain Activity in Young Children with Autism

    ERIC Educational Resources Information Center

    Dawson, Geraldine; Jones, Emily J. H.; Merkle, Kristen; Venema, Kaitlin; Lowy, Rachel; Faja, Susan; Kamara, Dana; Murias, Michael; Greenson, Jessica; Winter, Jamie; Smith, Milani; Rogers, Sally J.; Webb, Sara J.

    2012-01-01

    Objective: A previously published randomized clinical trial indicated that a developmental behavioral intervention, the Early Start Denver Model (ESDM), resulted in gains in IQ, language, and adaptive behavior of children with autism spectrum disorder. This report describes a secondary outcome measurement from this trial, EEG activity. Method:…

  4. Cantharidin, a protein phosphatase inhibitor, strongly upregulates detoxification enzymes in the Arabidopsis proteome

    USDA-ARS?s Scientific Manuscript database

    Cantharidin is a potent natural herbicide. This work was conducted to probe its mode of action. We previously published its effect on transcription of plant genes (mRNA production) with transcriptomic methods. This paper follows up and looks at cantharidin effects translation of mRNA using proteom...

  5. Do Premarital Education Programs Really Work? A Meta-Analytic Study

    ERIC Educational Resources Information Center

    Fawcett, Elizabeth B.; Hawkins, Alan J.; Blanchard, Victoria L.; Carroll, Jason S.

    2010-01-01

    Previous studies (J. S. Carroll & W. J. Doherty, 2003) have asserted that premarital education programs have a positive effect on program participants. Using meta-analytic methods of current best practices to look across the entire body of published and unpublished evaluation research on premarital education, we found a more complex pattern of…

  6. Implementing Cognitive Behavioral Therapy for Chronic Fatigue Syndrome in a Mental Health Center: A Benchmarking Evaluation

    ERIC Educational Resources Information Center

    Scheeres, Korine; Wensing, Michel; Knoop, Hans; Bleijenberg, Gijs

    2008-01-01

    Objective: This study evaluated the success of implementing cognitive behavioral therapy (CBT) for chronic fatigue syndrome (CFS) in a representative clinical practice setting and compared the patient outcomes with those of previously published randomized controlled trials (RCTs) of CBT for CFS. Method: The implementation interventions were the…

  7. The Empirical Review of Meta-Analysis Published in Korea

    ERIC Educational Resources Information Center

    Park, Sunyoung; Hong, Sehee

    2016-01-01

    Meta-analysis is a statistical method that is increasingly utilized to combine and compare the results of previous primary studies. However, because of the lack of comprehensive guidelines for how to use meta-analysis, many meta-analysis studies have failed to consider important aspects, such as statistical programs, power analysis, publication…

  8. The role of moisture content in above-ground leaching

    Treesearch

    Stan Lebow; Patricia Lebow

    2007-01-01

    This paper reviews previous reports on the moisture content of wood exposed above ground and compares those values to moisture contents obtained using simulated rainfall and immersion methods. Laboratory leaching trials with CCA-treated specimens were also conducted and the results compared to published values for leaching of CCA-treated specimens exposed above ground...

  9. A Simple and Novel Method to Attain Retrograde Ureteral Access after Previous Cohen Cross-Trigonal Ureteral Reimplantation

    PubMed Central

    Adam, Ahmed

    2017-01-01

    Objective To describe a simple, novel method to achieve ureteric access in the Cohen crossed reimplanted ureter, which will allow retrograde working access via the conventional transurethral method. Materials and Methods Under cystoscopic vision, suprapubic needle puncture was performed. The needle was directed (bevel facing) towards the desired ureteric orifice (UO). A guidewire (with a floppy-tip) was then inserted into the suprapubic needle passing into the bladder, and then easily passed into the crossed-reimplanted UO. The distal end of the guidewire was then removed through the urethra with cystoscopic grasping forceps. The straightened ureter then easily facilitated ureteroscopy access, retrograde pyelogram studies, and JJ stent insertion in a conventional transurethral method. Results The UO and ureter were aligned in a more conventional orthotopic course, to allow for conventional transurethral working access. Conclusion A novel method to access the Cohen crossed reimplanted ureter was described. All previously published methods of accessing the crossed ureter were critically appraised. PMID:29463976

  10. QFASAR: Quantitative fatty acid signature analysis with R

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  11. Discovery of a z = 7.452 High Equivalent Width Lyα Emitter from the Hubble Space Telescope  Faint Infrared Grism Survey

    NASA Astrophysics Data System (ADS)

    Larson, Rebecca L.; Finkelstein, Steven L.; Pirzkal, Norbert; Ryan, Russell; Tilvi, Vithal; Malhotra, Sangeeta; Rhoads, James; Finkelstein, Keely; Jung, Intae; Christensen, Lise; Cimatti, Andrea; Ferreras, Ignacio; Grogin, Norman; Koekemoer, Anton M.; Hathi, Nimish; O’Connell, Robert; Östlin, Göran; Pasquali, Anna; Pharo, John; Rothberg, Barry; Windhorst, Rogier A.; The FIGS Team

    2018-05-01

    We present the results of an unbiased search for Lyα emission from continuum-selected 5.6 < z < 8.7 galaxies. Our data set consists of 160 orbits of G102 slitless grism spectroscopy obtained with the Hubble Space Telescope(HST)/WFC3 as part of the Faint Infrared Grism Survey (FIGS; PI: Malhotra), which obtains deep slitless spectra of all sources in four fields, and was designed to minimize contamination in observations of previously identified high-redshift galaxy candidates. The FIGS data can potentially spectroscopically confirm the redshifts of galaxies, and as Lyα emission is resonantly scattered by neutral gas, FIGS can also constrain the ionization state of the intergalactic medium during the epoch of reionization. These data have sufficient depth to detect Lyα emission in this epoch, as Tilvi et al. have published the FIGS detection of previously known Lyα emission at z = 7.51. The FIGS data use five separate roll angles of HST to mitigate the contamination by nearby galaxies. We created a method that accounts for and removes the contamination from surrounding galaxies and also removes any dispersed continuum light from each individual spectrum. We searched for significant (>4σ) emission lines using two different automated detection methods, free of any visual inspection biases. Applying these methods on photometrically selected high-redshift candidates between 5.6 < z < 8.7, we find two emission lines, one previously published by Tilvi et al., (2016) and a new line at 1.028 μm, which we identify as Lyα at z = 7.452 ± 0.003. This newly spectroscopically confirmed galaxy has the highest Lyα rest-frame equivalent width (EWLyα ) yet published at z > 7 (140.3 ± 19.0 Å).

  12. Characterization of primary standards for use in the HPLC analysis of the procyanidin content of cocoa and chocolate containing products.

    PubMed

    Hurst, William J; Stanley, Bruce; Glinski, Jan A; Davey, Matthew; Payne, Mark J; Stuart, David A

    2009-10-15

    This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.

  13. A wavelet method for modeling and despiking motion artifacts from resting-state fMRI time series.

    PubMed

    Patel, Ameera X; Kundu, Prantik; Rubinov, Mikail; Jones, P Simon; Vértes, Petra E; Ersche, Karen D; Suckling, John; Bullmore, Edward T

    2014-07-15

    The impact of in-scanner head movement on functional magnetic resonance imaging (fMRI) signals has long been established as undesirable. These effects have been traditionally corrected by methods such as linear regression of head movement parameters. However, a number of recent independent studies have demonstrated that these techniques are insufficient to remove motion confounds, and that even small movements can spuriously bias estimates of functional connectivity. Here we propose a new data-driven, spatially-adaptive, wavelet-based method for identifying, modeling, and removing non-stationary events in fMRI time series, caused by head movement, without the need for data scrubbing. This method involves the addition of just one extra step, the Wavelet Despike, in standard pre-processing pipelines. With this method, we demonstrate robust removal of a range of different motion artifacts and motion-related biases including distance-dependent connectivity artifacts, at a group and single-subject level, using a range of previously published and new diagnostic measures. The Wavelet Despike is able to accommodate the substantial spatial and temporal heterogeneity of motion artifacts and can consequently remove a range of high and low frequency artifacts from fMRI time series, that may be linearly or non-linearly related to physical movements. Our methods are demonstrated by the analysis of three cohorts of resting-state fMRI data, including two high-motion datasets: a previously published dataset on children (N=22) and a new dataset on adults with stimulant drug dependence (N=40). We conclude that there is a real risk of motion-related bias in connectivity analysis of fMRI data, but that this risk is generally manageable, by effective time series denoising strategies designed to attenuate synchronized signal transients induced by abrupt head movements. The Wavelet Despiking software described in this article is freely available for download at www.brainwavelet.org. Copyright © 2014. Published by Elsevier Inc.

  14. Small-volume potentiometric titrations: EPR investigations of Fe-S cluster N2 in mitochondrial complex I.

    PubMed

    Wright, John J; Salvadori, Enrico; Bridges, Hannah R; Hirst, Judy; Roessler, Maxie M

    2016-09-01

    EPR-based potentiometric titrations are a well-established method for determining the reduction potentials of cofactors in large and complex proteins with at least one EPR-active state. However, such titrations require large amounts of protein. Here, we report a new method that requires an order of magnitude less protein than previously described methods, and that provides EPR samples suitable for measurements at both X- and Q-band microwave frequencies. We demonstrate our method by determining the reduction potential of the terminal [4Fe-4S] cluster (N2) in the intramolecular electron-transfer relay in mammalian respiratory complex I. The value determined by our method, E m7 =-158mV, is precise, reproducible, and consistent with previously reported values. Our small-volume potentiometric titration method will facilitate detailed investigations of EPR-active centres in non-abundant and refractory proteins that can only be prepared in small quantities. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Gingival displacement: Survey results of dentists' practice procedures.

    PubMed

    Ahmed, Sumitha N; Donovan, Terry E

    2015-07-01

    A high percentage of fixed prosthodontic restorations require a subgingival margin placement, which requires the practice of gingival displacement or a deflection procedure to replicate the margins in impression. The purpose of this study was to learn the different gingival displacement techniques that are currently used by dentists in their practice and to compare the current concepts of gingival displacement with previously published articles. A survey of questions pertaining to gingival deflection methods was distributed as part of continuing education (CE) course material to dentists attending CE meetings in 7 states in the U.S. and 1 Canadian province. Question topics included initial patient assessment procedures, gingival displacement methods, dentist's knowledge and assessment of systemic manifestations, and brand names of materials used. Ninety-four percent of the participants were general practitioners with 24.11 ± 12.5 years of experience. Ninety-two percent used gingival displacement cords, while 20.2% used a soft tissue laser and 32% used electrosurgery as an adjunct. Sixty percent of the dentists used displacement cords impregnated with a medicament. Of the preimpregnated cords, 29% were impregnated with epinephrine, 13% with aluminum chloride, and 18% with aluminum potassium sulfate. The study showed a steady decrease compared with results of previously published articles in the use of epinephrine as a gingival deflection medicament. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  16. Child Malnutrition in Pakistan: Evidence from Literature

    PubMed Central

    Asim, Muhammad; Nawaz, Yasir

    2018-01-01

    Pakistan has one of the highest prevalences of child malnutrition as compared to other developing countries. This narrative review was accomplished to examine the published empirical literature on children’s nutritional status in Pakistan. The objectives of this review were to know about the methodological approaches used in previous studies, to assess the overall situation of childhood malnutrition, and to identify the areas that have not yet been studied. This study was carried out to collect and synthesize the relevant data from previously published papers through different scholarly database search engines. The most relevant and current published papers between 2000–2016 were included in this study. The research papers that contain the data related to child malnutrition in Pakistan were assessed. A total of 28 articles was reviewed and almost similar methodologies were used in all of them. Most of the researchers conducted the cross sectional quantitative and descriptive studies, through structured interviews for identifying the causes of child malnutrition. Only one study used the mix method technique for acquiring data from the respondents. For the assessment of malnutrition among children, out of 28 papers, 20 used the World Health Organization (WHO) weight for age, age for height, and height for weight Z-score method. Early marriages, large family size, high fertility rates with a lack of birth spacing, low income, the lack of breast feeding, and exclusive breastfeeding were found to be the themes that repeatedly emerged in the reviewed literature. There is a dire need of qualitative and mixed method researches to understand and have an insight into the underlying factors of child malnutrition in Pakistan. PMID:29734703

  17. A case-crossover study of transient risk factors influence on occupational injuries: a study protocol based on a review of previous studies.

    PubMed

    Oesterlund, Anna H; Lander, Flemming; Lauritsen, Jens

    2016-10-01

    The occupational injury incident rate remains relatively high in the European Union. The case-crossover study gives a unique opportunity to study transient risk factors that normally would be very difficult to approach. Studies like this have been carried out in both America and Asia, but so far no relevant research has been conducted in Europe. Case-crossover studies of occupational injuries were collected from PubMed and Embase and read through. Previous experiences concerning method, exposure and outcome, time-related measurements and construction of the questionnaire were taken into account in the preparation of a pilot study. Consequently, experiences from the pilot study were used to design the study protocol. Approximately 2000 patients with an occupational injury will be recruited from the emergency departments in Herning and Odense, Denmark. A standardised questionnaire will be used to collect basic demographic data and information on eight transient risk factors. Based on previous studies and knowledge on occupational injuries the transient risk factors we chose to examine were: time pressure, performing a task with a different method/using unaccustomed technique, change in working surroundings, using a phone, disagreement, feeling ill, being distracted and using malfunctioning machinery/tools or work material. Exposure time 'just before the injury' will be compared with two control periods, 'previous day at the same time of the injury' (pair match) and the previous work week (usual frequency). This study protocol describes a unique opportunity to calculate the effect of transient risk factors on occupational injuries in a European setting. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. Reporting of methods was better in the Clinical Trials Registry-India than in Indian journal publications.

    PubMed

    Tharyan, Prathap; George, Aneesh Thomas; Kirubakaran, Richard; Barnabas, Jabez Paul

    2013-01-01

    We sought to evaluate if editorial policies and the reporting quality of randomized controlled trials (RCTs) had improved since our 2004-05 survey of 151 RCTs in 65 Indian journals, and to compare reporting quality of protocols in the Clinical Trials Registry-India (CTRI). An observational study of endorsement of Consolidated Standards for the Reporting of Trials (CONSORT) and International Committee of Medical Journal Editors (ICMJE) requirements in the instructions to authors in Indian journals, and compliance with selected requirements in all RCTs published during 2007-08 vs. our previous survey and between all RCT protocols in the CTRI on August 31, 2010 and published RCTs from both surveys. Journal policies endorsing the CONSORT statement (22/67, 33%) and ICMJE requirements (35/67, 52%) remained suboptimal, and only 4 of 13 CONSORT items were reported in more than 50% of the 145 RCTs assessed. Reporting of ethical issues had improved significantly, and that of methods addressing internal validity had not improved. Adequate methods were reported significantly more frequently in 768 protocols in the CTRI, than in the 296 published trials. The CTRI template facilitates the reporting of valid methods in registered trial protocols. The suboptimal compliance with CONSORT and ICMJE requirements in RCTs published in Indian journals reduces credibility in the reliability of their results. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Writing Integrative Reviews of the Literature: Methods and Purposes

    ERIC Educational Resources Information Center

    Torraco, Richard J.

    2016-01-01

    This article discusses the integrative review of the literature as a distinctive form of research that uses existing literature to create new knowledge. As an expansion and update of a previously published article on this topic, it acknowledges the growth and appeal of this form of research to scholars, it identifies the main components of the…

  20. Nonhuman Primates Prefer Slow Tempos but Dislike Music Overall

    ERIC Educational Resources Information Center

    McDermott, Josh; Hauser, Marc D.

    2007-01-01

    Human adults generally find fast tempos more arousing than slow tempos, with tempo frequently manipulated in music to alter tension and emotion. We used a previously published method [McDermott, J., & Hauser, M. (2004). Are consonant intervals music to their ears? Spontaneous acoustic preferences in a nonhuman primate. Cognition, 94(2), B11-B21]…

  1. A Simplified Digestion Protocol for the Analysis of Hg in Fish by Cold Vapor Atomic Absorption Spectroscopy

    ERIC Educational Resources Information Center

    Kristian, Kathleen E.; Friedbauer, Scott; Kabashi, Donika; Ferencz, Kristen M.; Barajas, Jennifer C.; O'Brien, Kelly

    2015-01-01

    Analysis of mercury in fish is an interesting problem with the potential to motivate students in chemistry laboratory courses. The recommended method for mercury analysis in fish is cold vapor atomic absorption spectroscopy (CVAAS), which requires homogeneous analyte solutions, typically prepared by acid digestion. Previously published digestion…

  2. Some Elements of American Indian Pedagogy from an Anishinaabe Perspective

    ERIC Educational Resources Information Center

    Gross, Lawrence W.

    2010-01-01

    In 2005 the author published an article discussing the teaching method teachers used for an introduction to American Indian studies course at Iowa State University. In his previous piece, the author did not delineate the elements that go into an American Indian pedagogy. In this article, the author discusses some elements of American Indian…

  3. High energy PIXE: A tool to characterize multi-layer thick samples

    NASA Astrophysics Data System (ADS)

    Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.

    2018-02-01

    High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.

  4. Determination of palladium, platinum and rhodium in geologic materials by fire assay and emission spectrography

    USGS Publications Warehouse

    Hapfty, J.; Riley, L.B.

    1968-01-01

    A method is described for the determination of palladium down to 4ppb (parts per billion, 109), platinum down to 10 ppb and rhodium down to 5 ppb in 15 g of sample. Fire-assay techniques are used to preconcentrate the platinum metals into a gold bead, then the bead is dissolved in aqua regia and diluted to volume with 1M hydrochloric acid. The solution is analysed by optical emission spectrography of the residue from 200 ??l of it evaporated on a pair of flat-top graphite electrodes. This method requires much less sample handling than most published methods for these elements. Data are presented for G-1, W-1, and six new standard rocks of the U.S. Geological Survey. The values for palladium in W-1 are in reasonable agreement with previously published data. ?? 1968.

  5. 2011 update to the Society of Thoracic Surgeons and the Society of Cardiovascular Anesthesiologists blood conservation clinical practice guidelines.

    PubMed

    Ferraris, Victor A; Brown, Jeremiah R; Despotis, George J; Hammon, John W; Reece, T Brett; Saha, Sibu P; Song, Howard K; Clough, Ellen R; Shore-Lesserson, Linda J; Goodnough, Lawrence T; Mazer, C David; Shander, Aryeh; Stafford-Smith, Mark; Waters, Jonathan; Baker, Robert A; Dickinson, Timothy A; FitzGerald, Daniel J; Likosky, Donald S; Shann, Kenneth G

    2011-03-01

    Practice guidelines reflect published literature. Because of the ever changing literature base, it is necessary to update and revise guideline recommendations from time to time. The Society of Thoracic Surgeons recommends review and possible update of previously published guidelines at least every three years. This summary is an update of the blood conservation guideline published in 2007. The search methods used in the current version differ compared to the previously published guideline. Literature searches were conducted using standardized MeSH terms from the National Library of Medicine PUBMED database list of search terms. The following terms comprised the standard baseline search terms for all topics and were connected with the logical 'OR' connector--Extracorporeal circulation (MeSH number E04.292), cardiovascular surgical procedures (MeSH number E04.100), and vascular diseases (MeSH number C14.907). Use of these broad search terms allowed specific topics to be added to the search with the logical 'AND' connector. In this 2011 guideline update, areas of major revision include: 1) management of dual anti-platelet therapy before operation, 2) use of drugs that augment red blood cell volume or limit blood loss, 3) use of blood derivatives including fresh frozen plasma, Factor XIII, leukoreduced red blood cells, platelet plasmapheresis, recombinant Factor VII, antithrombin III, and Factor IX concentrates, 4) changes in management of blood salvage, 5) use of minimally invasive procedures to limit perioperative bleeding and blood transfusion, 6) recommendations for blood conservation related to extracorporeal membrane oxygenation and cardiopulmonary perfusion, 7) use of topical hemostatic agents, and 8) new insights into the value of team interventions in blood management. Much has changed since the previously published 2007 STS blood management guidelines and this document contains new and revised recommendations. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  6. Determination of the molar extinction coefficient for the ferric reducing/antioxidant power assay.

    PubMed

    Hayes, William A; Mills, Daniel S; Neville, Rachel F; Kiddie, Jenna; Collins, Lisa M

    2011-09-15

    The FRAP reagent contains 2,4,6-tris(2-pyridyl)-s-triazine, which forms a blue-violet complex ion in the presence of ferrous ions. Although the FRAP (ferric reducing/antioxidant power) assay is popular and has been in use for many years, the correct molar extinction coefficient of this complex ion under FRAP assay conditions has never been published, casting doubt on the validity of previous calibrations. A previously reported value of 19,800 is an underestimate. We determined that the molar extinction coefficient was 21,140. The value of the molar extinction coefficient was also shown to depend on the type of assay and was found to be 22,230 under iron assay conditions, in good agreement with published data. Redox titration indicated that the ferrous sulfate heptahydrate calibrator recommended by Benzie and Strain, the FRAP assay inventors, is prone to efflorescence and, therefore, is unreliable. Ferrous ammonium sulfate hexahydrate in dilute sulfuric acid was a more stable alternative. Few authors publish their calibration data, and this makes comparative analyses impossible. A critical examination of the limited number of examples of calibration data in the published literature reveals only that Benzie and Strain obtained a satisfactory calibration using their method. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Extending the solvent-free MALDI sample preparation method.

    PubMed

    Hanton, Scott D; Parees, David M

    2005-01-01

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is an important technique to characterize many different materials, including synthetic polymers. MALDI mass spectral data can be used to determine the polymer average molecular weights, repeat units, and end groups. One of the key issues in traditional MALDI sample preparation is making good solutions of the analyte and the matrix. Solvent-free sample preparation methods have been developed to address these issues. Previous results of solvent-free or dry prepared samples show some advantages over traditional wet sample preparation methods. Although the results of the published solvent-free sample preparation methods produced excellent mass spectra, we found the method to be very time-consuming, with significant tool cleaning, which presents a significant possibility of cross contamination. To address these issues, we developed an extension of the solvent-free method that replaces the mortar and pestle grinding with ball milling the sample in a glass vial with two small steel balls. This new method generates mass spectra with equal quality of the previous methods, but has significant advantages in productivity, eliminates cross contamination, and is applicable to liquid and soft or waxy analytes.

  8. Methods and formulas for calculating the strength of plate and shell constructions as used in airplane design

    NASA Technical Reports Server (NTRS)

    Heck, O S; Ebner, H

    1936-01-01

    This report is a compilation of previously published articles on formulas and methods of calculation for the determination of the strength and stability of plate and shell construction as employed in airplane design. In particular, it treats the problem of isotropic, orthotopic, and stiffened rectangular plates, thin curved panels, and circular cylinders under various loading conditions. The purpose of appending the pertinent literature references following the subjects discussed was to facilitate a comprehensive study of the treated problems.

  9. Practical uncertainty reduction and quantification in shock physics measurements

    DOE PAGES

    Akin, M. C.; Nguyen, J. H.

    2015-04-20

    We report the development of a simple error analysis sampling method for identifying intersections and inflection points to reduce total uncertainty in experimental data. This technique was used to reduce uncertainties in sound speed measurements by 80% over conventional methods. Here, we focused on its impact on a previously published set of Mo sound speed data and possible implications for phase transition and geophysical studies. However, this technique's application can be extended to a wide range of experimental data.

  10. Organophosphorus pesticide poisonings in humans: determination of residues and metabolites in tissues and urine.

    PubMed

    Lores, E M; Bradway, D E; Moseman, R F

    1978-01-01

    The analyses of four organophosphorus pesticide poisoning cases, three of which resulted in death, are reported. The case histories of the subjects, along with the analysis of tissues, urine, and blood for the levels of pesticides and metabolites are given. The pesticides involved include dicrotophos, chlorpyrifos, malathion, and parathion. The methods of analysis were adapted from previously published methods that provide a very rapid means of identification of organophosphorus pesticides in the tissues or in the blood of poisoned patients.

  11. Morphology vs morphokinetics: a retrospective comparison of inter-observer and intra-observer agreement between embryologists on blastocysts with known implantation outcome.

    PubMed

    Adolfsson, Emma; Andershed, Anna Nowosad

    2018-06-18

    Our primary aim was to compare the morphology and morphokinetics on inter- and intra-observer agreement for blastocyst with known implantation outcome. Our secondary aim was to validate the morphokinetic parameters' ability to predict pregnancy using a previous published selection algorithm, and to compare this to standard morphology assessments. Two embryologists made independent blinded annotations on two occasions using time-lapse images and morphology evaluations using the Gardner Schoolcraft criteria of 99 blastocysts with known implantation outcome. Inter- and intra-observer agreement was calculated and compared using the two methods. The embryos were grouped based on their morphological score, and on their morphokinetic class using a previous published selection algorithm. The implantation rates for each group was calculated and compared. There was moderate agreement for morphology, with agreement on the same embryo score in 55 of 99 cases. The highest agreement rate was found for expansion grade, followed by trophectoderm and inner cell mass. Correlation with pregnancy was inconclusive. For morphokinetics, almost perfect agreement was found for early and late embryo development events, and strong agreement for day-2 and day-3 events. When applying the selection algorithm, the embryo distributions were uneven, and correlation to pregnancy was inconclusive. Time-lapse annotation is consistent and accurate, but our external validation of a previously published selection algorithm was unsuccessful.

  12. A method to investigate the diffusion properties of nuclear calcium.

    PubMed

    Queisser, Gillian; Wittum, Gabriel

    2011-10-01

    Modeling biophysical processes in general requires knowledge about underlying biological parameters. The quality of simulation results is strongly influenced by the accuracy of these parameters, hence the identification of parameter values that the model includes is a major part of simulating biophysical processes. In many cases, secondary data can be gathered by experimental setups, which are exploitable by mathematical inverse modeling techniques. Here we describe a method for parameter identification of diffusion properties of calcium in the nuclei of rat hippocampal neurons. The method is based on a Gauss-Newton method for solving a least-squares minimization problem and was formulated in such a way that it is ideally implementable in the simulation platform uG. Making use of independently published space- and time-dependent calcium imaging data, generated from laser-assisted calcium uncaging experiments, here we could identify the diffusion properties of nuclear calcium and were able to validate a previously published model that describes nuclear calcium dynamics as a diffusion process.

  13. Pancreaticoduodenectomy following gastrectomy reconstructed with Billroth II or Roux-en-Y method: Case series and literature review.

    PubMed

    Kawamoto, Yusuke; Ome, Yusuke; Kouda, Yusuke; Saga, Kennichi; Park, Taebum; Kawamoto, Kazuyuki

    2017-01-01

    The ideal reconstruction method for pancreaticoduodenectomy following a gastrectomy with Billroth II or Roux-en-Y reconstruction is unclear. We reviewed a series of seven pancreaticoduodenectomies performed after gastrectomy with the Billroth II or Roux-en-Y method. While preserving the existing gastrojejunostomy or esophagojejunostomy, pancreaticojejunostomy and hepaticojejunostomy were performed by the Roux-en-Y method using a new Roux limb in all cases. Four patients experienced postoperative complications, although the specific complications varied. A review of the literature revealed 13 cases of pancreaticoduodenectomy following gastrectomy with Billroth II or Roux-en-Y reconstruction. Three patients out of six (50%) in whom the past afferent limb was used for the reconstruction of the pancreaticojejunostomy and hepaticojejunostomy experienced afferent loop syndrome, while 14 previous and current patients in whom a new jejeunal limb was used did not experience this complication. The Roux-en-Y method, using the distal intestine of previous gastrojejunostomy or jejunojejunostomy as a new jejunal limb for pancreaticojejunostomy and hepaticojejunostomy, may be a better reconstruction method to avoid the complication of afferent loop syndrome after previous gastrectomy with Billroth II or Roux-en-Y reconstruction if the afferent limb is less than 40cm. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Recent meta-analyses neglect previous systematic reviews and meta-analyses about the same topic: a systematic examination.

    PubMed

    Helfer, Bartosz; Prosser, Aaron; Samara, Myrto T; Geddes, John R; Cipriani, Andrea; Davis, John M; Mavridis, Dimitris; Salanti, Georgia; Leucht, Stefan

    2015-04-14

    As the number of systematic reviews is growing rapidly, we systematically investigate whether meta-analyses published in leading medical journals present an outline of available evidence by referring to previous meta-analyses and systematic reviews. We searched PubMed for recent meta-analyses of pharmacological treatments published in high impact factor journals. Previous systematic reviews and meta-analyses were identified with electronic searches of keywords and by searching reference sections. We analyzed the number of meta-analyses and systematic reviews that were cited, described and discussed in each recent meta-analysis. Moreover, we investigated publication characteristics that potentially influence the referencing practices. We identified 52 recent meta-analyses and 242 previous meta-analyses on the same topics. Of these, 66% of identified previous meta-analyses were cited, 36% described, and only 20% discussed by recent meta-analyses. The probability of citing a previous meta-analysis was positively associated with its publication in a journal with a higher impact factor (odds ratio, 1.49; 95% confidence interval, 1.06 to 2.10) and more recent publication year (odds ratio, 1.19; 95% confidence interval 1.03 to 1.37). Additionally, the probability of a previous study being described by the recent meta-analysis was inversely associated with the concordance of results (odds ratio, 0.38; 95% confidence interval, 0.17 to 0.88), and the probability of being discussed was increased for previous studies that employed meta-analytic methods (odds ratio, 32.36; 95% confidence interval, 2.00 to 522.85). Meta-analyses on pharmacological treatments do not consistently refer to and discuss findings of previous meta-analyses on the same topic. Such neglect can lead to research waste and be confusing for readers. Journals should make the discussion of related meta-analyses mandatory.

  15. A voxel-based approach to gray matter asymmetries.

    PubMed

    Luders, E; Gaser, C; Jancke, L; Schlaug, G

    2004-06-01

    Voxel-based morphometry (VBM) was used to analyze gray matter (GM) asymmetries in a large sample (n = 60) of male and female professional musicians with and without absolute pitch (AP). We chose to examine these particular groups because previous studies using traditional region-of-interest (ROI) analyses have shown differences in hemispheric asymmetry related to AP and gender. Voxel-based methods may have advantages over traditional ROI-based methods since the analysis can be performed across the whole brain with minimal user bias. After determining that the VBM method was sufficiently sensitive for the detection of differences in GM asymmetries between groups, we found that male AP musicians were more leftward lateralized in the anterior region of the planum temporale (PT) than male non-AP musicians. This confirmed the results of previous studies using ROI-based methods that showed an association between PT asymmetry and the AP phenotype. We further observed that male non-AP musicians revealed an increased leftward GM asymmetry in the postcentral gyrus compared to female non-AP musicians, again corroborating results of a previously published study using ROI-based methods. By analyzing hemispheric GM differences across our entire sample, we were able to partially confirm findings of previous studies using traditional morphometric techniques, as well as more recent, voxel-based analyses. In addition, we found some unusually pronounced GM asymmetries in our musician sample not previously detected in subjects unselected for musical training. Since we were able to validate gender- and AP-related brain asymmetries previously described using traditional ROI-based morphometric techniques, the results of our analyses support the use of VBM for examinations of GM asymmetries.

  16. The National Center on Indigenous Hawaiian Behavioral Health Study of Prevalence of Psychiatric Disorders in Native Hawaiian Adolescents

    ERIC Educational Resources Information Center

    Andrade, Naleen N.; Hishinuma, Earl S.; McDermott, John F., Jr.; Johnson, Ronald C.; Goebert, Deborah A.; Makini, George K., Jr.; Nahulu, Linda B.; Yuen, Noelle Y. C.; McArdle, John J.; Bell, Cathy K.; Carlton, Barry S.; Miyamoto, Robin H.; Nishimura, Stephanie T.; Else, Iwalani R. N.; Guerrero, Anthony P. S.; Darmal, Arsalan; Yates, Alayne; Waldron, Jane A.

    2006-01-01

    Objectives: The prevalence rates of disorders among a community-based sample of Hawaiian youths were determined and compared to previously published epidemiological studies. Method: Using a two-phase design, 7,317 adolescents were surveyed (60% participation rate), from which 619 were selected in a modified random sample during the 1992-1993 to…

  17. Determining a carbohydrate profile for Hansenula polymorpha

    NASA Technical Reports Server (NTRS)

    Petersen, G. R.

    1985-01-01

    The determination of the levels of carbohydrates in the yeast Hansenula polymorpha required the development of new analytical procedures. Existing fractionation and analytical methods were adapted to deal with the problems involved with the lysis of whole cells. Using these new procedures, the complete carbohydrate profiles of H. polymorpha and selected mutant strains were determined and shown to correlate favourably with previously published results.

  18. Gray Matter Correlates of Fluid, Crystallized, and Spatial Intelligence: Testing the P-FIT Model

    ERIC Educational Resources Information Center

    Colom, Roberto; Haier, Richard J.; Head, Kevin; Alvarez-Linera, Juan; Quiroga, Maria Angeles; Shih, Pei Chun; Jung, Rex E.

    2009-01-01

    The parieto-frontal integration theory (P-FIT) nominates several areas distributed throughout the brain as relevant for intelligence. This theory was derived from previously published studies using a variety of both imaging methods and tests of cognitive ability. Here we test this theory in a new sample of young healthy adults (N = 100) using a…

  19. Cancer Control Research among Cambodian Americans in Washington

    PubMed Central

    Taylor, Victoria M.; Jackson, J. Carey; Tu, Shin-Ping

    2006-01-01

    Purpose We summarized previous and ongoing cancer control research among Cambodian immigrants in Washington. Methods A literature review of articles and published abstracts was conducted. Findings Cambodian Americans have a limited understanding of Western biomedical concepts, and low levels of cancer screening participation. Conclusions Culturally appropriate cancer control interventions for Cambodian Americans should be developed, implemented, and evaluated. PMID:11567509

  20. A Comparison of Multivariable Control Design Techniques for a Turbofan Engine Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Watts, Stephen R.

    1995-01-01

    This paper compares two previously published design procedures for two different multivariable control design techniques for application to a linear engine model of a jet engine. The two multivariable control design techniques compared were the Linear Quadratic Gaussian with Loop Transfer Recovery (LQG/LTR) and the H-Infinity synthesis. The two control design techniques were used with specific previously published design procedures to synthesize controls which would provide equivalent closed loop frequency response for the primary control loops while assuring adequate loop decoupling. The resulting controllers were then reduced in order to minimize the programming and data storage requirements for a typical implementation. The reduced order linear controllers designed by each method were combined with the linear model of an advanced turbofan engine and the system performance was evaluated for the continuous linear system. Included in the performance analysis are the resulting frequency and transient responses as well as actuator usage and rate capability for each design method. The controls were also analyzed for robustness with respect to structured uncertainties in the unmodeled system dynamics. The two controls were then compared for performance capability and hardware implementation issues.

  1. Assessment of a condition-specific quality-of-life measure for patients with developmentally absent teeth: validity and reliability testing.

    PubMed

    Akram, A J; Ireland, A J; Postlethwaite, K C; Sandy, J R; Jerreat, A S

    2013-11-01

    This article describes the process of validity and reliability testing of a condition-specific quality-of-life measure for patients with hypodontia presenting for orthodontic treatment. The development of the instrument is described in a previous article. Royal Devon and Exeter NHS Foundation Trust & Musgrove Park Hospital, Taunton. The child perception questionnaire was used as a standard against which to test criterion validity. The Bland and Altman method was used to check agreement between the two questionnaires. Construct validity was tested using principal component analysis on the four sections of the questionnaire. Test-retest reliability was tested using intraclass correlation coefficient and Bland and Altman method. Cronbach's alpha was used to test internal consistency reliability. Overall the questionnaire showed good reliability, criterion and construct validity. This together with previous evidence of good face and content validity suggests that the instrument may prove useful in clinical practice and further research. This study has demonstrated that the newly developed condition-specific quality-of-life questionnaire is both valid and reliable for use in young patients with hypodontia. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  2. Enhancing Breast Cancer Recurrence Algorithms Through Selective Use of Medical Record Data

    PubMed Central

    Chubak, Jessica; Johnson, Lisa; Castillo, Adrienne; Weltzien, Erin; Caan, Bette J.

    2016-01-01

    Abstract Background: The utility of data-based algorithms in research has been questioned because of errors in identification of cancer recurrences. We adapted previously published breast cancer recurrence algorithms, selectively using medical record (MR) data to improve classification. Methods: We evaluated second breast cancer event (SBCE) and recurrence-specific algorithms previously published by Chubak and colleagues in 1535 women from the Life After Cancer Epidemiology (LACE) and 225 women from the Women’s Health Initiative cohorts and compared classification statistics to published values. We also sought to improve classification with minimal MR examination. We selected pairs of algorithms—one with high sensitivity/high positive predictive value (PPV) and another with high specificity/high PPV—using MR information to resolve discrepancies between algorithms, properly classifying events based on review; we called this “triangulation.” Finally, in LACE, we compared associations between breast cancer survival risk factors and recurrence using MR data, single Chubak algorithms, and triangulation. Results: The SBCE algorithms performed well in identifying SBCE and recurrences. Recurrence-specific algorithms performed more poorly than published except for the high-specificity/high-PPV algorithm, which performed well. The triangulation method (sensitivity = 81.3%, specificity = 99.7%, PPV = 98.1%, NPV = 96.5%) improved recurrence classification over two single algorithms (sensitivity = 57.1%, specificity = 95.5%, PPV = 71.3%, NPV = 91.9%; and sensitivity = 74.6%, specificity = 97.3%, PPV = 84.7%, NPV = 95.1%), with 10.6% MR review. Triangulation performed well in survival risk factor analyses vs analyses using MR-identified recurrences. Conclusions: Use of multiple recurrence algorithms in administrative data, in combination with selective examination of MR data, may improve recurrence data quality and reduce research costs. PMID:26582243

  3. Detection of no-model input-output pairs in closed-loop systems.

    PubMed

    Potts, Alain Segundo; Alvarado, Christiam Segundo Morales; Garcia, Claudio

    2017-11-01

    The detection of no-model input-output (IO) pairs is important because it can speed up the multivariable system identification process, since all the pairs with null transfer functions are previously discarded and it can also improve the identified model quality, thus improving the performance of model based controllers. In the available literature, the methods focus just on the open-loop case, since in this case there is not the effect of the controller forcing the main diagonal in the transfer matrix to one and all the other terms to zero. In this paper, a modification of a previous method able to detect no-model IO pairs in open-loop systems is presented, but adapted to perform this duty in closed-loop systems. Tests are performed by using the traditional methods and the proposed one to show its effectiveness. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Paediatric reference interval and biological variation trends of thyrotropin (TSH) and free thyroxine (T4) in an Asian population.

    PubMed

    Loh, Tze Ping; Sethi, Sunil Kumar; Metz, Michael Patrick

    2015-08-01

    To describe the reference intervals and biological variation data for thyrotropin (TSH) and free thyroxine (FT4) in a mixed Asian population using an indirect sampling approach and to compare them with published reports. TSH and FT4 of children measured once or twice over a 7-year period (2008-2014) at primary-care and tertiary-care settings were extracted from the laboratory information system. After excluding outliers, age-related reference intervals were derived using the Lambda-Mu-Sigma (LMS) approach, while age-partitioned biological variation data were obtained according to recommendations by Fraser and Harris. Both TSH and FT4 were very high at birth and declined with age. Similarly within-individual and between-individual biological variations were higher for both TSH and FT4 at birth and also declined with age. Our data were broadly similar to previous studies. Significant heterogeneity in study population and methods prohibited direct numerical comparison between this and previously published studies. This study fills two important gaps in our knowledge of paediatric thyroid function by reporting the centile trends (and reference values) in a mixed Asian population, as well as providing age-partitioned biological variation data. The variation in published reference intervals highlights the difficulty in harmonising paediatric thyroid reference intervals or recommending universal clinical cut-offs. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. A new NIST primary standardization of 18F.

    PubMed

    Fitzgerald, R; Zimmerman, B E; Bergeron, D E; Cessna, J C; Pibida, L; Moreira, D S

    2014-02-01

    A new primary standardization of (18)F by NIST is reported. The standard is based on live-timed beta-gamma anticoincidence counting with confirmatory measurements by three other methods: (i) liquid scintillation (LS) counting using CIEMAT/NIST (3)H efficiency tracing; (ii) triple-to-double coincidence ratio (TDCR) counting; and (iii) NaI integral counting and HPGe γ-ray spectrometry. The results are reported as calibration factors for NIST-maintained ionization chambers (including some "dose calibrators"). The LS-based methods reveal evidence for cocktail instability for one LS cocktail. Using an ionization chamber to link this work with previous NIST results, the new value differs from the previous reports by about 4%, but appears to be in good agreement with the key comparison reference value (KCRV) of 2005. © 2013 Published by Elsevier Ltd.

  6. MMASS: an optimized array-based method for assessing CpG island methylation.

    PubMed

    Ibrahim, Ashraf E K; Thorne, Natalie P; Baird, Katie; Barbosa-Morais, Nuno L; Tavaré, Simon; Collins, V Peter; Wyllie, Andrew H; Arends, Mark J; Brenton, James D

    2006-01-01

    We describe an optimized microarray method for identifying genome-wide CpG island methylation called microarray-based methylation assessment of single samples (MMASS) which directly compares methylated to unmethylated sequences within a single sample. To improve previous methods we used bioinformatic analysis to predict an optimized combination of methylation-sensitive enzymes that had the highest utility for CpG-island probes and different methods to produce unmethylated representations of test DNA for more sensitive detection of differential methylation by hybridization. Subtraction or methylation-dependent digestion with McrBC was used with optimized (MMASS-v2) or previously described (MMASS-v1, MMASS-sub) methylation-sensitive enzyme combinations and compared with a published McrBC method. Comparison was performed using DNA from the cell line HCT116. We show that the distribution of methylation microarray data is inherently skewed and requires exogenous spiked controls for normalization and that analysis of digestion of methylated and unmethylated control sequences together with linear fit models of replicate data showed superior statistical power for the MMASS-v2 method. Comparison with previous methylation data for HCT116 and validation of CpG islands from PXMP4, SFRP2, DCC, RARB and TSEN2 confirmed the accuracy of MMASS-v2 results. The MMASS-v2 method offers improved sensitivity and statistical power for high-throughput microarray identification of differential methylation.

  7. Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS

    PubMed Central

    Chitranshi, Priyanka; da Costa, Gonçalo Gamboa

    2016-01-01

    We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219

  8. An improved method of measuring heart rate using a webcam

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Ouyang, Jianfei; Yan, Yonggang

    2014-09-01

    Measuring heart rate traditionally requires special equipment and physical contact with the subject. Reliable non-contact and low-cost measurements are highly desirable for convenient and comfortable physiological self-assessment. Previous work has shown that consumer-grade cameras can provide useful signals for remote heart rate measurements. In this paper a simple and robust method of measuring the heart rate using low-cost webcam is proposed. Blood volume pulse is extracted by proper Region of Interest (ROI) and color channel selection from image sequences of human faces without complex computation. Heart rate is subsequently quantified by spectrum analysis. The method is successfully applied under natural lighting conditions. Results of experiments show that it takes less time, is much simpler, and has similar accuracy to the previously published and widely used method of Independent Component Analysis (ICA). Benefitting from non-contact, convenience, and low-costs, it provides great promise for popularization of home healthcare and can further be applied to biomedical research.

  9. Discussion on “A Fuzzy Method for Medical Diagnosis of Headache”

    NASA Astrophysics Data System (ADS)

    Hung, Kuo-Chen; Wou, Yu-Wen; Julian, Peterson

    This paper is in response to the report of Ahn, Mun, Kim, Oh, and Han published in IEICE Trans. INF. & SYST., Vol.E91-D, No.4, 2008, 1215-1217. They tried to extend their previous paper that published on IEICE Trans. INF. & SYST., Vol.E86-D, No.12, 2003, 2790-2793. However, we will point out that their extension is based on the detailed data of knowing the frequency of three types. Their new occurrence information based on intuitionistic fuzzy set for medical diagnosis of headache becomes redundant. We advise researchers to directly use the detailed data to decide the diagnosis of headache.

  10. Bolometric Light Curves of Peculiar Type II-P Supernovae

    NASA Astrophysics Data System (ADS)

    Lusk, Jeremy A.; Baron, E.

    2017-04-01

    We examine the bolometric light curves of five Type II-P supernovae (SNe 1998A, 2000cb, 2006V, 2006au, and 2009E), which are thought to originate from blue supergiant progenitors like that of SN 1987A, using a new python package named SuperBoL. With this code, we calculate SNe light curves using three different common techniques common from the literature: the quasi-bolometric method, which integrates the observed photometry, the direct integration method, which additionally corrects for unobserved flux in the UV and IR, and the bolometric correction method, which uses correlations between observed colors and V-band bolometric corrections. We present here the light curves calculated by SuperBoL, along with previously published light curves, as well as peak luminosities and 56Ni yields. We find that the direct integration and bolometric correction light curves largely agree with previously published light curves, but with what we believe to be more robust error calculations, with 0.2≲ δ {L}{bol}/{L}{bol}≲ 0.5. Peak luminosities and 56Ni masses are similarly comparable to previous work. SN 2000cb remains an unusual member of this sub-group, owing to the faster rise and flatter plateau than the other supernovae in the sample. Initial comparisons with the NLTE atmosphere code PHOENIX show that the direct integration technique reproduces the luminosity of a model supernova spectrum to ˜5% when given synthetic photometry of the spectrum as input. Our code is publicly available. The ability to produce bolometric light curves from observed sets of broadband light curves should be helpful in the interpretation of other types of supernovae, particularly those that are not well characterized, such as extremely luminous supernovae and faint fast objects.

  11. Bolometric Lightcurves of Peculiar Type II-P Supernovae

    NASA Astrophysics Data System (ADS)

    Lusk, Jeremy A.; Baron, Edward A.

    2017-01-01

    We examine the bolometric lightcurves of five Type II-P supernovae (SNe 1998A, 2000cb, 2006V, 2006au and 2009E) which are thought to originate from blue supergiant progenitors using a new python package named SuperBoL. With this code, we calculate SNe lightcurves using three different techniques common in the literature: the quasi-bolometric method, which integrates the observed photometry, the direct integration method, which additionally corrects for unobserved flux in the UV and IR, and the bolometric correction method, which uses correlations between observed colors and V-band bolometric corrections. We present here the lightcurves calculated by SuperBoL along with previously published lightcurves, as well as peak luminosities and 56Ni yields. We find that the direct integration and bolometric correction lightcurves largely agree with previously published lightcurves, but with what we believe to be more robust error calculations, with 0.2 ≤ δL/L ≤ 0.5. Peak luminosities and 56Ni masses are similarly comparable to previous work. SN 2000cb remains an unusual member of this sub-group, owing to the faster rise and flatter plateau than the other supernovae in the sample. Initial comparisons with the NLTE atmosphere code PHOENIX show that the direct integration technique reproduces the luminosity of a model supernova spectrum to ˜5% when given synthetic photometry of the spectrum as input. Our code is publicly available. The ability to produce bolometric lightcurves from observed sets of broad-band light curves should be helpful in the interpretation of other types of supernovae, particularly those that are not well characterized, such as extremely luminous supernovae and faint fast objects.

  12. Analysis of steady-state salt-water upconing with application at Truro well field, Cape Cod, Massachusetts

    USGS Publications Warehouse

    Reilly, T.E.; Frimpter, M.H.; LeBlanc, D.R.; Goodman, A.S.

    1987-01-01

    Sharp interface methods have been used successfully to describe the physics of upconing. A finite-element model is developed to simulate a sharp interface for determination of the steady-state position of the interface and maximum permissible well discharges. The model developed is compared to previous published electric-analog model results of Bennett and others (1968). -from Authors

  13. Global Ray Tracing Simulations of the SABER Gravity Wave Climatology

    DTIC Science & Technology

    2009-01-01

    atmosphere , the residual temperature profiles are analyzed by a combi- nation of maximum entropy method (MEM) and harmonic analysis, thus providing the...accepted 24 February 2009; published 30 April 2009. [1] Since February 2002, the SABER (sounding of the atmosphere using broadband emission radiometry...satellite instrument has measured temperatures throughout the entire middle atmosphere . Employing the same techniques as previously used for CRISTA

  14. CeasIng Cpap At standarD criteriA (CICADA): impact on weight gain, time to full feeds and caffeine use.

    PubMed

    Broom, Margaret; Ying, Lei; Wright, Audrey; Stewart, Alice; Abdel-Latif, Mohamed E; Shadbolt, Bruce; Todd, David A

    2014-09-01

    In our previous randomised controlled trial (RCT), we have shown in preterm babies (PBs) <30 weeks gestation that CeasIng Cpap At standarD criteriA (CICADA (method 1)) compared with cycling off continuous positive airway pressure (CPAP) gradually (method 2) or cycling off CPAP gradually with low flow air/oxygen during periods off CPAP (method 3) reduces CPAP cessation time in PBs <30 weeks gestation. This retrospective study reviewed weight gain, time to reach full feeds and time to cease caffeine in PBs previously enrolled in the RCT. Data were collected from 162 of the 177 PBs, and there was no significant difference in the projected weight gain between the three methods. Based on intention to treat, the time taken to reach full feeds for all three methods showed no significant difference. However, post hoc analysis showed the CICADA method compared with cycling off gradually just failed significance (30.3±1.6 vs 31.1±2.4 (weeks corrected gestational age (Wks CGA±SD)), p=0.077). Analysis of time to cease caffeine showed there was a significant difference between the methods with PBs randomised to the CICADA method compared with the cycling off method ceasing caffeine almost a week earlier (33.6±2.4 vs 34.5±2.8 (Wks CGA±SD), p=0.02). This retrospective study provides evidence to substantiate the optimum method of ceasing CPAP; the CICADA method, does not adversely affect weight gain, time to reach full feeds and may reduce time to cease caffeine in PBs <30 weeks gestation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Metal artifact reduction for CT-based luggage screening.

    PubMed

    Karimi, Seemeen; Martz, Harry; Cosman, Pamela

    2015-01-01

    In aviation security, checked luggage is screened by computed tomography scanning. Metal objects in the bags create artifacts that degrade image quality. Though there exist metal artifact reduction (MAR) methods mainly in medical imaging literature, they require knowledge of the materials in the scan, or are outlier rejection methods. To improve and evaluate a MAR method we previously introduced, that does not require knowledge of the materials in the scan, and gives good results on data with large quantities and different kinds of metal. We describe in detail an optimization which de-emphasizes metal projections and has a constraint for beam hardening and scatter. This method isolates and reduces artifacts in an intermediate image, which is then fed to a previously published sinogram replacement method. We evaluate the algorithm for luggage data containing multiple and large metal objects. We define measures of artifact reduction, and compare this method against others in MAR literature. Metal artifacts were reduced in our test images, even for multiple and large metal objects, without much loss of structure or resolution. Our MAR method outperforms the methods with which we compared it. Our approach does not make assumptions about image content, nor does it discard metal projections.

  16. Classifying medical relations in clinical text via convolutional neural networks.

    PubMed

    He, Bin; Guan, Yi; Dai, Rui

    2018-05-16

    Deep learning research on relation classification has achieved solid performance in the general domain. This study proposes a convolutional neural network (CNN) architecture with a multi-pooling operation for medical relation classification on clinical records and explores a loss function with a category-level constraint matrix. Experiments using the 2010 i2b2/VA relation corpus demonstrate these models, which do not depend on any external features, outperform previous single-model methods and our best model is competitive with the existing ensemble-based method. Copyright © 2018. Published by Elsevier B.V.

  17. Validation of light water reactor calculation methods and JEF-1-based data libraries by TRX and BAPL critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Pelloni, S.; Grimm, P.

    1991-04-01

    This paper analyzes the capability of various code systems and JEF-1-based nuclear data libraries to compute light water reactor lattices by comparing calculations with results from thermal reactor benchmark experiments TRX and BAPL and with previously published values. With the JEF-1 evaluation, eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and all methods give reasonable results for the measured reaction rate ratios within, or not too far from, the experimental uncertainty.

  18. Distributed synchronization of networked drive-response systems: A nonlinear fixed-time protocol.

    PubMed

    Zhao, Wen; Liu, Gang; Ma, Xi; He, Bing; Dong, Yunfeng

    2017-11-01

    The distributed synchronization of networked drive-response systems is investigated in this paper. A novel nonlinear protocol is proposed to ensure that the tracking errors converge to zeros in a fixed-time. By comparison with previous synchronization methods, the present method considers more practical conditions and the synchronization time is not dependent of arbitrary initial conditions but can be offline pre-assign according to the task assignment. Finally, the feasibility and validity of the presented protocol have been illustrated by a numerical simulation. Copyright © 2017. Published by Elsevier Ltd.

  19. Determining the Depth of Infinite Horizontal Cylindrical Sources from Spontaneous Polarization Data

    NASA Astrophysics Data System (ADS)

    Cooper, G. R. J.; Stettler, E. H.

    2017-03-01

    Previously published semi-automatic interpretation methods that use ratios of analytic signal amplitudes of orders that differ by one to determine the distance to potential field sources are shown also to apply to self-potential (S.P.) data when the source is a horizontal cylinder. Local minima of the distance (when it becomes closest to zero) give the source depth. The method was applied to an S.P. anomaly from the Bourkes Luck potholes district in Mpumalanga Province, South Africa, and gave results that were confirmed by drilling.

  20. Comprehensive European dietary exposure model (CEDEM) for food additives.

    PubMed

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  1. Nose profile morphology and accuracy study of nose profile estimation method in Scottish subadult and Indonesian adult populations.

    PubMed

    Sarilita, Erli; Rynn, Christopher; Mossey, Peter A; Black, Sue; Oscandar, Fahmi

    2018-05-01

    This study investigated nose profile morphology and its relationship to the skull in Scottish subadult and Indonesian adult populations, with the aim of improving the accuracy of forensic craniofacial reconstruction. Samples of 86 lateral head cephalograms from Dundee Dental School (mean age, 11.8 years) and 335 lateral head cephalograms from the Universitas Padjadjaran Dental Hospital, Bandung, Indonesia (mean age 24.2 years), were measured. The method of nose profile estimation based on skull morphology previously proposed by Rynn and colleagues in 2010 (FSMP 6:20-34) was tested in this study. Following this method, three nasal aperture-related craniometrics and six nose profile dimensions were measured from the cephalograms. To assess the accuracy of the method, six nose profile dimensions were estimated from the three craniometric parameters using the published method and then compared to the actual nose profile dimensions.In the Scottish subadult population, no sexual dimorphism was evident in the measured dimensions. In contrast, sexual dimorphism of the Indonesian adult population was evident in all craniometric and nose profile dimensions; notably, males exhibited statistically significant larger values than females. The published method by Rynn and colleagues (FSMP 6:20-34, 2010) performed better in the Scottish subadult population (mean difference of maximum, 2.35 mm) compared to the Indonesian adult population (mean difference of maximum, 5.42 mm in males and 4.89 mm in females).In addition, regression formulae were derived to estimate nose profile dimensions based on the craniometric measurements for the Indonesian adult population. The published method is not sufficiently accurate for use on the Indonesian population, so the derived method should be used. The accuracy of the published method by Rynn and colleagues (FSMP 6:20-34, 2010) was sufficiently reliable to be applied in Scottish subadult population.

  2. Increased longitudinal contractility and diastolic function at rest in well-trained amateur Marathon runners: a speckle tracking echocardiography study

    PubMed Central

    2014-01-01

    Background Regular physical activity reduces cardiovascular risk. There is concern that Marathon running might acutely damage the heart. It is unknown to what extent intensive physical endurance activity influences the cardiac mechanics at resting condition. Methods Eighty-four amateur marathon runners (43 women and 41 men) from Berlin-Brandenburg area who had completed at least one marathon previously underwent clinical examination and echocardiography at least 10 days before the Berlin Marathon at rest. Standard transthoracic echocardiography and 2D strain and strain rate analysis were performed. The 2D Strain and strain rate values were compared to previous published data of healthy untrained individuals. Results The average global longitudinal peak systolic strain of the left ventricle was -23 +/- 2% with peak systolic strain rate -1.39 +/- 0.21/s, early diastolic strain rate 2.0 +/- 0.40/s and late diastolic strain rate 1.21 +/- 0.31/s. These values are significantly higher compared to the previous published values of normal age-adjusted individuals. In addition, no age-related decline of longitudinal contractility in well-trained athletes was observed. Conclusions There is increased overall longitudinal myocardial contractility at rest in experienced endurance athletes compared to the published normal values in the literature indicating a preserved and even supra-normal contractility in the athletes. There is no age dependent decline of the longitudinal 2D Strain values. This underlines the beneficial effects of regular physical exercise even in advanced age. PMID:24571726

  3. A 3D musculoskeletal model of the western lowland gorilla hind limb: moment arms and torque of the hip, knee and ankle.

    PubMed

    Goh, Colleen; Blanchard, Mary L; Crompton, Robin H; Gunther, Michael M; Macaulay, Sophie; Bates, Karl T

    2017-10-01

    Three-dimensional musculoskeletal models have become increasingly common for investigating muscle moment arms in studies of vertebrate locomotion. In this study we present the first musculoskeletal model of a western lowland gorilla hind limb. Moment arms of individual muscles around the hip, knee and ankle were compared with previously published data derived from the experimental tendon travel method. Considerable differences were found which we attribute to the different methodologies in this specific case. In this instance, we argue that our 3D model provides more accurate and reliable moment arm data than previously published data on the gorilla because our model incorporates more detailed consideration of the 3D geometry of muscles and the geometric constraints that exist on their lines-of-action about limb joints. Our new data have led us to revaluate the previous conclusion that muscle moment arms in the gorilla hind limb are optimised for locomotion with crouched or flexed limb postures. Furthermore, we found that bipedalism and terrestrial quadrupedalism coincided more regularly with higher moment arms and torque around the hip, knee and ankle than did vertical climbing. This indicates that the ability of a gorilla to walk bipedally is not restricted by musculoskeletal adaptations for quadrupedalism and vertical climbing, at least in terms of moment arms and torque about hind limb joints. © 2017 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  4. A novel approach to quantify cybersecurity for electric power systems

    NASA Astrophysics Data System (ADS)

    Kaster, Paul R., Jr.

    Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.

  5. Occupational Asthma in Antibiotic Manufacturing Workers: Case Reports and Systematic Review

    PubMed Central

    Díaz Angulo, Sara; Szram, Joanna; Welch, Jenny; Cannon, Julie; Cullinan, Paul

    2011-01-01

    Background. The risks of occupational asthma (OA) from antibiotics are uncertain. We report 4 new cases and a systematic review of the literature. Methods. Cases were identified through a specialist clinic, each underwent specific provocation testing (SPT). We subsequently reviewed the published literature. Results. The patients were employed in the manufacture of antibiotics; penicillins were implicated in three cases, in the fourth erythromycin, not previously reported to cause OA. In two, there was evidence of specific IgE sensitisation. At SPT each developed a late asthmatic reaction and increased bronchial hyperresponsiveness. 36 case reports have been previously published, 26 (citing penicillins or cephalosporins). Seven cross-sectional workplace-based surveys found prevalences of 5–8%. Conclusions. OA in antibiotic manufacturers may be more common than is generally recognised. Its pathogenesis remains unclear; immunological tests are of uncertain value and potential cases require confirmation with SPT. Further study of its frequency, mechanisms, and diagnosis is required. PMID:21603168

  6. Complex refractive index measurements for BaF 2 and CaF 2 via single-angle infrared reflectance spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly-Gorham, Molly Rose K.; DeVetter, Brent M.; Brauer, Carolyn S.

    We have re-investigated the optical constants n and k for the homologous series of inorganic salts barium fluoride (BaF2) and calcium fluoride (CaF2) using a single-angle near-normal incidence reflectance device in combination with a calibrated Fourier transform infrared (FTIR) spectrometer. Our results are in good qualitative agreement with most previous works. However, certain features of the previously published data near the reststrahlen band exhibit distinct differences in spectral characteristics. Notably, our measurements of BaF2 do not include a spectral feature in the ~250 cm-1 reststrahlen band that was previously published. Additionally, CaF2 exhibits a distinct wavelength shift relative to themore » model derived from previously published data. We confirmed our results with recently published works that use significantly more modern instrumentation and data reduction techniques« less

  7. Mean Glandular dose coefficients (DgN) for x-ray spectra used in contemporary breast imaging systems

    PubMed Central

    Nosratieh, Anita; Hernandez, Andrew; Shen, Sam Z.; Yaffe, Martin J.; Seibert, J. Anthony; Boone, John M.

    2015-01-01

    Purpose To develop tables of normalized glandular dose coefficients DgN for a range of anode–filter combinations and tube voltages used in contemporary breast imaging systems. Methods Previously published mono-energetic DgN values were used with various spectra to mathematically compute DgN coefficients. The tungsten anode spectra from TASMICS were used; Molybdenum and Rhodium anode-spectra were generated using MCNPx Monte Carlo code. The spectra were filtered with various thicknesses of Al, Rh, Mo or Cu. An initial HVL calculation was made using the anode and filter material. A range of the HVL values was produced with the addition of small thicknesses of polymethyl methacrylate (PMMA) as a surrogate for the breast compression paddle, to produce a range of HVL values at each tube voltage. Using a spectral weighting method, DgN coefficients for the generated spectra were calculated for breast glandular densities of 0%, 12.5%, 25%, 37.5%, 50% and 100% for a range of compressed breast thicknesses from 3 to 8 cm. Results Eleven tables of normalized glandular dose (DgN) coefficients were produced for the following anode/filter combinations: W + 50 μm Ag, W + 500 μm Al, W + 700 μm Al, W + 200 μm Cu, W + 300 μm Cu, W + 50 μm Rh, Mo + 400 μm Cu, Mo + 30 μm Mo, Mo + 25 μm Rh, Rh + 400 μm Cu and Rh + 25 μm Rh. Where possible, these results were compared to previously published DgN values and were found to be on average less than 2% different than previously reported values. Conclusion Over 200-pages of DgN coefficients were computed for modeled x-ray system spectra that are used in a number of new breast imaging applications. The reported values were found to be in excellent agreement when compared to published values. PMID:26348995

  8. Torsional anharmonicity in the conformational thermodynamics of flexible molecules

    NASA Astrophysics Data System (ADS)

    Miller, Thomas F., III; Clary, David C.

    We present an algorithm for calculating the conformational thermodynamics of large, flexible molecules that combines ab initio electronic structure theory calculations with a torsional path integral Monte Carlo (TPIMC) simulation. The new algorithm overcomes the previous limitations of the TPIMC method by including the thermodynamic contributions of non-torsional vibrational modes and by affordably incorporating the ab initio calculation of conformer electronic energies, and it improves the conventional ab initio treatment of conformational thermodynamics by accounting for the anharmonicity of the torsional modes. Using previously published ab initio results and new TPIMC calculations, we apply the algorithm to the conformers of the adrenaline molecule.

  9. A novel genome signature based on inter-nucleotide distances profiles for visualization of metagenomic data

    NASA Astrophysics Data System (ADS)

    Xie, Xian-Hua; Yu, Zu-Guo; Ma, Yuan-Lin; Han, Guo-Sheng; Anh, Vo

    2017-09-01

    There has been a growing interest in visualization of metagenomic data. The present study focuses on the visualization of metagenomic data using inter-nucleotide distances profile. We first convert the fragment sequences into inter-nucleotide distances profiles. Then we analyze these profiles by principal component analysis. Finally the principal components are used to obtain the 2-D scattered plot according to their source of species. We name our method as inter-nucleotide distances profiles (INP) method. Our method is evaluated on three benchmark data sets used in previous published papers. Our results demonstrate that the INP method is good, alternative and efficient for visualization of metagenomic data.

  10. Preoptimised VB: a fast method for the ground and excited states of ionic clusters II. Delocalised preoptimisation for He 2+, Ar 2+, He 3+ and Ar 3+

    NASA Astrophysics Data System (ADS)

    Archirel, Pierre

    1997-09-01

    We generalise the preoptimisation of orbitals within VB (Part I of this series) through letting the orbitals delocalise on the neighbouring fragments. The method is more accurate than the local preoptimisation. The method is tested on the rare gas clusters He 2+, Ar 2+, He 3+ and Ar 3+. The results are in good agreement with previously published data on these systems. We complete these data with higher excited states. The binding energies of (ArCO) +, (ArN 2) + and N 4+ are revisited. The simulation of the SCF method is extended to Cu +H 2O.

  11. Application of the ratio difference spectrophotometry to the determination of ibuprofen and famotidine in their combined dosage form: comparison with previously published spectrophotometric methods.

    PubMed

    Zaazaa, Hala E; Elzanfaly, Eman S; Soudi, Aya T; Salem, Maissa Y

    2015-05-15

    Ratio difference spectrophotometric method was developed for the determination of ibuprofen and famotidine in their mixture form. Ibuprofen and famotidine were determined in the presence of each other by the ratio difference spectrophotometric (RD) method where linearity was obtained from 50 to 600μg/mL and 2.5 to 25μg/mL for ibuprofen and famotidine, respectively. The suggested method was validated according to ICH guidelines and successfully applied for the analysis of ibuprofen and famotidine in their pharmaceutical dosage forms without interference from any additives or excipients. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Single-shot speckle reduction in numerical reconstruction of digitally recorded holograms: comment.

    PubMed

    Maycock, Jonathan; Hennelly, Bryan; McDonald, John

    2015-09-01

    We comment on a recent Letter by Hincapie et al. [Opt. Lett.40, 1623 (2015)], in which the authors proposed a method to reduce the speckle noise in digital holograms. This method was previously published by us in Maycock ["Improving reconstructions of digital holograms," Ph.D. thesis (National University of Ireland, 2012)] and Maycock and Hennelly [Improving Reconstructions of Digital Holograms: Speckle Reduction and Occlusions in Digital Holography (Lambert Academic, 2014)]. We also wish to highlight an important limitation of the method resulting from the superposition of different perspectives of the object/scene, which was not addressed in their Letter.

  13. Book review: The Wilderness Debate Rages On: Continuing the Great New Wilderness Debate

    Treesearch

    Peter Landres

    2009-01-01

    The Wilderness Debate Rages On is a collection of mostly previously published papers about the meaning, value, and role of wilderness and continues the discussion that was propelled by the editors' previous book The Great New Wilderness Debate (also a collection of papers) published in 1998. The editors state that this sequel to their previous book is mandated...

  14. Giuliano Vanghetti and the innovation of "cineplastic operations".

    PubMed

    Tropea, Peppino; Mazzoni, Alberto; Micera, Silvestro; Corbo, Massimo

    2017-10-10

    Developing functional artificial limbs for amputees has been a centuries-old challenge in medicine. We review the mechanical and neurologic principles of "cineplastic operations" and "plastic motors" used to restore movements in prostheses, with special attention to the work of Giuliano Vanghetti. We evaluated original publications describing cineplastic operations, biographic information, writings, drawings, and unpublished letters from the Vanghetti library, preserved in Empoli, Italy, and performed a bibliographic search and comparison for similar procedures in the literature. Vanghetti's method for cineplastic operations differs from similar previous methods, being the first aimed at exploiting natural movements of the remnant muscles to activate the mechanical prosthesis, and the first to do so by directly connecting the prosthesis to the residual muscles and tendons. This represented a frame-changing innovation for that time and paved the way for current neuroprosthetic approaches. The first description of the method was published in 1898 and human studies started in 1900. The results of these studies were presented in 1905 and published in 1906 in Plastic and Kinematic Prosthesis . A German surgeon, Ferdinand Sauerbruch, often acknowledged as the inventor of the method, published his first results in 1915. Vanghetti was the first to accurately perform and describe cineplastic operations for patients following an upper arm amputation. He considered the neurologic implications of the problem and, perhaps in an effort to provide more appropriate proprioceptive feedback, he intuitively applied the prostheses so that they were functionally activated by the muscles of the proximal stump. Copyright © 2017 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the American Academy of Neurology.

  15. Prediction and analysis of beta-turns in proteins by support vector machine.

    PubMed

    Pham, Tho Hoan; Satou, Kenji; Ho, Tu Bao

    2003-01-01

    Tight turn has long been recognized as one of the three important features of proteins after the alpha-helix and beta-sheet. Tight turns play an important role in globular proteins from both the structural and functional points of view. More than 90% tight turns are beta-turns. Analysis and prediction of beta-turns in particular and tight turns in general are very useful for the design of new molecules such as drugs, pesticides, and antigens. In this paper, we introduce a support vector machine (SVM) approach to prediction and analysis of beta-turns. We have investigated two aspects of applying SVM to the prediction and analysis of beta-turns. First, we developed a new SVM method, called BTSVM, which predicts beta-turns of a protein from its sequence. The prediction results on the dataset of 426 non-homologous protein chains by sevenfold cross-validation technique showed that our method is superior to the other previous methods. Second, we analyzed how amino acid positions support (or prevent) the formation of beta-turns based on the "multivariable" classification model of a linear SVM. This model is more general than the other ones of previous statistical methods. Our analysis results are more comprehensive and easier to use than previously published analysis results.

  16. NCI Helps Children’s Hospital of Philadelphia to Identify and Treat New Target in Pediatric Cancer | Poster

    Cancer.gov

    There may be a new, more effective method for treating high-risk neuroblastoma, according to scientists at the Children’s Hospital of Philadelphia and collaborators in the Cancer and Inflammation Program at NCI at Frederick. Together, the groups published a study describing a previously unrecognized protein on neuroblastoma cells, called GPC2, as well as the creation of a

  17. The cardiac muscle duplex as a method to study myocardial heterogeneity

    PubMed Central

    Solovyova, O.; Katsnelson, L.B.; Konovalov, P.V.; Kursanov, A.G.; Vikulova, N.A.; Kohl, P.; Markhasin, V.S.

    2014-01-01

    This paper reviews the development and application of paired muscle preparations, called duplex, for the investigation of mechanisms and consequences of intra-myocardial electro-mechanical heterogeneity. We illustrate the utility of the underlying combined experimental and computational approach for conceptual development and integration of basic science insight with clinically relevant settings, using previously published and new data. Directions for further study are identified. PMID:25106702

  18. Factors affecting the cost of tractor logging in the California Pine Region

    Treesearch

    M.E. Krueger

    1929-01-01

    The past five years have seen a very rapid expansion in the use of tractors for logging in the pine region of California. In 1923, when a previous bulletin of this series was published, steam donkey yarding, with which that study treated, was the prevailing method of yarding. During the season of 1928 probably not less than 60 percent of the timber output of this...

  19. Cost Modeling for Space Telescope

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2011-01-01

    Parametric cost models are an important tool for planning missions, compare concepts and justify technology investments. This paper presents on-going efforts to develop single variable and multi-variable cost models for space telescope optical telescope assembly (OTA). These models are based on data collected from historical space telescope missions. Standard statistical methods are used to derive CERs for OTA cost versus aperture diameter and mass. The results are compared with previously published models.

  20. Using Silica Sol as a Nanoglue to Prepare Nanoscale Mesoporous Composite Gel and Aerogels

    DTIC Science & Technology

    2000-03-31

    solution-phase reactants remain unaltered. Furthermore, the composite constitutes a rigid solid architecture, such that the silica aerogel structure...nm) was immobilized in a silica aerogel structure according to the method of the present invention. The optical properties of 9 these materials...Aerogel Preparation. Acid- and base-catalyzed silica aerogels were prepared by procedures similarto those previously published in Russo et al.J.Non

  1. Solitaire salvage: a stent retriever-assisted catheter reduction technical report.

    PubMed

    Parry, Phillip Vaughan; Morales, Alejandro; Jankowitz, Brian Thomas

    2016-07-01

    The endovascular management of giant aneurysms often proves difficult with standard techniques. Obtaining distal access to allow catheter reduction is often key to approaching these aneurysms, but several anatomic challenges make this task unsafe and not feasible. Obtaining distal anchor points and performing catheter reduction maneuvers using adjunctive devices is not a novel concept, however using the Solitaire in order to do so may have some distinct advantages compared with previously described methods. Here we describe our novel Solitaire salvage technique, which allowed successful reduction of a looped catheter within an aneurysm in three cases. While this technique is expensive and therefore best performed after standard maneuvers have failed, in our experience it was effective, safe, and more efficient than other methods. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Scene-based nonuniformity correction with reduced ghosting using a gated LMS algorithm.

    PubMed

    Hardie, Russell C; Baxley, Frank; Brys, Brandon; Hytla, Patrick

    2009-08-17

    In this paper, we present a scene-based nouniformity correction (NUC) method using a modified adaptive least mean square (LMS) algorithm with a novel gating operation on the updates. The gating is designed to significantly reduce ghosting artifacts produced by many scene-based NUC algorithms by halting updates when temporal variation is lacking. We define the algorithm and present a number of experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published methods including other LMS and constant statistics based methods. The experimental results include simulated imagery and a real infrared image sequence. We show that the proposed method significantly reduces ghosting artifacts, but has a slightly longer convergence time. (c) 2009 Optical Society of America

  3. Using the framework method for the analysis of qualitative data in multi-disciplinary health research.

    PubMed

    Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi

    2013-09-18

    The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.

  4. Predicting crystalline lens fall caused by accommodation from changes in wavefront error

    PubMed Central

    He, Lin; Applegate, Raymond A.

    2011-01-01

    PURPOSE To illustrate and develop a method for estimating crystalline lens decentration as a function of accommodative response using changes in wavefront error and show the method and limitations using previously published data (2004) from 2 iridectomized monkey eyes so that clinicians understand how spherical aberration can induce coma, in particular in intraocular lens surgery. SETTINGS College of Optometry, University of Houston, Houston, USA. DESIGN Evaluation of diagnostic test or technology. METHODS Lens decentration was estimated by displacing downward the wavefront error of the lens with respect to the limiting aperture (7.0 mm) and ocular first surface wavefront error for each accommodative response (0.00 to 11.00 diopters) until measured values of vertical coma matched previously published experimental data (2007). Lens decentration was also calculated using an approximation formula that only included spherical aberration and vertical coma. RESULTS The change in calculated vertical coma was consistent with downward lens decentration. Calculated downward lens decentration peaked at approximately 0.48 mm of vertical decentration in the right eye and approximately 0.31 mm of decentration in the left eye using all Zernike modes through the 7th radial order. Calculated lens decentration using only coma and spherical aberration formulas was peaked at approximately 0.45 mm in the right eye and approximately 0.23 mm in the left eye. CONCLUSIONS Lens fall as a function of accommodation was quantified noninvasively using changes in vertical coma driven principally by the accommodation-induced changes in spherical aberration. The newly developed method was valid for a large pupil only. PMID:21700108

  5. Effects of linking a soil-water-balance model with a groundwater-flow model

    USGS Publications Warehouse

    Stanton, Jennifer S.; Ryter, Derek W.; Peterson, Steven M.

    2013-01-01

    A previously published regional groundwater-flow model in north-central Nebraska was sequentially linked with the recently developed soil-water-balance (SWB) model to analyze effects to groundwater-flow model parameters and calibration results. The linked models provided a more detailed spatial and temporal distribution of simulated recharge based on hydrologic processes, improvement of simulated groundwater-level changes and base flows at specific sites in agricultural areas, and a physically based assessment of the relative magnitude of recharge for grassland, nonirrigated cropland, and irrigated cropland areas. Root-mean-squared (RMS) differences between the simulated and estimated or measured target values for the previously published model and linked models were relatively similar and did not improve for all types of calibration targets. However, without any adjustment to the SWB-generated recharge, the RMS difference between simulated and estimated base-flow target values for the groundwater-flow model was slightly smaller than for the previously published model, possibly indicating that the volume of recharge simulated by the SWB code was closer to actual hydrogeologic conditions than the previously published model provided. Groundwater-level and base-flow hydrographs showed that temporal patterns of simulated groundwater levels and base flows were more accurate for the linked models than for the previously published model at several sites, particularly in agricultural areas.

  6. Performing both propensity score and instrumental variable analyses in observational studies often leads to discrepant results: a systematic review.

    PubMed

    Laborde-Castérot, Hervé; Agrinier, Nelly; Thilly, Nathalie

    2015-10-01

    Propensity score (PS) and instrumental variable (IV) are analytical techniques used to adjust for confounding in observational research. More and more, they seem to be used simultaneously in studies evaluating health interventions. The present review aimed to analyze the agreement between PS and IV results in medical research published to date. Review of all published observational studies that evaluated a clinical intervention using simultaneously PS and IV analyses, as identified in MEDLINE and Web of Science. Thirty-seven studies, most of them published during the previous 5 years, reported 55 comparisons between results from PS and IV analyses. There was a slight/fair agreement between the methods [Cohen's kappa coefficient = 0.21 (95% confidence interval: 0.00, 0.41)]. In 23 cases (42%), results were nonsignificant for one method and significant for the other, and IV analysis results were nonsignificant in most situations (87%). Discrepancies are frequent between PS and IV analyses and can be interpreted in various ways. This suggests that researchers should carefully consider their analytical choices, and readers should be cautious when interpreting results, until further studies clarify the respective roles of the two methods in observational comparative effectiveness research. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Bayesian inference of interaction properties of noisy dynamical systems with time-varying coupling: capabilities and limitations

    NASA Astrophysics Data System (ADS)

    Wilting, Jens; Lehnertz, Klaus

    2015-08-01

    We investigate a recently published analysis framework based on Bayesian inference for the time-resolved characterization of interaction properties of noisy, coupled dynamical systems. It promises wide applicability and a better time resolution than well-established methods. At the example of representative model systems, we show that the analysis framework has the same weaknesses as previous methods, particularly when investigating interacting, structurally different non-linear oscillators. We also inspect the tracking of time-varying interaction properties and propose a further modification of the algorithm, which improves the reliability of obtained results. We exemplarily investigate the suitability of this algorithm to infer strength and direction of interactions between various regions of the human brain during an epileptic seizure. Within the limitations of the applicability of this analysis tool, we show that the modified algorithm indeed allows a better time resolution through Bayesian inference when compared to previous methods based on least square fits.

  8. Independent and combined analyses of sequences from all three genomic compartments converge on the root of flowering plant phylogeny

    PubMed Central

    Barkman, Todd J.; Chenery, Gordon; McNeal, Joel R.; Lyons-Weiler, James; Ellisens, Wayne J.; Moore, Gerry; Wolfe, Andrea D.; dePamphilis, Claude W.

    2000-01-01

    Plant phylogenetic estimates are most likely to be reliable when congruent evidence is obtained independently from the mitochondrial, plastid, and nuclear genomes with all methods of analysis. Here, results are presented from separate and combined genomic analyses of new and previously published data, including six and nine genes (8,911 bp and 12,010 bp, respectively) for different subsets of taxa that suggest Amborella + Nymphaeales (water lilies) are the first-branching angiosperm lineage. Before and after tree-independent noise reduction, most individual genomic compartments and methods of analysis estimated the Amborella + Nymphaeales basal topology with high support. Previous phylogenetic estimates placing Amborella alone as the first extant angiosperm branch may have been misled because of a series of specific problems with paralogy, suboptimal outgroups, long-branch taxa, and method dependence. Ancestral character state reconstructions differ between the two topologies and affect inferences about the features of early angiosperms. PMID:11069280

  9. A test method for determining adhesion forces and Hamaker constants of cementitious materials using atomic force microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lomboy, Gilson; Sundararajan, Sriram, E-mail: srirams@iastate.edu; Wang Kejin

    2011-11-15

    A method for determining Hamaker constant of cementitious materials is presented. The method involved sample preparation, measurement of adhesion force between the tested material and a silicon nitride probe using atomic force microscopy in dry air and in water, and calculating the Hamaker constant using appropriate contact mechanics models. The work of adhesion and Hamaker constant were computed from the pull-off forces using the Johnson-Kendall-Roberts and Derjagin-Muller-Toropov models. Reference materials with known Hamaker constants (mica, silica, calcite) and commercially available cementitious materials (Portland cement (PC), ground granulated blast furnace slag (GGBFS)) were studied. The Hamaker constants of the reference materialsmore » obtained are consistent with those published by previous researchers. The results indicate that PC has a higher Hamaker constant than GGBFS. The Hamaker constant of PC in water is close to the previously predicted value C{sub 3}S, which is attributed to short hydration time ({<=} 45 min) used in this study.« less

  10. Giuliano Vanghetti and the innovation of “cineplastic operations”

    PubMed Central

    Mazzoni, Alberto; Micera, Silvestro; Corbo, Massimo

    2017-01-01

    Objective: Developing functional artificial limbs for amputees has been a centuries-old challenge in medicine. We review the mechanical and neurologic principles of “cineplastic operations” and “plastic motors” used to restore movements in prostheses, with special attention to the work of Giuliano Vanghetti. Methods: We evaluated original publications describing cineplastic operations, biographic information, writings, drawings, and unpublished letters from the Vanghetti library, preserved in Empoli, Italy, and performed a bibliographic search and comparison for similar procedures in the literature. Results: Vanghetti's method for cineplastic operations differs from similar previous methods, being the first aimed at exploiting natural movements of the remnant muscles to activate the mechanical prosthesis, and the first to do so by directly connecting the prosthesis to the residual muscles and tendons. This represented a frame-changing innovation for that time and paved the way for current neuroprosthetic approaches. The first description of the method was published in 1898 and human studies started in 1900. The results of these studies were presented in 1905 and published in 1906 in Plastic and Kinematic Prosthesis. A German surgeon, Ferdinand Sauerbruch, often acknowledged as the inventor of the method, published his first results in 1915. Conclusions: Vanghetti was the first to accurately perform and describe cineplastic operations for patients following an upper arm amputation. He considered the neurologic implications of the problem and, perhaps in an effort to provide more appropriate proprioceptive feedback, he intuitively applied the prostheses so that they were functionally activated by the muscles of the proximal stump. PMID:28993523

  11. Evaluation of Brain Iron Content Based on Magnetic Resonance Imaging (MRI): Comparison among Phase Value, R2* and Magnitude Signal Intensity

    PubMed Central

    Yan, Shen-Qiang; Sun, Jian-Zhong; Yan, Yu-Qing; Wang, He; Lou, Min

    2012-01-01

    Background and Purpose Several magnetic resonance imaging (MRI) techniques are being exploited to measure brain iron levels increasingly as iron deposition has been implicated in some neurodegenerative diseases. However, there remains no unified evaluation of these methods as postmortem measurement isn't commonly available as the reference standard. The purpose of this study was to make a comparison among these methods and try to find a new index of brain iron. Methods We measured both phase values and R2* in twenty-four adults, and performed correlation analysis among the two methods and the previously published iron concentrations. We also proposed a new method using magnitude signal intensity and compared it with R2* and brain iron. Results We found phase value correlated with R2* in substantia nigra (r = −0.723, p<0.001) and putamen (r = −0.514, p = 0.010), while no correlations in red nucleus (r = −0.236, p = 0.268) and globus pallidus (r = −0.111, p = 0.605). And the new magnitude method had significant correlations in red nucleus (r = −0.593, p = 0.002), substantia nigra (r = −0.521, p = 0.009), globus pallidus (r = −0.750, p<0.001) and putamen (r = −0.547, p = 0.006) with R2*. A strong inverse correlation was also found between the new magnitude method and previously published iron concentrations in seven brain regions (r = −0.982, P<0.001). Conclusions Our study indicates that phase value may not be used for assessing the iron content in some brain regions especially globus pallidus. The new magnitude method is highly consistent with R2* especially in globus pallidus, and we assume that this approach may be acceptable as an index of iron content in iron-rich brain regions. PMID:22363719

  12. Methodes entropiques appliquees au probleme inverse en magnetoencephalographie

    NASA Astrophysics Data System (ADS)

    Lapalme, Ervig

    2005-07-01

    This thesis is devoted to biomagnetic source localization using magnetoencephalography. This problem is known to have an infinite number of solutions. So methods are required to take into account anatomical and functional information on the solution. The work presented in this thesis uses the maximum entropy on the mean method to constrain the solution. This method originates from statistical mechanics and information theory. This thesis is divided into two main parts containing three chapters each. The first part reviews the magnetoencephalographic inverse problem: the theory needed to understand its context and the hypotheses for simplifying the problem. In the last chapter of this first part, the maximum entropy on the mean method is presented: its origins are explained and also how it is applied to our problem. The second part is the original work of this thesis presenting three articles; one of them already published and two others submitted for publication. In the first article, a biomagnetic source model is developed and applied in a theoretical con text but still demonstrating the efficiency of the method. In the second article, we go one step further towards a realistic modelization of the cerebral activation. The main priors are estimated using the magnetoencephalographic data. This method proved to be very efficient in realistic simulations. In the third article, the previous method is extended to deal with time signals thus exploiting the excellent time resolution offered by magnetoencephalography. Compared with our previous work, the temporal method is applied to real magnetoencephalographic data coming from a somatotopy experience and results agree with previous physiological knowledge about this kind of cognitive process.

  13. A morphometric system to distinguish sheep and goat postcranial bones.

    PubMed

    Salvagno, Lenny; Albarella, Umberto

    2017-01-01

    Distinguishing between the bones of sheep and goat is a notorious challenge in zooarchaeology. Several methodological contributions have been published at different times and by various people to facilitate this task, largely relying on a macro-morphological approach. This is now routinely adopted by zooarchaeologists but, although it certainly has its value, has also been shown to have limitations. Morphological discriminant criteria can vary in different populations and correct identification is highly dependent upon a researcher's experience, availability of appropriate reference collections, and many other factors that are difficult to quantify. There is therefore a need to establish a more objective system, susceptible to scrutiny. In order to fulfil such a requirement, this paper offers a comprehensive morphometric method for the identification of sheep and goat postcranial bones, using a sample of more than 150 modern skeletons as a basis, and building on previous pioneering work. The proposed method is based on measurements-some newly created, others previously published-and its use is recommended in combination with the more traditional morphological approach. Measurement ratios, used to translate morphological traits into biometrical attributes, are demonstrated to have substantial diagnostic potential, with the vast majority of specimens correctly assigned to species. The efficacy of the new method is also tested with Discriminant Analysis, which provides a successful verification of the biometrical indices, a statistical means to select the most promising measurements, and an additional line of analysis to be used in conjunction with the others.

  14. Optics-Only Calibration of a Neural-Net Based Optical NDE Method for Structural Health Monitoring

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    2004-01-01

    A calibration process is presented that uses optical measurements alone to calibrate a neural-net based NDE method. The method itself detects small changes in the vibration mode shapes of structures. The optics-only calibration process confirms previous work that the sensitivity to vibration-amplitude changes can be as small as 10 nanometers. A more practical value in an NDE service laboratory is shown to be 50 nanometers. Both model-generated and experimental calibrations are demonstrated using two implementations of the calibration technique. The implementations are based on previously published demonstrations of the NDE method and an alternative calibration procedure that depends on comparing neural-net and point sensor measurements. The optics-only calibration method, unlike the alternative method, does not require modifications of the structure being tested or the creation of calibration objects. The calibration process can be used to test improvements in the NDE process and to develop a vibration-mode-independence of damagedetection sensitivity. The calibration effort was intended to support NASA s objective to promote safety in the operations of ground test facilities or aviation safety, in general, by allowing the detection of the gradual onset of structural changes and damage.

  15. Arctic lead detection using a waveform mixture algorithm from CryoSat-2 data

    NASA Astrophysics Data System (ADS)

    Lee, Sanggyun; Kim, Hyun-cheol; Im, Jungho

    2018-05-01

    We propose a waveform mixture algorithm to detect leads from CryoSat-2 data, which is novel and different from the existing threshold-based lead detection methods. The waveform mixture algorithm adopts the concept of spectral mixture analysis, which is widely used in the field of hyperspectral image analysis. This lead detection method was evaluated with high-resolution (250 m) MODIS images and showed comparable and promising performance in detecting leads when compared to the previous methods. The robustness of the proposed approach also lies in the fact that it does not require the rescaling of parameters (i.e., stack standard deviation, stack skewness, stack kurtosis, pulse peakiness, and backscatter σ0), as it directly uses L1B waveform data, unlike the existing threshold-based methods. Monthly lead fraction maps were produced by the waveform mixture algorithm, which shows interannual variability of recent sea ice cover during 2011-2016, excluding the summer season (i.e., June to September). We also compared the lead fraction maps to other lead fraction maps generated from previously published data sets, resulting in similar spatiotemporal patterns.

  16. Experimental investigation of the Multipoint Ultrasonic Flowmeter

    NASA Astrophysics Data System (ADS)

    Jakub, Filipský

    2018-06-01

    The Multipoint Ultrasonic Flowmeter is a vector tomographic device capable of reconstructing all three components of velocity field based solely on boundary ultrasonic measurements. Computer simulations have shown the feasibility of such a device and have been published previously. This paper describes an experimental investigation of achievable accuracy of such a method. Doubled acoustic tripoles used to obtain information of the solenoidal part of vector field show extremely short differences between the Time Of Flights (TOFs) of individual sensors and are therefore sensitive to parasitic effects of TOF measurements. Sampling at 40MHz and correlation method is used to measure the TOF.

  17. How blockchain-timestamped protocols could improve the trustworthiness of medical science

    PubMed Central

    Irving, Greg; Holden, John

    2017-01-01

    Trust in scientific research is diminished by evidence that data are being manipulated. Outcome switching, data dredging and selective publication are some of the problems that undermine the integrity of published research. Methods for using blockchain to provide proof of pre-specified endpoints in clinical trial protocols were first reported by Carlisle. We wished to empirically test such an approach using a clinical trial protocol where outcome switching has previously been reported. Here we confirm the use of blockchain as a low cost, independently verifiable method to audit and confirm the reliability of scientific studies. PMID:27239273

  18. The coupled three-dimensional wave packet approach to reactive scattering

    NASA Astrophysics Data System (ADS)

    Marković, Nikola; Billing, Gert D.

    1994-01-01

    A recently developed scheme for time-dependent reactive scattering calculations using three-dimensional wave packets is applied to the D+H2 system. The present method is an extension of a previously published semiclassical formulation of the scattering problem and is based on the use of hyperspherical coordinates. The convergence requirements are investigated by detailed calculations for total angular momentum J equal to zero and the general applicability of the method is demonstrated by solving the J=1 problem. The inclusion of the geometric phase is also discussed and its effect on the reaction probability is demonstrated.

  19. How blockchain-timestamped protocols could improve the trustworthiness of medical science.

    PubMed

    Irving, Greg; Holden, John

    2016-01-01

    Trust in scientific research is diminished by evidence that data are being manipulated. Outcome switching, data dredging and selective publication are some of the problems that undermine the integrity of published research. Methods for using blockchain to provide proof of pre-specified endpoints in clinical trial protocols were first reported by Carlisle. We wished to empirically test such an approach using a clinical trial protocol where outcome switching has previously been reported. Here we confirm the use of blockchain as a low cost, independently verifiable method to audit and confirm the reliability of scientific studies.

  20. New particle formation in the sulfuric acid-dimethylamine-water system: reevaluation of CLOUD chamber measurements and comparison to an aerosol nucleation and growth model

    NASA Astrophysics Data System (ADS)

    Kürten, Andreas; Li, Chenxi; Bianchi, Federico; Curtius, Joachim; Dias, António; Donahue, Neil M.; Duplissy, Jonathan; Flagan, Richard C.; Hakala, Jani; Jokinen, Tuija; Kirkby, Jasper; Kulmala, Markku; Laaksonen, Ari; Lehtipalo, Katrianne; Makhmutov, Vladimir; Onnela, Antti; Rissanen, Matti P.; Simon, Mario; Sipilä, Mikko; Stozhkov, Yuri; Tröstl, Jasmin; Ye, Penglin; McMurry, Peter H.

    2018-01-01

    A recent CLOUD (Cosmics Leaving OUtdoor Droplets) chamber study showed that sulfuric acid and dimethylamine produce new aerosols very efficiently and yield particle formation rates that are compatible with boundary layer observations. These previously published new particle formation (NPF) rates are reanalyzed in the present study with an advanced method. The results show that the NPF rates at 1.7 nm are more than a factor of 10 faster than previously published due to earlier approximations in correcting particle measurements made at a larger detection threshold. The revised NPF rates agree almost perfectly with calculated rates from a kinetic aerosol model at different sizes (1.7 and 4.3 nm mobility diameter). In addition, modeled and measured size distributions show good agreement over a wide range of sizes (up to ca. 30 nm). Furthermore, the aerosol model is modified such that evaporation rates for some clusters can be taken into account; these evaporation rates were previously published from a flow tube study. Using this model, the findings from the present study and the flow tube experiment can be brought into good agreement for the high base-to-acid ratios (˜ 100) relevant for this study. This confirms that nucleation proceeds at rates that are compatible with collision-controlled (a.k.a. kinetically controlled) NPF for the conditions during the CLOUD7 experiment (278 K, 38 % relative humidity, sulfuric acid concentration between 1 × 106 and 3 × 107 cm-3, and dimethylamine mixing ratio of ˜ 40 pptv, i.e., 1 × 109 cm-3).

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The plpdfa software is a product of an LDRD project at LLNL entitked "Adaptive Sampling for Very High Throughput Data Streams" (tracking number 11-ERD-035). This software was developed by a graduate student summer intern, Chris Challis, who worked under project PI Dan Merl furing the summer of 2011. The software the source code is implementing is a statistical analysis technique for clustering and classification of text-valued data. The method had been previously published by the PI in the open literature.

  2. Battery Calendar Life Estimator Manual Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jon P. Christophersen; Ira Bloom; Ed Thomas

    2012-10-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  3. Battery Life Estimator Manual Linear Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jon P. Christophersen; Ira Bloom; Ed Thomas

    2009-08-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  4. A rapid and rational approach to generating isomorphous heavy-atom phasing derivatives.

    PubMed

    Lu, Jinghua; Sun, Peter D

    2014-09-01

    In attempts to replace the conventional trial-and-error heavy-atom derivative search method with a rational approach, we previously defined heavy metal compound reactivity against peptide ligands. Here, we assembled a composite pH- and buffer-dependent peptide reactivity profile for each heavy metal compound to guide rational heavy-atom derivative search. When knowledge of the best-reacting heavy-atom compound is combined with mass spectrometry assisted derivatization, and with a quick-soak method to optimize phasing, it is likely that the traditional heavy-atom compounds could meet the demand of modern high-throughput X-ray crystallography. As an example, we applied this rational heavy-atom phasing approach to determine a previously unknown mouse serum amyloid A2 crystal structure. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  5. NEW MEMBERS OF THE SCORPIUS-CENTAURUS COMPLEX AND AGES OF ITS SUB-REGIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Inseok; Zuckerman, B.; Bessell, M. S.

    2012-07-15

    We have spectroscopically identified {approx}100 G-, K-, and M-type members of the Scorpius-Centaurus complex. To deduce the age of these young stars we compare their Li {lambda}6708 absorption line strengths against those of stars in the TW Hydrae association and {beta} Pictoris moving group. These line strengths indicate that Sco-Cen stars are younger than {beta} Pic stars whose ages of {approx}12 Myr have previously been derived from a kinematic traceback analysis. Our derived age, {approx}10 Myr, for stars in the Lower Centaurus Crux and Upper Centaurus Lupus subgroups of ScoCen is younger than previously published ages based on the movingmore » cluster method and upper main-sequence fitting. The discrepant ages are likely due to an incorrect (or lack of) cross-calibration between model-dependent and model-independent age-dating methods.« less

  6. In-situ measurement of electroosmotic drag coefficient in Nafion membrane for the PEMFC.

    PubMed

    Peng, Zhe; Morin, Arnaud; Huguet, Patrice; Schott, Pascal; Pauchet, Joël

    2011-11-10

    A new method based on hydrogen pump has been developed to measure the electroosmotic drag coefficient in representative PEMFC operating conditions. It allows eliminating the back-flow of water which leads to some errors in the calculation of this coefficient with previously reported electrochemical methods. Measurements have been performed on 50 μm thick Nafion membranes both extruded and recast. Contrary to what has been described in most of previous published works, the electroosmotic drag coefficient decreases as the membrane water content increases. The same trend is observed for temperatures between 25 and 80 °C. For the same membrane water content, the electroosmotic drag coefficient increases with temperature. In the same condition, there is no difference in drag coefficient for extruded Nafion N112 and recast Nafion NRE212. These results are discussed on the basis of the two commonly accepted proton transport mechanisms, namely, Grotthus and vehicular.

  7. 15 CFR 10.10 - Review of published standards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Review of published standards. 10.10... DEVELOPMENT OF VOLUNTARY PRODUCT STANDARDS § 10.10 Review of published standards. (a) Each standard published... considered until a replacement standard is published. (b) Each standard published under these or previous...

  8. Polynomial dual energy inverse functions for bone Calcium/Phosphorus ratio determination and experimental evaluation.

    PubMed

    Sotiropoulou, P; Fountos, G; Martini, N; Koukou, V; Michail, C; Kandarakis, I; Nikiforidis, G

    2016-12-01

    An X-ray dual energy (XRDE) method was examined, using polynomial nonlinear approximation of inverse functions for the determination of the bone Calcium-to-Phosphorus (Ca/P) mass ratio. Inverse fitting functions with the least-squares estimation were used, to determine calcium and phosphate thicknesses. The method was verified by measuring test bone phantoms with a dedicated dual energy system and compared with previously published dual energy data. The accuracy in the determination of the calcium and phosphate thicknesses improved with the polynomial nonlinear inverse function method, introduced in this work, (ranged from 1.4% to 6.2%), compared to the corresponding linear inverse function method (ranged from 1.4% to 19.5%). Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Diesel Exhaust Exposure and the Risk of Lung Cancer—A Review of the Epidemiological Evidence

    PubMed Central

    Sun, Yi; Bochmann, Frank; Nold, Annette; Mattenklott, Markus

    2014-01-01

    To critically evaluate the association between diesel exhaust (DE) exposure and the risk of lung cancer, we conducted a systematic review of published epidemiological evidences. To comprehensively identify original studies on the association between DE exposure and the risk of lung cancer, literature searches were performed in literature databases for the period between 1970 and 2013, including bibliographies and cross-referencing. In total, 42 cohort studies and 32 case-control studies were identified in which the association between DE exposures and lung cancer was examined. In general, previous studies suffer from a series of methodological limitations, including design, exposure assessment methods and statistical analysis used. A lack of objective exposure information appears to be the main problem in interpreting epidemiological evidence. To facilitate the interpretation and comparison of previous studies, a job-exposure matrix (JEM) of DE exposures was created based on around 4,000 historical industrial measurements. The values from the JEM were considered during interpretation and comparison of previous studies. Overall, neither cohort nor case-control studies indicate a clear exposure-response relationship between DE exposure and lung cancer. Epidemiological studies published to date do not allow a valid quantification of the association between DE and lung cancer. PMID:24473109

  10. Past speculations of future health technologies: a description of technologies predicted in 15 forecasting studies published between 1986 and 2010

    PubMed Central

    Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew

    2017-01-01

    Objective To describe and classify health technologies predicted in forecasting studies. Design and methods A portrait describing health technologies predicted in 15 forecasting studies published between 1986 and 2010 that were identified in a previous systematic review. Health technologies are classified according to their type, purpose and clinical use; relating these to the original purpose and timing of the forecasting studies. Data sources All health-related technologies predicted in 15 forecasting studies identified in a previously published systematic review. Main outcome measure Outcomes related to (1) each forecasting study including country, year, intention and forecasting methods used and (2) the predicted technologies including technology type, purpose, targeted clinical area and forecast timeframe. Results Of the 896 identified health-related technologies, 685 (76.5%) were health technologies with an explicit or implied health application and included in our study. Of these, 19.1% were diagnostic or imaging tests, 14.3% devices or biomaterials, 12.6% information technology systems, eHealth or mHealth and 12% drugs. The majority of the technologies were intended to treat or manage disease (38.1%) or diagnose or monitor disease (26.1%). The most frequent targeted clinical areas were infectious diseases followed by cancer, circulatory and nervous system disorders. The most frequent technology types were for: infectious diseases—prophylactic vaccines (45.8%), cancer—drugs (40%), circulatory disease—devices and biomaterials (26.3%), and diseases of the nervous system—equally devices and biomaterials (25%) and regenerative medicine (25%). The mean timeframe for forecasting was 11.6 years (range 0–33 years, median=10, SD=6.6). The forecasting timeframe significantly differed by technology type (p=0.002), the intent of the forecasting group (p<0.001) and the methods used (p<001). Conclusion While description and classification of predicted health-related technologies is crucial in preparing healthcare systems for adopting new innovations, further work is needed to test the accuracy of predictions made. PMID:28760796

  11. Quantifying uncertainty in carbon and nutrient pools of coarse woody debris

    NASA Astrophysics Data System (ADS)

    See, C. R.; Campbell, J. L.; Fraver, S.; Domke, G. M.; Harmon, M. E.; Knoepp, J. D.; Woodall, C. W.

    2016-12-01

    Woody detritus constitutes a major pool of both carbon and nutrients in forested ecosystems. Estimating coarse wood stocks relies on many assumptions, even when full surveys are conducted. Researchers rarely report error in coarse wood pool estimates, despite the importance to ecosystem budgets and modelling efforts. To date, no study has attempted a comprehensive assessment of error rates and uncertainty inherent in the estimation of this pool. Here, we use Monte Carlo analysis to propagate the error associated with the major sources of uncertainty present in the calculation of coarse wood carbon and nutrient (i.e., N, P, K, Ca, Mg, Na) pools. We also evaluate individual sources of error to identify the importance of each source of uncertainty in our estimates. We quantify sampling error by comparing the three most common field methods used to survey coarse wood (two transect methods and a whole-plot survey). We quantify the measurement error associated with length and diameter measurement, and technician error in species identification and decay class using plots surveyed by multiple technicians. We use previously published values of model error for the four most common methods of volume estimation: Smalian's, conical frustum, conic paraboloid, and average-of-ends. We also use previously published values for error in the collapse ratio (cross-sectional height/width) of decayed logs that serves as a surrogate for the volume remaining. We consider sampling error in chemical concentration and density for all decay classes, using distributions from both published and unpublished studies. Analytical uncertainty is calculated using standard reference plant material from the National Institute of Standards. Our results suggest that technician error in decay classification can have a large effect on uncertainty, since many of the error distributions included in the calculation (e.g. density, chemical concentration, volume-model selection, collapse ratio) are decay-class specific.

  12. Systematic review of non-surgical therapies for osteoarthritis of the hand: an update.

    PubMed

    Lue, S; Koppikar, S; Shaikh, K; Mahendira, D; Towheed, T E

    2017-09-01

    To update our earlier systematic reviews which evaluated all published randomized controlled trials (RCTs) evaluating pharmacological and non-pharmacological therapies in patients with hand osteoarthritis (OA). Surgical therapies were not evaluated. RCTs published between March 2008 and December 2015 were added to the previous systematic reviews. A total of 95 RCTs evaluating various pharmacological and non-pharmacological therapies in hand OA were analyzed in this update. Generally, the methodological quality of these RCTs has improved since the last update, with more studies describing their methods for randomization, blinding, and allocation concealment. However, RCTs continue to be weakened by a lack of consistent case definition and a lack of standardized outcome assessments specific to hand OA. The number and location of evaluated hand joints continues to be underreported, and only 25% of RCTs adequately described the method used to ensure allocation concealment. These remain major weaknesses of published RCTs. A meta-analysis could not be performed because of marked study heterogeneity, insufficient statistical data available in the published RCTs, and a small number of identical comparators. Hand OA is a complex area in which to study the efficacy of therapies. There has been an improvement in the overall design and conduct of RCTs, however, additional large RCTs with a more robust methodological approach specific to hand OA are needed in order to make clinically relevant conclusions about the efficacy of the diverse treatment options available. Copyright © 2017 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  13. Development of a consensus method for culture of Clostridium difficile from meat and its use in a survey of U.S. retail meats.

    PubMed

    Limbago, Brandi; Thompson, Angela D; Greene, Sharon A; MacCannell, Duncan; MacGowan, Charles E; Jolbitado, Beverly; Hardin, Henrietta D; Estes, Stephanie R; Weese, J Scott; Songer, J Glenn; Gould, L Hannah

    2012-12-01

    Three previously described methods for culture of Clostridium difficile from meats were evaluated by microbiologists with experience in C. difficile culture and identification. A consensus protocol using BHI broth enrichment followed by ethanol shock and plating to selective and non-selective media was selected for use, and all participating laboratories received hands-on training in the use of this method prior to study initiation. Retail meat products (N = 1755) were cultured for C. difficile over 12 months during 2010-2011 at 9 U.S. FoodNet sites. No C. difficile was recovered, although other clostridia were isolated. Published by Elsevier Ltd.

  14. Study of grid independence of finite element method on MHD free convective casson fluid flow with slip effect

    NASA Astrophysics Data System (ADS)

    Raju, R. Srinivasa; Ramesh, K.

    2018-05-01

    The purpose of this work is to study the grid independence of finite element method on MHD Casson fluid flow past a vertically inclined plate filled in a porous medium in presence of chemical reaction, heat absorption, an external magnetic field and slip effect has been investigated. For this study of grid independence, a mathematical model is developed and analyzed by using appropriate mathematical technique, called finite element method. Grid study discussed with the help of numerical values of velocity, temperature and concentration profiles in tabular form. avourable comparisons with previously published work on various special cases of the problem are obtained.

  15. Comparison of DNA preservation methods for environmental bacterial community samples.

    PubMed

    Gray, Michael A; Pratte, Zoe A; Kellogg, Christina A

    2013-02-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard(™), RNAlater(®), DMSO-EDTA-salt (DESS), FTA(®) cards, and FTA Elute(®) cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA(®) cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard(™), RNAlater(®), and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  16. Enhancing Breast Cancer Recurrence Algorithms Through Selective Use of Medical Record Data.

    PubMed

    Kroenke, Candyce H; Chubak, Jessica; Johnson, Lisa; Castillo, Adrienne; Weltzien, Erin; Caan, Bette J

    2016-03-01

    The utility of data-based algorithms in research has been questioned because of errors in identification of cancer recurrences. We adapted previously published breast cancer recurrence algorithms, selectively using medical record (MR) data to improve classification. We evaluated second breast cancer event (SBCE) and recurrence-specific algorithms previously published by Chubak and colleagues in 1535 women from the Life After Cancer Epidemiology (LACE) and 225 women from the Women's Health Initiative cohorts and compared classification statistics to published values. We also sought to improve classification with minimal MR examination. We selected pairs of algorithms-one with high sensitivity/high positive predictive value (PPV) and another with high specificity/high PPV-using MR information to resolve discrepancies between algorithms, properly classifying events based on review; we called this "triangulation." Finally, in LACE, we compared associations between breast cancer survival risk factors and recurrence using MR data, single Chubak algorithms, and triangulation. The SBCE algorithms performed well in identifying SBCE and recurrences. Recurrence-specific algorithms performed more poorly than published except for the high-specificity/high-PPV algorithm, which performed well. The triangulation method (sensitivity = 81.3%, specificity = 99.7%, PPV = 98.1%, NPV = 96.5%) improved recurrence classification over two single algorithms (sensitivity = 57.1%, specificity = 95.5%, PPV = 71.3%, NPV = 91.9%; and sensitivity = 74.6%, specificity = 97.3%, PPV = 84.7%, NPV = 95.1%), with 10.6% MR review. Triangulation performed well in survival risk factor analyses vs analyses using MR-identified recurrences. Use of multiple recurrence algorithms in administrative data, in combination with selective examination of MR data, may improve recurrence data quality and reduce research costs. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Determination of protein carbonyls in plasma, cell extracts, tissue homogenates, isolated proteins: Focus on sample preparation and derivatization conditions.

    PubMed

    Weber, Daniela; Davies, Michael J; Grune, Tilman

    2015-08-01

    Protein oxidation is involved in regulatory physiological events as well as in damage to tissues and is thought to play a key role in the pathophysiology of diseases and in the aging process. Protein-bound carbonyls represent a marker of global protein oxidation, as they are generated by multiple different reactive oxygen species in blood, tissues and cells. Sample preparation and stabilization are key steps in the accurate quantification of oxidation-related products and examination of physiological/pathological processes. This review therefore focuses on the sample preparation processes used in the most relevant methods to detect protein carbonyls after derivatization with 2,4-dinitrophenylhydrazine with an emphasis on measurement in plasma, cells, organ homogenates, isolated proteins and organelles. Sample preparation, derivatization conditions and protein handling are presented for the spectrophotometric and HPLC method as well as for immunoblotting and ELISA. An extensive overview covering these methods in previously published articles is given for researchers who plan to measure protein carbonyls in different samples. © 2015 Published by Elsevier Ltd.

  18. OARSI Clinical Trials Recommendations: Design and conduct of clinical trials of lifestyle diet and exercise interventions for osteoarthritis.

    PubMed

    Messier, S P; Callahan, L F; Golightly, Y M; Keefe, F J

    2015-05-01

    The objective was to develop a set of "best practices" for use as a primer for those interested in entering the clinical trials field for lifestyle diet and/or exercise interventions in osteoarthritis (OA), and as a set of recommendations for experienced clinical trials investigators. A subcommittee of the non-pharmacologic therapies committee of the OARSI Clinical Trials Working Group was selected by the Steering Committee to develop a set of recommended principles for non-pharmacologic diet/exercise OA randomized clinical trials. Topics were identified for inclusion by co-authors and reviewed by the subcommittee. Resources included authors' expert opinions, traditional search methods including MEDLINE (via PubMed), and previously published guidelines. Suggested steps and considerations for study methods (e.g., recruitment and enrollment of participants, study design, intervention and assessment methods) were recommended. The recommendations set forth in this paper provide a guide from which a research group can design a lifestyle diet/exercise randomized clinical trial in patients with OA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  19. Zero echo time MRI-only treatment planning for radiation therapy of brain tumors after resection.

    PubMed

    Boydev, C; Demol, B; Pasquier, D; Saint-Jalmes, H; Delpon, G; Reynaert, N

    2017-10-01

    Using magnetic resonance imaging (MRI) as the sole imaging modality for patient modeling in radiation therapy (RT) is a challenging task due to the need to derive electron density information from MRI and construct a so-called pseudo-computed tomography (pCT) image. We have previously published a new method to derive pCT images from head T1-weighted (T1-w) MR images using a single-atlas propagation scheme followed by a post hoc correction of the mapped CT numbers using local intensity information. The purpose of this study was to investigate the performance of our method with head zero echo time (ZTE) MR images. To evaluate results, the mean absolute error in bins of 20 HU was calculated with respect to the true planning CT scan of the patient. We demonstrated that applying our method using ZTE MR images instead of T1-w improved the correctness of the pCT in case of bone resection surgery prior to RT (that is, an example of large anatomical difference between the atlas and the patient). Copyright © 2017. Published by Elsevier Ltd.

  20. County-level estimates of nitrogen and phosphorus from animal manure for the conterminous United States, 2007 and 2012

    USGS Publications Warehouse

    Gronberg, JoAnn M.; Arnold, Terri L.

    2017-03-24

    County-level estimates of nitrogen and phosphorus inputs from animal manure for the conterminous United States were calculated from animal population inventories in the 2007 and 2012 Census of Agriculture, using previously published methods. These estimates of non-point nitrogen and phosphorus inputs from animal manure were compiled in support of the U.S. Geological Survey’s National Water-Quality Assessment Project of the National Water Quality Program and are needed to support national-scale investigations of stream and groundwater water quality. The estimates published in this report are comparable with older estimates which can be compared to show changes in nitrogen and phosphorus inputs from manure over time.

  1. Development of cortisol circadian rhythm in infancy.

    PubMed

    de Weerth, Carolina; Zijl, Robbert H; Buitelaar, Jan K

    2003-08-01

    Cortisol is the final product of the hypothalamus-pituitary-adrenal (HPA) axis. It is secreted in a pulsatile fashion that displays a circadian rhythm. Infants are born without a circadian rhythm in cortisol and they acquire it during their first year of life. Studies do not agree on the age of appearance of the circadian rhythm (varying between 2 weeks till the age of 9 months) nor on whether it is related to the appearance of the sleep-wake circadian rhythm. The object of the present study was to find evidence of the age of appearance of the diurnal rhythm of cortisol and to compare the results obtained by several different analysis methods on a new data set. Cortisol was determined in salival samples of 14 normally developing infants who were followed monthly between the ages of 2 and 5 months. The data were analyzed with several previously published analysis methods as well as with Multilevel Analysis (Hierarchical Linear Modeling). The previously published analysis methods each produced different results when applied to the current data set. Moreover, our results indicate striking differences between young infants in both age of appearance and stability of the diurnal cortisol rhythm. Also, a link was found between the appearance of the sleep-wake circadian rhythm and the cortisol circadian rhythm. An important intraindividual variability in cortisol levels was found even after correcting for the different variables that affect cortisol (i.e. time of sampling, feeding, etc.). Although the choice of analysis method influences the age of appearance obtained, our use of HLM shows that the infants' own variability in onset and stability of the cortisol circadian rhythm greatly contributes to the different results.

  2. Molecular methods for diagnosis of odontogenic infections.

    PubMed

    Flynn, Thomas R; Paster, Bruce J; Stokes, Lauren N; Susarla, Srinivas M; Shanti, Rabie M

    2012-08-01

    Historically, the identification of microorganisms has been limited to species that could be cultured in the microbiology laboratory. The purpose of the present study was to apply molecular techniques to identify microorganisms in orofacial odontogenic infections (OIs). Specimens were obtained from subjects with clinical evidence of OI. To identify the microorganisms involved, 16S rRNA sequencing methods were used on clinical specimens. The name and number of the clones of each species identified and the combinations of species present were recorded for each subject. Descriptive statistics were computed for the study variables. Specimens of pus or wound fluid were obtained from 9 subjects. A mean of 7.4 ± 3.7 (standard deviation) species per case were identified. The predominant species detected in the present study that have previously been associated with OIs were Fusobacterium spp, Parvimonas micra, Porphyromonas endodontalis, and Prevotella oris. The predominant species detected in our study that have not been previously associated with OIs were Dialister pneumosintes and Eubacterium brachy. Unculturable phylotypes accounted for 24% of the species identified in our study. All species detected were obligate or facultative anaerobes. Streptococci were not detected. Molecular methods have enabled us to detect previously cultivated and not-yet-cultivated species in OIs; these methods could change our understanding of the pathogenic flora of orofacial OIs. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  3. A neural network based reputation bootstrapping approach for service selection

    NASA Astrophysics Data System (ADS)

    Wu, Quanwang; Zhu, Qingsheng; Li, Peng

    2015-10-01

    With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.

  4. Evaluation of Allele-Specific Somatic Changes of Genome-Wide Association Study Susceptibility Alleles in Human Colorectal Cancers

    PubMed Central

    Gerber, Madelyn M.; Hampel, Heather; Schulz, Nathan P.; Fernandez, Soledad; Wei, Lai; Zhou, Xiao-Ping; de la Chapelle, Albert; Toland, Amanda Ewart

    2012-01-01

    Background Tumors frequently exhibit loss of tumor suppressor genes or allelic gains of activated oncogenes. A significant proportion of cancer susceptibility loci in the mouse show somatic losses or gains consistent with the presence of a tumor susceptibility or resistance allele. Thus, allele-specific somatic gains or losses at loci may demarcate the presence of resistance or susceptibility alleles. The goal of this study was to determine if previously mapped susceptibility loci for colorectal cancer show evidence of allele-specific somatic events in colon tumors. Methods We performed quantitative genotyping of 16 single nucleotide polymorphisms (SNPs) showing statistically significant association with colorectal cancer in published genome-wide association studies (GWAS). We genotyped 194 paired normal and colorectal tumor DNA samples and 296 paired validation samples to investigate these SNPs for allele-specific somatic gains and losses. We combined analysis of our data with published data for seven of these SNPs. Results No statistically significant evidence for allele-specific somatic selection was observed for the tested polymorphisms in the discovery set. The rs6983267 variant, which has shown preferential loss of the non-risk T allele and relative gain of the risk G allele in previous studies, favored relative gain of the G allele in the combined discovery and validation samples (corrected p-value = 0.03). When we combined our data with published allele-specific imbalance data for this SNP, the G allele of rs6983267 showed statistically significant evidence of relative retention (p-value = 2.06×10−4). Conclusions Our results suggest that the majority of variants identified as colon cancer susceptibility alleles through GWAS do not exhibit somatic allele-specific imbalance in colon tumors. Our data confirm previously published results showing allele-specific imbalance for rs6983267. These results indicate that allele-specific imbalance of cancer susceptibility alleles may not be a common phenomenon in colon cancer. PMID:22629442

  5. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  6. Improvements in an in vivo neutron activation analysis (NAA) method for the measurement of fluorine in human bone.

    PubMed

    Mostafaei, F; McNeill, F E; Chettle, D R; Prestwich, W V

    2013-10-01

    We previously published a method for the in vivo measurement of bone fluoride using neutron activation analysis (NAA) and demonstrated the utility of the technique in a pilot study of environmentally exposed people. The method involved activation of the hand in an irradiation cavity at the McMaster University Accelerator Laboratory and acquisition of the resultant γ-ray signals in a '4π' NaI(Tl) detector array of nine detectors. In this paper we describe a series of improvements to the method. This was investigated via measurement of hand simulating phantoms doped with varying levels of fluorine and fixed amounts of sodium, chlorine and calcium. Four improvements to the technique were tested since our first publication. The previously published detection limit for phantom measurements using this system was 0.66 mg F/g Ca. The accelerator irradiation and detection facilities were relocated to a new section of the laboratory and one more detector was added to the detection system. This was found to reduce the detection limit (possibly because of better detection shielding and additional detector) to 0.59 mg F/g Ca, a factor of 1.12. A new set of phantoms was developed and in this work we show that they improved the minimum detectable limit for fluoride in phantoms irradiated using neutrons produced by 2.15 MeV protons on lithium by a factor of 1.55. We compared the detection limits previously obtained using a summed signal from the nine detectors with the detection limit obtained by acquiring the spectra in anticoincidence mode for reduction of the disturbing signal from chlorine in bone. This was found to improve the ratio of the detection of fluorine to chlorine (an interfering signal) by a factor of 2.8 and the resultant minimum detection limit was found to be reduced by a factor of 1.2. We studied the effects of changing the timing of γ-ray acquisition. Our previously published data used a series of three 10 s acquisitions followed by a 300 s count. Changing the acquisition to a series of six 5 s acquisitions was found to further improve the detection limit by a factor of 1.4. We also present data showing that if the neutron dose is delivered to the phantom in a shorter time period, i.e. the dose rate is increased and irradiation shortened then the detection limit can be reduced by a further factor of 1.35.The overall improvement in detection limit by employing all of these changes was found to be a factor of 3.9. The technique now has an in phantom detection limit of 0.17 mg F/g Ca compared to a previous detection limit of 0.66 mg F/g Ca. The system can now be tested on human volunteers to see if individuals with diagnosed fluorosis can be distinguished from the general Canadian population using this technique.

  7. Erratum: Correction of “Pridgeon, J. W., Zhao, L., Becnel, J. J., Strickman, D. A., Clark, G. G., and Linthicum, K. J. 2008. Topically applied AaeIAP1 double-stranded RNA kills female adults of Aedes aegypti."

    USDA-ARS?s Scientific Manuscript database

    The coauthors of previously published work correct details from a 2008 publication. Specifically, it was incorrectly indicated in the methods section for data presented in Tables 2 and 3 that this experiment was the result of three replicates. These data were not the result of three replicate experi...

  8. Measured effects of coolant injection on the performance of a film cooled turbine

    NASA Technical Reports Server (NTRS)

    Mcdonel, J. D.; Eiswerth, J. E.

    1977-01-01

    Tests have been conducted on a 20-inch diameter single-stage air-cooled turbine designed to evaluate the effects of film cooling air on turbine aerodynamic performance. The present paper reports the results of five test configurations, including two different cooling designs and three combinations of cooled and solid airfoils. A comparison is made of the experimental results with a previously published analytical method of evaluating coolant injection effects on turbine performance.

  9. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    PubMed Central

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160

  10. Ochratoxin A in cocoa and chocolate sampled in Canada.

    PubMed

    Turcotte, A-M; Scott, P M

    2011-06-01

    In order to determine the levels of ochratoxin A (OTA) in cocoa and cocoa products available in Canada, a previously published analytical method, with minor modifications to the extraction and immunoaffinity clean-up and inclusion of an evaporation step, was initially used (Method I). To improve the low method recoveries (46-61%), 40% methanol was then included in the aqueous sodium bicarbonate extraction solvent (pH 7.8) (Method II). Clean-up was on an Ochratest™ immunoaffinity column and OTA was determined by liquid chromatography (LC) with fluorescence detection. Recoveries of OTA from spiked cocoa powder (0.5 and 5 ng g(-1)) were 75-84%; while recoveries from chocolate were 93-94%. The optimized method was sensitive (limit of quantification (LOQ) = 0.07-0.08 ng g(-1)), accurate (recovery = 75-94%) and precise (coefficient of variation (CV) < 5%). It is applicable to cocoa and chocolate. Analysis of 32 samples of cocoa powder (16 alkalized and 16 natural) for OTA showed an incidence of 100%, with concentrations ranging from 0.25 to 7.8 ng g(-1); in six samples the OTA level exceeded 2 ng g(-1), the previously considered European Union limit for cocoa. The frequency of detection of OTA in 28 chocolate samples (21 dark or baking chocolate and seven milk chocolate) was also 100% with concentrations ranging from 0.05 to 1.4 ng g(-1); one sample had a level higher than the previously considered European Union limit for chocolate (1 ng g(-1)).

  11. Economic evaluation in collaborative hospital drug evaluation reports.

    PubMed

    Ortega, Ana; Fraga, María Dolores; Marín-Gil, Roberto; Lopez-Briz, Eduardo; Puigventós, Francesc; Dranitsaris, George

    2015-09-01

    economic evaluation is a fundamental criterion when deciding a drug's place in therapy. The MADRE method (Method for Assistance in making Decisions and Writing Drug Evaluation Reports) is widely used for drug evaluation. This method was developed by the GENESIS group of the Spanish Society of Hospital Pharmacy (SEFH), including economic evaluation. We intend to improve the economic aspects of this method. As for the direction to take, we have to first analyze our previous experiences with the current methodology and propose necessary improvements. economic evaluation sections in collaboratively conducted drug evaluation reports (as the scientific society, SEFH) with the MADRE method were reviewed retrospectively. thirty-two reports were reviewed, 87.5% of them included an economic evaluation conducted by authors and 65.6% contained published economic evaluations. In 90.6% of the reports, a Budget impact analysis was conducted. The cost per life year gained or per Quality Adjusted Life Year gained was present in 14 reports. Twenty-three reports received public comments regarding the need to improve the economic aspect. Main difficulties: low quality evidence in the target population, no comparative studies with a relevant comparator, non-final outcomes evaluated, no quality of life data, no fixed drug price available, dosing uncertainty, and different prices for the same drug. proposed improvements: incorporating different forms of aid for non-drug costs, survival estimation and adapting published economic evaluations; establishing criteria for drug price selection, decision-making in conditions of uncertainty and poor quality evidence, dose calculation and cost-effectiveness thresholds depending on different situations. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  12. A guide to critiquing a research paper. Methodological appraisal of a paper on nurses in abortion care.

    PubMed

    Lipp, Allyson; Fothergill, Anne

    2015-03-01

    In this paper, we have taken a previously published article on nurses' judgements in abortion care performing a systematic critique of the merits of this research using a recognised critiquing framework. The qualitative paper chosen for the critique is a grounded theory design and the research terms and terminology associated with this method such as symbolic interactionism are defined. The published paper reported on findings from a study exploring the characteristics of nurses in abortion care. A published critiquing tool has been applied. It was chosen because it is pragmatic, clearly laid out and accessible as full text to the people likely to need it. It comprises two stages, the first of which centres on the believability of the research. The second stage is more detailed and examines the research process and establishes the credibility of the research in its application to practice. Develop critical and analytical skills through methodically appraising the merits of published research. Nursing as an evidence-based profession requires nurses at both pre- and post-registration levels to be able to understand, synthesise and critique research, this being a fundamental part of many nursing curricula. These have become core skills to acquire since implementing up to date evidence is the cornerstone of contemporary nursing practice. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. A wavelet method for modeling and despiking motion artifacts from resting-state fMRI time series

    PubMed Central

    Patel, Ameera X.; Kundu, Prantik; Rubinov, Mikail; Jones, P. Simon; Vértes, Petra E.; Ersche, Karen D.; Suckling, John; Bullmore, Edward T.

    2014-01-01

    The impact of in-scanner head movement on functional magnetic resonance imaging (fMRI) signals has long been established as undesirable. These effects have been traditionally corrected by methods such as linear regression of head movement parameters. However, a number of recent independent studies have demonstrated that these techniques are insufficient to remove motion confounds, and that even small movements can spuriously bias estimates of functional connectivity. Here we propose a new data-driven, spatially-adaptive, wavelet-based method for identifying, modeling, and removing non-stationary events in fMRI time series, caused by head movement, without the need for data scrubbing. This method involves the addition of just one extra step, the Wavelet Despike, in standard pre-processing pipelines. With this method, we demonstrate robust removal of a range of different motion artifacts and motion-related biases including distance-dependent connectivity artifacts, at a group and single-subject level, using a range of previously published and new diagnostic measures. The Wavelet Despike is able to accommodate the substantial spatial and temporal heterogeneity of motion artifacts and can consequently remove a range of high and low frequency artifacts from fMRI time series, that may be linearly or non-linearly related to physical movements. Our methods are demonstrated by the analysis of three cohorts of resting-state fMRI data, including two high-motion datasets: a previously published dataset on children (N = 22) and a new dataset on adults with stimulant drug dependence (N = 40). We conclude that there is a real risk of motion-related bias in connectivity analysis of fMRI data, but that this risk is generally manageable, by effective time series denoising strategies designed to attenuate synchronized signal transients induced by abrupt head movements. The Wavelet Despiking software described in this article is freely available for download at www.brainwavelet.org. PMID:24657353

  14. A method for rapid, targeted CNV genotyping identifies rare variants associated with neurocognitive disease.

    PubMed

    Mefford, Heather C; Cooper, Gregory M; Zerr, Troy; Smith, Joshua D; Baker, Carl; Shafer, Neil; Thorland, Erik C; Skinner, Cindy; Schwartz, Charles E; Nickerson, Deborah A; Eichler, Evan E

    2009-09-01

    Copy-number variants (CNVs) are substantial contributors to human disease. A central challenge in CNV-disease association studies is to characterize the pathogenicity of rare and possibly incompletely penetrant events, which requires the accurate detection of rare CNVs in large numbers of individuals. Cost and throughput issues limit our ability to perform these studies. We have adapted the Illumina BeadXpress SNP genotyping assay and developed an algorithm, SNP-Conditional OUTlier detection (SCOUT), to rapidly and accurately detect both rare and common CNVs in large cohorts. This approach is customizable, cost effective, highly parallelized, and largely automated. We applied this method to screen 69 loci in 1105 children with unexplained intellectual disability, identifying pathogenic variants in 3.1% of these individuals and potentially pathogenic variants in an additional 2.3%. We identified seven individuals (0.7%) with a deletion of 16p11.2, which has been previously associated with autism. Our results widen the phenotypic spectrum of these deletions to include intellectual disability without autism. We also detected 1.65-3.4 Mbp duplications at 16p13.11 in 1.1% of affected individuals and 350 kbp deletions at 15q11.2, near the Prader-Willi/Angelman syndrome critical region, in 0.8% of affected individuals. Compared to published CNVs in controls they are significantly (P = 4.7 x 10(-5) and 0.003, respectively) enriched in these children, supporting previously published hypotheses that they are neurocognitive disease risk factors. More generally, this approach offers a previously unavailable balance between customization, cost, and throughput for analysis of CNVs and should prove valuable for targeted CNV detection in both research and diagnostic settings.

  15. Comparison of velocity-log data collected using impeller and electromagnetic flowmeters

    USGS Publications Warehouse

    Newhouse, M.W.; Izbicki, J.A.; Smith, G.A.

    2005-01-01

    Previous studies have used flowmeters in environments that are within the expectations of their published ranges. Electromagnetic flowmeters have a published range from 0.1 to 79.0 m/min, and impeller flowmeters have a published range from 1.2 to 61.0 m/min. Velocity-log data collected in five long-screened production wells in the Pleasant Valley area of southern California showed that (1) electromagnetic flowmeter results were comparable within ??2% to results obtained using an impeller flowmeter for comparable depths; (2) the measured velocities from the electromagnetic flowmeter were up to 36% greater than the published maximum range; and (3) both data sets, collected without the use of centralizers or flow diverters, produced comparable and interpretable results. Although either method is acceptable for measuring wellbore velocities and the distribution of flow, the electromagnetic flowmeter enables collection of data over a now greater range of flows. In addition, changes in fluid temperature and fluid resistivity, collected as part of the electromagnetic flowmeter log, are useful in the identification of flow and hydrogeologic interpretation.

  16. Comparison of velocity-log data collected using impeller and electromagnetic flowmeters.

    PubMed

    Newhouse, M W; Izbicki, J A; Smith, G A

    2005-01-01

    Previous studies have used flowmeters in environments that are within the expectations of their published ranges. Electromagnetic flowmeters have a published range from 0.1 to 79.0 m/min, and impeller flowmeters have a published range from 1.2 to 61.0 m/min. Velocity-log data collected in five long-screened production wells in the Pleasant Valley area of southern California showed that (1) electromagnetic flowmeter results were comparable within +/-2% to results obtained using an impeller flowmeter for comparable depths; (2) the measured velocities from the electromagnetic flowmeter were up to 36% greater than the published maximum range; and (3) both data sets, collected without the use of centralizers or flow diverters, produced comparable and interpretable results. Although either method is acceptable for measuring wellbore velocities and the distribution of flow, the electromagnetic flowmeter enables collection of data over a now greater range of flows. In addition, changes in fluid temperature and fluid resistivity, collected as part of the electromagnetic flowmeter log, are useful in the identification of flow and hydrogeologic interpretation.

  17. Gradient-based interpolation method for division-of-focal-plane polarimeters.

    PubMed

    Gao, Shengkui; Gruev, Viktor

    2013-01-14

    Recent advancements in nanotechnology and nanofabrication have allowed for the emergence of the division-of-focal-plane (DoFP) polarization imaging sensors. These sensors capture polarization properties of the optical field at every imaging frame. However, the DoFP polarization imaging sensors suffer from large registration error as well as reduced spatial-resolution output. These drawbacks can be improved by applying proper image interpolation methods for the reconstruction of the polarization results. In this paper, we present a new gradient-based interpolation method for DoFP polarimeters. The performance of the proposed interpolation method is evaluated against several previously published interpolation methods by using visual examples and root mean square error (RMSE) comparison. We found that the proposed gradient-based interpolation method can achieve better visual results while maintaining a lower RMSE than other interpolation methods under various dynamic ranges of a scene ranging from dim to bright conditions.

  18. Modal parameter identification using the log decrement method and band-pass filters

    NASA Astrophysics Data System (ADS)

    Liao, Yabin; Wells, Valana

    2011-10-01

    This paper presents a time-domain technique for identifying modal parameters of test specimens based on the log-decrement method. For lightly damped multidegree-of-freedom or continuous systems, the conventional method is usually restricted to identification of fundamental-mode parameters only. Implementation of band-pass filters makes it possible for the proposed technique to extract modal information of higher modes. The method has been applied to a polymethyl methacrylate (PMMA) beam for complex modulus identification in the frequency range 10-1100 Hz. Results compare well with those obtained using the Least Squares method, and with those previously published in literature. Then the accuracy of the proposed method has been further verified by experiments performed on a QuietSteel specimen with very low damping. The method is simple and fast. It can be used for a quick estimation of the modal parameters, or as a complementary approach for validation purposes.

  19. Assessment of published models and prognostic variables in epithelial ovarian cancer at Mayo Clinic

    PubMed Central

    Hendrickson, Andrea Wahner; Hawthorne, Kieran M.; Goode, Ellen L.; Kalli, Kimberly R.; Goergen, Krista M.; Bakkum-Gamez, Jamie N.; Cliby, William A.; Keeney, Gary L.; Visscher, Dan W.; Tarabishy, Yaman; Oberg, Ann L.; Hartmann, Lynn C.; Maurer, Matthew J.

    2015-01-01

    Objectives Epithelial ovarian cancer (EOC) is an aggressive disease in which first line therapy consists of a surgical staging/debulking procedure and platinum based chemotherapy. There is significant interest in clinically applicable, easy to use prognostic tools to estimate risk of recurrence and overall survival. In this study we used a large prospectively collected cohort of women with EOC to validate currently published models and assess prognostic variables. Methods Women with invasive ovarian, peritoneal, or fallopian tube cancer diagnosed between 2000-2011 and prospectively enrolled into the Mayo Clinic Ovarian Cancer registry were identified. Demographics and known prognostic markers as well as epidemiologic exposure variables were abstracted from the medical record and collected via questionnaire. Six previously published models of overall and recurrence-free survival were assessed for external validity. In addition, predictors of outcome were assessed in our dataset. Results Previously published models validated with a range of c-statistics (0.587-0.827), though application of models containing variables not part of routine practice were somewhat limited by missing data; utilization of all applicable models and comparison of results is suggested. Examination of prognostic variables identified only the presence of ascites and ASA score to be independent predictors of prognosis in our dataset, albeit with marginal gain in prognostic information, after accounting for stage and debulking. Conclusions Existing prognostic models for newly diagnosed EOC showed acceptable calibration in our cohort for clinical application. However, modeling of prospective variables in our dataset reiterates that stage and debulking remain the most important predictors of prognosis in this setting. PMID:25620544

  20. A method to measure the ozone penetration factor in residences under infiltration conditions: application in a multifamily apartment unit.

    PubMed

    Zhao, H; Stephens, B

    2016-08-01

    Recent experiments have demonstrated that outdoor ozone reacts with materials inside residential building enclosures, potentially reducing indoor exposures to ozone or altering ozone reaction byproducts. However, test methods to measure ozone penetration factors in residences (P) remain limited. We developed a method to measure ozone penetration factors in residences under infiltration conditions and applied it in an unoccupied apartment unit. Twenty-four repeated measurements were made, and results were explored to (i) evaluate the accuracy and repeatability of the new procedure using multiple solution methods, (ii) compare results from 'interference-free' and conventional UV absorbance ozone monitors, and (iii) compare results against those from a previously published test method requiring artificial depressurization. The mean (±s.d.) estimate of P was 0.54 ± 0.10 across a wide range of conditions using the new method with an interference-free monitor; the conventional monitor was unable to yield meaningful results due to relatively high limits of detection. Estimates of P were not clearly influenced by any indoor or outdoor environmental conditions or changes in indoor decay rate constants. This work represents the first known measurements of ozone penetration factors in a residential building operating under natural infiltration conditions and provides a new method for widespread application in buildings. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Visual analog rating of mood by people with aphasia.

    PubMed

    Haley, Katarina L; Womack, Jennifer L; Harmon, Tyson G; Williams, Sharon W

    2015-08-01

    Considerable attention has been given to the identification of depression in stroke survivors with aphasia, but there is more limited information about other mood states. Visual analog scales are often used to collect subjective information from people with aphasia. However, the validity of these methods for communicating about mood has not been established in people with moderately to severely impaired language. The dual purposes of this study were to characterize the relative endorsement of negative and positive mood states in people with chronic aphasia after stroke and to examine congruent validity for visual analog rating methods for people with a range of aphasia severity. Twenty-three left-hemisphere stroke survivors with aphasia were asked to indicate their present mood by using two published visual analog rating methods. The congruence between the methods was estimated through correlation analysis, and scores for different moods were compared. Endorsement was significantly stronger for "happy" than for mood states with negative valence. At the same time, several participants displayed pronounced negative mood compared to previously published norms for neurologically healthy adults. Results from the two rating methods were moderately and positively correlated. Positive mood is prominent in people with aphasia who are in the chronic stage of recovery after stroke, but negative moods can also be salient and individual presentations are diverse. Visual analog rating methods are valid methods for discussing mood with people with aphasia; however, design optimization should be explored.

  2. The challenge of Parkinson's disease management in Africa.

    PubMed

    Dotchin, C L; Msuya, O; Walker, R W

    2007-03-01

    Parkinson's disease (PD) is said to be less common in Africa than elsewhere in the world, but previous studies have been based on small numbers. Also, the differences may be due to the diagnostic criteria used, case finding methods and different population age structures. Developing countries have few facilities for chronic disease management and non-communicable diseases, although on the increase, tend to play second fiddle to malaria and HIV/AIDS. Previous reports suggest that, at least from anecdotal information, under-diagnosis of PD is common and long-term availability of medication, follow-up, patient education and multidisciplinary input is lacking. Published literature is scarce and there is a lack of recent information. We are currently conducting a door-to-door prevalence study in northern Tanzania in a population of 161,162. We have reviewed previous literature on PD in Africa and illustrate our personal experience of PD and its management in Africa with three cases.

  3. Method-centered digital communities on protocols.io for fast-paced scientific innovation.

    PubMed

    Kindler, Lori; Stoliartchouk, Alexei; Teytelman, Leonid; Hurwitz, Bonnie L

    2016-01-01

    The Internet has enabled online social interaction for scientists beyond physical meetings and conferences. Yet despite these innovations in communication, dissemination of methods is often relegated to just academic publishing. Further, these methods remain static, with subsequent advances published elsewhere and unlinked. For communities undergoing fast-paced innovation, researchers need new capabilities to share, obtain feedback, and publish methods at the forefront of scientific development. For example, a renaissance in virology is now underway given the new metagenomic methods to sequence viral DNA directly from an environment. Metagenomics makes it possible to "see" natural viral communities that could not be previously studied through culturing methods. Yet, the knowledge of specialized techniques for the production and analysis of viral metagenomes remains in a subset of labs.  This problem is common to any community using and developing emerging technologies and techniques. We developed new capabilities to create virtual communities in protocols.io, an open access platform, for disseminating protocols and knowledge at the forefront of scientific development. To demonstrate these capabilities, we present a virology community forum called VERVENet. These new features allow virology researchers to share protocols and their annotations and optimizations, connect with the broader virtual community to share knowledge, job postings, conference announcements through a common online forum, and discover the current literature through personalized recommendations to promote discussion of cutting edge research. Virtual communities in protocols.io enhance a researcher's ability to: discuss and share protocols, connect with fellow community members, and learn about new and innovative research in the field.  The web-based software for developing virtual communities is free to use on protocols.io. Data are available through public APIs at protocols.io.

  4. When ab ≠ c - c': published errors in the reports of single-mediator models.

    PubMed

    Petrocelli, John V; Clarkson, Joshua J; Whitmire, Melanie B; Moon, Paul E

    2013-06-01

    Accurate reports of mediation analyses are critical to the assessment of inferences related to causality, since these inferences are consequential for both the evaluation of previous research (e.g., meta-analyses) and the progression of future research. However, upon reexamination, approximately 15% of published articles in psychology contain at least one incorrect statistical conclusion (Bakker & Wicherts, Behavior research methods, 43, 666-678 2011), disparities that beget the question of inaccuracy in mediation reports. To quantify this question of inaccuracy, articles reporting standard use of single-mediator models in three high-impact journals in personality and social psychology during 2011 were examined. More than 24% of the 156 models coded failed an equivalence test (i.e., ab = c - c'), suggesting that one or more regression coefficients in mediation analyses are frequently misreported. The authors cite common sources of errors, provide recommendations for enhanced accuracy in reports of single-mediator models, and discuss implications for alternative methods.

  5. Bloodmeal Identification in Field-Collected Sand Flies From Casa Branca, Brazil, Using the Cytochrome b PCR Method.

    PubMed

    Carvalho, G M L; Rêgo, F D; Tanure, A; Silva, A C P; Dias, T A; Paz, G F; Andrade Filho, J D

    2017-07-01

    PCR-based identification of vertebrate host bloodmeals has been performed on several vectors species with success. In the present study, we used a previously published PCR protocol followed by DNA sequencing based on primers designed from multiple alignments of the mitochondrial cytochrome b gene used to identify avian and mammalian hosts of various hematophagous vectors. The amplification of a fragment encoding a 359 bp sequence of the Cyt b gene yielded recognized amplification products in 192 female sand flies (53%), from a total of 362 females analyzed. In the study area of Casa Branca, Brazil, blood-engorged female sand flies such as Lutzomyia longipalpis (Lutz & Neiva, 1912), Migonemyia migonei (França, 1924), and Nyssomyia whitmani (Antunes & Coutinho, 1939) were analyzed for bloodmeal sources. The PCR-based method identified human, dog, chicken, and domestic rat blood sources. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Externalities and article citations: experience of a national public health journal (Gaceta Sanitaria).

    PubMed

    Ruano-Ravina, Alberto; Álvarez-Dardet, Carlos; Domínguez-Berjón, M Felicitas; Fernández, Esteve; García, Ana M; Borrell, Carme

    2016-01-01

    The purpose of the study was to analyze the determinants of citations such as publication year, article type, article topic, article selected for a press release, number of articles previously published by the corresponding author, and publication language in a Spanish journal of public health. Observational study including all articles published in Gaceta Sanitaria during 2007-2011. We retrieved the number of citations from the ISI Web of Knowledge database in June 2013 and also information on other variables such as number of articles published by the corresponding author in the previous 5 years (searched through PubMed), selection for a press release, publication language, article type and topic, and others. We included 542 articles. Of these, 62.5% were cited in the period considered. We observed an increased odds ratio of citations for articles selected for a press release and also with the number of articles published previously by the corresponding author. Articles published in English do not seem to increase their citations. Certain externalities such as number of articles published by the corresponding author and being selected for a press release seem to influence the number of citations in national journals. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. The magnetic fields of hot subdwarf stars

    NASA Astrophysics Data System (ADS)

    Landstreet, J. D.; Bagnulo, S.; Fossati, L.; Jordan, S.; O'Toole, S. J.

    2012-05-01

    Context. Detection of magnetic fields has been reported in several sdO and sdB stars. Recent literature has cast doubts on the reliability of most of these detections. The situation concerning the occurrence and frequency of magnetic fields in hot subdwarfs is at best confused. Aims: We revisit data previously published in the literature, and we present new observations to clarify the question of how common magnetic fields are in subdwarf stars. Methods: We consider a sample of about 40 hot subdwarf stars. About 30 of them have been observed with the FORS1 and FORS2 instruments of the ESO VLT. Results have been published for only about half of the hot subdwarfs observed with FORS. Here we present new FORS1 field measurements for 17 stars, 14 of which have never been observed for magnetic fields before. We also critically review the measurements already published in the literature, and in particular we try to explain why previous papers based on the same FORS1 data have reported contradictory results. Results: All new and re-reduced measurements obtained with FORS1 are shown to be consistent with non-detection of magnetic fields. We explain previous spurious field detections from data obtained with FORS1 as due to a non-optimal method of wavelength calibration. Field detections in other surveys are found to be uncertain or doubtful, and certainly in need of confirmation. Conclusions: There is presently no strong evidence for the occurrence of a magnetic field in any sdB or sdO star, with typical longitudinal field uncertainties of the order of 2-400 G. It appears that globally simple fields of more than about 1 or 2 kG in strength occur in at most a few percent of hot subdwarfs. Further high-precision surveys, both with high-resolution spectropolarimeters and with instruments similar to FORS1 on large telescopes, would be very valuable. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere, Chile under observing programmes 072.D-0290 and 075.D-0352, or obtained from the ESO/ST-ECF Science Archive Facility.

  8. Resistive method for measuring the disintegration speed of Prince Rupert's drops

    NASA Astrophysics Data System (ADS)

    Bochkov, Mark; Gusenkova, Daria; Glushkov, Evgenii; Zotova, Julia; Zhabin, S. N.

    2016-09-01

    We have successfully applied the resistance grid technique to measure the disintegration speed in a special type of glass objects, widely known as Prince Rupert's drops. We use a fast digital oscilloscope and a simple electrical circuit, glued to the surface of the drops, to detect the voltage changes, corresponding to the breaks in the specific parts of the drops. The results obtained using this method are in good qualitative and quantitative agreement with theoretical predictions and previously published data. Moreover, the proposed experimental setup does not include any expensive equipment (such as a high-speed camera) and can therefore be widely used in high schools and universities.

  9. Partial gene sequences for the A subunit of methyl-coenzyme M reductase (mcrI) as a phylogenetic tool for the family Methanosarcinaceae

    NASA Technical Reports Server (NTRS)

    Springer, E.; Sachs, M. S.; Woese, C. R.; Boone, D. R.

    1995-01-01

    Representatives of the family Methanosarcinaceae were analyzed phylogenetically by comparing partial sequences of their methyl-coenzyme M reductase (mcrI) genes. A 490-bp fragment from the A subunit of the gene was selected, amplified by the PCR, cloned, and sequenced for each of 25 strains belonging to the Methanosarcinaceae. The sequences obtained were aligned with the corresponding portions of five previously published sequences, and all of the sequences were compared to determine phylogenetic distances by Fitch distance matrix methods. We prepared analogous trees based on 16S rRNA sequences; these trees corresponded closely to the mcrI trees, although the mcrI sequences of pairs of organisms had 3.01 +/- 0.541 times more changes than the respective pairs of 16S rRNA sequences, suggesting that the mcrI fragment evolved about three times more rapidly than the 16S rRNA gene. The qualitative similarity of the mcrI and 16S rRNA trees suggests that transfer of genetic information between dissimilar organisms has not significantly affected these sequences, although we found inconsistencies between some mcrI distances that we measured and and previously published DNA reassociation data. It is unlikely that multiple mcrI isogenes were present in the organisms that we examined, because we found no major discrepancies in multiple determinations of mcrI sequences from the same organism. Our primers for the PCR also match analogous sites in the previously published mcrII sequences, but all of the sequences that we obtained from members of the Methanosarcinaceae were more closely related to mcrI sequences than to mcrII sequences, suggesting that members of the Methanosarcinaceae do not have distinct mcrII genes.

  10. Two dimensional wavefront retrieval using lateral shearing interferometry

    NASA Astrophysics Data System (ADS)

    Mancilla-Escobar, B.; Malacara-Hernández, Z.; Malacara-Hernández, D.

    2018-06-01

    A new zonal two-dimensional method for wavefront retrieval from a surface under test using lateral shearing interferometry is presented. A modified Saunders method and phase shifting techniques are combined to generate a method for wavefront reconstruction. The result is a wavefront with an error below 0.7 λ and without any global high frequency filtering. A zonal analysis over square cells along the surfaces is made, obtaining a polynomial expression for the wavefront deformations over each cell. The main advantage of this method over previously published methods is that a global filtering of high spatial frequencies is not present. Thus, a global smoothing of the wavefront deformations is avoided, allowing the detection of deformations with relatively small extensions, that is, with high spatial frequencies. Additionally, local curvature and low order aberration coefficients are obtained in each cell.

  11. An exact noniterative linear method for locating sources based on measuring receiver arrival times.

    PubMed

    Militello, C; Buenafuente, S R

    2007-06-01

    In this paper an exact, linear solution to the source localization problem based on the time of arrival at the receivers is presented. The method is unique in that the source's position can be obtained by solving a system of linear equations, three for a plane and four for a volume. This simplification means adding an additional receiver to the minimum mathematically required (3+1 in two dimensions and 4+1 in three dimensions). The equations are easily worked out for any receiver configuration and their geometrical interpretation is straightforward. Unlike other methods, the system of reference used to describe the receivers' positions is completely arbitrary. The relationship between this method and previously published ones is discussed, showing how the present, more general, method overcomes nonlinearity and unknown dependency issues.

  12. Bas-relief generation using adaptive histogram equalization.

    PubMed

    Sun, Xianfang; Rosin, Paul L; Martin, Ralph R; Langbein, Frank C

    2009-01-01

    An algorithm is presented to automatically generate bas-reliefs based on adaptive histogram equalization (AHE), starting from an input height field. A mesh model may alternatively be provided, in which case a height field is first created via orthogonal or perspective projection. The height field is regularly gridded and treated as an image, enabling a modified AHE method to be used to generate a bas-relief with a user-chosen height range. We modify the original image-contrast-enhancement AHE method to use gradient weights also to enhance the shape features of the bas-relief. To effectively compress the height field, we limit the height-dependent scaling factors used to compute relative height variations in the output from height variations in the input; this prevents any height differences from having too great effect. Results of AHE over different neighborhood sizes are averaged to preserve information at different scales in the resulting bas-relief. Compared to previous approaches, the proposed algorithm is simple and yet largely preserves original shape features. Experiments show that our results are, in general, comparable to and in some cases better than the best previously published methods.

  13. Compact determination of hydrogen isotopes

    DOE PAGES

    Robinson, David

    2017-04-06

    Scanning calorimetry of a confined, reversible hydrogen sorbent material has been previously proposed as a method to determine compositions of unknown mixtures of diatomic hydrogen isotopologues and helium. Application of this concept could result in greater process knowledge during the handling of these gases. Previously published studies have focused on mixtures that do not include tritium. This paper focuses on modeling to predict the effect of tritium in mixtures of the isotopologues on a calorimetry scan. Furthermore, the model predicts that tritium can be measured with a sensitivity comparable to that observed for hydrogen-deuterium mixtures, and that under so memore » conditions, it may be possible to determine the atomic fractions of all three isotopes in a gas mixture.« less

  14. Temperature and heat flux changes at the base of Laurentide ice sheet inferred from geothermal data (evidence from province of Alberta, Canada)

    NASA Astrophysics Data System (ADS)

    Demezhko, Dmitry; Gornostaeva, Anastasia; Majorowicz, Jacek; Šafanda, Jan

    2018-01-01

    Using a previously published temperature log of the 2363-m-deep borehole Hunt well (Alberta, Canada) and the results of its previous interpretation, the new reconstructions of ground surface temperature and surface heat flux histories for the last 30 ka have been obtained. Two ways to adjust the timescale of geothermal reconstructions are discussed, namely the traditional method based on the a priori data on thermal diffusivity value, and the alternative one including the orbital tuning of the surface heat flux and the Earth's insolation changes. It is shown that the second approach provides better agreement between geothermal reconstructions and proxy evidences of deglaciation chronology in the studied region.

  15. The Impact of Guided Notes on Post-Secondary Student Achievement: A Meta-Analysis

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2013-01-01

    The common practice of using of guided notes in the post-secondary classroom is not fully appreciated or understood. In an effort to add to the existing research about this phenomenon, the current investigation expands on previously published research and one previously published meta-analysis that examined the impact of guided notes on…

  16. Optimization of the design of Gas Cherenkov Detectors for ICF diagnosis

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Hu, Huasi; Han, Hetong; Lv, Huanwen; Li, Lan

    2018-07-01

    A design method, which combines a genetic algorithm (GA) with Monte-Carlo simulation, is established and applied to two different types of Cherenkov detectors, namely, Gas Cherenkov Detector (GCD) and Gamma Reaction History (GRH). For accelerating the optimization program, open Message Passing Interface (MPI) is used in the Geant4 simulation. Compared with the traditional optical ray-tracing method, the performances of these detectors have been improved with the optimization method. The efficiency for GCD system, with a threshold of 6.3 MeV, is enhanced by ∼20% and time response improved by ∼7.2%. For the GRH system, with threshold of 10 MeV, the efficiency is enhanced by ∼76% in comparison with previously published results.

  17. An improved method for estimating capillary pressure from 3D microtomography images and its application to the study of disconnected nonwetting phase

    NASA Astrophysics Data System (ADS)

    Li, Tianyi; Schlüter, Steffen; Dragila, Maria Ines; Wildenschild, Dorthe

    2018-04-01

    We present an improved method for estimating interfacial curvatures from x-ray computed microtomography (CMT) data that significantly advances the potential for this tool to unravel the mechanisms and phenomena associated with multi-phase fluid motion in porous media. CMT data, used to analyze the spatial distribution and capillary pressure-saturation (Pc-S) relationships of liquid phases, requires accurate estimates of interfacial curvature. Our improved method for curvature estimation combines selective interface modification and distance weighting approaches. It was verified against synthetic (analytical computer-generated) and real image data sets, demonstrating a vast improvement over previous methods. Using this new tool on a previously published data set (multiphase flow) yielded important new insights regarding the pressure state of the disconnected nonwetting phase during drainage and imbibition. The trapped and disconnected non-wetting phase delimits its own hysteretic Pc-S curve that inhabits the space within the main hysteretic Pc-S loop of the connected wetting phase. Data suggests that the pressure of the disconnected, non-wetting phase is strongly modified by the pore geometry rather than solely by the bulk liquid phase that surrounds it.

  18. Assessment of In-Place Oil Shale Resources of the Green River Formation, Piceance Basin, Western Colorado

    USGS Publications Warehouse

    Johnson, Ronald C.; Mercier, Tracey J.; Brownfield, Michael E.; Pantea, Michael P.; Self, Jesse G.

    2009-01-01

    The U.S. Geological Survey (USGS) recently completed a reassessment of in-place oil shale resources, regardless of richness, in the Eocene Green River Formation in the Piceance Basin, western Colorado. A considerable amount of oil-yield data has been collected after previous in-place assessments were published, and these data were incorporated into this new assessment. About twice as many oil-yield data points were used, and several additional oil shale intervals were included that were not assessed previously for lack of data. Oil yields are measured using the Fischer assay method. The Fischer assay method is a standardized laboratory test for determining the oil yield from oil shale that has been almost universally used to determine oil yields for Green River Formation oil shales. Fischer assay does not necessarily measure the maximum amount of oil that an oil shale can produce, and there are retorting methods that yield more than the Fischer assay yield. However, the oil yields achieved by other technologies are typically reported as a percentage of the Fischer assay oil yield, and thus Fischer assay is still considered the standard by which other methods are compared.

  19. Parameter identification for nonlinear aerodynamic systems

    NASA Technical Reports Server (NTRS)

    Pearson, Allan E.

    1993-01-01

    This final technical report covers a three and one-half year period preceding February 28, 1993 during which support was provided under NASA Grant NAG-1-1065. Following a general description of the system identification problem and a brief survey of methods to attack it, the basic ideas behind the approach taken in this research effort are presented. The results obtained are described with reference to the published work, including the five semiannual progress reports previously submitted and two interim technical reports.

  20. An Application of Survival Analysis Methods to the Study of Marine Enlisted Attrition

    DTIC Science & Technology

    1990-03-01

    the author and do nut reflect the official policy or po- sition of the Dep.artment of Defense or the U.S. Go -ernment. 17 Cosati Codes 13 Subject Te...The majority of the findings concern -ing the effects of the covariates On attrition are consistent with published results from previous mnibta~ry...however. For instance, it is not exactly clear why high school graduates are better suited for military service. Issues concerning the non-high school

  1. Reply to "Comment on `Optical Imaging of Light-Induced Thermopower in Semiconductors'"

    NASA Astrophysics Data System (ADS)

    Gibelli, François; Lombez, Laurent; Rodière, Jean; Guillemoles, Jean-François

    2018-05-01

    In a Comment [1] on our previously published article [2], Apertet stated, "The definition of the thermopower given in that article seems erroneous due to a confusion between the different physical quantities needed to derive this parameter." We believe some definitions need to be clarified in order to avoid confusion. We here intend to answer the questions of Apertet by detailing the method and by focusing on the definition of the quantities that we optically measured.

  2. Methods for Reachability-based Hybrid Controller Design

    DTIC Science & Technology

    2012-05-10

    approaches for airport runways ( Teo and Tomlin, 2003). The results of the reachability calculations were validated in extensive simulations as well as...UAV flight experiments (Jang and Tomlin, 2005; Teo , 2005). While the focus of these previous applications lies largely in safety verification, the work...B([15, 0],a0)× [−π,π])\\ V,∀qi ∈ Q, where a0 = 30m is the protected radius (chosen based upon published data of the wingspan of a Boeing KC -135

  3. Hydrodistillation-adsorption method for the isolation of water-soluble, non-soluble and high volatile compounds from plant materials.

    PubMed

    Mastelić, J; Jerković, I; Blazević, I; Radonić, A; Krstulović, L

    2008-08-15

    Proposed method of hydrodistillation-adsorption (HDA) on activated carbon and hydrodistillation (HD) with solvent trap were compared for the isolation of water-soluble, non-soluble and high volatile compounds, such as acids, monoterpenes, isothiocyanates and others from carob (Certonia siliqua L.), rosemary (Rosmarinus officinalis L.) and rocket (Eruca sativa L.). Isolated volatiles were analyzed by GC and GC/MS. The main advantages of HDA method over ubiquitous HD method were higher yields of volatile compounds and their simultaneous separation in three fractions that enabled more detail analyses. This method is particularly suitable for the isolation and analysis of the plant volatiles with high amounts of water-soluble compounds. In distinction from previously published adsorption of remaining volatile compounds from distillation water on activated carbon, this method offers simultaneous hydrodistillation and adsorption in the same apparatus.

  4. Single image super-resolution based on approximated Heaviside functions and iterative refinement

    PubMed Central

    Wang, Xin-Yu; Huang, Ting-Zhu; Deng, Liang-Jian

    2018-01-01

    One method of solving the single-image super-resolution problem is to use Heaviside functions. This has been done previously by making a binary classification of image components as “smooth” and “non-smooth”, describing these with approximated Heaviside functions (AHFs), and iteration including l1 regularization. We now introduce a new method in which the binary classification of image components is extended to different degrees of smoothness and non-smoothness, these components being represented by various classes of AHFs. Taking into account the sparsity of the non-smooth components, their coefficients are l1 regularized. In addition, to pick up more image details, the new method uses an iterative refinement for the residuals between the original low-resolution input and the downsampled resulting image. Experimental results showed that the new method is superior to the original AHF method and to four other published methods. PMID:29329298

  5. Discovery and validation of information theory-based transcription factor and cofactor binding site motifs.

    PubMed

    Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K

    2017-03-17

    Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Modelling multiple sources of dissemination bias in meta-analysis.

    PubMed

    Bowden, Jack; Jackson, Dan; Thompson, Simon G

    2010-03-30

    Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.

  7. Active implementation strategy of CONSORT adherence by a dental specialty journal improved randomized clinical trial reporting.

    PubMed

    Pandis, Nikolaos; Shamseer, Larissa; Kokich, Vincent G; Fleming, Padhraig S; Moher, David

    2014-09-01

    To describe a novel CONsolidated Standards of Reporting Trials (CONSORT) adherence strategy implemented by the American Journal of Orthodontics and Dentofacial Orthopedics (AJO-DO) and to report its impact on the completeness of reporting of published trials. The AJO-DO CONSORT adherence strategy, initiated in June 2011, involves active assessment of randomized clinical trial (RCT) reporting during the editorial process. The completeness of reporting CONSORT items was compared between trials submitted and published during the implementation period (July 2011 to September 2013) and trials published between August 2007 and July 2009. Of the 42 RCTs submitted (July 2011 to September 2013), 23 were considered for publication and assessed for completeness of reporting, seven of which were eventually published. For all published RCTs between 2007 and 2009 (n = 20), completeness of reporting by CONSORT item ranged from 0% to 100% (Median = 40%, interquartile range = 60%). All published trials in 2011-2013, reported 33 of 37 CONSORT (sub) items. Four CONSORT 2010 checklist items remained problematic even after implementation of the adherence strategy: changes to methods (3b), changes to outcomes (6b) after the trial commenced, interim analysis (7b), and trial stopping (14b), which are typically only reported when applicable. Trials published following implementation of the AJO-DO CONSORT adherence strategy completely reported more CONSORT items than those published or submitted previously. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Quantitation of ortho-cresyl phosphate adducts to butyrylcholinesterase in human serum by immunomagnetic-UHPLC-MS/MS.

    PubMed

    Johnson, Darryl; Carter, Melissa D; Crow, Brian S; Isenberg, Samantha L; Graham, Leigh Ann; Erol, H Akin; Watson, Caroline M; Pantazides, Brooke G; van der Schans, Marcel J; Langenberg, Jan P; Noort, Daan; Blake, Thomas A; Thomas, Jerry D; Johnson, Rudolph C

    2015-04-01

    Tri-ortho-cresyl phosphate (ToCP) is an anti-wear, flame retardant additive used in industrial lubricants, hydraulic fluids and gasoline. The neurotoxic effects of ToCP arise from the liver-activated metabolite 2-(o-cresyl)-4H-1,3,2-benzodioxaphosphoran-2-one (cresyl saligenin phosphate or CBDP), which inhibits esterase enzymes including butyrylcholinesterase (BChE). Following BChE adduction, CBDP undergoes hydrolysis to form the aged adduct ortho-cresyl phosphoserine (oCP-BChE), thus providing a biomarker of CBDP exposure. Previous studies have identified ToCP in aircraft cabin and cockpit air, but assessing human exposure has been hampered by the lack of a laboratory assay to confirm exposure. This work presents the development of an immunomagnetic-UHPLC-MS/MS method for the quantitation of unadducted BChE and the long-term CBDP biomarker, oCP-BChE, in human serum. The method has a reportable range from 2.0 ng/ml to 150 ng/ml, which is consistent with the sensitivity of methods used to detect organophosphorus nerve agent protein adducts. The assay demonstrated high intraday and interday accuracy (≥85%) and precision (RSD ≤ 15%) across the calibration range. The method was developed for future analyses of potential human exposure to CBDP. Analysis of human serum inhibited in vitro with CBDP demonstrated that the oCP-BChE adduct was stable for at least 72 h at 4, 22 and 37 °C. Compared to a previously reported assay, this method requires 75% less sample volume, reduces analysis time by a factor of 20 and demonstrates a threefold improvement in sensitivity. Published 2015. This article is a U.S. Government work and is in the public domain in the USA. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  9. Examining Factors That Impact Inpatient Management of Diabetes and the Role of Insulin Pen Devices.

    PubMed

    Smallwood, Chelsea; Lamarche, Danièle; Chevrier, Annie

    2017-02-01

    Insulin administration in the acute care setting is an integral component of inpatient diabetes management. Although some institutions have moved to insulin pen devices, many acute care settings continue to employ the vial and syringe method of insulin administration. The aim of this study was to evaluate the impact of insulin pen implementation in the acute care setting on patients, healthcare workers and health resource utilization. A review of published literature, including guidelines, was conducted to identify how insulin pen devices in the acute care setting may impact inpatient diabetes management. Previously published studies have revealed that insulin pen devices have the potential to improve inpatient management through better glycemic control, increased adherence and improved self-management education. Furthermore, insulin pen devices may result in cost savings and improved safety for healthcare workers. There are benefits to the use of insulin pen devices in acute care and, as such, their implementation should be considered. Copyright © 2016 Becton Dickinson Canada Inc. Published by Elsevier Inc. All rights reserved.

  10. Oxidation Mechanisms of Toluene and Benzene

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1995-01-01

    An expanded and improved version of a previously published benzene oxidation mechanism is presented and shown to model published experimental data fairly successfully. This benzene submodel is coupled to a modified version of a toluene oxidation submodel from the recent literature. This complete mechanism is shown to successfully model published experimental toluene oxidation data for a highly mixed flow reactor and for higher temperature ignition delay times in a shock tube. A comprehensive sensitivity analysis showing the most important reactions is presented for both the benzene and toluene reacting systems. The NASA Lewis toluene mechanism's modeling capability is found to be equivalent to that of the previously published mechanism which contains a somewhat different benzene submodel.

  11. Validating internet research: a test of the psychometric equivalence of internet and in-person samples.

    PubMed

    Meyerson, Paul; Tryon, Warren W

    2003-11-01

    This study evaluated the psychometric equivalency of Web-based research. The Sexual Boredom Scale was presented via the World-Wide Web along with five additional scales used to validate it. A subset of 533 participants that matched a previously published sample (Watt & Ewing, 1996) on age, gender, and race was identified. An 8 x 8 correlation matrix from the matched Internet sample was compared via structural equation modeling with a similar 8 x 8 correlation matrix from the previously published study. The Internet and previously published samples were psychometrically equivalent. Coefficient alpha values calculated on the matched Internet sample yielded reliability coefficients almost identical to those for the previously published sample. Factors such as computer administration and uncontrollable administration settings did not appear to affect the results. Demographic data indicated an overrepresentation of males by about 6% and Caucasians by about 13% relative to the U.S. Census (2000). A total of 2,230 participants were obtained in about 8 months without remuneration. These results suggest that data collection on the Web is (1) reliable, (2) valid, (3) reasonably representative, (4) cost effective, and (5) efficient.

  12. Comparing Quantitative Values of Two Generations of Laser-Assisted Indocyanine Green Dye Angiography Systems: Can We Predict Necrosis?

    PubMed Central

    Fourman, Mitchell S.; Rivara, Andrew; Dagum, Alexander B.; Huston, Tara L.; Ganz, Jason C.; Bui, Duc T.; Khan, Sami U.

    2014-01-01

    Objective: Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Methods: Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. Results: 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R2 = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Conclusion: Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8. PMID:25525483

  13. Optimization of a direct spectrophotometric method to investigate the kinetics and inhibition of sialidases

    PubMed Central

    2012-01-01

    Backgrounds Streptococcus pneumoniae expresses three distinct sialidases, NanA, NanB, and NanC, that are believed to be key virulence factors and thus, potential important drug targets. We previously reported that the three enzymes release different products from sialosides, but could share a common catalytic mechanism before the final step of product formation. However, the kinetic investigations of the three sialidases have not been systematically done thus far, due to the lack of an easy and steady measurement of sialidase reaction rate. Results In this work, we present further kinetic characterization of pneumococcal sialidases by using a direct spectrophotometric method with the chromogenic substrate p-nitrophenyl-N-acetylneuraminic acid (p-NP-Neu5Ac). Using our assay, the measured kinetic parameters of the three purified pneumococcal sialidase, NanA, NanB and NanC, were obtained and were in perfect agreement with the previously published data. The major advantage of this alternative method resides in the direct measurement of the released product, allowing to readily determine of initial reaction rates and record complete hydrolysis time courses. Conclusion We developed an accurate, fast and sensitive spectrophotometric method to investigate the kinetics of sialidase-catalyzed reactions. This fast, sensitive, inexpensive and accurate method could benefit the study of the kinetics and inhibition of sialidases in general. PMID:23031230

  14. Maximal use of kinematic information for the extraction of the mass of the top quark in single-lepton tt bar events at DO

    NASA Astrophysics Data System (ADS)

    Estrada Vigil, Juan Cruz

    The mass of the top (t) quark has been measured in the lepton+jets channel of tt¯ final states studied by the DØ and CDF experiments at Fermilab using data from Run I of the Tevatron pp¯ collider. The result published by DØ is 173.3 +/- 5.6(stat) +/- 5.5(syst) GeV. We present a different method to perform this measurement using the existing data. The new technique uses all available kinematic information in an event, and provides a significantly smaller statistical uncertainty than achieved in previous analyses. The preliminary results presented in this thesis indicate a statistical uncertainty for the extracted mass of the top quark of 3.5 GeV, which represents a significant improvement over the previous value of 5.6 GeV. The method of analysis is very general, and may be particularly useful in situations where there is a small signal and a large background.

  15. Battery Test Manual For Plug-In Hybrid Electric Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey R. Belt

    2010-09-01

    This battery test procedure manual was prepared for the United States Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy (EERE), Vehicle Technologies Program. It is based on technical targets established for energy storage development projects aimed at meeting system level DOE goals for Plug-in Hybrid Electric Vehicles (PHEV). The specific procedures defined in this manual support the performance and life characterization of advanced battery devices under development for PHEV’s. However, it does share some methods described in the previously published battery test manual for power-assist hybrid electric vehicles. Due to the complexity of some of the proceduresmore » and supporting analysis, a revision including some modifications and clarifications of these procedures is expected. As in previous battery and capacitor test manuals, this version of the manual defines testing methods for full-size battery systems, along with provisions for scaling these tests for modules, cells or other subscale level devices.« less

  16. Battery Test Manual For Plug-In Hybrid Electric Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey R. Belt

    2010-12-01

    This battery test procedure manual was prepared for the United States Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy (EERE), Vehicle Technologies Program. It is based on technical targets established for energy storage development projects aimed at meeting system level DOE goals for Plug-in Hybrid Electric Vehicles (PHEV). The specific procedures defined in this manual support the performance and life characterization of advanced battery devices under development for PHEV’s. However, it does share some methods described in the previously published battery test manual for power-assist hybrid electric vehicles. Due to the complexity of some of the proceduresmore » and supporting analysis, a revision including some modifications and clarifications of these procedures is expected. As in previous battery and capacitor test manuals, this version of the manual defines testing methods for full-size battery systems, along with provisions for scaling these tests for modules, cells or other subscale level devices.« less

  17. Review of Research Reporting Guidelines for Radiology Researchers.

    PubMed

    Cronin, Paul; Rawson, James V

    2016-05-01

    Prior articles have reviewed reporting guidelines and study evaluation tools for clinical research. However, only some of the many available accepted reporting guidelines at the Enhancing the QUAlity and Transparency Of health Research Network have been discussed in previous reports. In this paper, we review the key Enhancing the QUAlity and Transparency Of health Research reporting guidelines that have not been previously discussed. The study types include diagnostic and prognostic studies, reliability and agreement studies, observational studies, analytical and descriptive, experimental studies, quality improvement studies, qualitative research, health informatics, systematic reviews and meta-analyses, economic evaluations, and mixed methods studies. There are also sections on study protocols, and statistical analyses and methods. In each section, there is a brief overview of the study type, and then the reporting guideline(s) that are most applicable to radiology researchers including radiologists involved in health services research are discussed. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  18. Improved determination of particulate absorption from combined filter pad and PSICAM measurements.

    PubMed

    Lefering, Ina; Röttgers, Rüdiger; Weeks, Rebecca; Connor, Derek; Utschig, Christian; Heymann, Kerstin; McKee, David

    2016-10-31

    Filter pad light absorption measurements are subject to two major sources of experimental uncertainty: the so-called pathlength amplification factor, β, and scattering offsets, o, for which previous null-correction approaches are limited by recent observations of non-zero absorption in the near infrared (NIR). A new filter pad absorption correction method is presented here which uses linear regression against point-source integrating cavity absorption meter (PSICAM) absorption data to simultaneously resolve both β and the scattering offset. The PSICAM has previously been shown to provide accurate absorption data, even in highly scattering waters. Comparisons of PSICAM and filter pad particulate absorption data reveal linear relationships that vary on a sample by sample basis. This regression approach provides significantly improved agreement with PSICAM data (3.2% RMS%E) than previously published filter pad absorption corrections. Results show that direct transmittance (T-method) filter pad absorption measurements perform effectively at the same level as more complex geometrical configurations based on integrating cavity measurements (IS-method and QFT-ICAM) because the linear regression correction compensates for the sensitivity to scattering errors in the T-method. This approach produces accurate filter pad particulate absorption data for wavelengths in the blue/UV and in the NIR where sensitivity issues with PSICAM measurements limit performance. The combination of the filter pad absorption and PSICAM is therefore recommended for generating full spectral, best quality particulate absorption data as it enables correction of multiple errors sources across both measurements.

  19. Rapid determination of the isomeric truxillines in illicit cocaine via capillary gas chromatography/flame ionization detection and their use and implication in the determination of cocaine origin and trafficking routes.

    PubMed

    Mallette, Jennifer R; Casale, John F

    2014-10-17

    The isomeric truxillines are a group of minor alkaloids present in all illicit cocaine samples. The relative amount of truxillines in cocaine is indicative of the variety of coca used for cocaine processing, and thus, is useful in source determination. Previously, the determination of isomeric truxillines in cocaine was performed with a gas chromatography/electron capture detection method. However, due to the tedious sample preparation as well as the expense and maintenance required of electron capture detectors, the protocol was converted to a gas chromatography/flame-ionization detection method. Ten truxilline isomers (alpha-, beta-, delta-, epsilon-, gamma-, omega, zeta-, peri-, neo-, and epi-) were quantified relative to a structurally related internal standard, 4',4″-dimethyl-α-truxillic acid dimethyl ester. The method was shown to have a linear response from 0.001 to 1.00 mg/mL and a lower detection limit of 0.001 mg/mL. In this method, the truxillines are directly reduced with lithium aluminum hydride and then acylated with heptafluorobutyric anhydride prior to analysis. The analysis of more than 100 cocaine hydrochloride samples is presented and compared to data obtained by the previous methodology. Authentic cocaine samples obtained from the source countries of Colombia, Bolivia, and Peru were also analyzed, and comparative data on more than 23,000 samples analyzed over the past 10 years with the previous methodology is presented. Published by Elsevier B.V.

  20. Third-order elastic constants of diamond determined from experimental data

    DOE PAGES

    Winey, J. M.; Hmiel, A.; Gupta, Y. M.

    2016-06-01

    The pressure derivatives of the second-order elastic constants (SOECs) of diamond were determined by analyzing previous sound velocity measurements under hydrostatic stress [McSkimin and Andreatch, J. Appl. Phys. 43, 294 (1972)]. Furthermore, our analysis corrects an error in the previously reported results.We present a complete and corrected set of third-order elastic constants (TOECs) using the corrected pressure derivatives, together with published data for the nonlinear elastic response of shock compressed diamond [Lang and Gupta, Phys. Rev. Lett. 106, 125502 (2011)] and it differs significantly from TOECs published previously.

  1. The methodological quality of systematic reviews published in high-impact nursing journals: a review of the literature.

    PubMed

    Pölkki, Tarja; Kanste, Outi; Kääriäinen, Maria; Elo, Satu; Kyngäs, Helvi

    2014-02-01

    To analyse systematic review articles published in the top 10 nursing journals to determine the quality of the methods employed within them. Systematic review is defined as a scientific research method that synthesises high-quality scientific knowledge on a given topic. The number of such reviews in nursing science has increased dramatically during recent years, but their methodological quality has not previously been assessed. A review of the literature using a narrative approach. Ranked impact factor scores for nursing journals were obtained from the Journal Citation Report database of the Institute of Scientific Information (ISI Web of Knowledge). All issues from the years 2009 and 2010 of the top 10 ranked journals were included. CINAHL and MEDLINE databases were searched to locate studies using the search terms 'systematic review' and 'systematic literature review'. A total of 39 eligible studies were identified. Their methodological quality was evaluated through the specific criteria of quality assessment, description of synthesis and strengths and weaknesses reported in the included studies. Most of the eligible systematic reviews included several different designs or types of quantitative study. The majority included a quality assessment, and a total of 17 different criteria were identified. The method of synthesis was mentioned in about half of the reviews, the most common being narrative synthesis. The weaknesses of reviews were discussed, while strengths were rarely highlighted. The methodological quality of the systematic reviews examined varied considerably, although they were all published in nursing journals with a high-impact factor. Despite the fact that systematic reviews are considered the most robust source of research evidence, they vary in methodological quality. This point is important to consider in clinical practice when applying the results to patient care. © 2013 Blackwell Publishing Ltd.

  2. Using flow cytometry to estimate pollen DNA content: improved methodology and applications

    PubMed Central

    Kron, Paul; Husband, Brian C.

    2012-01-01

    Background and Aims Flow cytometry has been used to measure nuclear DNA content in pollen, mostly to understand pollen development and detect unreduced gametes. Published data have not always met the high-quality standards required for some applications, in part due to difficulties inherent in the extraction of nuclei. Here we describe a simple and relatively novel method for extracting pollen nuclei, involving the bursting of pollen through a nylon mesh, compare it with other methods and demonstrate its broad applicability and utility. Methods The method was tested across 80 species, 64 genera and 33 families, and the data were evaluated using established criteria for estimating genome size and analysing cell cycle. Filter bursting was directly compared with chopping in five species, yields were compared with published values for sonicated samples, and the method was applied by comparing genome size estimates for leaf and pollen nuclei in six species. Key Results Data quality met generally applied standards for estimating genome size in 81 % of species and the higher best practice standards for cell cycle analysis in 51 %. In 41 % of species we met the most stringent criterion of screening 10 000 pollen grains per sample. In direct comparison with two chopping techniques, our method produced better quality histograms with consistently higher nuclei yields, and yields were higher than previously published results for sonication. In three binucleate and three trinucleate species we found that pollen-based genome size estimates differed from leaf tissue estimates by 1·5 % or less when 1C pollen nuclei were used, while estimates from 2C generative nuclei differed from leaf estimates by up to 2·5 %. Conclusions The high success rate, ease of use and wide applicability of the filter bursting method show that this method can facilitate the use of pollen for estimating genome size and dramatically improve unreduced pollen production estimation with flow cytometry. PMID:22875815

  3. Preliminary estimates of annual agricultural pesticide use for counties of the conterminous United States, 2010-11

    USGS Publications Warehouse

    Baker, Nancy T.; Stone, Wesley W.

    2013-01-01

    This report provides preliminary estimates of annual agricultural use of 374 pesticide compounds in counties of the conterminous United States in 2010 and 2011, compiled by means of methods described in Thelin and Stone (2013). U.S. Department of Agriculture (USDA) county-level data for harvested-crop acreage were used in conjunction with proprietary Crop Reporting District (CRD)-level pesticide-use data to estimate county-level pesticide use. Estimated pesticide use (EPest) values were calculated with both the EPest-high and EPest-low methods. The distinction between the EPest-high method and the EPest-low method is that there are more counties with estimated pesticide use for EPest-high compared to EPest-low, owing to differing assumptions about missing survey data (Thelin and Stone, 2013). Preliminary estimates in this report will be revised upon availability of updated crop acreages in the 2012 Agricultural Census, to be published by the USDA in 2014. In addition, estimates for 2008 and 2009 previously published by Stone (2013) will be updated subsequent to the 2012 Agricultural Census release. Estimates of annual agricultural pesticide use are provided as downloadable, tab-delimited files, which are organized by compound, year, state Federal Information Processing Standard (FIPS) code, county FIPS code, and kg (amount in kilograms).

  4. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  5. Using the Halstead-Reitan Battery to diagnose brain damage: a comparison of the predictive power of traditional techniques to Rohling's Interpretive Method.

    PubMed

    Rohling, Martin L; Williamson, David J; Miller, L Stephen; Adams, Russell L

    2003-11-01

    The aim of this project was to validate an alternative global measure of neurocognitive impairment (Rohling Interpretive Method, or RIM) that could be generated from data gathered from a flexible battery approach. A critical step in this process is to establish the utility of the technique against current standards in the field. In this paper, we compared results from the Rohling Interpretive Method to those obtained from the General Neuropsychological Deficit Scale (GNDS; Reitan & Wolfson, 1988) and the Halstead-Russell Average Impairment Rating (AIR; Russell, Neuringer & Goldstein, 1970) on a large previously published sample of patients assessed with the Halstead-Reitan Battery (HRB). Findings support the use of the Rohling Interpretive Method in producing summary statistics similar in diagnostic sensitivity and specificity to the traditional HRB indices.

  6. Research Techniques Made Simple: Emerging Methods to Elucidate Protein Interactions through Spatial Proximity.

    PubMed

    Che, Yonglu; Khavari, Paul A

    2017-12-01

    Interactions between proteins are essential for fundamental cellular processes, and the diversity of such interactions enables the vast variety of functions essential for life. A persistent goal in biological research is to develop assays that can faithfully capture different types of protein interactions to allow their study. A major step forward in this direction came with a family of methods that delineates spatial proximity of proteins as an indirect measure of protein-protein interaction. A variety of enzyme- and DNA ligation-based methods measure protein co-localization in space, capturing novel interactions that were previously too transient or low affinity to be identified. Here we review some of the methods that have been successfully used to measure spatially proximal protein-protein interactions. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. A new method for tracking poultry litter in the Potomac Basin headwaters of West Virginia.

    PubMed

    Weidhaas, J; Lipscomb, E

    2013-08-01

    To validate the distribution of a poultry litter-specific marker gene in faecally contaminated environmental waters of an intensive poultry litter rearing region. A TaqMan(®)-based qPCR assay for Brevibacterium sp. LA35 16S rRNA (LA35 gene), which was previously shown to be associated with poultry litter and faeces, was tested on 126 nontarget faecal samples and 28 poultry litter and faecal samples. The TaqMan assay was sensitive (76%) and specific (100%) to the LA35 gene and exhibited a detection limit for poultry litter in water samples that is sufficiently low (2.5 × 10(-2) mg litter l(-1)) to be applicable for environmental monitoring. The LA35 gene was detected in 43% of water samples (n = 30) collected in an intensive poultry rearing region of West Virginia which drains to the Chesapeake Bay. The poultry-specific TaqMan qPCR method for the LA35 gene is more specific than previously published methods and can be used to identify regions impacted by poultry rearing activities. The LA35 gene appears to have a broad geographical distribution as it has been found in poultry litter and faeces from Delaware and West Virginia, in this study and from Arkansas, Georgia, Florida, Minnesota, Oklahoma and Utah previously. © 2013 The Society for Applied Microbiology.

  8. Correlation, evaluation, and extension of linearized theories for tire motion and wheel shimmy

    NASA Technical Reports Server (NTRS)

    Smiley, Robert F

    1957-01-01

    An evaluation is made of the existing theories of a linearized tire motion and wheel shimmy. It is demonstrated that most of the previously published theories represent varying degrees of approximation to a summary theory developed in this report which is a minor modification of the basic theory of Von Schlippe and Dietrich. In most cases where strong differences exist between the previously published theories and summary theory, the previously published theories are shown to possess certain deficiencies. A series of systematic approximations to the summary theory is developed for the treatment of problems too simple to merit the use of the complete summary theory, and procedures are discussed for applying the summary theory and its systematic approximations to the shimmy of more complex landing-gear structures than have previously been considered. Comparisons of the existing experimental data with the predictions of the summary theory and the systematic approximations provide a fair substantiation of the more detailed approximate theories.

  9. Effect of reverse shoulder design philosophy on muscle moment arms.

    PubMed

    Hamilton, Matthew A; Diep, Phong; Roche, Chris; Flurin, Pierre Henri; Wright, Thomas W; Zuckerman, Joseph D; Routman, Howard

    2015-04-01

    This study analyzes the muscle moment arms of three different reverse shoulder design philosophies using a previously published method. Digital bone models of the shoulder were imported into a 3D modeling software and markers placed for the origin and insertion of relevant muscles. The anatomic model was used as a baseline for moment arm calculations. Subsequently, three different reverse shoulder designs were virtually implanted and moment arms were analyzed in abduction and external rotation. The results indicate that the lateral offset between the joint center and the axis of the humerus specific to one reverse shoulder design increased the external rotation moment arms of the posterior deltoid relative to the other reverse shoulder designs. The other muscles analyzed demonstrated differences in the moment arms, but none of the differences reached statistical significance. This study demonstrated how the combination of variables making up different reverse shoulder designs can affect the moment arms of the muscles in different and statistically significant ways. The role of humeral offset in reverse shoulder design has not been previously reported and could have an impact on external rotation and stability achieved post-operatively. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  10. Portable Wind Energy Harvesters for Low-Power Applications: A Survey

    PubMed Central

    Nabavi, Seyedfakhreddin; Zhang, Lihong

    2016-01-01

    Energy harvesting has become an increasingly important topic thanks to the advantages in renewability and environmental friendliness. In this paper, a comprehensive study on contemporary portable wind energy harvesters has been conducted. The electrical power generation methods of portable wind energy harvesters are surveyed in three major groups, piezoelectric-, electromagnetic-, and electrostatic-based generators. The paper also takes another view of this area by gauging the required mechanisms for trapping wind flow from ambient environment. In this regard, rotational and aeroelastic mechanisms are analyzed for the portable wind energy harvesting devices. The comparison between both mechanisms shows that the aeroelastic mechanism has promising potential in producing an energy harvester in smaller scale although how to maintain the resonator perpendicular to wind flow for collecting the maximum vibration is still a major challenge to overcome for this mechanism. Furthermore, this paper categorizes the previously published portable wind energy harvesters to macro and micro scales in terms of their physical dimensions. The power management systems are also surveyed to explore the possibility of improving energy conversion efficiency. Finally some insights and research trends are pointed out based on an overall analysis of the previously published works along the historical timeline. PMID:27438834

  11. Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010

    USGS Publications Warehouse

    Soller, David R.; Soller, David R.

    2012-01-01

    The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  12. CamMedNP: building the Cameroonian 3D structural natural products database for virtual screening.

    PubMed

    Ntie-Kang, Fidele; Mbah, James A; Mbaze, Luc Meva'a; Lifongo, Lydia L; Scharfe, Michael; Hanna, Joelle Ngo; Cho-Ngwa, Fidelis; Onguéné, Pascal Amoa; Owono Owono, Luc C; Megnassan, Eugene; Sippl, Wolfgang; Efange, Simon M N

    2013-04-16

    Computer-aided drug design (CADD) often involves virtual screening (VS) of large compound datasets and the availability of such is vital for drug discovery protocols. We present CamMedNP - a new database beginning with more than 2,500 compounds of natural origin, along with some of their derivatives which were obtained through hemisynthesis. These are pure compounds which have been previously isolated and characterized using modern spectroscopic methods and published by several research teams spread across Cameroon. In the present study, 224 distinct medicinal plant species belonging to 55 plant families from the Cameroonian flora have been considered. About 80 % of these have been previously published and/or referenced in internationally recognized journals. For each compound, the optimized 3D structure, drug-like properties, plant source, collection site and currently known biological activities are given, as well as literature references. We have evaluated the "drug-likeness" of this database using Lipinski's "Rule of Five". A diversity analysis has been carried out in comparison with the ChemBridge diverse database. CamMedNP could be highly useful for database screening and natural product lead generation programs.

  13. Mapping the core journals of the physical therapy literature*

    PubMed Central

    Fell, Dennis W; Buchanan, Melanie J; Horchen, Heidi A; Scherr, Joel A

    2011-01-01

    Objectives: The purpose of this study was to identify (1) core journals in the literature of physical therapy, (2) currency of references cited in that literature, and (3) online databases providing the highest coverage rate of core journals. Method: Data for each cited reference in each article of four source journals for three years were recorded, including type of literature, year of publication, and journal title. The journal titles were ranked in descending order according to the frequency of citations and divided into three zones using Bradford's Law of Scattering. Four databases were analyzed for coverage rates of articles published in the Zone 1 and Zone 2 journals in 2007. Results: Journal articles were the most frequently cited type of literature, with sixteen journals supplying one-third of the cited journal references. Physical Therapy was the most commonly cited title. There were more cited articles published from 2000 to 2007 than in any previous full decade. Of the databases analyzed, CINAHL provided the highest coverage rate for Zone 1 2007 publications. Conclusions: Results were similar to a previous study, except for changes in the order of Zone 1 journals. Results can help physical therapists and librarians determine important journals in this discipline. PMID:21753912

  14. Portable Wind Energy Harvesters for Low-Power Applications: A Survey.

    PubMed

    Nabavi, Seyedfakhreddin; Zhang, Lihong

    2016-07-16

    Energy harvesting has become an increasingly important topic thanks to the advantages in renewability and environmental friendliness. In this paper, a comprehensive study on contemporary portable wind energy harvesters has been conducted. The electrical power generation methods of portable wind energy harvesters are surveyed in three major groups, piezoelectric-, electromagnetic-, and electrostatic-based generators. The paper also takes another view of this area by gauging the required mechanisms for trapping wind flow from ambient environment. In this regard, rotational and aeroelastic mechanisms are analyzed for the portable wind energy harvesting devices. The comparison between both mechanisms shows that the aeroelastic mechanism has promising potential in producing an energy harvester in smaller scale although how to maintain the resonator perpendicular to wind flow for collecting the maximum vibration is still a major challenge to overcome for this mechanism. Furthermore, this paper categorizes the previously published portable wind energy harvesters to macro and micro scales in terms of their physical dimensions. The power management systems are also surveyed to explore the possibility of improving energy conversion efficiency. Finally some insights and research trends are pointed out based on an overall analysis of the previously published works along the historical timeline.

  15. Epithelial-mesenchymal transition spectrum quantification and its efficacy in deciphering survival and drug responses of cancer patients.

    PubMed

    Tan, Tuan Zea; Miow, Qing Hao; Miki, Yoshio; Noda, Tetsuo; Mori, Seiichi; Huang, Ruby Yun-Ju; Thiery, Jean Paul

    2014-10-01

    Epithelial-mesenchymal transition (EMT) is a reversible and dynamic process hypothesized to be co-opted by carcinoma during invasion and metastasis. Yet, there is still no quantitative measure to assess the interplay between EMT and cancer progression. Here, we derived a method for universal EMT scoring from cancer-specific transcriptomic EMT signatures of ovarian, breast, bladder, lung, colorectal and gastric cancers. We show that EMT scoring exhibits good correlation with previously published, cancer-specific EMT signatures. This universal and quantitative EMT scoring was used to establish an EMT spectrum across various cancers, with good correlation noted between cell lines and tumours. We show correlations between EMT and poorer disease-free survival in ovarian and colorectal, but not breast, carcinomas, despite previous notions. Importantly, we found distinct responses between epithelial- and mesenchymal-like ovarian cancers to therapeutic regimes administered with or without paclitaxel in vivo and demonstrated that mesenchymal-like tumours do not always show resistance to chemotherapy. EMT scoring is thus a promising, versatile tool for the objective and systematic investigation of EMT roles and dynamics in cancer progression, treatment response and survival. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.

  16. Rapid analysis of aminoglycoside antibiotics in bovine tissues using disposable pipette extraction and ultrahigh performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Lehotay, Steven J; Mastovska, Katerina; Lightfield, Alan R; Nuñez, Alberto; Dutko, Terry; Ng, Chilton; Bluhm, Louis

    2013-10-25

    A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pipette extraction, and analysis by a 3 min ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. The drug analytes include neomycin, streptomycin, dihydrosptreptomycin, and spectinomycin, which have residue tolerances in bovine in the US, and kanamicin, gentamicin, apramycin, amikacin, and hygromycin, which do not have US tolerances established in bovine tissues. Tobramycin was used as an internal standard. An additional drug, paromomycin also was validated in the method, but it was dropped during implementation due to conversion of neomycin into paromomycin. Proposed fragmentation patterns for the monitored ions of each analyte were elucidated with the aid of high resolution MS using a quadrupole-time-of-flight instrument. Recoveries from spiking experiments at regulatory levels of concern showed that all analytes averaged 70-120% recoveries in all tissues, except hygromycin averaged 61% recovery. Lowest calibrated levels were as low as 0.005 μg/g in matrix extracts, which approximately corresponded to the limit of detection for screening purposes. Drug identifications at levels <0.05 μg/g were made in spiked and/or real samples for all analytes and tissues tested. Analyses of 60 samples from 20 slaughtered cattle previously screened positive for aminoglycosides showed that this method worked well in practice. The UHPLC-MS/MS method has several advantages compared to the previous microbial inhibition screening assay, especially for distinguishing individual drugs from a mixture and improving identification of gentamicin in tissue samples. Published by Elsevier B.V.

  17. The application of prototype point processes for the summary and description of California wildfires

    USGS Publications Warehouse

    Nichols, K.; Schoenberg, F.P.; Keeley, J.E.; Bray, A.; Diez, D.

    2011-01-01

    A method for summarizing repeated realizations of a space-time marked point process, known as prototyping, is discussed and applied to catalogues of wildfires in California. Prototype summaries are constructed for varying time intervals using California wildfire data from 1990 to 2006. Previous work on prototypes for temporal and space-time point processes is extended here to include methods for computing prototypes with marks and the incorporation of prototype summaries into hierarchical clustering algorithms, the latter of which is used to delineate fire seasons in California. Other results include summaries of patterns in the spatial-temporal distribution of wildfires within each wildfire season. ?? 2011 Blackwell Publishing Ltd.

  18. Adolescent Male Conduct-Disordered Patients in Substance Use Disorder Treatment: Examining the "Limited Prosocial Emotions" Specifier.

    PubMed

    Sakai, Joseph T; Mikulich-Gilbertson, Susan K; Young, Susan E; Rhee, Soo Hyun; McWilliams, Shannon K; Dunn, Robin; Salomonsen-Sautel, Stacy; Thurstone, Christian; Hopfer, Christian J

    2016-01-01

    To our knowledge, this is the first study to examine the DSM-5-defined conduct disorder (CD) with limited prosocial emotions (LPE) among adolescents in substance use disorder (SUD) treatment, despite the high rates of CD in this population. We tested previously published methods of LPE categorization in a sample of male conduct-disordered patients in SUD treatment (n=196). CD with LPE patients did not demonstrate a distinct pattern in terms of demographics or co-morbidity regardless of the categorization method utilized. In conclusion, LPE, as operationalized here, does not identify a distinct subgroup of patients based on psychiatric comorbidity, SUD diagnoses, or demographics.

  19. Adding results to a meta-analysis: Theory and example

    NASA Astrophysics Data System (ADS)

    Willson, Victor L.

    Meta-analysis has been used as a research method to describe bodies of research data. It promotes hypothesis formation and the development of science education laws. A function overlooked, however, is the role it plays in updating research. Methods to integrate new research with meta-analysis results need explication. A procedure is presented using Bayesian analysis. Research in science education attitude correlation with achievement has been published after a recent meta-analysis of the topic. The results show how new findings complement the previous meta-analysis and extend its conclusions. Additional methodological questions adddressed are how studies are to be weighted, which variables are to be examined, and how often meta-analysis are to be updated.

  20. Generalisation of the identity method for determination of high-order moments of multiplicity distributions with a software implementation

    NASA Astrophysics Data System (ADS)

    Maćkowiak-Pawłowska, Maja; Przybyła, Piotr

    2018-05-01

    The incomplete particle identification limits the experimentally-available phase space region for identified particle analysis. This problem affects ongoing fluctuation and correlation studies including the search for the critical point of strongly interacting matter performed on SPS and RHIC accelerators. In this paper we provide a procedure to obtain nth order moments of the multiplicity distribution using the identity method, generalising previously published solutions for n=2 and n=3. Moreover, we present an open source software implementation of this computation, called Idhim, that allows one to obtain the true moments of identified particle multiplicity distributions from the measured ones provided the response function of the detector is known.

  1. Time-resolved gamma spectroscopy of single events

    NASA Astrophysics Data System (ADS)

    Wolszczak, W.; Dorenbos, P.

    2018-04-01

    In this article we present a method of characterizing scintillating materials by digitization of each individual scintillation pulse followed by digital signal processing. With this technique it is possible to measure the pulse shape and the energy of an absorbed gamma photon on an event-by-event basis. In contrast to time-correlated single photon counting technique, the digital approach provides a faster measurement, an active noise suppression, and enables characterization of scintillation pulses simultaneously in two domains: time and energy. We applied this method to study the pulse shape change of a CsI(Tl) scintillator with energy of gamma excitation. We confirmed previously published results and revealed new details of the phenomenon.

  2. An investigation into exoplanet transits and uncertainties

    NASA Astrophysics Data System (ADS)

    Ji, Y.; Banks, T.; Budding, E.; Rhodes, M. D.

    2017-06-01

    A simple transit model is described along with tests of this model against published results for 4 exoplanet systems (Kepler-1, 2, 8, and 77). Data from the Kepler mission are used. The Markov Chain Monte Carlo (MCMC) method is applied to obtain realistic error estimates. Optimisation of limb darkening coefficients is subject to data quality. It is more likely for MCMC to derive an empirical limb darkening coefficient for light curves with S/N (signal to noise) above 15. Finally, the model is applied to Kepler data for 4 Kepler candidate systems (KOI 760.01, 767.01, 802.01, and 824.01) with previously unpublished results. Error estimates for these systems are obtained via the MCMC method.

  3. Empirical Observations on the Sensitivity of Hot Cathode Ionization Type Vacuum Gages

    NASA Technical Reports Server (NTRS)

    Summers, R. L.

    1969-01-01

    A study of empirical methods of predicting tile relative sensitivities of hot cathode ionization gages is presented. Using previously published gage sensitivities, several rules for predicting relative sensitivity are tested. The relative sensitivity to different gases is shown to be invariant with gage type, in the linear range of gage operation. The total ionization cross section, molecular and molar polarizability, and refractive index are demonstrated to be useful parameters for predicting relative gage sensitivity. Using data from the literature, the probable error of predictions of relative gage sensitivity based on these molecular properties is found to be about 10 percent. A comprehensive table of predicted relative sensitivities, based on empirical methods, is presented.

  4. Repair for mitral stenosis due to pannus formation after Duran ring annuloplasty.

    PubMed

    Song, Seunghwan; Cho, Seong Ho; Yang, Ji-Hyuk; Park, Pyo Won

    2010-12-01

    Mitral stenosis after mitral repair with using an annuloplasty ring is not common and it is almost always due to pannus formation. Mitral valve replacement was required in most of the previous cases of pannus covering the mitral valve leaflet, which could not be stripped off without damaging the valve leaflets. In two cases, we removed the previous annuloplasty ring and pannus without leaflet injury, and we successfully repaired the mitral valve. During the follow-up of 4 months and 39 months respectively, we observed improvement of the patients' symptoms and good valvular function. Redo mitral repair may be a possible method for treating mitral stenosis due to pannus formation after ring annuloplasty. Copyright © 2010 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  5. Efficacy and Safety of Bendamustine and Ibrutinib in Previously Untreated Patients With Chronic Lymphocytic Leukemia: Indirect Comparison.

    PubMed

    Andrasiak, Iga; Rybka, Justyna; Knopinska-Posluszny, Wanda; Wrobel, Tomasz

    2017-05-01

    Bendamustine and ibrutinib are commonly used in the treatment of patients suffering from chronic lymphocytic leukemia (CLL). In this study we compare efficacy and safety bendamustine versus ibrutinib therapy in previously untreated patients with CLL. Because there are no head-to-head comparisons between bendamustine and ibrutinib, we performed indirect comparison using Bucher method. A systematic literature review was performed and 2 studies published before June 2016 were taken into analysis. Treatment with ibrutinib significantly improves PFS determined by investigator (HR of 0.3; P = .01) and OS (HR of 0.21; P < .001. Our study indicates that ibrutinib therapy improves PFS, OS and is superior in terms of safety comparing with bendamustine therapy in CLL patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Participant comprehension of research for which they volunteer: a systematic review.

    PubMed

    Montalvo, Wanda; Larson, Elaine

    2014-11-01

    Evidence indicates that research participants often do not fully understand the studies for which they have volunteered. The aim of this systematic review was to examine the relationship between the process of obtaining informed consent for research and participant comprehension and satisfaction with the research. Systematic review of published research on informed consent and participant comprehension of research for which they volunteer using the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) Statement as a guide. PubMed, Cumulative Index for Nursing and Allied Health Literature, Cochrane Central Register of Controlled Trails, and Cochrane Database of Systematic Reviews were used to search the literature for studies meeting the following inclusion criteria: (a) published between January 1, 2006, and December 31, 2013, (b) interventional or descriptive quantitative design, (c) published in a peer-reviewed journal, (d) written in English, and (e) assessed participant comprehension or satisfaction with the research process. Studies were assessed for quality using seven indicators: sampling method, use of controls or comparison groups, response rate, description of intervention, description of outcome, statistical method, and health literacy assessment. Of 176 studies identified, 27 met inclusion criteria: 13 (48%) were randomized interventional designs and 14 (52%) were descriptive. Three categories of studies included projects assessing (a) enhanced consent process or form, (b) multimedia methods, and (c) education to improve participant understanding. Most (78%) used investigator-developed tools to assess participant comprehension, did not assess participant health literacy (74%), or did not assess the readability level of the consent form (89%). Researchers found participants lacked basic understanding of research elements: randomization, placebo, risks, and therapeutic misconception. Findings indicate (a) inconsistent assessment of participant reading or health literacy level, (b) measurement variation associated with use of nonstandardized tools, and (c) continued therapeutic misconception and lack of understanding among research participants of randomization, placebo, benefit, and risk. While the Agency for Healthcare and Quality and National Quality Forum have published informed consent and authorization toolkits, previously published validated tools are underutilized. Informed consent requires the assessment of health literacy, reading level, and comprehension of research participants using validated assessment tools and methods. © 2014 Sigma Theta Tau International.

  7. Stability of Nanobubbles Formed at the Interface between Cold Water and Hot Highly Oriented Pyrolytic Graphite.

    PubMed

    An, Hongjie; Tan, Beng Hau; Zeng, Qingyun; Ohl, Claus-Dieter

    2016-11-01

    For the wider application of nanobubbles, a simple and reproducible nucleation process is not readily available. Here we describe a method for nucleating nanobubbles using only the most basic of conditions: depositing cold water at 4 °C on heated highly oriented pyrolytic graphite substrates. This method thus avoids the need, as in previous studies, to use secondary liquids, salts, or electrolysis to nucleate the nanobubbles and provides a pure system in which the properties of nanobubbles can be studied. The nanobubbles generated with this method are observed to survive for at least 5 days, barely changing their contact angles or heights after the first few hours. The stability of the nanobubbles in our system is discussed within the framework of some recently published theories.

  8. Adaptive sliding mode control for finite-time stability of quad-rotor UAVs with parametric uncertainties.

    PubMed

    Mofid, Omid; Mobayen, Saleh

    2018-01-01

    Adaptive control methods are developed for stability and tracking control of flight systems in the presence of parametric uncertainties. This paper offers a design technique of adaptive sliding mode control (ASMC) for finite-time stabilization of unmanned aerial vehicle (UAV) systems with parametric uncertainties. Applying the Lyapunov stability concept and finite-time convergence idea, the recommended control method guarantees that the states of the quad-rotor UAV are converged to the origin with a finite-time convergence rate. Furthermore, an adaptive-tuning scheme is advised to guesstimate the unknown parameters of the quad-rotor UAV at any moment. Finally, simulation results are presented to exhibit the helpfulness of the offered technique compared to the previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions

    DOE PAGES

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    2018-03-03

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less

  10. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions.

    PubMed

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    2018-05-30

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175-183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave ). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave . The improved model contains six of the 10 terms in the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. Compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less

  12. Bounding the moment deficit rate on crustal faults using geodetic data: Methods

    DOE PAGES

    Maurer, Jeremy; Segall, Paul; Bradley, Andrew Michael

    2017-08-19

    Here, the geodetically derived interseismic moment deficit rate (MDR) provides a first-order constraint on earthquake potential and can play an important role in seismic hazard assessment, but quantifying uncertainty in MDR is a challenging problem that has not been fully addressed. We establish criteria for reliable MDR estimators, evaluate existing methods for determining the probability density of MDR, and propose and evaluate new methods. Geodetic measurements moderately far from the fault provide tighter constraints on MDR than those nearby. Previously used methods can fail catastrophically under predictable circumstances. The bootstrap method works well with strong data constraints on MDR, butmore » can be strongly biased when network geometry is poor. We propose two new methods: the Constrained Optimization Bounding Estimator (COBE) assumes uniform priors on slip rate (from geologic information) and MDR, and can be shown through synthetic tests to be a useful, albeit conservative estimator; the Constrained Optimization Bounding Linear Estimator (COBLE) is the corresponding linear estimator with Gaussian priors rather than point-wise bounds on slip rates. COBE matches COBLE with strong data constraints on MDR. We compare results from COBE and COBLE to previously published results for the interseismic MDR at Parkfield, on the San Andreas Fault, and find similar results; thus, the apparent discrepancy between MDR and the total moment release (seismic and afterslip) in the 2004 Parkfield earthquake remains.« less

  13. Bounding the moment deficit rate on crustal faults using geodetic data: Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maurer, Jeremy; Segall, Paul; Bradley, Andrew Michael

    Here, the geodetically derived interseismic moment deficit rate (MDR) provides a first-order constraint on earthquake potential and can play an important role in seismic hazard assessment, but quantifying uncertainty in MDR is a challenging problem that has not been fully addressed. We establish criteria for reliable MDR estimators, evaluate existing methods for determining the probability density of MDR, and propose and evaluate new methods. Geodetic measurements moderately far from the fault provide tighter constraints on MDR than those nearby. Previously used methods can fail catastrophically under predictable circumstances. The bootstrap method works well with strong data constraints on MDR, butmore » can be strongly biased when network geometry is poor. We propose two new methods: the Constrained Optimization Bounding Estimator (COBE) assumes uniform priors on slip rate (from geologic information) and MDR, and can be shown through synthetic tests to be a useful, albeit conservative estimator; the Constrained Optimization Bounding Linear Estimator (COBLE) is the corresponding linear estimator with Gaussian priors rather than point-wise bounds on slip rates. COBE matches COBLE with strong data constraints on MDR. We compare results from COBE and COBLE to previously published results for the interseismic MDR at Parkfield, on the San Andreas Fault, and find similar results; thus, the apparent discrepancy between MDR and the total moment release (seismic and afterslip) in the 2004 Parkfield earthquake remains.« less

  14. Quantitation of peptides from non-invasive skin tapings using isotope dilution and tandem mass spectrometry.

    PubMed

    Reisdorph, Nichole; Armstrong, Michael; Powell, Roger; Quinn, Kevin; Legg, Kevin; Leung, Donald; Reisdorph, Rick

    2018-05-01

    Previous work from our laboratories utilized a novel skin taping method and mass spectrometry-based proteomics to discover clinical biomarkers of skin conditions; these included atopic dermatitis, Staphylococcus aureus colonization, and eczema herpeticum. While suitable for discovery purposes, semi-quantitative proteomics is generally time-consuming and expensive. Furthermore, depending on the method used, discovery-based proteomics can result in high variation and inadequate sensitivity to detect low abundant peptides. Therefore, we strove to develop a rapid, sensitive, and reproducible method to quantitate disease-related proteins from skin tapings. We utilized isotopically-labeled peptides and tandem mass spectrometry to obtain absolute quantitation values on 14 peptides from 7 proteins; these proteins had shown previous importance in skin disease. The method demonstrated good reproducibility, dynamic range, and linearity (R 2  > 0.993) when n = 3 standards were analyzed across 0.05-2.5 pmol. The method was used to determine if differences exist between skin proteins in a small group of atopic versus non-atopic individuals (n = 12). While only minimal differences were found, peptides were detected in all samples and exhibited good correlation between peptides for 5 of the 7 proteins (R 2  = 0.71-0.98). This method can be applied to larger cohorts to further establish the relationships of these proteins to skin disease. Copyright © 2017. Published by Elsevier B.V.

  15. High risk of false positive results in a widely used diagnostic test for detection of the porcine reproductive and respiratory syndrome virus (PRRSV).

    PubMed

    Fetzer, C; Pesch, S; Ohlinger, V F

    2006-06-15

    During 2003 and 2004, increasing numbers of positive PRRSV RT-PCR results were reported from herds negative for PRRSV infection. Interestingly, three herds represent nucleus herds with no animal contacts from outside and without clinical symptoms of PRRS until now. Since these positive results that were obtained using a PCR protocol adapted to routine laboratory conditions could not be reproduced with other PRRSV specific RT-PCRs, controlled negative and positive samples were used to examine this phenomenon. A RT-PCR assay for detection and differential diagnosis of the European and North American genotypes of the porcine reproductive and respiratory syndrome virus (PRRSV) according to the method previously published by Oleksiewicz et al. [Oleksiewicz, M.B., Botner, A., Madsen, K.G., Storgaard, T., 1998. Sensitive detection and typing of porcine reproductive and respiratory syndrome virus by RT-PCR amplification of whole viral genes. Vet. Microbiol. 64, 7-22] was investigated in parallel to another recently published method [Pesch, S., 2003. Etablierung einer Nachweismethode für die zwei Genotypen von dem porcine reproductive and respiratory syndrome virus (PRRSV) und ein Beitrag zu seiner molekularen Epidemiologie. Thesis, Institute of Virology, Faculty of Veterinary Medicine, University of Leipzig]. A panel of 228 clinical samples sent in for PRRSV routine diagnostics served as test panel. It was found that both methods have similar analytical sensitivity. However, the primers published by Oleksiewicz were shown to yield a very high proportion of false positive results under routine diagnostic laboratory conditions, i.e. they resulted in RT-PCR products with non-PRRSV sequences, that were indistinguishable from truly positive reagents in standard gel electrophoresis settings. The reason for and possible implications of this finding as well as the risk of modifying published methods without control are discussed.

  16. Retrospective analysis of natural products provides insights for future discovery trends.

    PubMed

    Pye, Cameron R; Bertin, Matthew J; Lokey, R Scott; Gerwick, William H; Linington, Roger G

    2017-05-30

    Understanding of the capacity of the natural world to produce secondary metabolites is important to a broad range of fields, including drug discovery, ecology, biosynthesis, and chemical biology, among others. Both the absolute number and the rate of discovery of natural products have increased significantly in recent years. However, there is a perception and concern that the fundamental novelty of these discoveries is decreasing relative to previously known natural products. This study presents a quantitative examination of the field from the perspective of both number of compounds and compound novelty using a dataset of all published microbial and marine-derived natural products. This analysis aimed to explore a number of key questions, such as how the rate of discovery of new natural products has changed over the past decades, how the average natural product structural novelty has changed as a function of time, whether exploring novel taxonomic space affords an advantage in terms of novel compound discovery, and whether it is possible to estimate how close we are to having described all of the chemical space covered by natural products. Our analyses demonstrate that most natural products being published today bear structural similarity to previously published compounds, and that the range of scaffolds readily accessible from nature is limited. However, the analysis also shows that the field continues to discover appreciable numbers of natural products with no structural precedent. Together, these results suggest that the development of innovative discovery methods will continue to yield compounds with unique structural and biological properties.

  17. Y-Chromosome Haplogroups in the Bosnian-Herzegovinian Population Based on 23 Y-STR Loci.

    PubMed

    Doğan, Serkan; Ašić, Adna; Doğan, Gulsen; Besic, Larisa; Marjanovic, Damir

    2016-07-01

    In a study of the Bosnian-Herzegovinian (B&H) population, Y-chromosome marker frequencies for 100 individuals, generated using the PowerPlex Y23 kit, were used to perform Y-chromosome haplogroup assignment via Whit Athey's Haplogroup Predictor. This algorithm determines Y-chromosome haplogroups from Y-chromosome short tandem repeat (Y-STR) data using a Bayesian probability-based approach. The most frequent haplogroup appeared to be I2a, with a prevalence of 49%, followed by R1a and E1b1b, each accounting for 17% of all haplogroups within the population. Remaining haplogroups were J2a (5%), I1 (4%), R1b (4%), J2b (2%), G2a (1%), and N (1%). These results confirm previously published preliminary B&H population data published over 10 years ago, especially the prediction about the B&H population being a part of the Western Balkan area, which served as the Last Glacial Maximum refuge for the Paleolithic human European population. Furthermore, the results corroborate the hypothesis that this area was a significant stopping point on the "Middle East-Europe highway" during the Neolithic farmer migrations. Finally, since these results are almost completely in accordance with previously published data on B&H and neighboring populations generated by Y-chromosome single nucleotide polymorphism analysis, it can be concluded that in silico analysis of Y-STRs is a reliable method for approximation of the Y-chromosome haplogroup diversity of an examined population.

  18. Analysis of stimulant drugs in the wastewater of five Nordic capitals.

    PubMed

    Löve, Arndís Sue Ching; Baz-Lomba, Jose Antonio; Reid, Malcolm J; Kankaanpää, Aino; Gunnar, Teemu; Dam, Maria; Ólafsdóttir, Kristín; Thomas, Kevin V

    2018-06-15

    Wastewater-based epidemiology is an efficient way to assess illicit drug use, complementing currently used methods retrieved from different data sources. The aim of this study is to compare stimulant drug use in five Nordic capital cities that include for the first time wastewater samples from Torshavn in the Faroe Islands. Currently there are no published reports that compare stimulant drug use in these Nordic capitals. All wastewater samples were analyzed using solid phase extraction and ultra-high performance liquid chromatography coupled to tandem mass spectrometry. The results were compared with data published by the European Monitoring Centre for Drugs and Drug Addiction based on illicit drugs in wastewater from over 50 European cities. Confirming previous reports, the results showed high amphetamine loads compared with other European countries. Very little apparent abuse of stimulant drugs was detected in Torshavn. Methamphetamine loads were the highest from Helsinki of the Nordic countries, indicating substantial fluctuations in the availability of the drug compared with previous studies. Methamphetamine loads from Oslo confirmed that the use continues to be high. Estimated cocaine use was found to be in the lower range compared with other cities in the southern and western part of Europe. Ecstasy and cocaine showed clear variations between weekdays and weekends, indicating recreational use. This study further demonstrates geographical trends in the stimulant drug market in five Nordic capitals, which enables a better comparison with other areas of the continent. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  19. First Nuclear DNA Amounts in more than 300 Angiosperms

    PubMed Central

    ZONNEVELD, B. J. M.; LEITCH, I. J.; BENNETT, M. D.

    2005-01-01

    • Background and Aims Genome size (DNA C-value) data are key biodiversity characters of fundamental significance used in a wide variety of biological fields. Since 1976, Bennett and colleagues have made scattered published and unpublished genome size data more widely accessible by assembling them into user-friendly compilations. Initially these were published as hard copy lists, but since 1997 they have also been made available electronically (see the Plant DNA C-values database www.kew.org/cval/homepage.html). Nevertheless, at the Second Plant Genome Size Meeting in 2003, Bennett noted that as many as 1000 DNA C-value estimates were still unpublished and hence unavailable. Scientists were strongly encouraged to communicate such unpublished data. The present work combines the databasing experience of the Kew-based authors with the unpublished C-values produced by Zonneveld to make a large body of valuable genome size data available to the scientific community. • Methods C-values for angiosperm species, selected primarily for their horticultural interest, were estimated by flow cytometry using the fluorochrome propidium iodide. The data were compiled into a table whose form is similar to previously published lists of DNA amounts by Bennett and colleagues. • Key Results and Conclusions The present work contains C-values for 411 taxa including first values for 308 species not listed previously by Bennett and colleagues. Based on a recent estimate of the global published output of angiosperm DNA C-value data (i.e. 200 first C-value estimates per annum) the present work equals 1·5 years of average global published output; and constitutes over 12 % of the latest 5-year global target set by the Second Plant Genome Size Workshop (see www.kew.org/cval/workshopreport.html). Hopefully, the present example will encourage others to unveil further valuable data which otherwise may lie forever unpublished and unavailable for comparative analyses. PMID:15905300

  20. The use of genetic programming to develop a predictor of swash excursion on sandy beaches

    NASA Astrophysics Data System (ADS)

    Passarella, Marinella; Goldstein, Evan B.; De Muro, Sandro; Coco, Giovanni

    2018-02-01

    We use genetic programming (GP), a type of machine learning (ML) approach, to predict the total and infragravity swash excursion using previously published data sets that have been used extensively in swash prediction studies. Three previously published works with a range of new conditions are added to this data set to extend the range of measured swash conditions. Using this newly compiled data set we demonstrate that a ML approach can reduce the prediction errors compared to well-established parameterizations and therefore it may improve coastal hazards assessment (e.g. coastal inundation). Predictors obtained using GP can also be physically sound and replicate the functionality and dependencies of previous published formulas. Overall, we show that ML techniques are capable of both improving predictability (compared to classical regression approaches) and providing physical insight into coastal processes.

  1. Publication rates from the All India Ophthalmic Conference 2010 compared to 2000: Are we improving?

    PubMed Central

    Kumaragurupari, R; Sengupta, Sabyasachi; Bhandari, Sahil

    2016-01-01

    Purpose: To determine the publication rates of free papers and posters presented at the All India Ophthalmic Conference (AIOC) 2010 in peer-reviewed journals up to December 2015 and compare this with publication rates from AIOC2000 published previously. Methods: A thorough literature search was conducted using PubMed, Google Scholar, and the general Google search engine by two independent investigators. The title of the paper, keywords and author names were used to “match” the AIOC free-paper with the published paper. In addition, the “purpose,” “methods,” and “outcome measures” between the two were studied to determine the “match.” Results: A total of 58 out of 394 free-papers (14.7%) from AIOC2010 were published till December 2015 compared to 16.5% from AIOC2000. Out of these, 52 (90%) were published in PubMed indexed journals. Maximum publications were seen in pediatric ophthalmology (50%) followed by glaucoma (24.4%) and cornea (23.8%). Fifteen out of 272 posters (5.5%) were published; orbit/oculoplastics had the highest poster publications (13%). Excluding papers in nonindexed journals and those by authors with international affiliations, the publication rate was approximately 12%. Conclusion: The publication rate of free papers from AIOC2010 has marginally reduced compared to AIOC2000. Various causes for this such as lack of adequate training, motivation, and lack of incentives for research in the Indian scenario have been explored, and measures to improve this paradigm have been discussed. It will be prudent to repeat this exercise every decade to compare publication rates between periodic AIOC, stimulate young minds for quality research and educate policy makers toward the need for developing dedicated research departments across the country. PMID:27905332

  2. A supertree pipeline for summarizing phylogenetic and taxonomic information for millions of species

    PubMed Central

    Redelings, Benjamin D.

    2017-01-01

    We present a new supertree method that enables rapid estimation of a summary tree on the scale of millions of leaves. This supertree method summarizes a collection of input phylogenies and an input taxonomy. We introduce formal goals and criteria for such a supertree to satisfy in order to transparently and justifiably represent the input trees. In addition to producing a supertree, our method computes annotations that describe which grouping in the input trees support and conflict with each group in the supertree. We compare our supertree construction method to a previously published supertree construction method by assessing their performance on input trees used to construct the Open Tree of Life version 4, and find that our method increases the number of displayed input splits from 35,518 to 39,639 and decreases the number of conflicting input splits from 2,760 to 1,357. The new supertree method also improves on the previous supertree construction method in that it produces no unsupported branches and avoids unnecessary polytomies. This pipeline is currently used by the Open Tree of Life project to produce all of the versions of project’s “synthetic tree” starting at version 5. This software pipeline is called “propinquity”. It relies heavily on “otcetera”—a set of C++ tools to perform most of the steps of the pipeline. All of the components are free software and are available on GitHub. PMID:28265520

  3. Defining clinically important perioperative blood loss and transfusion for the Standardised Endpoints for Perioperative Medicine (StEP) collaborative: a protocol for a scoping review.

    PubMed

    Bartoszko, Justyna; Vorobeichik, Leon; Jayarajah, Mohandas; Karkouti, Keyvan; Klein, Andrew A; Lamy, Andre; Mazer, C David; Murphy, Mike; Richards, Toby; Englesakis, Marina; Myles, Paul S; Wijeysundera, Duminda N

    2017-06-30

    'Standardised Endpoints for Perioperative Medicine' (StEP) is an international collaboration undertaking development of consensus-based consistent definitions for endpoints in perioperative clinical trials. Inconsistency in endpoint definitions can make interpretation of trial results more difficult, especially if conflicting evidence is present. Furthermore, this inconsistency impedes evidence synthesis and meta-analyses. The goals of StEP are to harmonise definitions for clinically meaningful endpoints and specify standards for endpoint reporting in clinical trials. To help inform this endeavour, we aim to conduct a scoping review to systematically characterise the definitions of clinically important endpoints in the existing published literature on perioperative blood loss and transfusion. The scoping review will be conducted using the widely adopted framework developed by Arksey and O'Malley, with modifications from Levac. We refined our methods with guidance from research librarians as well as researchers and clinicians with content expertise. The electronic literature search will involve several databases including Medline, PubMed-not-Medline and Embase. Our review has three objectives, namely to (1) identify definitions of significant blood loss and transfusion used in previously published large perioperative randomised trials; (2) identify previously developed consensus-based definitions for significant blood loss and transfusion in perioperative medicine and related fields; and (3) describe the association between different magnitudes of blood loss and transfusion with postoperative outcomes. The multistage review process for each question will involve two reviewers screening abstracts, reading full-text articles and performing data extraction. The abstracted data will be organised and subsequently analysed in an iterative process. This scoping review of the previously published literature does not require research ethics approval. The results will be used to inform a consensus-based process to develop definitions of clinically important perioperative blood loss and transfusion. The results of the scoping review will be published in a peer-reviewed scientific journal. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Preliminary Multivariable Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  5. Optoelectronic scanning system upgrade by energy center localization methods

    NASA Astrophysics Data System (ADS)

    Flores-Fuentes, W.; Sergiyenko, O.; Rodriguez-Quiñonez, J. C.; Rivas-López, M.; Hernández-Balbuena, D.; Básaca-Preciado, L. C.; Lindner, L.; González-Navarro, F. F.

    2016-11-01

    A problem of upgrading an optoelectronic scanning system with digital post-processing of the signal based on adequate methods of energy center localization is considered. An improved dynamic triangulation analysis technique is proposed by an example of industrial infrastructure damage detection. A modification of our previously published method aimed at searching for the energy center of an optoelectronic signal is described. Application of the artificial intelligence algorithm of compensation for the error of determining the angular coordinate in calculating the spatial coordinate through dynamic triangulation is demonstrated. Five energy center localization methods are developed and tested to select the best method. After implementation of these methods, digital compensation for the measurement error, and statistical data analysis, a non-parametric behavior of the data is identified. The Wilcoxon signed rank test is applied to improve the result further. For optical scanning systems, it is necessary to detect a light emitter mounted on the infrastructure being investigated to calculate its spatial coordinate by the energy center localization method.

  6. Proceedings of the Nautical Almanac Office Sesquicentennial Symposium Held in Washington, The District of Columbia on March 3-4, 1999

    DTIC Science & Technology

    1999-03-01

    Laboratory. While I was at USNO I was allocated a roll-top desk (previously used by H. R. Morgan) in the Library, and so I did not interact with the NAO...Tables de la lune construites d’apr~s le principe newtonien de la gravitation universelle would not be published until 1857. They were, on the other...the methodical calculations for the astronomical tables, Newcomb had much free time in his daily schedule of work. He allocated some of the hours to

  7. Australian Tri-Service Anthropometric Survey, 1977. Part 1. Survey Planning, Conduct, Data Handling and Methods of Analysis,

    DTIC Science & Technology

    1979-07-01

    information on this page): No Limitation 14. Descriptors: 15. Cosati Codes: Anthropometry Australia 0614 Surveys 0505 Biomedical data Military personnel... AUSTRALIA ) K C HENDY JUL 79 ARL/SYS-15 UNCASSFIE ) m.2 | 2 1 Lm MICROCOPY RESOLUTION TEST "ART N ATIONAL O € F ST" OANDOS1963A i0 __ 9 ,’ ARLjS¥S -R BW...was known about the anthropometry of Australian military personnel. For example, there have been no previously published attempts to compare the

  8. Digital Mapping Techniques '07 - Workshop Proceedings

    USGS Publications Warehouse

    Soller, David R.

    2008-01-01

    The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  9. Ecologic regression analysis and the study of the influence of air quality on mortality.

    PubMed Central

    Selvin, S; Merrill, D; Wong, L; Sacks, S T

    1984-01-01

    This presentation focuses entirely on the use and evaluation of regression analysis applied to ecologic data as a method to study the effects of ambient air pollution on mortality rates. Using extensive national data on mortality, air quality and socio-economic status regression analyses are used to study the influence of air quality on mortality. The analytic methods and data are selected in such a way that direct comparisons can be made with other ecologic regression studies of mortality and air quality. Analyses are performed by use of two types of geographic areas, age-specific mortality of both males and females and three pollutants (total suspended particulates, sulfur dioxide and nitrogen dioxide). The overall results indicate no persuasive evidence exists of a link between air quality and general mortality levels. Additionally, a lack of consistency between the present results and previous published work is noted. Overall, it is concluded that linear regression analysis applied to nationally collected ecologic data cannot be used to usefully infer a causal relationship between air quality and mortality which is in direct contradiction to other major published studies. PMID:6734568

  10. Particle Swarm Optimization Based Feature Enhancement and Feature Selection for Improved Emotion Recognition in Speech and Glottal Signals

    PubMed Central

    Muthusamy, Hariharan; Polat, Kemal; Yaacob, Sazali

    2015-01-01

    In the recent years, many research works have been published using speech related features for speech emotion recognition, however, recent studies show that there is a strong correlation between emotional states and glottal features. In this work, Mel-frequency cepstralcoefficients (MFCCs), linear predictive cepstral coefficients (LPCCs), perceptual linear predictive (PLP) features, gammatone filter outputs, timbral texture features, stationary wavelet transform based timbral texture features and relative wavelet packet energy and entropy features were extracted from the emotional speech (ES) signals and its glottal waveforms(GW). Particle swarm optimization based clustering (PSOC) and wrapper based particle swarm optimization (WPSO) were proposed to enhance the discerning ability of the features and to select the discriminating features respectively. Three different emotional speech databases were utilized to gauge the proposed method. Extreme learning machine (ELM) was employed to classify the different types of emotions. Different experiments were conducted and the results show that the proposed method significantly improves the speech emotion recognition performance compared to previous works published in the literature. PMID:25799141

  11. Expediting Combinatorial Data Set Analysis by Combining Human and Algorithmic Analysis.

    PubMed

    Stein, Helge Sören; Jiao, Sally; Ludwig, Alfred

    2017-01-09

    A challenge in combinatorial materials science remains the efficient analysis of X-ray diffraction (XRD) data and its correlation to functional properties. Rapid identification of phase-regions and proper assignment of corresponding crystal structures is necessary to keep pace with the improved methods for synthesizing and characterizing materials libraries. Therefore, a new modular software called htAx (high-throughput analysis of X-ray and functional properties data) is presented that couples human intelligence tasks used for "ground-truth" phase-region identification with subsequent unbiased verification by an algorithm to efficiently analyze which phases are present in a materials library. Identified phases and phase-regions may then be correlated to functional properties in an expedited manner. For the functionality of htAx to be proven, two previously published XRD benchmark data sets of the materials systems Al-Cr-Fe-O and Ni-Ti-Cu are analyzed by htAx. The analysis of ∼1000 XRD patterns takes less than 1 day with htAx. The proposed method reliably identifies phase-region boundaries and robustly identifies multiphase structures. The method also addresses the problem of identifying regions with previously unpublished crystal structures using a special daisy ternary plot.

  12. Obtaining reliable phase-gradient delays from otoacoustic emission data.

    PubMed

    Shera, Christopher A; Bergevin, Christopher

    2012-08-01

    Reflection-source otoacoustic emission phase-gradient delays are widely used to obtain noninvasive estimates of cochlear function and properties, such as the sharpness of mechanical tuning and its variation along the length of the cochlear partition. Although different data-processing strategies are known to yield different delay estimates and trends, their relative reliability has not been established. This paper uses in silico experiments to evaluate six methods for extracting delay trends from reflection-source otoacoustic emissions (OAEs). The six methods include both previously published procedures (e.g., phase smoothing, energy-weighting, data exclusion based on signal-to-noise ratio) and novel strategies (e.g., peak-picking, all-pass factorization). Although some of the methods perform well (e.g., peak-picking), others introduce substantial bias (e.g., phase smoothing) and are not recommended. In addition, since standing waves caused by multiple internal reflection can complicate the interpretation and compromise the application of OAE delays, this paper develops and evaluates two promising signal-processing strategies, the first based on time-frequency filtering using the continuous wavelet transform and the second on cepstral analysis, for separating the direct emission from its subsequent reflections. Altogether, the results help to resolve previous disagreements about the frequency dependence of human OAE delays and the sharpness of cochlear tuning while providing useful analysis methods for future studies.

  13. Segmentation of cortical bone using fast level sets

    NASA Astrophysics Data System (ADS)

    Chowdhury, Manish; Jörgens, Daniel; Wang, Chunliang; Smedby, Årjan; Moreno, Rodrigo

    2017-02-01

    Cortical bone plays a big role in the mechanical competence of bone. The analysis of cortical bone requires accurate segmentation methods. Level set methods are usually in the state-of-the-art for segmenting medical images. However, traditional implementations of this method are computationally expensive. This drawback was recently tackled through the so-called coherent propagation extension of the classical algorithm which has decreased computation times dramatically. In this study, we assess the potential of this technique for segmenting cortical bone in interactive time in 3D images acquired through High Resolution peripheral Quantitative Computed Tomography (HR-pQCT). The obtained segmentations are used to estimate cortical thickness and cortical porosity of the investigated images. Cortical thickness and Cortical porosity is computed using sphere fitting and mathematical morphological operations respectively. Qualitative comparison between the segmentations of our proposed algorithm and a previously published approach on six images volumes reveals superior smoothness properties of the level set approach. While the proposed method yields similar results to previous approaches in regions where the boundary between trabecular and cortical bone is well defined, it yields more stable segmentations in challenging regions. This results in more stable estimation of parameters of cortical bone. The proposed technique takes few seconds to compute, which makes it suitable for clinical settings.

  14. Distortion correction of echo planar images applying the concept of finite rate of innovation to point spread function mapping (FRIP).

    PubMed

    Nunes, Rita G; Hajnal, Joseph V

    2018-06-01

    Point spread function (PSF) mapping enables estimating the displacement fields required for distortion correction of echo planar images. Recently, a highly accelerated approach was introduced for estimating displacements from the phase slope of under-sampled PSF mapping data. Sampling schemes with varying spacing were proposed requiring stepwise phase unwrapping. To avoid unwrapping errors, an alternative approach applying the concept of finite rate of innovation to PSF mapping (FRIP) is introduced, using a pattern search strategy to locate the PSF peak, and the two methods are compared. Fully sampled PSF data was acquired in six subjects at 3.0 T, and distortion maps were estimated after retrospective under-sampling. The two methods were compared for both previously published and newly optimized sampling patterns. Prospectively under-sampled data were also acquired. Shift maps were estimated and deviations relative to the fully sampled reference map were calculated. The best performance was achieved when using FRIP with a previously proposed sampling scheme. The two methods were comparable for the remaining schemes. The displacement field errors tended to be lower as the number of samples or their spacing increased. A robust method for estimating the position of the PSF peak has been introduced.

  15. A comprehensive assessment of collision likelihood in Geosynchronous Earth Orbit

    NASA Astrophysics Data System (ADS)

    Oltrogge, D. L.; Alfano, S.; Law, C.; Cacioni, A.; Kelso, T. S.

    2018-06-01

    Knowing the likelihood of collision for satellites operating in Geosynchronous Earth Orbit (GEO) is of extreme importance and interest to the global community and the operators of GEO spacecraft. Yet for all of its importance, a comprehensive assessment of GEO collision likelihood is difficult to do and has never been done. In this paper, we employ six independent and diverse assessment methods to estimate GEO collision likelihood. Taken in aggregate, this comprehensive assessment offer new insights into GEO collision likelihood that are within a factor of 3.5 of each other. These results are then compared to four collision and seven encounter rate estimates previously published. Collectively, these new findings indicate that collision likelihood in GEO is as much as four orders of magnitude higher than previously published by other researchers. Results indicate that a collision is likely to occur every 4 years for one satellite out of the entire GEO active satellite population against a 1 cm RSO catalogue, and every 50 years against a 20 cm RSO catalogue. Further, previous assertions that collision relative velocities are low (i.e., <1 km/s) in GEO are disproven, with some GEO relative velocities as high as 4 km/s identified. These new findings indicate that unless operators successfully mitigate this collision risk, the GEO orbital arc is and will remain at high risk of collision, with the potential for serious follow-on collision threats from post-collision debris when a substantial GEO collision occurs.

  16. A prospective gating method to acquire a diverse set of free-breathing CT images for model-based 4DCT

    NASA Astrophysics Data System (ADS)

    O'Connell, D.; Ruan, D.; Thomas, D. H.; Dou, T. H.; Lewis, J. H.; Santhanam, A.; Lee, P.; Low, D. A.

    2018-02-01

    Breathing motion modeling requires observation of tissues at sufficiently distinct respiratory states for proper 4D characterization. This work proposes a method to improve sampling of the breathing cycle with limited imaging dose. We designed and tested a prospective free-breathing acquisition protocol with a simulation using datasets from five patients imaged with a model-based 4DCT technique. Each dataset contained 25 free-breathing fast helical CT scans with simultaneous breathing surrogate measurements. Tissue displacements were measured using deformable image registration. A correspondence model related tissue displacement to the surrogate. Model residual was computed by comparing predicted displacements to image registration results. To determine a stopping criteria for the prospective protocol, i.e. when the breathing cycle had been sufficiently sampled, subsets of N scans where 5  ⩽  N  ⩽  9 were used to fit reduced models for each patient. A previously published metric was employed to describe the phase coverage, or ‘spread’, of the respiratory trajectories of each subset. Minimum phase coverage necessary to achieve mean model residual within 0.5 mm of the full 25-scan model was determined and used as the stopping criteria. Using the patient breathing traces, a prospective acquisition protocol was simulated. In all patients, phase coverage greater than the threshold necessary for model accuracy within 0.5 mm of the 25 scan model was achieved in six or fewer scans. The prospectively selected respiratory trajectories ranked in the (97.5  ±  4.2)th percentile among subsets of the originally sampled scans on average. Simulation results suggest that the proposed prospective method provides an effective means to sample the breathing cycle with limited free-breathing scans. One application of the method is to reduce the imaging dose of a previously published model-based 4DCT protocol to 25% of its original value while achieving mean model residual within 0.5 mm.

  17. Improving the large scale purification of the HIV microbicide, griffithsin.

    PubMed

    Fuqua, Joshua L; Wanga, Valentine; Palmer, Kenneth E

    2015-02-22

    Griffithsin is a broad spectrum antiviral lectin that inhibits viral entry and maturation processes through binding clusters of oligomannose glycans on viral envelope glycoproteins. An efficient, scaleable manufacturing process for griffithsin active pharmaceutical ingredient (API) is essential for particularly cost-sensitive products such as griffithsin -based topical microbicides for HIV-1 prevention in resource poor settings. Our previously published purification method used ceramic filtration followed by two chromatography steps, resulting in a protein recovery of 30%. Our objective was to develop a scalable purification method for griffithsin expressed in Nicotiana benthamiana plants that would increase yield, reduce production costs, and simplify manufacturing techniques. Considering the future need to transfer griffithsin manufacturing technology to resource poor areas, we chose to focus modifying the purification process, paying particular attention to introducing simple, low-cost, and scalable procedures such as use of temperature, pH, ion concentration, and filtration to enhance product recovery. We achieved >99% pure griffithsin API by generating the initial green juice extract in pH 4 buffer, heating the extract to 55°C, incubating overnight with a bentonite MgCl2 mixture, and final purification with Capto™ multimodal chromatography. Griffithsin extracted with this protocol maintains activity comparable to griffithsin purified by the previously published method and we are able to recover a substantially higher yield: 88 ± 5% of griffithsin from the initial extract. The method was scaled to produce gram quantities of griffithsin with high yields, low endotoxin levels, and low purification costs maintained. The methodology developed to purify griffithsin introduces and develops multiple tools for purification of recombinant proteins from plants at an industrial scale. These tools allow for robust cost-effective production and purification of griffithsin. The methodology can be readily scaled to the bench top or industry and process components can be used for purification of additional proteins based on biophysical characteristics.

  18. Writing and Publishing Handbook.

    ERIC Educational Resources Information Center

    Hansen, William F., Ed.

    Intended to provide guidance in academic publishing to faculty members, especially younger faculty members, this handbook is a compilation of four previously published essays by different authors. Following a preface and an introduction, the four essays and their authors are as follows: (1) "One Writer's Secrets" (Donald M. Murray); (2)…

  19. Endorsement of PRISMA statement and quality of systematic reviews and meta-analyses published in nursing journals: a cross-sectional study

    PubMed Central

    Tam, Wilson W S; Lo, Kenneth K H; Khalechelvam, Parames

    2017-01-01

    Objective Systematic reviews (SRs) often poorly report key information, thereby diminishing their usefulness. Previous studies evaluated published SRs and determined that they failed to meet explicit criteria or characteristics. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement was recommended as a reporting guideline for SR and meta-analysis (MA), but previous studies showed that adherence to the statement was not high for SRs published in different medical fields. Thus, the aims of this study are twofold: (1) to investigate the number of nursing journals that have required or recommended the use of the PRISMA statement for reporting SR, and (2) to examine the adherence of SRs and/or meta-analyses to the PRISMA statement published in nursing journals. Design A cross-sectional study. Methods Nursing journals listed in the ISI journal citation report were divided into 2 groups based on the recommendation of PRISMA statement in their ‘Instruction for Authors’. SRs and meta-analyses published in 2014 were searched in 3 databases. 37 SRs and meta-analyses were randomly selected in each group. The adherence of each item to the PRISMA was examined and summarised using descriptive statistics. The quality of the SRs was assessed by Assessing the Methodological Quality of Systematic Reviews. The differences between the 2 groups were compared using the Mann-Whitney U test. Results Out of 107 nursing journals, 30 (28.0%) recommended or required authors to follow the PRISMA statement when they submit SRs or meta-analyses. The median rates of adherence to the PRISMA statement for reviews published in journals with and without PRISMA endorsement were 64.9% (IQR: 17.6–92.3%) and 73.0% (IQR: 59.5–94.6%), respectively. No significant difference was observed in any of the items between the 2 groups. Conclusions The median adherence of SRs and meta-analyses in nursing journals to PRISMA is low at 64.9% and 73.0%, respectively. Nonetheless, the adherence level of nursing journals to the PRISMA statement does not significantly vary whether they endorse or recommend such a guideline. PMID:28174224

  20. Airfoil self-noise and prediction

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Pope, D. Stuart; Marcolini, Michael A.

    1989-01-01

    A prediction method is developed for the self-generated noise of an airfoil blade encountering smooth flow. The prediction methods for the individual self-noise mechanisms are semiempirical and are based on previous theoretical studies and data obtained from tests of two- and three-dimensional airfoil blade sections. The self-noise mechanisms are due to specific boundary-layer phenomena, that is, the boundary-layer turbulence passing the trailing edge, separated-boundary-layer and stalled flow over an airfoil, vortex shedding due to laminar boundary layer instabilities, vortex shedding from blunt trailing edges, and the turbulent vortex flow existing near the tip of lifting blades. The predictions are compared successfully with published data from three self-noise studies of different airfoil shapes. An application of the prediction method is reported for a large scale-model helicopter rotor, and the predictions compared well with experimental broadband noise measurements. A computer code of the method is given.

  1. What Makes a Successful Survey? A Systematic Review of Surveys Used in Anterior Cruciate Ligament Reconstruction.

    PubMed

    Ekhtiari, Seper; Kay, Jeffrey; de Sa, Darren; Simunovic, Nicole; Musahl, Volker; Peterson, Devin C; Ayeni, Olufemi R

    2017-05-01

    To characterize and assess the methodological quality of patient and physician surveys related to anterior cruciate ligament reconstruction, and to analyze the factors influencing response rate. The databases MEDLINE, Embase, and PubMed were searched from database inception to search date and screened in duplicate for relevant studies. Data regarding survey characteristics, response rates, and distribution methods were extracted. A previously published list of recommendations for high-quality surveys in orthopaedics was used as a scale to assess survey quality (12 items scored 0, 1, or 2; maximum score = 24). Of the initial 1,276 studies, 53 studies published between 1986 and 2016 met the inclusion criteria. Sixty-four percent of studies were distributed to physicians, compared with 32% distributed to patients and less than 4% to coaches. The median number of items in each survey was 10.5, and the average response rate was 73% (range: 18% to 100%). In-person distribution was the most common method (40%), followed by web-based methods (28%) and mail (25%). Response rates were highest for surveys targeted at patients (77%, P < .0001) and those delivered in-person (94%, P < .0001). The median quality score was 12/24 (range = 8.5/24 to 21/24). There was high inter-rater agreement using the quality scale (intraclass correlation coefficient = 0.92), but there was no correlation with the response rate (Rho = -0.01, P = .97). Response rates vary based on target audience and distribution methods, with patients responding at a significantly higher rate than physicians and in-person distribution yielding significantly higher response rates than web or mail surveys. Level IV, systematic review of Level IV studies. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  2. Assessment of interchangeability rate between 2 methods of measurements: An example with a cardiac output comparison study.

    PubMed

    Lorne, Emmanuel; Diouf, Momar; de Wilde, Robert B P; Fischer, Marc-Olivier

    2018-02-01

    The Bland-Altman (BA) and percentage error (PE) methods have been previously described to assess the agreement between 2 methods of medical or laboratory measurements. This type of approach raises several problems: the BA methodology constitutes a subjective approach to interchangeability, whereas the PE approach does not take into account the distribution of values over a range. We describe a new methodology that defines an interchangeability rate between 2 methods of measurement and cutoff values that determine the range of interchangeable values. We used a simulated data and a previously published data set to demonstrate the concept of the method. The interchangeability rate of 5 different cardiac output (CO) pulse contour techniques (Wesseling method, LiDCO, PiCCO, Hemac method, and Modelflow) was calculated, in comparison with the reference pulmonary artery thermodilution CO using our new method. In our example, Modelflow with a good interchangeability rate of 93% and a cutoff value of 4.8 L min, was found to be interchangeable with the thermodilution method for >95% of measurements. Modelflow had a higher interchangeability rate compared to Hemac (93% vs 86%; P = .022) or other monitors (Wesseling cZ = 76%, LiDCO = 73%, and PiCCO = 62%; P < .0001). Simulated data and reanalysis of a data set comparing 5 CO monitors against thermodilution CO showed that, depending on the repeatability of the reference method, the interchangeability rate combined with a cutoff value could be used to define the range of values over which interchangeability remains acceptable.

  3. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    NASA Astrophysics Data System (ADS)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  4. Is email a reliable means of contacting authors of previously published papers? A study of the Emergency Medicine Journal for 2001.

    PubMed

    O'Leary, F

    2003-07-01

    To determine whether it is possible to contact authors of previously published papers via email. A cross sectional study of the Emergency Medicine Journal for 2001. 118 articles were included in the study. The response rate from those with valid email addresses was 73%. There was no statistical difference between the type of email address used and the address being invalid (p=0.392) or between the type of article and the likelihood of a reply (p=0.197). More responses were obtained from work addresses when compared with Hotmail addresses (86% v 57%, p=0.02). Email is a valid means of contacting authors of previously published articles, particularly within the emergency medicine specialty. A work based email address may be a more valid means of contact than a Hotmail address.

  5. An approximation of herd effect due to vaccinating children against seasonal influenza – a potential solution to the incorporation of indirect effects into static models

    PubMed Central

    2013-01-01

    Background Indirect herd effect from vaccination of children offers potential for improving the effectiveness of influenza prevention in the remaining unvaccinated population. Static models used in cost-effectiveness analyses cannot dynamically capture herd effects. The objective of this study was to develop a methodology to allow herd effect associated with vaccinating children against seasonal influenza to be incorporated into static models evaluating the cost-effectiveness of influenza vaccination. Methods Two previously published linear equations for approximation of herd effects in general were compared with the results of a structured literature review undertaken using PubMed searches to identify data on herd effects specific to influenza vaccination. A linear function was fitted to point estimates from the literature using the sum of squared residuals. Results The literature review identified 21 publications on 20 studies for inclusion. Six studies provided data on a mathematical relationship between effective vaccine coverage in subgroups and reduction of influenza infection in a larger unvaccinated population. These supported a linear relationship when effective vaccine coverage in a subgroup population was between 20% and 80%. Three studies evaluating herd effect at a community level, specifically induced by vaccinating children, provided point estimates for fitting linear equations. The fitted linear equation for herd protection in the target population for vaccination (children) was slightly less conservative than a previously published equation for herd effects in general. The fitted linear equation for herd protection in the non-target population was considerably less conservative than the previously published equation. Conclusions This method of approximating herd effect requires simple adjustments to the annual baseline risk of influenza in static models: (1) for the age group targeted by the childhood vaccination strategy (i.e. children); and (2) for other age groups not targeted (e.g. adults and/or elderly). Two approximations provide a linear relationship between effective coverage and reduction in the risk of infection. The first is a conservative approximation, recommended as a base-case for cost-effectiveness evaluations. The second, fitted to data extracted from a structured literature review, provides a less conservative estimate of herd effect, recommended for sensitivity analyses. PMID:23339290

  6. Birth month affects lifetime disease risk: a phenome-wide method.

    PubMed

    Boland, Mary Regina; Shahn, Zachary; Madigan, David; Hripcsak, George; Tatonetti, Nicholas P

    2015-09-01

    An individual's birth month has a significant impact on the diseases they develop during their lifetime. Previous studies reveal relationships between birth month and several diseases including atherothrombosis, asthma, attention deficit hyperactivity disorder, and myopia, leaving most diseases completely unexplored. This retrospective population study systematically explores the relationship between seasonal affects at birth and lifetime disease risk for 1688 conditions. We developed a hypothesis-free method that minimizes publication and disease selection biases by systematically investigating disease-birth month patterns across all conditions. Our dataset includes 1 749 400 individuals with records at New York-Presbyterian/Columbia University Medical Center born between 1900 and 2000 inclusive. We modeled associations between birth month and 1688 diseases using logistic regression. Significance was tested using a chi-squared test with multiplicity correction. We found 55 diseases that were significantly dependent on birth month. Of these 19 were previously reported in the literature (P < .001), 20 were for conditions with close relationships to those reported, and 16 were previously unreported. We found distinct incidence patterns across disease categories. Lifetime disease risk is affected by birth month. Seasonally dependent early developmental mechanisms may play a role in increasing lifetime risk of disease. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  7. Fast detection and characterization of organic and inorganic gunshot residues on the hands of suspects by CMV-GC-MS and LIBS.

    PubMed

    Tarifa, Anamary; Almirall, José R

    2015-05-01

    A rapid method for the characterization of both organic and inorganic components of gunshot residues (GSR) is proposed as an alternative tool to facilitate the identification of a suspected shooter. In this study, two fast screening methods were developed and optimized for the detection of organic compounds and inorganic components indicative of GSR presence on the hands of shooters and non-shooters. The proposed methods consist of headspace extraction of volatile organic compounds using a capillary microextraction of volatiles (CMV) device previously reported as a high-efficiency sampler followed by detection by GC-MS. This novel sampling technique has the potential to yield fast results (<2min sampling) and high sensitivity capable of detecting 3ng of diphenylamine (DPA) and 8ng of nitroglycerine (NG). Direct analysis of the headspace of over 50 swabs collected from the hands of suspected shooters (and non-shooters) provides information regarding VOCs present on their hands. In addition, a fast laser induced breakdown spectroscopy (LIBS) screening method for the detection of the inorganic components indicative of the presence of GSR (Sb, Pb and Ba) is described. The sampling method for the inorganics consists of liquid extraction of the target elements from the same cotton swabs (previously analyzed for VOCs) and an additional 30 swab samples followed by spiking 1μL of the extract solution onto a Teflon disk and then analyzed by LIBS. Advantages of LIBS include fast analysis (~12s per sample) and high selectivity and sensitivity, with expected LODs 0.1-18ng for each of the target elements after sampling. The analytical performance of the LIBS method is also compared to previously reported methods (inductively coupled plasma-optical emission spectroscopy). The combination of fast CMV sampling, unambiguous organic compound identification with GC-MS and fast LIBS analysis provides the basis for a new comprehensive screening method for GSR. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  8. Clustering of Farsi sub-word images for whole-book recognition

    NASA Astrophysics Data System (ADS)

    Soheili, Mohammad Reza; Kabir, Ehsanollah; Stricker, Didier

    2015-01-01

    Redundancy of word and sub-word occurrences in large documents can be effectively utilized in an OCR system to improve recognition results. Most OCR systems employ language modeling techniques as a post-processing step; however these techniques do not use important pictorial information that exist in the text image. In case of large-scale recognition of degraded documents, this information is even more valuable. In our previous work, we proposed a subword image clustering method for the applications dealing with large printed documents. In our clustering method, the ideal case is when all equivalent sub-word images lie in one cluster. To overcome the issues of low print quality, the clustering method uses an image matching algorithm for measuring the distance between two sub-word images. The measured distance with a set of simple shape features were used to cluster all sub-word images. In this paper, we analyze the effects of adding more shape features on processing time, purity of clustering, and the final recognition rate. Previously published experiments have shown the efficiency of our method on a book. Here we present extended experimental results and evaluate our method on another book with totally different font face. Also we show that the number of the new created clusters in a page can be used as a criteria for assessing the quality of print and evaluating preprocessing phases.

  9. A benchmark for statistical microarray data analysis that preserves actual biological and technical variance.

    PubMed

    De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric

    2010-01-11

    Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.

  10. Image Quality Ranking Method for Microscopy

    PubMed Central

    Koho, Sami; Fazeli, Elnaz; Eriksson, John E.; Hänninen, Pekka E.

    2016-01-01

    Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics. PMID:27364703

  11. New method for solving inductive electric fields in the non-uniformly conducting ionosphere

    NASA Astrophysics Data System (ADS)

    Vanhamäki, H.; Amm, O.; Viljanen, A.

    2006-10-01

    We present a new calculation method for solving inductive electric fields in the ionosphere. The time series of the potential part of the ionospheric electric field, together with the Hall and Pedersen conductances serves as the input to this method. The output is the time series of the induced rotational part of the ionospheric electric field. The calculation method works in the time-domain and can be used with non-uniform, time-dependent conductances. In addition, no particular symmetry requirements are imposed on the input potential electric field. The presented method makes use of special non-local vector basis functions called the Cartesian Elementary Current Systems (CECS). This vector basis offers a convenient way of representing curl-free and divergence-free parts of 2-dimensional vector fields and makes it possible to solve the induction problem using simple linear algebra. The new calculation method is validated by comparing it with previously published results for Alfvén wave reflection from a uniformly conducting ionosphere.

  12. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study.

    PubMed

    van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-08-07

    Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.

  13. An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data

    PubMed Central

    Jing, Linhai; Tang, Yunwei; Ding, Haifeng

    2018-01-01

    Numerous pansharpening methods were proposed in recent decades for fusing low-spatial-resolution multispectral (MS) images with high-spatial-resolution (HSR) panchromatic (PAN) bands to produce fused HSR MS images, which are widely used in various remote sensing tasks. The effect of misregistration between MS and PAN bands on quality of fused products has gained much attention in recent years. An improved method for misaligned MS and PAN imagery is proposed, through two improvements made on a previously published method named RMI (reduce misalignment impact). The performance of the proposed method was assessed by comparing with some outstanding fusion methods, such as adaptive Gram-Schmidt and generalized Laplacian pyramid. Experimental results show that the improved version can reduce spectral distortions of fused dark pixels and sharpen boundaries between different image objects, as well as obtain similar quality indexes with the original RMI method. In addition, the proposed method was evaluated with respect to its sensitivity to misalignments between MS and PAN bands. It is certified that the proposed method is more robust to misalignments between MS and PAN bands than the other methods. PMID:29439502

  14. An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data.

    PubMed

    Li, Hui; Jing, Linhai; Tang, Yunwei; Ding, Haifeng

    2018-02-11

    Numerous pansharpening methods were proposed in recent decades for fusing low-spatial-resolution multispectral (MS) images with high-spatial-resolution (HSR) panchromatic (PAN) bands to produce fused HSR MS images, which are widely used in various remote sensing tasks. The effect of misregistration between MS and PAN bands on quality of fused products has gained much attention in recent years. An improved method for misaligned MS and PAN imagery is proposed, through two improvements made on a previously published method named RMI (reduce misalignment impact). The performance of the proposed method was assessed by comparing with some outstanding fusion methods, such as adaptive Gram-Schmidt and generalized Laplacian pyramid. Experimental results show that the improved version can reduce spectral distortions of fused dark pixels and sharpen boundaries between different image objects, as well as obtain similar quality indexes with the original RMI method. In addition, the proposed method was evaluated with respect to its sensitivity to misalignments between MS and PAN bands. It is certified that the proposed method is more robust to misalignments between MS and PAN bands than the other methods.

  15. PrePhyloPro: phylogenetic profile-based prediction of whole proteome linkages

    PubMed Central

    Niu, Yulong; Liu, Chengcheng; Moghimyfiroozabad, Shayan; Yang, Yi

    2017-01-01

    Direct and indirect functional links between proteins as well as their interactions as part of larger protein complexes or common signaling pathways may be predicted by analyzing the correlation of their evolutionary patterns. Based on phylogenetic profiling, here we present a highly scalable and time-efficient computational framework for predicting linkages within the whole human proteome. We have validated this method through analysis of 3,697 human pathways and molecular complexes and a comparison of our results with the prediction outcomes of previously published co-occurrency model-based and normalization methods. Here we also introduce PrePhyloPro, a web-based software that uses our method for accurately predicting proteome-wide linkages. We present data on interactions of human mitochondrial proteins, verifying the performance of this software. PrePhyloPro is freely available at http://prephylopro.org/phyloprofile/. PMID:28875072

  16. Mean glandular dose coefficients (D(g)N) for x-ray spectra used in contemporary breast imaging systems.

    PubMed

    Nosratieh, Anita; Hernandez, Andrew; Shen, Sam Z; Yaffe, Martin J; Seibert, J Anthony; Boone, John M

    2015-09-21

    To develop tables of normalized glandular dose coefficients D(g)N for a range of anode-filter combinations and tube voltages used in contemporary breast imaging systems. Previously published mono-energetic D(g)N values were used with various spectra to mathematically compute D(g)N coefficients. The tungsten anode spectra from TASMICS were used; molybdenum and rhodium anode-spectra were generated using MCNPX Monte Carlo code. The spectra were filtered with various thicknesses of Al, Rh, Mo or Cu. An initial half value layer (HVL) calculation was made using the anode and filter material. A range of the HVL values was produced with the addition of small thicknesses of polymethyl methacrylate (PMMA) as a surrogate for the breast compression paddle, to produce a range of HVL values at each tube voltage. Using a spectral weighting method, D(g)N coefficients for the generated spectra were calculated for breast glandular densities of 0%, 12.5%, 25%, 37.5%, 50% and 100% for a range of compressed breast thicknesses from 3 to 8 cm. Eleven tables of normalized glandular dose (D(g)N) coefficients were produced for the following anode/filter combinations: W + 50 μm Ag, W + 500 μm Al, W + 700 μm Al, W + 200 μm Cu, W + 300 μm Cu, W + 50 μm Rh, Mo + 400 μm Cu, Mo + 30 μm Mo, Mo + 25 μm Rh, Rh + 400 μm Cu and Rh + 25 μm Rh. Where possible, these results were compared to previously published D(g)N values and were found to be on average less than 2% different than previously reported values.Over 200 pages of D(g)N coefficients were computed for modeled x-ray system spectra that are used in a number of new breast imaging applications. The reported values were found to be in excellent agreement when compared to published values.

  17. How many novel eukaryotic 'kingdoms'? Pitfalls and limitations of environmental DNA surveys

    PubMed Central

    Berney, Cédric; Fahrni, José; Pawlowski, Jan

    2004-01-01

    Background Over the past few years, the use of molecular techniques to detect cultivation-independent, eukaryotic diversity has proven to be a powerful approach. Based on small-subunit ribosomal RNA (SSU rRNA) gene analyses, these studies have revealed the existence of an unexpected variety of new phylotypes. Some of them represent novel diversity in known eukaryotic groups, mainly stramenopiles and alveolates. Others do not seem to be related to any molecularly described lineage, and have been proposed to represent novel eukaryotic kingdoms. In order to review the evolutionary importance of this novel high-level eukaryotic diversity critically, and to test the potential technical and analytical pitfalls and limitations of eukaryotic environmental DNA surveys (EES), we analysed 484 environmental SSU rRNA gene sequences, including 81 new sequences from sediments of the small river, the Seymaz (Geneva, Switzerland). Results Based on a detailed screening of an exhaustive alignment of eukaryotic SSU rRNA gene sequences and the phylogenetic re-analysis of previously published environmental sequences using Bayesian methods, our results suggest that the number of novel higher-level taxa revealed by previously published EES was overestimated. Three main sources of errors are responsible for this situation: (1) the presence of undetected chimeric sequences; (2) the misplacement of several fast-evolving sequences; and (3) the incomplete sampling of described, but yet unsequenced eukaryotes. Additionally, EES give a biased view of the diversity present in a given biotope because of the difficult amplification of SSU rRNA genes in some taxonomic groups. Conclusions Environmental DNA surveys undoubtedly contribute to reveal many novel eukaryotic lineages, but there is no clear evidence for a spectacular increase of the diversity at the kingdom level. After re-analysis of previously published data, we found only five candidate lineages of possible novel high-level eukaryotic taxa, two of which comprise several phylotypes that were found independently in different studies. To ascertain their taxonomic status, however, the organisms themselves have now to be identified. PMID:15176975

  18. SU-D-209-02: Percent Depth Dose Curves for Fluoroscopic X-Ray Beam Qualities Incorporating Copper Filtration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wunderle, K; Wayne State University School of Medicine, Detroit, MI; Godley, A

    Purpose: The purpose of this investigation was to quantify percent depth dose (PDD) curves for fluoroscopic x-ray beam qualities incorporating added copper filtration. Methods: A PTW (Freiburg, Germany) MP3 water tank was used with a Standard Imaging (Middleton, WI) Exradin Model 11 Spokas Chamber to measure PDD curves for 60, 80, 100 and 120 kVp x-ray beams with copper filtration ranging from 0.0–0.9 mm at 22cm and 42cm fields of view from 0 to 150 mm of water. A free-in-air monitor chamber was used to normalize the water tank data to fluctuations in output from the fluoroscope. The measurements weremore » acquired on a Siemens (Erlangen, Germany) Artis ZeeGo fluoroscope. The fluoroscope was inverted from the typical orientation providing an x-ray beam originating from above the water tank. The water tank was positioned so that the water level was located at 60cm from the focal spot; which also represents the focal spot to interventional reference plane distance for that fluoroscope. Results: PDDs for 60, 80, 100, and 120 kVp with 0 mm of copper filtration compared well to previously published data by Fetterly et al. [Med Phys, 28, 205 (2001)] for those beam qualities given differences in fluoroscopes, geometric orientation, type of ionization chamber, and the water tank used for data collection. PDDs for 60, 80, 100, and 120 kVp with copper filtration were obtained and are presented, which have not been previously investigated and published. Conclusion: The equipment and processes used to acquire the reported data were sound and compared well with previously published data for PDDs without copper filtration. PDD data for the fluoroscopic x-ray beams incorporating copper filtration can be used as reference data for estimating organ or soft tissue dose at depth involving similar beam qualities or for comparison with mathematical models.« less

  19. PCM-SABRE: a platform for benchmarking and comparing outcome prediction methods in precision cancer medicine.

    PubMed

    Eyal-Altman, Noah; Last, Mark; Rubin, Eitan

    2017-01-17

    Numerous publications attempt to predict cancer survival outcome from gene expression data using machine-learning methods. A direct comparison of these works is challenging for the following reasons: (1) inconsistent measures used to evaluate the performance of different models, and (2) incomplete specification of critical stages in the process of knowledge discovery. There is a need for a platform that would allow researchers to replicate previous works and to test the impact of changes in the knowledge discovery process on the accuracy of the induced models. We developed the PCM-SABRE platform, which supports the entire knowledge discovery process for cancer outcome analysis. PCM-SABRE was developed using KNIME. By using PCM-SABRE to reproduce the results of previously published works on breast cancer survival, we define a baseline for evaluating future attempts to predict cancer outcome with machine learning. We used PCM-SABRE to replicate previous work that describe predictive models of breast cancer recurrence, and tested the performance of all possible combinations of feature selection methods and data mining algorithms that was used in either of the works. We reconstructed the work of Chou et al. observing similar trends - superior performance of Probabilistic Neural Network (PNN) and logistic regression (LR) algorithms and inconclusive impact of feature pre-selection with the decision tree algorithm on subsequent analysis. PCM-SABRE is a software tool that provides an intuitive environment for rapid development of predictive models in cancer precision medicine.

  20. Stratospheric aerosol particle size distribution based on multi-color polarization measurements of the twilight sky

    NASA Astrophysics Data System (ADS)

    Ugolnikov, Oleg S.; Maslov, Igor A.

    2018-03-01

    Polarization measurements of the twilight background with Wide-Angle Polarization Camera (WAPC) are used to detect the depolarization effect caused by stratospheric aerosol near the altitude of 20 km. Based on a number of observations in central Russia in spring and summer 2016, we found the parameters of lognormal size distribution of aerosol particles. This confirmed the previously published results of the colorimetric method as applied to the same twilights. The mean particle radius (about 0.1 micrometers) and size distribution are also in agreement with the recent data of in situ and space-based remote sensing of stratospheric aerosol. Methods considered here provide two independent techniques of the stratospheric aerosol study based on the twilight sky analysis.

  1. Finite difference time domain calculation of transients in antennas with nonlinear loads

    NASA Technical Reports Server (NTRS)

    Luebbers, Raymond J.; Beggs, John H.; Kunz, Karl S.; Chamberlin, Kent

    1991-01-01

    In this paper transient fields for antennas with more general geometries are calculated directly using Finite Difference Time Domain methods. In each FDTD cell which contains a nonlinear load, a nonlinear equation is solved at each time step. As a test case the transient current in a long dipole antenna with a nonlinear load excited by a pulsed plane wave is computed using this approach. The results agree well with both calculated and measured results previously published. The approach given here extends the applicability of the FDTD method to problems involving scattering from targets including nonlinear loads and materials, and to coupling between antennas containing nonlinear loads. It may also be extended to propagation through nonlinear materials.

  2. Improvements to sample processing and measurement to enable more widespread environmental application of tritium.

    PubMed

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.

  3. Validating a benchmarking tool for audit of early outcomes after operations for head and neck cancer.

    PubMed

    Tighe, D; Sassoon, I; McGurk, M

    2017-04-01

    INTRODUCTION In 2013 all UK surgical specialties, with the exception of head and neck surgery, published outcome data adjusted for case mix for indicator operations. This paper reports a pilot study to validate a previously published risk adjustment score on patients from separate UK cancer centres. METHODS A case note audit was performed of 1,075 patients undergoing 1,218 operations for head and neck squamous cell carcinoma under general anaesthesia in 4 surgical centres. A logistic regression equation predicting for all complications, previously validated internally at sites A-C, was tested on a fourth external validation sample (site D, 172 operations) using receiver operating characteristic curves, Hosmer-Lemeshow goodness of fit analysis and Brier scores. RESULTS Thirty-day complication rates varied widely (34-51%) between the centres. The predictive score allowed imperfect risk adjustment (area under the curve: 0.70), with Hosmer-Lemeshow analysis suggesting good calibration. The Brier score changed from 0.19 for sites A-C to 0.23 when site D was also included, suggesting poor accuracy overall. CONCLUSIONS Marked differences in operative risk and patient case mix captured by the risk adjustment score do not explain all the differences in observed outcomes. Further investigation with different methods is recommended to improve modelling of risk. Morbidity is common, and usually has a major impact on patient recovery, ward occupancy, hospital finances and patient perception of quality of care. We hope comparative audit will highlight good performance and challenge underperformance where it exists.

  4. Validating a benchmarking tool for audit of early outcomes after operations for head and neck cancer

    PubMed Central

    Sassoon, I; McGurk, M

    2017-01-01

    INTRODUCTION In 2013 all UK surgical specialties, with the exception of head and neck surgery, published outcome data adjusted for case mix for indicator operations. This paper reports a pilot study to validate a previously published risk adjustment score on patients from separate UK cancer centres. METHODS A case note audit was performed of 1,075 patients undergoing 1,218 operations for head and neck squamous cell carcinoma under general anaesthesia in 4 surgical centres. A logistic regression equation predicting for all complications, previously validated internally at sites A–C, was tested on a fourth external validation sample (site D, 172 operations) using receiver operating characteristic curves, Hosmer–Lemeshow goodness of fit analysis and Brier scores. RESULTS Thirty-day complication rates varied widely (34–51%) between the centres. The predictive score allowed imperfect risk adjustment (area under the curve: 0.70), with Hosmer–Lemeshow analysis suggesting good calibration. The Brier score changed from 0.19 for sites A–C to 0.23 when site D was also included, suggesting poor accuracy overall. CONCLUSIONS Marked differences in operative risk and patient case mix captured by the risk adjustment score do not explain all the differences in observed outcomes. Further investigation with different methods is recommended to improve modelling of risk. Morbidity is common, and usually has a major impact on patient recovery, ward occupancy, hospital finances and patient perception of quality of care. We hope comparative audit will highlight good performance and challenge underperformance where it exists. PMID:27917662

  5. Residual γH2AX foci after ex vivo irradiation of patient samples with known tumour-type specific differences in radio-responsiveness.

    PubMed

    Menegakis, Apostolos; De Colle, Chiara; Yaromina, Ala; Hennenlotter, Joerg; Stenzl, Arnulf; Scharpf, Marcus; Fend, Falko; Noell, Susan; Tatagiba, Marcos; Brucker, Sara; Wallwiener, Diethelm; Boeke, Simon; Ricardi, Umberto; Baumann, Michael; Zips, Daniel

    2015-09-01

    To apply our previously published residual ex vivo γH2AX foci method to patient-derived tumour specimens covering a spectrum of tumour-types with known differences in radiation response. In addition, the data were used to simulate different experimental scenarios to simplify the method. Evaluation of residual γH2AX foci in well-oxygenated tumour areas of ex vivo irradiated patient-derived tumour specimens with graded single doses was performed. Immediately after surgical resection, the samples were cultivated for 24h in culture medium prior to irradiation and fixed 24h post-irradiation for γH2AX foci evaluation. Specimens from a total of 25 patients (including 7 previously published) with 10 different tumour types were included. Linear dose response of residual γH2AX foci was observed in all specimens with highly variable slopes among different tumour types ranging from 0.69 (95% CI: 1.14-0.24) to 3.26 (95% CI: 4.13-2.62) for chondrosarcomas (radioresistant) and classical seminomas (radiosensitive) respectively. Simulations suggest that omitting dose levels might simplify the assay without compromising robustness. Here we confirm clinical feasibility of the assay. The slopes of the residual foci number are well in line with the expected differences in radio-responsiveness of different tumour types implying that intrinsic radiation sensitivity contributes to tumour radiation response. Thus, this assay has a promising potential for individualized radiation therapy and prospective validation is warranted. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Catch-up validation study of an in vitro skin irritation test method based on an open source reconstructed epidermis (phase II).

    PubMed

    Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R

    2016-10-01

    To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Multiplex real-time PCR monitoring of intestinal helminths in humans reveals widespread polyparasitism in Northern Samar, the Philippines.

    PubMed

    Gordon, Catherine A; McManus, Donald P; Acosta, Luz P; Olveda, Remigio M; Williams, Gail M; Ross, Allen G; Gray, Darren J; Gobert, Geoffrey N

    2015-06-01

    The global socioeconomic importance of helminth parasitic disease is underpinned by the considerable clinical impact on millions of people. While helminth polyparasitism is considered common in the Philippines, little has been done to survey its extent in endemic communities. High morphological similarity of eggs between related species complicates conventional microscopic diagnostic methods which are known to lack sensitivity, particularly in low intensity infections. Multiplex quantitative PCR diagnostic methods can provide rapid, simultaneous identification of multiple helminth species from a single stool sample. We describe a multiplex assay for the differentiation of Ascaris lumbricoides, Necator americanus, Ancylostoma, Taenia saginata and Taenia solium, building on our previously published findings for Schistosoma japonicum. Of 545 human faecal samples examined, 46.6% were positive for at least three different parasite species. High prevalences of S. japonicum (90.64%), A. lumbricoides (58.17%), T. saginata (42.57%) and A. duodenale (48.07%) were recorded. Neither T. solium nor N. americanus were found to be present. The utility of molecular diagnostic methods for monitoring helminth parasite prevalence provides new information on the extent of polyparasitism in the Philippines municipality of Palapag. These methods and findings have potential global implications for the monitoring of neglected tropical diseases and control measures. Copyright © 2015 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

  8. A method to validate quantitative high-frequency power doppler ultrasound with fluorescence in vivo video microscopy.

    PubMed

    Pinter, Stephen Z; Kim, Dae-Ro; Hague, M Nicole; Chambers, Ann F; MacDonald, Ian C; Lacefield, James C

    2014-08-01

    Flow quantification with high-frequency (>20 MHz) power Doppler ultrasound can be performed objectively using the wall-filter selection curve (WFSC) method to select the cutoff velocity that yields a best-estimate color pixel density (CPD). An in vivo video microscopy system (IVVM) is combined with high-frequency power Doppler ultrasound to provide a method for validation of CPD measurements based on WFSCs in mouse testicular vessels. The ultrasound and IVVM systems are instrumented so that the mouse remains on the same imaging platform when switching between the two modalities. In vivo video microscopy provides gold-standard measurements of vascular diameter to validate power Doppler CPD estimates. Measurements in four image planes from three mice exhibit wide variation in the optimal cutoff velocity and indicate that a predetermined cutoff velocity setting can introduce significant errors in studies intended to quantify vascularity. Consistent with previously published flow-phantom data, in vivo WFSCs exhibited three characteristic regions and detectable plateaus. Selection of a cutoff velocity at the right end of the plateau yielded a CPD close to the gold-standard vascular volume fraction estimated using IVVM. An investigator can implement the WFSC method to help adapt cutoff velocity to current blood flow conditions and thereby improve the accuracy of power Doppler for quantitative microvascular imaging. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  9. Assessment of masticatory performance by means of a color-changeable chewing gum.

    PubMed

    Tarkowska, Agnieszka; Katzer, Lukasz; Ahlers, Marcus Oliver

    2017-01-01

    Previous research determined the relevance of masticatory performance with regard to nutritional status, cognitive functions, or stress management. In addition, the measurement of masticatory efficiency contributes to the evaluation of therapeutic successes within the stomatognathic system. However, the question remains unanswered as to what extent modern techniques are able to reproduce the subtle differences in masticatory efficiency within various patient groups. The purpose of this review is to provide an extensive summary of the evaluation of masticatory performance by means of a color-changeable chewing gum with regard to its clinical relevance and applicability. A general overview describing the various methods available for this task has already been published. This review focuses in depth on the research findings available on the technique of measuring masticatory performance by means of color-changeable chewing gum. Described are the mechanism and the differentiability of the color change and methods to evaluate the color changes. Subsequently, research on masticatory performance is conducted with regard to patient age groups, the impact of general diseases and the effect of prosthetic and surgical treatment. The studies indicate that color-changeable chewing gum is a valid and reliable method for the evaluation of masticatory function. Apart from other methods, in clinical practice this technique can enhance dental diagnostics as well as the assessment of therapy outcomes. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  10. A Network View on Psychiatric Disorders: Network Clusters of Symptoms as Elementary Syndromes of Psychopathology

    PubMed Central

    Goekoop, Rutger; Goekoop, Jaap G.

    2014-01-01

    Introduction The vast number of psychopathological syndromes that can be observed in clinical practice can be described in terms of a limited number of elementary syndromes that are differentially expressed. Previous attempts to identify elementary syndromes have shown limitations that have slowed progress in the taxonomy of psychiatric disorders. Aim To examine the ability of network community detection (NCD) to identify elementary syndromes of psychopathology and move beyond the limitations of current classification methods in psychiatry. Methods 192 patients with unselected mental disorders were tested on the Comprehensive Psychopathological Rating Scale (CPRS). Principal component analysis (PCA) was performed on the bootstrapped correlation matrix of symptom scores to extract the principal component structure (PCS). An undirected and weighted network graph was constructed from the same matrix. Network community structure (NCS) was optimized using a previously published technique. Results In the optimal network structure, network clusters showed a 89% match with principal components of psychopathology. Some 6 network clusters were found, including "DEPRESSION", "MANIA", “ANXIETY”, "PSYCHOSIS", "RETARDATION", and "BEHAVIORAL DISORGANIZATION". Network metrics were used to quantify the continuities between the elementary syndromes. Conclusion We present the first comprehensive network graph of psychopathology that is free from the biases of previous classifications: a ‘Psychopathology Web’. Clusters within this network represent elementary syndromes that are connected via a limited number of bridge symptoms. Many problems of previous classifications can be overcome by using a network approach to psychopathology. PMID:25427156

  11. Use of conserved key amino acid positions to morph protein folds.

    PubMed

    Reddy, Boojala V B; Li, Wilfred W; Bourne, Philip E

    2002-07-15

    By using three-dimensional (3D) structure alignments and a previously published method to determine Conserved Key Amino Acid Positions (CKAAPs) we propose a theoretical method to design mutations that can be used to morph the protein folds. The original Paracelsus challenge, met by several groups, called for the engineering of a stable but different structure by modifying less than 50% of the amino acid residues. We have used the sequences from the Protein Data Bank (PDB) identifiers 1ROP, and 2CRO, which were previously used in the Paracelsus challenge by those groups, and suggest mutation to CKAAPs to morph the protein fold. The total number of mutations suggested is less than 40% of the starting sequence theoretically improving the challenge results. From secondary structure prediction experiments of the proposed mutant sequence structures, we observe that each of the suggested mutant protein sequences likely folds to a different, non-native potentially stable target structure. These results are an early indicator that analyses using structure alignments leading to CKAAPs of a given structure are of value in protein engineering experiments. Copyright 2002 Wiley Periodicals, Inc.

  12. Analytical Study of 90Sr Betavoltaic Nuclear Battery Performance Based on p-n Junction Silicon

    NASA Astrophysics Data System (ADS)

    Rahastama, Swastya; Waris, Abdul

    2016-08-01

    Previously, an analytical calculation of 63Ni p-n junction betavoltaic battery has been published. As the basic approach, we reproduced the analytical simulation of 63Ni betavoltaic battery and then compared it to previous results using the same design of the battery. Furthermore, we calculated its maximum power output and radiation- electricity conversion efficiency using semiconductor analysis method.Then, the same method were applied to calculate and analyse the performance of 90Sr betavoltaic battery. The aim of this project is to compare the analytical perfomance results of 90Sr betavoltaic battery to 63Ni betavoltaic battery and the source activity influences to performance. Since it has a higher power density, 90Sr betavoltaic battery yields more power than 63Ni betavoltaic battery but less radiation-electricity conversion efficiency. However, beta particles emitted from 90Sr source could travel further inside the silicon corresponding to stopping range of beta particles, thus the 90Sr betavoltaic battery could be designed thicker than 63Ni betavoltaic battery to achieve higher conversion efficiency.

  13. Feasibility assessment of yttrium-90 liver radioembolization imaging using amplitude-based gated PET/CT

    PubMed Central

    Acuff, Shelley N.; Neveu, Melissa L.; Syed, Mumtaz; Kaman, Austin D.; Fu, Yitong

    2018-01-01

    Purpose The usage of PET/computed tomography (CT) to monitor hepatocellular carcinoma patients following yttrium-90 (90Y) radioembolization has increased. Respiratory motion causes liver movement, which can be corrected using gating techniques at the expense of added noise. This work examines the use of amplitude-based gating on 90Y-PET/CT and its potential impact on diagnostic integrity. Patients and methods Patients were imaged using PET/CT following 90Y radioembolization. A respiratory band was used to collect respiratory cycle data. Patient data were processed as both standard and motion-corrected images. Regions of interest were drawn and compared using three methods. Activity concentrations were calculated and converted into dose estimates using previously determined and published scaling factors. Diagnostic assessments were performed using a binary scale created from published 90Y-PET/CT image interpretation guidelines. Results Estimates of radiation dose were increased (P<0.05) when using amplitude-gating methods with 90Y PET/CT imaging. Motion-corrected images show increased noise, but the diagnostic determination of success, using the Kao criteria, did not change between static and motion-corrected data. Conclusion Amplitude-gated PET/CT following 90Y radioembolization is feasible and may improve 90Y dose estimates while maintaining diagnostic assessment integrity. PMID:29351124

  14. Transmission loss of double panels filled with porogranular materials.

    PubMed

    Chazot, Jean-Daniel; Guyader, Jean-Louis

    2009-12-01

    Sound transmission through hollow structures found its interest in several industrial domains such as building acoustics, automotive industry, and aeronautics. However, in practice, hollow structures are often filled with porous materials to improve acoustic properties without adding an excessive mass. Recently a lot of interest arises for granular materials of low density that can be an alternative to standard absorbing materials. This paper aims to predict vibro-acoustic behavior of double panels filled with porogranular materials by using the patch-mobility method recently published. Biot's theory is a basic tool for the description of porous material but is quite difficult to use in practice, mostly because of the solid phase characterization. The original simplified Biot's model (fluid-fluid model) for porogranular material permitting a considerable reduction in data necessary for calculation has been recently published. The aim of the present paper is to propose a model to predict sound transmission through a double panel filled with a porogranular material. The method is an extension of a previous paper to take into account the porogranular material through fluid-fluid Biot's model. After a global overview of the method, the case of a double panel filled with expanded polystyrene beads is studied and a comparison with measurements is realized.

  15. Validation of a high-throughput real-time polymerase chain reaction assay for the detection of capripoxviral DNA.

    PubMed

    Stubbs, Samuel; Oura, Chris A L; Henstock, Mark; Bowden, Timothy R; King, Donald P; Tuppurainen, Eeva S M

    2012-02-01

    Capripoxviruses, which are endemic in much of Africa and Asia, are the aetiological agents of economically devastating poxviral diseases in cattle, sheep and goats. The aim of this study was to validate a high-throughput real-time PCR assay for routine diagnostic use in a capripoxvirus reference laboratory. The performance of two previously published real-time PCR methods were compared using commercially available reagents including the amplification kits recommended in the original publication. Furthermore, both manual and robotic extraction methods used to prepare template nucleic acid were evaluated using samples collected from experimentally infected animals. The optimised assay had an analytical sensitivity of at least 63 target DNA copies per reaction, displayed a greater diagnostic sensitivity compared to conventional gel-based PCR, detected capripoxviruses isolated from outbreaks around the world and did not amplify DNA from related viruses in the genera Orthopoxvirus or Parapoxvirus. The high-throughput robotic DNA extraction procedure did not adversely affect the sensitivity of the assay compared to manual preparation of PCR templates. This laboratory-based assay provides a rapid and robust method to detect capripoxviruses following suspicion of disease in endemic or disease-free countries. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  16. Retracted articles in surgery journals. What are surgeons doing wrong?

    PubMed

    Cassão, Bruna Dell'Acqua; Herbella, Fernando A M; Schlottmann, Francisco; Patti, Marco G

    2018-06-01

    Retraction of previously published scientific articles is an important mechanism to preserve the integrity of scientific work. This study analyzed retractions of previously published articles from surgery journals. We searched for retracted articles in the 100 surgery journals with the highest SJR2 indicator grades. We found 130 retracted articles in 49 journals (49%). Five or more retracted articles were published in 8 journals (8%). The mean time between publication and retraction was 26 months (range 1 to 158 months). The United States, China, Germany, Japan, and the United Kingdom accounted for more than 3 out of 4 of the retracted articles. The greatest number of retractions came from manuscripts about orthopedics and traumatology, general surgery, anesthesiology, cardiothoracic surgery, and plastic surgery. Nonsurgeons were responsible for 16% of retractions in these surgery journals. The main reasons for retraction were duplicate publication (42%), plagiarism (16%), absence of proven integrity of the study (14%), incorrect data (13%), data published without authorization (12%), violation of research ethics (11%), documented fraud (11%), request of an author(s) (5%), and unknown (3%). In 25% of the retracted articles, other publications by the same authors also had been retracted. Retraction of published articles does not occur frequently in surgery journals. Some form of scientific misconduct was present in the majority of retractions, especially duplication of publication and plagiarism. Retractions of previously published articles were most frequent from countries with the greatest number of publications; some authors showed recidivism. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. The Histochemistry and Cell Biology compendium: a review of 2012.

    PubMed

    Taatjes, Douglas J; Roth, Jürgen

    2013-06-01

    The year 2012 was another exciting year for Histochemistry and Cell Biology. Innovations in immunohistochemical techniques and microscopy-based imaging have provided the means for advances in the field of cell biology. Over 130 manuscripts were published in the journal during 2012, representing methodological advancements, pathobiology of disease, and cell and tissue biology. This annual review of the manuscripts published in the previous year in Histochemistry and Cell Biology serves as an abbreviated reference for the readership to quickly peruse and discern trends in the field over the past year. The review has been broadly divided into multiple sections encompassing topics such as method advancements, subcellular components, extracellular matrix, and organ systems. We hope that the creation of this subdivision will serve to guide the reader to a specific topic of interest, while simultaneously providing a concise and easily accessible encapsulation of other topics in the broad area of Histochemistry and Cell Biology.

  18. Application of photoacoustic infrared spectroscopy in the forensic analysis of artists' inorganic pigments.

    PubMed

    von Aderkas, Eleanor L; Barsan, Mirela M; Gilson, Denis F R; Butler, Ian S

    2010-12-01

    Fourier-transform photoacoustic infrared (PAIR) spectroscopy has been used in the analysis of 12 inorganic pigments commonly in use by artists today, viz., cobalt blue, ultramarine blue, Prussian blue, azurite, malachite, chromium oxide, viridian, cadmium yellow, chrome yellow, iron oxide, yellow ochre and Mars orange. The authenticity of these 12 commercial pigments was first established by recording their Raman spectra. The subsequent PAIR spectra were highly reproducible and matched well in the mid-IR region with previously published data for these pigments. A number of additional overtone and combination bands were also detected that will prove useful in the identification of the pigments in the future. The PAIR technique is a promising and reliable method for the analysis of inorganic pigments, especially since it involves much simpler preparation than is required for conventional IR measurements. Copyright © 2010. Published by Elsevier B.V.

  19. Reading Level and Comprehension of Research Consent Forms: An Integrative Review.

    PubMed

    Foe, Gabriella; Larson, Elaine L

    2016-02-01

    Consent forms continue to be at a higher reading level than the recommended sixth to eighth grade, making it difficult for participants to comprehend information before enrolling in research. To assess and address the extent of the problem regarding the level of literacy of consent forms and update previously published reports, we conducted an integrative literature review of English language research published between January 1, 2000, and December 31, 2013; 35 descriptive and eight intervention studies met inclusion criteria. Results confirmed that developing forms at eighth-grade level was attainable though not practiced. It was found that risks of participation was the section most poorly understood. There was also a lack of consensus regarding the most effective method to increase comprehension. Further research using standardized tools is needed to determine the best approach for improving consent forms and processes. © The Author(s) 2016.

  20. Biomechanical validation of an artificial tooth–periodontal ligament–bone complex for in vitro orthodontic load measurement

    PubMed Central

    Xia, Zeyang; Chen, Jie

    2014-01-01

    Objectives To develop an artificial tooth–periodontal ligament (PDL)–bone complex (ATPBC) that simulates clinical crown displacement. Material and Methods An ATPBC was created. It had a socket hosting a tooth with a thin layer of silicon mixture in between for simulating the PDL. The complex was attached to a device that allows applying a controlled force to the crown and measuring the resulting crown displacement. Crown displacements were compared to previously published data for validation. Results The ATPBC that had a PDL made of two types of silicones, 50% gasket sealant No. 2 and 50% RTV 587 silicone, with a thickness of 0.3 mm, simulated the PDL well. The mechanical behaviors (1) force-displacement relationship, (2) stress relaxation, (3) creep, and (4) hysteresis were validated by the published results. Conclusion The ATPBC simulated the crown displacement behavior reported from biological studies well. PMID:22970752

  1. Checklist of bees (Hymenoptera: Apoidea) from managed emergent wetlands in the lower Mississippi Alluvial Valley of Arkansas

    PubMed Central

    2018-01-01

    Abstract Background Here we present the results from a two-year bee survey conducted on 18 managed emergent wetlands in the lower Mississippi Alluvial Valley of Arkansas, USA. Sample methods included pan traps, sweep netting and blue-vane traps. We document 83 bee species and morphospecies in 5 families and 31 genera, of which 37 species represent first published state records for Arkansas. The majority of species were opportunistic wetland species; only a small number were wetland-dependent species or species largely restricted to alluvial plains. New information We present new distributional records for bee species not previously recorded in managed emergent wetlands and report specimens of thirty-seven species for which no published Arkansas records exist, expanding the known ranges of Ceratina cockerelli, Diadasia enavata, Lasioglossum creberrimum, Svastra cressonii and Dieunomia triangulifera. We also distinguish opportunistic wetland bee species from wetland-dependent and alluvial plain-restricted species. PMID:29773960

  2. PISA: Federated Search in P2P Networks with Uncooperative Peers

    NASA Astrophysics Data System (ADS)

    Ren, Zujie; Shou, Lidan; Chen, Gang; Chen, Chun; Bei, Yijun

    Recently, federated search in P2P networks has received much attention. Most of the previous work assumed a cooperative environment where each peer can actively participate in information publishing and distributed document indexing. However, little work has addressed the problem of incorporating uncooperative peers, which do not publish their own corpus statistics, into a network. This paper presents a P2P-based federated search framework called PISA which incorporates uncooperative peers as well as the normal ones. In order to address the indexing needs for uncooperative peers, we propose a novel heuristic query-based sampling approach which can obtain high-quality resource descriptions from uncooperative peers at relatively low communication cost. We also propose an effective method called RISE to merge the results returned by uncooperative peers. Our experimental results indicate that PISA can provide quality search results, while utilizing the uncooperative peers at a low cost.

  3. Checklist of bees (Hymenoptera: Apoidea) from managed emergent wetlands in the lower Mississippi Alluvial Valley of Arkansas.

    PubMed

    Stephenson, Phillip L; Griswold, Terry L; Arduser, Michael S; Dowling, Ashley P G; Krementz, David G

    2018-01-01

    Here we present the results from a two-year bee survey conducted on 18 managed emergent wetlands in the lower Mississippi Alluvial Valley of Arkansas, USA. Sample methods included pan traps, sweep netting and blue-vane traps. We document 83 bee species and morphospecies in 5 families and 31 genera, of which 37 species represent first published state records for Arkansas. The majority of species were opportunistic wetland species; only a small number were wetland-dependent species or species largely restricted to alluvial plains. We present new distributional records for bee species not previously recorded in managed emergent wetlands and report specimens of thirty-seven species for which no published Arkansas records exist, expanding the known ranges of Ceratina cockerelli , Diadasia enavata, Lasioglossum creberrimum, Svastra cressonii and Dieunomia triangulifera . We also distinguish opportunistic wetland bee species from wetland-dependent and alluvial plain-restricted species.

  4. A Multi-Start Evolutionary Local Search for the Two-Echelon Location Routing Problem

    NASA Astrophysics Data System (ADS)

    Nguyen, Viet-Phuong; Prins, Christian; Prodhon, Caroline

    This paper presents a new hybrid metaheuristic between a greedy randomized adaptive search procedure (GRASP) and an evolutionary/iterated local search (ELS/ILS), using Tabu list to solve the two-echelon location routing problem (LRP-2E). The GRASP uses in turn three constructive heuristics followed by local search to generate the initial solutions. From a solution of GRASP, an intensification strategy is carried out by a dynamic alternation between ELS and ILS. In this phase, each child is obtained by mutation and evaluated through a splitting procedure of giant tour followed by a local search. The tabu list, defined by two characteristics of solution (total cost and number of trips), is used to avoid searching a space already explored. The results show that our metaheuristic clearly outperforms all previously published methods on LRP-2E benchmark instances. Furthermore, it is competitive with the best meta-heuristic published for the single-echelon LRP.

  5. Anatomy of open access publishing: a study of longitudinal development and internal structure

    PubMed Central

    2012-01-01

    Background Open access (OA) is a revolutionary way of providing access to the scholarly journal literature made possible by the Internet. The primary aim of this study was to measure the volume of scientific articles published in full immediate OA journals from 2000 to 2011, while observing longitudinal internal shifts in the structure of OA publishing concerning revenue models, publisher types and relative distribution among scientific disciplines. The secondary aim was to measure the share of OA articles of all journal articles, including articles made OA by publishers with a delay and individual author-paid OA articles in subscription journals (hybrid OA), as these subsets of OA publishing have mostly been ignored in previous studies. Methods Stratified random sampling of journals in the Directory of Open Access Journals (n = 787) was performed. The annual publication volumes spanning 2000 to 2011 were retrieved from major publication indexes and through manual data collection. Results An estimated 340,000 articles were published by 6,713 full immediate OA journals during 2011. OA journals requiring article-processing charges have become increasingly common, publishing 166,700 articles in 2011 (49% of all OA articles). This growth is related to the growth of commercial publishers, who, despite only a marginal presence a decade ago, have grown to become key actors on the OA scene, responsible for 120,000 of the articles published in 2011. Publication volume has grown within all major scientific disciplines, however, biomedicine has seen a particularly rapid 16-fold growth between 2000 (7,400 articles) and 2011 (120,900 articles). Over the past decade, OA journal publishing has steadily increased its relative share of all scholarly journal articles by about 1% annually. Approximately 17% of the 1.66 million articles published during 2011 and indexed in the most comprehensive article-level index of scholarly articles (Scopus) are available OA through journal publishers, most articles immediately (12%) but some within 12 months of publication (5%). Conclusions OA journal publishing is disrupting the dominant subscription-based model of scientific publishing, having rapidly grown in relative annual share of published journal articles during the last decade. PMID:23088823

  6. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...

  7. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  8. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2012, to the most recent Index published prior to December 1, 2013... change in the cost of living during the period from the most recent index published prior to the previous...

  9. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...

  10. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2012, to the most recent Index published prior to December 1, 2013... change in the cost of living during the period from the most recent index published prior to the previous...

  11. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  12. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  13. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  14. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...

  15. AptRank: an adaptive PageRank model for protein function prediction on   bi-relational graphs.

    PubMed

    Jiang, Biaobin; Kloster, Kyle; Gleich, David F; Gribskov, Michael

    2017-06-15

    Diffusion-based network models are widely used for protein function prediction using protein network data and have been shown to outperform neighborhood-based and module-based methods. Recent studies have shown that integrating the hierarchical structure of the Gene Ontology (GO) data dramatically improves prediction accuracy. However, previous methods usually either used the GO hierarchy to refine the prediction results of multiple classifiers, or flattened the hierarchy into a function-function similarity kernel. No study has taken the GO hierarchy into account together with the protein network as a two-layer network model. We first construct a Bi-relational graph (Birg) model comprised of both protein-protein association and function-function hierarchical networks. We then propose two diffusion-based methods, BirgRank and AptRank, both of which use PageRank to diffuse information on this two-layer graph model. BirgRank is a direct application of traditional PageRank with fixed decay parameters. In contrast, AptRank utilizes an adaptive diffusion mechanism to improve the performance of BirgRank. We evaluate the ability of both methods to predict protein function on yeast, fly and human protein datasets, and compare with four previous methods: GeneMANIA, TMC, ProteinRank and clusDCA. We design four different validation strategies: missing function prediction, de novo function prediction, guided function prediction and newly discovered function prediction to comprehensively evaluate predictability of all six methods. We find that both BirgRank and AptRank outperform the previous methods, especially in missing function prediction when using only 10% of the data for training. The MATLAB code is available at https://github.rcac.purdue.edu/mgribsko/aptrank . gribskov@purdue.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Signature of chaos in the 4 f -core-excited states for highly-charged tungsten ions

    NASA Astrophysics Data System (ADS)

    Safronova, Ulyana; Safronova, Alla

    2014-05-01

    We evaluate radiative and autoionizing transition rates in highly charged W ions in search for the signature of chaos. In particularly, previously published results for Ag-like W27+, Tm-like W5+, and Yb-like W4+ ions as well as newly obtained for I-like W21+, Xe-like W20+, Cs-like W19+, and La-like W17+ ions (with ground configuration [Kr] 4d10 4fk with k = 7, 8, 9, and 11, respectively) are considered that were calculated using the multiconfiguration relativistic Hebrew University Lawrence Livermore Atomic Code (HULLAC code) and the Hartree-Fock-Relativistic method (COWAN code). The main emphasis was on verification of Gaussian statistics of rates as a function of transition energy. There was no evidence of such statistics for above mentioned previously published results as well as for the transitions between the excited and autoionizing states for newly calculated results. However, we did find the Gaussian profile for the transitions between excited states such as the [Kr] 4d10 4fk - [Kr] 4d10 4f k - 1 5 d transitions , for newly calculated W ions. This work is supported in part by DOE under NNSA Cooperative Agreement DE-NA0001984.

  17. Population models and simulation methods: The case of the Spearman rank correlation.

    PubMed

    Astivia, Oscar L Olvera; Zumbo, Bruno D

    2017-11-01

    The purpose of this paper is to highlight the importance of a population model in guiding the design and interpretation of simulation studies used to investigate the Spearman rank correlation. The Spearman rank correlation has been known for over a hundred years to applied researchers and methodologists alike and is one of the most widely used non-parametric statistics. Still, certain misconceptions can be found, either explicitly or implicitly, in the published literature because a population definition for this statistic is rarely discussed within the social and behavioural sciences. By relying on copula distribution theory, a population model is presented for the Spearman rank correlation, and its properties are explored both theoretically and in a simulation study. Through the use of the Iman-Conover algorithm (which allows the user to specify the rank correlation as a population parameter), simulation studies from previously published articles are explored, and it is found that many of the conclusions purported in them regarding the nature of the Spearman correlation would change if the data-generation mechanism better matched the simulation design. More specifically, issues such as small sample bias and lack of power of the t-test and r-to-z Fisher transformation disappear when the rank correlation is calculated from data sampled where the rank correlation is the population parameter. A proof for the consistency of the sample estimate of the rank correlation is shown as well as the flexibility of the copula model to encompass results previously published in the mathematical literature. © 2017 The British Psychological Society.

  18. Retrospective analysis of natural products provides insights for future discovery trends

    PubMed Central

    Pye, Cameron R.; Bertin, Matthew J.; Lokey, R. Scott; Gerwick, William H.

    2017-01-01

    Understanding of the capacity of the natural world to produce secondary metabolites is important to a broad range of fields, including drug discovery, ecology, biosynthesis, and chemical biology, among others. Both the absolute number and the rate of discovery of natural products have increased significantly in recent years. However, there is a perception and concern that the fundamental novelty of these discoveries is decreasing relative to previously known natural products. This study presents a quantitative examination of the field from the perspective of both number of compounds and compound novelty using a dataset of all published microbial and marine-derived natural products. This analysis aimed to explore a number of key questions, such as how the rate of discovery of new natural products has changed over the past decades, how the average natural product structural novelty has changed as a function of time, whether exploring novel taxonomic space affords an advantage in terms of novel compound discovery, and whether it is possible to estimate how close we are to having described all of the chemical space covered by natural products. Our analyses demonstrate that most natural products being published today bear structural similarity to previously published compounds, and that the range of scaffolds readily accessible from nature is limited. However, the analysis also shows that the field continues to discover appreciable numbers of natural products with no structural precedent. Together, these results suggest that the development of innovative discovery methods will continue to yield compounds with unique structural and biological properties. PMID:28461474

  19. Global approach for the validation of an in-line Raman spectroscopic method to determine the API content in real-time during a hot-melt extrusion process.

    PubMed

    Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E

    2017-08-15

    Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. On finding bicliques in bipartite graphs: a novel algorithm and its application to the integration of diverse biological data types

    PubMed Central

    2014-01-01

    Background Integrating and analyzing heterogeneous genome-scale data is a huge algorithmic challenge for modern systems biology. Bipartite graphs can be useful for representing relationships across pairs of disparate data types, with the interpretation of these relationships accomplished through an enumeration of maximal bicliques. Most previously-known techniques are generally ill-suited to this foundational task, because they are relatively inefficient and without effective scaling. In this paper, a powerful new algorithm is described that produces all maximal bicliques in a bipartite graph. Unlike most previous approaches, the new method neither places undue restrictions on its input nor inflates the problem size. Efficiency is achieved through an innovative exploitation of bipartite graph structure, and through computational reductions that rapidly eliminate non-maximal candidates from the search space. An iterative selection of vertices for consideration based on non-decreasing common neighborhood sizes boosts efficiency and leads to more balanced recursion trees. Results The new technique is implemented and compared to previously published approaches from graph theory and data mining. Formal time and space bounds are derived. Experiments are performed on both random graphs and graphs constructed from functional genomics data. It is shown that the new method substantially outperforms the best previous alternatives. Conclusions The new method is streamlined, efficient, and particularly well-suited to the study of huge and diverse biological data. A robust implementation has been incorporated into GeneWeaver, an online tool for integrating and analyzing functional genomics experiments, available at http://geneweaver.org. The enormous increase in scalability it provides empowers users to study complex and previously unassailable gene-set associations between genes and their biological functions in a hierarchical fashion and on a genome-wide scale. This practical computational resource is adaptable to almost any applications environment in which bipartite graphs can be used to model relationships between pairs of heterogeneous entities. PMID:24731198

  1. An open-label, single arm, phase III clinical study to evaluate the efficacy and safety of CJ smallpox vaccine in previously vaccinated healthy adults.

    PubMed

    Kim, Nak-Hyun; Kang, Yu Min; Kim, Gayeon; Choe, Pyoeng Gyun; Song, Jin Su; Lee, Kwang-Hee; Seong, Baik-Lin; Park, Wan Beom; Kim, Nam Joong; Oh, Myoung-don

    2013-10-25

    The increased possibility of bioterrorism has led to reinitiation of smallpox vaccination. In Korea, more than 30 years have passed since the last smallpox vaccinations, and even people who were previously vaccinated are not regarded as adequately protected against smallpox. We evaluated the efficacy and safety of CJ-50300, a newly developed cell culture-derived smallpox vaccine, in healthy adults previously vaccinated against smallpox. We conducted an open label, single arm, phase III clinical trial to evaluate the efficacy and safety of CJ-50300. Healthy volunteers, previously vaccinated against smallpox, born between 1950 and 1978 were enrolled. CJ-50300 was administered with a bifurcated needle over the deltoid muscle according to the recommended method. The rate of the cutaneous take reaction, humoral immunogenicity, and safety of the vaccine was assessed. Of 145 individuals enrolled for vaccination, 139 completed the study. The overall rates of cutaneous take reactions and humoral immunogenicity were 95.0% (132/139) and 88.5% (123/139), respectively. Although 95.9% (139/145) reported adverse events related to vaccination, no serious adverse reactions were observed. CJ-50300 can be used safely and effectively in healthy adults previously vaccinated against smallpox. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Weighted analysis of paired microarray experiments.

    PubMed

    Kristiansson, Erik; Sjögren, Anders; Rudemo, Mats; Nerman, Olle

    2005-01-01

    In microarray experiments quality often varies, for example between samples and between arrays. The need for quality control is therefore strong. A statistical model and a corresponding analysis method is suggested for experiments with pairing, including designs with individuals observed before and after treatment and many experiments with two-colour spotted arrays. The model is of mixed type with some parameters estimated by an empirical Bayes method. Differences in quality are modelled by individual variances and correlations between repetitions. The method is applied to three real and several simulated datasets. Two of the real datasets are of Affymetrix type with patients profiled before and after treatment, and the third dataset is of two-colour spotted cDNA type. In all cases, the patients or arrays had different estimated variances, leading to distinctly unequal weights in the analysis. We suggest also plots which illustrate the variances and correlations that affect the weights computed by our analysis method. For simulated data the improvement relative to previously published methods without weighting is shown to be substantial.

  3. The Fluorescent-Oil Film Method and Other Techniques for Boundary-Layer Flow Visualization

    NASA Technical Reports Server (NTRS)

    Loving, Donald L.; Katzoff, S.

    1959-01-01

    A flow-visualization technique, known as the fluorescent-oil film method, has been developed which appears to be generally simpler and to require less experience and development of technique than previously published methods. The method is especially adapted to use in the large high-powered wind tunnels which require considerable time to reach the desired test conditions. The method consists of smearing a film of fluorescent oil over a surface and observing where the thickness is affected by the shearing action of the boundary layer. These films are detected and identified, and their relative thicknesses are determined by use of ultraviolet light. Examples are given of the use of this technique. Other methods that show promise in the study of boundary-layer conditions are described. These methods include the use of a temperature-sensitive fluorescent paint and the use of a radiometer that is sensitive to the heat radiation from a surface. Some attention is also given to methods that can be used with a spray apparatus in front of the test model.

  4. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  5. Comment on "Hydrogen Balmer beta: The separation between line peaks for plasma electron density diagnostics and self-absorption test"

    NASA Astrophysics Data System (ADS)

    Gautam, Ghaneshwar; Surmick, David M.; Parigger, Christian G.

    2015-07-01

    In this letter, we present a brief comment regarding the recently published paper by Ivković et al., J Quant Spectrosc Radiat Transf 2015;154:1-8. Reference is made to previous experimental results to indicate that self absorption must have occurred; however, when carefully considering error propagation, both widths and peak-separation predict electron densities within the error margins. Yet the diagnosis method and the presented details on the use of the hydrogen beta peak separation are viewed as a welcomed contribution in studies of laser-induced plasma.

  6. Comparison of Theoretical Stresses and Deflections of Multicell Wings with Experimental Results Obtained from Plastic Models

    NASA Technical Reports Server (NTRS)

    Zender, George W

    1956-01-01

    The experimental deflections and stresses of six plastic multicell-wing models of unswept, delta, and swept plan form are presented and compared with previously published theoretical results obtained by the electrical analog method. The comparisons indicate that the theory is reliable except for the evaluation of stresses in the vicinity of the leading edge of delta wings and the leading and trailing edges of swept wings. The stresses in these regions are questionable, apparently because of simplifications employed in idealizing the actual structure for theoretical purposes and because of local effects of concentrated loads.

  7. Advanced tools for the analysis of protein phosphorylation in yeast mitochondria.

    PubMed

    Walter, Corvin; Gonczarowska-Jorge, Humberto; Sickmann, Albert; Zahedi, René P; Meisinger, Chris; Schmidt, Oliver

    2018-05-24

    The biochemical analysis of protein phosphorylation in mitochondria lags behind that of cytosolic signaling events. One reason is the poor stability of many phosphorylation sites during common isolation procedures for mitochondria. We present here an optimized, fast protocol for the purification of yeast mitochondria that greatly increases recovery of phosphorylated mitochondrial proteins. Moreover, we describe improved protocols for the biochemical analysis of mitochondrial protein phosphorylation by Zn 2+ -Phos-tag electrophoresis under both denaturing and - for the first time - native conditions, and demonstrate that they outperform previously applied methods. Copyright © 2018. Published by Elsevier Inc.

  8. NCI Helps Children’s Hospital of Philadelphia to Identify and Treat New Target in Pediatric Cancer | Poster

    Cancer.gov

    There may be a new, more effective method for treating high-risk neuroblastoma, according to scientists at the Children’s Hospital of Philadelphia and collaborators in the Cancer and Inflammation Program at NCI at Frederick. Together, the groups published a study describing a previously unrecognized protein on neuroblastoma cells, called GPC2, as well as the creation of a novel antibody-drug conjugate, a combination of a human antibody and a naturally occurring anticancer drug, that locates and binds to GPC2 in a highly efficient way.

  9. Metabolism: The Physiological Power-Generating Process: A History of Methods to Test Human Beings' \\"Vital Capacity\\" [Retrospectroscope].

    PubMed

    Johnston, Richard; Valentinuzzi, Max E

    2016-01-01

    A previous "Retrospectroscope" note, published early in 2014, dealt with spirometry: it described many apparatuses used to measure the volume of inhaled and exhaled air that results from breathing [1]. Such machines, when adequately modified, are also able to measure the rate at which work is produced (specifically by an animal or a human being). Metabolism in that sense is the term used by physiologists and physicians, a word that in Greek, metabolismos, means "change" or "overthrow," in the sense of breaking down material, as in burning some stuff.

  10. High pressure liquid chromatographic determination of aflatoxins in spices.

    PubMed

    Awe, M J; Schranz, J L

    1981-11-01

    High pressure liquid chromatography with fluorescence detection is used to determine aflatoxin in 5 common spices. A 10 micrometer microparticulate silica gel column is used with a dichloromethane-cyclohexane-acetonitrile solvent system to resolve aflatoxins B1, G1, B2, and G2. The fluorescence detector contained a silica gel-packed flowcell. Samples of black, white, and red pepper, ginger, and nutmeg were extracted according to a previously published method. Recoveries from aflatoxin-free samples of white pepper, ginger, and red pepper spiked with 1-50 micrograms aflatoxin/kg ranged from 64 to 92%.

  11. Organocatalyzed asymmetric alpha-oxidation, alpha-aminoxylation and alpha-amination of carbonyl compounds.

    PubMed

    Vilaivan, Tirayut; Bhanthumnavin, Worawan

    2010-02-11

    Organocatalytic asymmetric alpha-oxidation and amination reactions of carbonyl compounds are highly useful synthetic methodologies, especially in generating chiral building blocks that previously have not been easily accessible by traditional methods. The concept is relatively new and therefore the list of new catalysts, oxidizing and aminating reagents, as well as new substrates, are expanding at an amazing rate. The scope of this review includes new reactions and catalysts, mechanistic aspects and synthetic applications of alpha-oxidation, hydroxylation, aminoxylation, amination, hydrazination, hydroxyamination and related alpha-heteroatom functionalization of aldehydes, ketones and related active methylene compounds published during 2005-2009.

  12. Solution methods for one-dimensional viscoelastic problems

    NASA Technical Reports Server (NTRS)

    Stubstad, John M.; Simitses, George J.

    1987-01-01

    A recently developed differential methodology for solution of one-dimensional nonlinear viscoelastic problems is presented. Using the example of an eccentrically loaded cantilever beam-column, the results from the differential formulation are compared to results generated using a previously published integral solution technique. It is shown that the results obtained from these distinct methodologies exhibit a surprisingly high degree of correlation with one another. A discussion of the various factors affecting the numerical accuracy and rate of convergence of these two procedures is also included. Finally, the influences of some 'higher order' effects, such as straining along the centroidal axis are discussed.

  13. Assessment of composite motif discovery methods.

    PubMed

    Klepper, Kjetil; Sandve, Geir K; Abul, Osman; Johansen, Jostein; Drablos, Finn

    2008-02-26

    Computational discovery of regulatory elements is an important area of bioinformatics research and more than a hundred motif discovery methods have been published. Traditionally, most of these methods have addressed the problem of single motif discovery - discovering binding motifs for individual transcription factors. In higher organisms, however, transcription factors usually act in combination with nearby bound factors to induce specific regulatory behaviours. Hence, recent focus has shifted from single motifs to the discovery of sets of motifs bound by multiple cooperating transcription factors, so called composite motifs or cis-regulatory modules. Given the large number and diversity of methods available, independent assessment of methods becomes important. Although there have been several benchmark studies of single motif discovery, no similar studies have previously been conducted concerning composite motif discovery. We have developed a benchmarking framework for composite motif discovery and used it to evaluate the performance of eight published module discovery tools. Benchmark datasets were constructed based on real genomic sequences containing experimentally verified regulatory modules, and the module discovery programs were asked to predict both the locations of these modules and to specify the single motifs involved. To aid the programs in their search, we provided position weight matrices corresponding to the binding motifs of the transcription factors involved. In addition, selections of decoy matrices were mixed with the genuine matrices on one dataset to test the response of programs to varying levels of noise. Although some of the methods tested tended to score somewhat better than others overall, there were still large variations between individual datasets and no single method performed consistently better than the rest in all situations. The variation in performance on individual datasets also shows that the new benchmark datasets represents a suitable variety of challenges to most methods for module discovery.

  14. Estimating the energetic cost of feeding excess dietary nitrogen to dairy cows.

    PubMed

    Reed, K F; Bonfá, H C; Dijkstra, J; Casper, D P; Kebreab, E

    2017-09-01

    Feeding N in excess of requirement could require the use of additional energy to metabolize excess protein, and to synthesize and excrete urea; however, the amount and fate of this energy is unknown. Little progress has been made on this topic in recent decades, so an extension of work published in 1970 was conducted to quantify the effect of excess N on ruminant energetics. In part 1 of this study, the results of previous work were replicated using a simple linear regression to estimate the effect of excess N on energy balance. In part 2, mixed model methodology and a larger data set were used to improve upon the previously reported linear regression methods. In part 3, heat production, retained energy, and milk energy replaced the composite energy balance variable previously proposed as the dependent variable to narrow the effect of excess N. In addition, rumen degradable and undegradable protein intakes were estimated using table values and included as covariates in part 3. Excess N had opposite and approximately equal effects on heat production (+4.1 to +7.6 kcal/g of excess N) and retained energy (-4.2 to -6.6 kcal/g of excess N) but had a larger negative effect on milk gross energy (-52 to -68 kcal/g of excess N). The results suggest that feeding excess N increases heat production, but more investigation is required to determine why excess N has such a large effect on milk gross energy production. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. A BIBLIOGRAPHY OF BIOLOGICAL APPLICATIONS OF AUTORADIOGRAPHY, 1958 THROUGH 1959

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, M.E.

    1959-08-01

    This bibliography of 281 reports and published literature references on biological applications of autoradiography is a supplement to the one published July 1958 as UCRL-8400. References previously omitted are included. (J.E. D.)

  16. A second chance for authors of hijacked journals to publish in legitimate journals.

    PubMed

    Jalalian, Mehrdad

    2015-01-01

    This article proposes the republication of articles that have previously been published in counterfeit websites of hijacked journals. The paper also discusses the technical and ethical aspects of republishing such articles.

  17. Publisher Correction: Measuring progress from nationally determined contributions to mid-century strategies

    NASA Astrophysics Data System (ADS)

    Iyer, Gokul; Ledna, Catherine; Clarke, Leon; Edmonds, James; McJeon, Haewon; Kyle, Page; Williams, James H.

    2018-03-01

    In the version of this Article previously published, technical problems led to the wrong summary appearing on the homepage, and an incorrect Supplementary Information file being uploaded. Both errors have now been corrected.

  18. Chronic cuffing of cervical vagus nerve inhibits efferent fiber integrity in rat model

    NASA Astrophysics Data System (ADS)

    Somann, Jesse P.; Albors, Gabriel O.; Neihouser, Kaitlyn V.; Lu, Kun-Han; Liu, Zhongming; Ward, Matthew P.; Durkes, Abigail; Robinson, J. Paul; Powley, Terry L.; Irazoqui, Pedro P.

    2018-06-01

    Objective. Numerous studies of vagal nerve stimulation (VNS) have been published showing it to be a potential treatment for chronic inflammation and other related diseases and disorders. Studies in recent years have shown that electrical stimulation of the vagal efferent fibers can artificially modulate cytokine levels and reduce systematic inflammation. Most VNS research in the treatment of inflammation have been acute studies on rodent subjects. Our study tested VNS on freely moving animals by stimulating and recording from the cervical vagus with nerve cuff electrodes over an extended period of time. Approach. We used methods of electrical stimulation, retrograde tracing (using Fluorogold) and post necropsy histological analysis of nerve tissue, flow cytometry to measure plasma cytokine levels, and MRI scanning of gastric emptying. This novel combination of methods allowed examination of physiological aspects of VNS previously unexplored. Main results. Through our study of 53 rat subjects, we found that chronically cuffing the left cervical vagus nerve suppressed efferent Fluorogold transport in 43 of 44 animals (36 showed complete suppression). Measured cytokine levels and gastric emptying rates concurrently showed nominal differences between chronically cuffed rats and those tested with similar acute methods. Meanwhile, results of electrophysiological and histological tests of the cuffed nerves revealed them to be otherwise healthy, consistent with previous literature. Significance. We hypothesize that due to these unforeseen and unexplored physiological consequences of the chronically cuffed vagus nerve in a rat, that inflammatory modulation and other vagal effects by VNS may become unreliable in chronic studies. Given our findings, we submit that it would benefit the VNS community to re-examine methods used in previous literature to verify the efficacy of the rat model for chronic VNS studies.

  19. Influence of critical closing pressure on systemic vascular resistance and total arterial compliance: A clinical invasive study.

    PubMed

    Chemla, Denis; Lau, Edmund M T; Hervé, Philippe; Millasseau, Sandrine; Brahimi, Mabrouk; Zhu, Kaixian; Sattler, Caroline; Garcia, Gilles; Attal, Pierre; Nitenberg, Alain

    2017-12-01

    Systemic vascular resistance (SVR) and total arterial compliance (TAC) modulate systemic arterial load, and their product is the time constant (Tau) of the Windkessel. Previous studies have assumed that aortic pressure decays towards a pressure asymptote (P∞) close to 0mmHg, as right atrial pressure is considered the outflow pressure. Using these assumptions, aortic Tau values of ∼1.5seconds have been documented. However, a zero P∞ may not be physiological because of the high critical closing pressure previously documented in vivo. To calculate precisely the Tau and P∞ of the Windkessel, and to determine the implications for the indices of systemic arterial load. Aortic pressure decay was analysed using high-fidelity recordings in 16 subjects. Tau was calculated assuming P∞=0mmHg, and by two methods that make no assumptions regarding P∞ (the derivative and best-fit methods). Assuming P∞=0mmHg, we documented a Tau value of 1372±308ms, with only 29% of Windkessel function manifested by end-diastole. In contrast, Tau values of 306±109 and 353±106ms were found from the derivative and best-fit methods, with P∞ values of 75±12 and 71±12mmHg, and with ∼80% completion of Windkessel function. The "effective" resistance and compliance were ∼70% and ∼40% less than SVR and TAC (area method), respectively. We did not challenge the Windkessel model, but rather the estimation technique of model variables (Tau, SVR, TAC) that assumes P∞=0. The study favoured a shorter Tau of the Windkessel and a higher P∞ compared with previous studies. This calls for a reappraisal of the quantification of systemic arterial load. Crown Copyright © 2017. Published by Elsevier Masson SAS. All rights reserved.

  20. Discriminating between stabilizing and destabilizing protein design mutations via recombination and simulation.

    PubMed

    Johnson, Lucas B; Gintner, Lucas P; Park, Sehoo; Snow, Christopher D

    2015-08-01

    Accuracy of current computational protein design (CPD) methods is limited by inherent approximations in energy potentials and sampling. These limitations are often used to qualitatively explain design failures; however, relatively few studies provide specific examples or quantitative details that can be used to improve future CPD methods. Expanding the design method to include a library of sequences provides data that is well suited for discriminating between stabilizing and destabilizing design elements. Using thermophilic endoglucanase E1 from Acidothermus cellulolyticus as a model enzyme, we computationally designed a sequence with 60 mutations. The design sequence was rationally divided into structural blocks and recombined with the wild-type sequence. Resulting chimeras were assessed for activity and thermostability. Surprisingly, unlike previous chimera libraries, regression analysis based on one- and two-body effects was not sufficient for predicting chimera stability. Analysis of molecular dynamics simulations proved helpful in distinguishing stabilizing and destabilizing mutations. Reverting to the wild-type amino acid at destabilized sites partially regained design stability, and introducing predicted stabilizing mutations in wild-type E1 significantly enhanced thermostability. The ability to isolate stabilizing and destabilizing elements in computational design offers an opportunity to interpret previous design failures and improve future CPD methods. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Robust alignment of chromatograms by statistically analyzing the shifts matrix generated by moving window fast Fourier transform cross-correlation.

    PubMed

    Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian

    2015-03-01

    Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Postauthorization safety surveillance of ADVATE [antihaemophilic factor (recombinant), plasma/albumin-free method] demonstrates efficacy, safety and low-risk for immunogenicity in routine clinical practice.

    PubMed

    Oldenburg, J; Goudemand, J; Valentino, L; Richards, M; Luu, H; Kriukov, A; Gajek, H; Spotts, G; Ewenstein, B

    2010-11-01

      Postauthorization safety surveillance of factor VIII (FVIII) concentrates is essential for assessing rare adverse event incidence. We determined safety and efficacy of ADVATE [antihaemophilic factor (recombinant), plasma/albumin-free method, (rAHF-PFM)] during routine clinical practice. Subjects with differing haemophilia A severities and medical histories were monitored during 12 months of prophylactic and/or on-demand therapy. Among 408 evaluable subjects, 386 (95%) received excellent/good efficacy ratings for all on-demand assessments; the corresponding number for subjects with previous FVIII inhibitors was 36/41 (88%). Among 276 evaluable subjects receiving prophylaxis continuously in the study, 255 (92%) had excellent/good ratings for all prophylactic assessments; the corresponding number for subjects with previous FVIII inhibitors was 41/46 (89%). Efficacy of surgical prophylaxis was excellent/good in 16/16 evaluable procedures. Among previously treated patients (PTPs) with >50 exposure days (EDs) and FVIII≤2%, three (0.75%) developed low-titre inhibitors. Two of these subjects had a positive inhibitor history; thus, the incidence of de novo inhibitor formation in PTPs with FVIII≤2% and no inhibitor history was 1/348 (0.29%; 95% CI, 0.01-1.59%). A PTP with moderate haemophilia developed a low-titre inhibitor. High-titre inhibitors were reported in a PTP with mild disease (following surgery), a previously untreated patient (PUP) with moderate disease (following surgery) and a PUP with severe disease. The favourable benefit/risk profile of rAHF-PFM previously documented in prospective clinical trials has been extended to include a broader range of haemophilia patients, many of whom would have been ineligible for registration studies. © 2010 Blackwell Publishing Ltd.

  3. Network meta-analyses could be improved by searching more sources and by involving a librarian.

    PubMed

    Li, Lun; Tian, Jinhui; Tian, Hongliang; Moher, David; Liang, Fuxiang; Jiang, Tongxiao; Yao, Liang; Yang, Kehu

    2014-09-01

    Network meta-analyses (NMAs) aim to rank the benefits (or harms) of interventions, based on all available randomized controlled trials. Thus, the identification of relevant data is critical. We assessed the conduct of the literature searches in NMAs. Published NMAs were retrieved by searching electronic bibliographic databases and other sources. Two independent reviewers selected studies and five trained reviewers abstracted data regarding literature searches, in duplicate. Search method details were examined using descriptive statistics. Two hundred forty-nine NMAs were included. Eight used previous systematic reviews to identify primary studies without further searching, and five did not report any literature searches. In the 236 studies that used electronic databases to identify primary studies, the median number of databases was 3 (interquartile range: 3-5). MEDLINE, EMBASE, and Cochrane Central Register of Controlled Trials were the most commonly used databases. The most common supplemental search methods included reference lists of included studies (48%), reference lists of previous systematic reviews (40%), and clinical trial registries (32%). None of these supplemental methods was conducted in more than 50% of the NMAs. Literature searches in NMAs could be improved by searching more sources, and by involving a librarian or information specialist. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. High Sensitivity Analysis of Nanoliter Volumes of Volatile and Nonvolatile Compounds using Matrix Assisted Ionization (MAI) Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Hoang, Khoa; Pophristic, Milan; Horan, Andrew J.; Johnston, Murray V.; McEwen, Charles N.

    2016-10-01

    First results are reported using a simple, fast, and reproducible matrix-assisted ionization (MAI) sample introduction method that provides substantial improvements relative to previously published MAI methods. The sensitivity of the new MAI methods, which requires no laser, high voltage, or nebulizing gas, is comparable to those reported for MALDI-TOF and n-ESI. High resolution full acquisition mass spectra having low chemical background are acquired from low nanoliters of solution using only a few femtomoles of analyte. The limit-of-detection for angiotensin II is less than 50 amol on an Orbitrap Exactive mass spectrometer. Analysis of peptides, including a bovine serum albumin digest, and drugs, including drugs in urine without a purification step, are reported using a 1 μL zero dead volume syringe in which only the analyte solution wetting the walls of the syringe needle is used in the analysis.

  5. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  6. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  7. Artifact formation during smoke trapping: An improved method for determination of N-nitrosamines in cigarette smoke

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caldwell, W.S.; Conner, J.M.

    Studies in our laboratory revealed artifactual formation of N-nitrosamines during trapping of mainstream and sidestream tobacco smoke by the method of Hoffmann and coworkers. Both volatile and tobacco-specific N-nitrosamines were produced. This artifact formation took place on the Cambridge filter, which is part of the collection train used in the previously published procedure. When the filter was treated with ascorbic acid before smoke collection, artifact formation was inhibited. The improved method resulting from these studies was applied to a comparative analysis of N-nitrosamines in smoke from cigarettes that heat, but do not burn, tobacco (the test cigarette) and several referencemore » cigarettes. Concentrations of volatile and tobacco-specific N-nitrosamines in both mainstream and sidestream smoke from the test cigarette were substantially lower than in the reference cigarettes.« less

  8. 37Cl/35Cl isotope ratio analysis in perchlorate by ion chromatography/multi collector -ICPMS: Analytical performance and implication for biodegradation studies.

    PubMed

    Zakon, Yevgeni; Ronen, Zeev; Halicz, Ludwik; Gelman, Faina

    2017-10-01

    In the present study we propose a new analytical method for 37 Cl/ 35 Cl analysis in perchlorate by Ion Chromatography(IC) coupled to Multicollector Inductively Coupled Plasma Mass Spectrometry (MC-ICPMS). The accuracy of the analytical method was validated by analysis of international perchlorate standard materials USGS-37 and USGS -38; analytical precision better than ±0.4‰ was achieved. 37 Cl/ 35 Cl isotope ratio analysis in perchlorate during laboratory biodegradation experiment with microbial cultures enriched from the contaminated soil in Israel resulted in isotope enrichment factor ε 37 Cl = -13.3 ± 1‰, which falls in the range reported previously for perchlorate biodegradation by pure microbial cultures. The proposed analytical method may significantly simplify the procedure for isotope analysis of perchlorate which is currently applied in environmental studies. Copyright © 2017. Published by Elsevier Ltd.

  9. Comparative analysis of the modified enclosed energy metric for self-focusing holograms from digital lensless holographic microscopy.

    PubMed

    Trujillo, Carlos; Garcia-Sucerquia, Jorge

    2015-06-01

    A comparative analysis of the performance of the modified enclosed energy (MEE) method for self-focusing holograms recorded with digital lensless holographic microscopy is presented. Notwithstanding the MEE analysis previously published, no extended analysis of its performance has been reported. We have tested the MEE in terms of the minimum axial distance allowed between the set of reconstructed holograms to search for the focal plane and the elapsed time to obtain the focused image. These parameters have been compared with those for some of the already reported methods in the literature. The MEE achieves better results in terms of self-focusing quality but at a higher computational cost. Despite its longer processing time, the method remains within a time frame to be technologically attractive. Modeled and experimental holograms have been utilized in this work to perform the comparative study.

  10. Translation of Genotype to Phenotype by a Hierarchy of Cell Subsystems.

    PubMed

    Yu, Michael Ku; Kramer, Michael; Dutkowski, Janusz; Srivas, Rohith; Licon, Katherine; Kreisberg, Jason; Ng, Cherie T; Krogan, Nevan; Sharan, Roded; Ideker, Trey

    2016-02-24

    Accurately translating genotype to phenotype requires accounting for the functional impact of genetic variation at many biological scales. Here we present a strategy for genotype-phenotype reasoning based on existing knowledge of cellular subsystems. These subsystems and their hierarchical organization are defined by the Gene Ontology or a complementary ontology inferred directly from previously published datasets. Guided by the ontology's hierarchical structure, we organize genotype data into an "ontotype," that is, a hierarchy of perturbations representing the effects of genetic variation at multiple cellular scales. The ontotype is then interpreted using logical rules generated by machine learning to predict phenotype. This approach substantially outperforms previous, non-hierarchical methods for translating yeast genotype to cell growth phenotype, and it accurately predicts the growth outcomes of two new screens of 2,503 double gene knockouts impacting DNA repair or nuclear lumen. Ontotypes also generalize to larger knockout combinations, setting the stage for interpreting the complex genetics of disease.

  11. Mortality of aircraft maintenance workers exposed to trichloroethylene and other hydrocarbons and chemicals: extended follow up

    PubMed Central

    Radican, Larry; Blair, Aaron; Stewart, Patricia; Wartenberg, Daniel

    2009-01-01

    Objective To extend follow-up of 14,455 workers from 1990 to 2000, and evaluate mortality risk from exposure to trichloroethylene (TCE) and other chemicals. Methods Multivariable Cox models were used to estimate relative risk for exposed vs. unexposed workers based on previously developed exposure surrogates. Results Among TCE exposed workers, there was no statistically significant increased risk of all-cause mortality (RR=1.04) or death from all cancers (RR=1.03). Exposure-response gradients for TCE were relatively flat and did not materially change since 1990. Statistically significant excesses were found for several chemical exposure subgroups and causes, and were generally consistent with the previous follow up. Conclusions Patterns of mortality have not changed substantially since 1990. While positive associations with several cancers were observed, and are consistent with the published literature, interpretation is limited due to the small numbers of events for specific exposures. PMID:19001957

  12. Fluorescence spectroscopy and molecular weight distribution of extracellular polymers from full-scale activated sludge biomass.

    PubMed

    Esparza-Soto, M; Westerhoff, P K

    2001-01-01

    Two fractions of extracellular polymer substances (EPSs), soluble and readily extractable (RE), were characterised in terms of their molecular weight distributions (MWD) and 3-D excitation-emission-matrix (EEM) fluorescence spectroscopy signatures. The EPS fractions were different: the soluble EPSs were composed mainly of high molecular weight compounds, while the RE EPSs were composed of small molecular weight compounds. Contrary to previous thought, EPS may not be considered only as macromolecular because most organic matter present in both fractions had low molecular weight. Three different fluorophore peaks were identified in the EEM fluorescence spectra. Two peaks were attributed to protein-like fluorophores, and the third to a humic-like fluorophore. Fluorescence signatures were different from other previously published signatures for marine and riverine environments. EEM spectroscopy proved to be a suitable method that may be used to characterise and trace organic matter of bacterial origin in wastewater treatment operations.

  13. Numerical comparison of grid pattern diffraction effects through measurement and modeling with OptiScan software

    NASA Astrophysics Data System (ADS)

    Murray, Ian B.; Densmore, Victor; Bora, Vaibhav; Pieratt, Matthew W.; Hibbard, Douglas L.; Milster, Tom D.

    2011-06-01

    Coatings of various metalized patterns are used for heating and electromagnetic interference (EMI) shielding applications. Previous work has focused on macro differences between different types of grids, and has shown good correlation between measurements and analyses of grid diffraction. To advance this work, we have utilized the University of Arizona's OptiScan software, which has been optimized for this application by using the Babinet Principle. When operating on an appropriate computer system, this algorithm produces results hundreds of times faster than standard Fourier-based methods, and allows realistic cases to be modeled for the first time. By using previously published derivations by Exotic Electro-Optics, we compare diffraction performance of repeating and randomized grid patterns with equivalent sheet resistance using numerical performance metrics. Grid patterns of each type are printed on optical substrates and measured energy is compared against modeled energy.

  14. A parameterization method and application in breast tomosynthesis dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinhua; Zhang, Da; Liu, Bob

    2013-09-15

    Purpose: To present a parameterization method based on singular value decomposition (SVD), and to provide analytical parameterization of the mean glandular dose (MGD) conversion factors from eight references for evaluating breast tomosynthesis dose in the Mammography Quality Standards Act (MQSA) protocol and in the UK, European, and IAEA dosimetry protocols.Methods: MGD conversion factor is usually listed in lookup tables for the factors such as beam quality, breast thickness, breast glandularity, and projection angle. The authors analyzed multiple sets of MGD conversion factors from the Hologic Selenia Dimensions quality control manual and seven previous papers. Each data set was parameterized usingmore » a one- to three-dimensional polynomial function of 2–16 terms. Variable substitution was used to improve accuracy. A least-squares fit was conducted using the SVD.Results: The differences between the originally tabulated MGD conversion factors and the results computed using the parameterization algorithms were (a) 0.08%–0.18% on average and 1.31% maximum for the Selenia Dimensions quality control manual, (b) 0.09%–0.66% on average and 2.97% maximum for the published data by Dance et al. [Phys. Med. Biol. 35, 1211–1219 (1990); ibid. 45, 3225–3240 (2000); ibid. 54, 4361–4372 (2009); ibid. 56, 453–471 (2011)], (c) 0.74%–0.99% on average and 3.94% maximum for the published data by Sechopoulos et al. [Med. Phys. 34, 221–232 (2007); J. Appl. Clin. Med. Phys. 9, 161–171 (2008)], and (d) 0.66%–1.33% on average and 2.72% maximum for the published data by Feng and Sechopoulos [Radiology 263, 35–42 (2012)], excluding one sample in (d) that does not follow the trends in the published data table.Conclusions: A flexible parameterization method is presented in this paper, and was applied to breast tomosynthesis dosimetry. The resultant data offer easy and accurate computations of MGD conversion factors for evaluating mean glandular breast dose in the MQSA protocol and in the UK, European, and IAEA dosimetry protocols. Microsoft Excel™ spreadsheets are provided for the convenience of readers.« less

  15. Discrete choice experiments of pharmacy services: a systematic review.

    PubMed

    Vass, Caroline; Gray, Ewan; Payne, Katherine

    2016-06-01

    Background Two previous systematic reviews have summarised the application of discrete choice experiments to value preferences for pharmacy services. These reviews identified a total of twelve studies and described how discrete choice experiments have been used to value pharmacy services but did not describe or discuss the application of methods used in the design or analysis. Aims (1) To update the most recent systematic review and critically appraise current discrete choice experiments of pharmacy services in line with published reporting criteria and; (2) To provide an overview of key methodological developments in the design and analysis of discrete choice experiments. Methods The review used a comprehensive strategy to identify eligible studies (published between 1990 and 2015) by searching electronic databases for key terms related to discrete choice and best-worst scaling (BWS) experiments. All healthcare choice experiments were then hand-searched for key terms relating to pharmacy. Data were extracted using a published checklist. Results A total of 17 discrete choice experiments eliciting preferences for pharmacy services were identified for inclusion in the review. No BWS studies were identified. The studies elicited preferences from a variety of populations (pharmacists, patients, students) for a range of pharmacy services. Most studies were from a United Kingdom setting, although examples from Europe, Australia and North America were also identified. Discrete choice experiments for pharmacy services tended to include more attributes than non-pharmacy choice experiments. Few studies reported the use of qualitative research methods in the design and interpretation of the experiments (n = 9) or use of new methods of analysis to identify and quantify preference and scale heterogeneity (n = 4). No studies reported the use of Bayesian methods in their experimental design. Conclusion Incorporating more sophisticated methods in the design of pharmacy-related discrete choice experiments could help researchers produce more efficient experiments which are better suited to valuing complex pharmacy services. Pharmacy-related discrete choice experiments could also benefit from more sophisticated analytical techniques such as investigations into scale and preference heterogeneity. Employing these sophisticated methods for both design and analysis could extend the usefulness of discrete choice experiments to inform health and pharmacy policy.

  16. Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008

    USGS Publications Warehouse

    Soller, David R.

    2009-01-01

    The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  17. Gene discovery by chemical mutagenesis and whole-genome sequencing in Dictyostelium.

    PubMed

    Li, Cheng-Lin Frank; Santhanam, Balaji; Webb, Amanda Nicole; Zupan, Blaž; Shaulsky, Gad

    2016-09-01

    Whole-genome sequencing is a useful approach for identification of chemical-induced lesions, but previous applications involved tedious genetic mapping to pinpoint the causative mutations. We propose that saturation mutagenesis under low mutagenic loads, followed by whole-genome sequencing, should allow direct implication of genes by identifying multiple independent alleles of each relevant gene. We tested the hypothesis by performing three genetic screens with chemical mutagenesis in the social soil amoeba Dictyostelium discoideum Through genome sequencing, we successfully identified mutant genes with multiple alleles in near-saturation screens, including resistance to intense illumination and strong suppressors of defects in an allorecognition pathway. We tested the causality of the mutations by comparison to published data and by direct complementation tests, finding both dominant and recessive causative mutations. Therefore, our strategy provides a cost- and time-efficient approach to gene discovery by integrating chemical mutagenesis and whole-genome sequencing. The method should be applicable to many microbial systems, and it is expected to revolutionize the field of functional genomics in Dictyostelium by greatly expanding the mutation spectrum relative to other common mutagenesis methods. © 2016 Li et al.; Published by Cold Spring Harbor Laboratory Press.

  18. 77 FR 7609 - Policy Letter 11-01, Performance of Inherently Governmental and Critical Functions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-13

    ... Regulation. The corrections below should be used in place of text previously published in the September 12, 2011 notice. All other information from the published Final Policy remains unchanged. The full text of...

  19. Device SEE Susceptibility Update: 1996-1998

    NASA Technical Reports Server (NTRS)

    Coss, J. R.; Miyahira, T. F.; Swift, G. M.

    1998-01-01

    This eighth Compendium continues the previous work of Nichols, et al, on single event effects (SEE) first published in 1985. Because the compendium has grown so voluminous, this update only presents data not publised in previous compendia.

  20. Plant proteome analysis: a 2006 update.

    PubMed

    Jorrín, Jesús V; Maldonado, Ana M; Castillejo, Ma Angeles

    2007-08-01

    This 2006 'Plant Proteomics Update' is a continuation of the two previously published in 'Proteomics' by 2004 (Canovas et al., Proteomics 2004, 4, 285-298) and 2006 (Rossignol et al., Proteomics 2006, 6, 5529-5548) and it aims to bring up-to-date the contribution of proteomics to plant biology on the basis of the original research papers published throughout 2006, with references to those appearing last year. According to the published papers and topics addressed, we can conclude that, as observed for the three previous years, there has been a quantitative, but not qualitative leap in plant proteomics. The full potential of proteomics is far from being exploited in plant biology research, especially if compared to other organisms, mainly yeast and humans, and a number of challenges, mainly technological, remain to be tackled. The original papers published last year numbered nearly 100 and deal with the proteome of at least 26 plant species, with a high percentage for Arabidopsis thaliana (28) and rice (11). Scientific objectives ranged from proteomic analysis of organs/tissues/cell suspensions (57) or subcellular fractions (29), to the study of plant development (12), the effect of hormones and signalling molecules (8) and response to symbionts (4) and stresses (27). A small number of contributions have covered PTMs (8) and protein interactions (4). 2-DE (specifically IEF-SDS-PAGE) coupled to MS still constitutes the almost unique platform utilized in plant proteome analysis. The application of gel-free protein separation methods and 'second generation' proteomic techniques such as multidimensional protein identification technology (MudPIT), and those for quantitative proteomics including DIGE, isotope-coded affinity tags (ICAT), iTRAQ and stable isotope labelling by amino acids in cell culture (SILAC) still remains anecdotal. This review is divided into seven sections: Introduction, Methodology, Subcellular proteomes, Development, Responses to biotic and abiotic stresses, PTMs and Protein interactions. Section 8 summarizes the major pitfalls and challenges of plant proteomics.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mead, H; St. Jude Children’s Research Hospital, Memphis, TN; Brady, S

    Purpose: To discover if a previously published methodology for estimating patient-specific organ dose in a pediatric population (5–55kg) is translatable to the adult sized patient population (> 55 kg). Methods: An adult male anthropomorphic phantom was scanned with metal oxide semiconductor field effect transistor (MOSFET) dosimeters placed at 23 organ locations in the chest and abdominopelvic regions to determine absolute organ dose. Organ-dose-to-SSDE correlation factors were developed by dividing individual phantom organ doses by SSDE of the phantom; where SSDE was calculated at the center of the scan volume of the chest and abdomen/pelvis separately. Organ dose correlation factors developedmore » in phantom were multiplied by 28 chest and 22 abdominopelvic patient SSDE values to estimate organ dose. The median patient weight from the CT examinations was 68.9 kg (range 57–87 kg) and median age was 17 years (range 13–28 years). Calculated organ dose estimates were compared to published Monte Carlo simulated patient and phantom results. Results: Organ-dose-to-SSDE correlation was determined for a total of 23 organs in the chest and abdominopelvic regions. For organs fully covered by the scan volume, correlation in the chest (median 1.3; range 1.1–1.5) and abdominopelvic (median 0.9; range 0.7–1.0) was 1.0 ± 10%. For organs that extended beyond the scan volume (i.e. skin bone marrow and bone surface) correlation was determined to be a median of 0.3 (range 0.1–0.4). Calculated patient organ dose using patient SSDE agreed to better than 6% (chest) and 15% (abdominopelvic) to published values. Conclusion: This study demonstrated that our previous published methodology for calculating organ dose using patient-specific SSDE for the chest and abdominopelvic regions is translatable to adult sized patients for organs fully covered by the scan volume.« less

  2. Optimization of direct fibroblast reprogramming to cardiomyocytes using calcium activity as a functional measure of success.

    PubMed

    Addis, Russell C; Ifkovits, Jamie L; Pinto, Filipa; Kellam, Lori D; Esteso, Paul; Rentschler, Stacey; Christoforou, Nicolas; Epstein, Jonathan A; Gearhart, John D

    2013-07-01

    Direct conversion of fibroblasts to induced cardiomyocytes (iCMs) has great potential for regenerative medicine. Recent publications have reported significant progress, but the evaluation of reprogramming has relied upon non-functional measures such as flow cytometry for cardiomyocyte markers or GFP expression driven by a cardiomyocyte-specific promoter. The issue is one of practicality: the most stringent measures - electrophysiology to detect cell excitation and the presence of spontaneously contracting myocytes - are not readily quantifiable in the large numbers of cells screened in reprogramming experiments. However, excitation and contraction are linked by a third functional characteristic of cardiomyocytes: the rhythmic oscillation of intracellular calcium levels. We set out to optimize direct conversion of fibroblasts to iCMs with a quantifiable calcium reporter to rapidly assess functional transdifferentiation. We constructed a reporter system in which the calcium indicator GCaMP is driven by the cardiomyocyte-specific Troponin T promoter. Using calcium activity as our primary outcome measure, we compared several published combinations of transcription factors along with novel combinations in mouse embryonic fibroblasts. The most effective combination consisted of Hand2, Nkx2.5, Gata4, Mef2c, and Tbx5 (HNGMT). This combination is >50-fold more efficient than GMT alone and produces iCMs with cardiomyocyte marker expression, robust calcium oscillation, and spontaneous beating that persist for weeks following inactivation of reprogramming factors. HNGMT is also significantly more effective than previously published factor combinations for the transdifferentiation of adult mouse cardiac fibroblasts to iCMs. Quantification of calcium function is a convenient and effective means for the identification and evaluation of cardiomyocytes generated by direct reprogramming. Using this stringent outcome measure, we conclude that HNGMT produces iCMs more efficiently than previously published methods. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Multiple sequence alignment in HTML: colored, possibly hyperlinked, compact representations.

    PubMed

    Campagne, F; Maigret, B

    1998-02-01

    Protein sequence alignments are widely used in protein structure prediction, protein engineering, modeling of proteins, etc. This type of representation is useful at different stages of scientific activity: looking at previous results, working on a research project, and presenting the results. There is a need to make it available through a network (intranet or WWW), in a way that allows biologists, chemists, and noncomputer specialists to look at the data and carry on research--possibly in a collaborative research. Previous methods (text-based, Java-based) are reported and their advantages are discussed. We have developed two novel approaches to represent the alignments as colored, hyper-linked HTML pages. The first method creates an HTML page that uses efficiently the image cache mechanism of a WWW browser, thereby allowing the user to browse different alignments without waiting for the images to be loaded through the network, but only for the first viewed alignment. The generated pages can be browsed with any HTML2.0-compliant browser. The second method that we propose uses W3C-CSS1-style sheets to render alignments. This new method generates pages that require recent browsers to be viewed. We implemented these methods in the Viseur program and made a WWW service available that allows a user to convert an MSF alignment file in HTML for WWW publishing. The latter service is available at http:@www.lctn.u-nancy.fr/viseur/services.htm l.

  4. Localization microscopy of DNA in situ using Vybrant(®) DyeCycle™ Violet fluorescent probe: A new approach to study nuclear nanostructure at single molecule resolution.

    PubMed

    Żurek-Biesiada, Dominika; Szczurek, Aleksander T; Prakash, Kirti; Mohana, Giriram K; Lee, Hyun-Keun; Roignant, Jean-Yves; Birk, Udo J; Dobrucki, Jurek W; Cremer, Christoph

    2016-05-01

    Higher order chromatin structure is not only required to compact and spatially arrange long chromatids within a nucleus, but have also important functional roles, including control of gene expression and DNA processing. However, studies of chromatin nanostructures cannot be performed using conventional widefield and confocal microscopy because of the limited optical resolution. Various methods of superresolution microscopy have been described to overcome this difficulty, like structured illumination and single molecule localization microscopy. We report here that the standard DNA dye Vybrant(®) DyeCycle™ Violet can be used to provide single molecule localization microscopy (SMLM) images of DNA in nuclei of fixed mammalian cells. This SMLM method enabled optical isolation and localization of large numbers of DNA-bound molecules, usually in excess of 10(6) signals in one cell nucleus. The technique yielded high-quality images of nuclear DNA density, revealing subdiffraction chromatin structures of the size in the order of 100nm; the interchromatin compartment was visualized at unprecedented optical resolution. The approach offers several advantages over previously described high resolution DNA imaging methods, including high specificity, an ability to record images using a single wavelength excitation, and a higher density of single molecule signals than reported in previous SMLM studies. The method is compatible with DNA/multicolor SMLM imaging which employs simple staining methods suited also for conventional optical microscopy. Copyright © 2016. Published by Elsevier Inc.

  5. Osteoarthritis year in review 2016: imaging.

    PubMed

    Boesen, M; Ellegaard, K; Henriksen, M; Gudbergsen, H; Hansen, P; Bliddal, H; Bartels, E M; Riis, R G

    2017-02-01

    The current narrative review covers original research related to imaging in osteoarthritis (OA) in humans published in English between April 1st 2015 and March 31st 2016, in peer reviewed journals available in Medline via PubMed (http://www.ncbi.nlm.nih.gov/pubmed/). Relevant studies in humans, subjectively decided by the authors, contributing significantly to the OA imaging field, were selected from an extensive Medline search using the terms "Osteoarthritis" in combination with "MRI", "Imaging", "Radiography", "X-rays", "Ultrasound", "Computed tomography", "Nuclear medicine", "PET-CT", "PET-MRI", "Scintigraphy", "SPECT". Publications were sorted according to relevance for the OA imaging research community with an emphasis on high impact special interest journals using the software for systematic reviews www.covidence.org. An overview of newly published studies compared to studies reported previous years is presented, followed by a review of selected imaging studies of primarily knee, hip and hand OA focussing on (1) results for detection of OA and OA-related pathology (2) studies dealing with treatments and (3) studies focussing on prognosis of disease progression or joint replacement. A record high number of 1420 articles were published, among others, of new technologies and tools for improved morphological and pathophysiological understanding of OA-related changes in joints. Also, imaging data were presented of monitoring treatment effect and prognosis of OA progression, primarily using established radiographic, magnetic resonance imaging (MRI), and ultrasound (US) methods. Imaging continues to play an important role in OA research, where several exciting new technologies and computer aided analysis methods are emerging to complement the conventional imaging approaches. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  6. Surgery for disc-associated wobbler syndrome in the dog--an examination of the controversy.

    PubMed

    Jeffery, N D; McKee, W M

    2001-12-01

    Controversy surrounds treatment of disc-associated 'wobbler' syndrome in the dog, centring on the choice of method of surgical decompression used. In this review, details of previously published case series are summarised and critically examined in an attempt to compare success rates and complications of different types of surgery. Unequivocally accurate comparisons were difficult because of differences in methods of case recording between series. Short-term success rates were high (approximately 80 per cent), but there was a high rate of recurrence (around 20 per cent) after any surgical treatment, suggesting the possibility that the syndrome should be considered a multifocal disease of the caudal cervical region. Statistical analysis revealed no significant differences in success rates between the various reported decompressive surgical techniques

  7. Estimation of multiple accelerated motions using chirp-Fourier transform and clustering.

    PubMed

    Alexiadis, Dimitrios S; Sergiadis, George D

    2007-01-01

    Motion estimation in the spatiotemporal domain has been extensively studied and many methodologies have been proposed, which, however, cannot handle both time-varying and multiple motions. Extending previously published ideas, we present an efficient method for estimating multiple, linearly time-varying motions. It is shown that the estimation of accelerated motions is equivalent to the parameter estimation of superpositioned chirp signals. From this viewpoint, one can exploit established signal processing tools such as the chirp-Fourier transform. It is shown that accelerated motion results in energy concentration along planes in the 4-D space: spatial frequencies-temporal frequency-chirp rate. Using fuzzy c-planes clustering, we estimate the plane/motion parameters. The effectiveness of our method is verified on both synthetic as well as real sequences and its advantages are highlighted.

  8. Validation of engineering methods for predicting hypersonic vehicle controls forces and moments

    NASA Technical Reports Server (NTRS)

    Maughmer, M.; Straussfogel, D.; Long, L.; Ozoroski, L.

    1991-01-01

    This work examines the ability of the aerodynamic analysis methods contained in an industry standard conceptual design code, the Aerodynamic Preliminary Analysis System (APAS II), to estimate the forces and moments generated through control surface deflections from low subsonic to high hypersonic speeds. Predicted control forces and moments generated by various control effectors are compared with previously published wind-tunnel and flight-test data for three vehicles: the North American X-15, a hypersonic research airplane concept, and the Space Shuttle Orbiter. Qualitative summaries of the results are given for each force and moment coefficient and each control derivative in the various speed ranges. Results show that all predictions of longitudinal stability and control derivatives are acceptable for use at the conceptual design stage.

  9. Medical auditing of whole-breast screening ultrasonography

    PubMed Central

    2017-01-01

    Since breast ultrasonography (US) has been used as an adjunctive screening modality in women with dense breasts, the need has arisen to evaluate and monitor its possible harm and benefits in comparison with other screening modalities such as mammography. Recently, the fifth edition of the Breast Imaging Reporting and Data System published by the American College of Radiology has suggested auditing methods for screening breast US. However, the method proposed therein is slightly different from how diagnostic performance was calculated in previous studies on screening breast US. In this article, the background and core aspects of medical audits of breast cancer screening will be reviewed to provide an introduction to the medical auditing of screening breast US, with the goal of helping radiologists to understand and identify potential ways to improve outcomes. PMID:28322034

  10. Selected field and analytical methods and analytical results in the Dutch Flats area, western Nebraska, 1995-99

    USGS Publications Warehouse

    Verstraeten, Ingrid M.; Steele, G.V.; Cannia, J.C.; Bohlke, J.K.; Kraemer, T.E.; Hitch, D.E.; Wilson, K.E.; Carnes, A.E.

    2001-01-01

    A study of the water resources of the Dutch Flats area in the western part of the North Platte Natural Resources District, western Nebraska, was conducted from 1995 through 1999 to describe the surface water and hydrogeology, the spatial distribution of selected water-quality constituents in surface and ground water, and the surface-water/ground-water interaction in selected areas. This report describes the selected field and analytical methods used in the study and selected analytical results from the study not previously published. Specifically, dissolved gases, age-dating data, and other isotopes collected as part of an intensive sampling effort in August and November 1998 and all uranium and uranium isotope data collected through the course of this study are included in the report.

  11. Determination of the spin and recovery characteristics of a typical low-wing general aviation design

    NASA Technical Reports Server (NTRS)

    Tischler, M. B.; Barlow, J. B.

    1980-01-01

    The equilibrium spin technique implemented in a graphical form for obtaining spin and recovery characteristics from rotary balance data is outlined. Results of its application to recent rotary balance tests of the NASA Low-Wing General Aviation Aircraft are discussed. The present results, which are an extension of previously published findings, indicate the ability of the equilibrium method to accurately evaluate spin modes and recovery control effectiveness. A comparison of the calculated results with available spin tunnel and full scale findings is presented. The technique is suitable for preliminary design applications as determined from the available results and data base requirements. A full discussion of implementation considerations and a summary of the results obtained from this method to date are presented.

  12. The development of a primary dental care outreach course.

    PubMed

    Waterhouse, P; Maguire, A; Tabari, D; Hind, V; Lloyd, J

    2008-02-01

    The aim of this work was to develop the first north-east based primary dental care outreach (PDCO) course for clinical dental undergraduate students at Newcastle University. The process of course design will be described and involved review of the existing Bachelor of Dental Surgery (BDS) degree course in relation to previously published learning outcomes. Areas were identified where the existing BDS course did not meet fully these outcomes. This was followed by setting the PDCO course aims and objectives, intended learning outcomes, curriculum and structure. The educational strategy and methods of teaching and learning were subsequently developed together with a strategy for overall quality control of the teaching and learning experience. The newly developed curriculum was aligned with appropriate student assessment methods, including summative, formative and ipsative elements.

  13. Medical auditing of whole-breast screening ultrasonography.

    PubMed

    Kim, Min Jung

    2017-07-01

    Since breast ultrasonography (US) has been used as an adjunctive screening modality in women with dense breasts, the need has arisen to evaluate and monitor its possible harm and benefits in comparison with other screening modalities such as mammography. Recently, the fifth edition of the Breast Imaging Reporting and Data System published by the American College of Radiology has suggested auditing methods for screening breast US. However, the method proposed therein is slightly different from how diagnostic performance was calculated in previous studies on screening breast US. In this article, the background and core aspects of medical audits of breast cancer screening will be reviewed to provide an introduction to the medical auditing of screening breast US, with the goal of helping radiologists to understand and identify potential ways to improve outcomes.

  14. High throughput photo-oxidations in a packed bed reactor system.

    PubMed

    Kong, Caleb J; Fisher, Daniel; Desai, Bimbisar K; Yang, Yuan; Ahmad, Saeed; Belecki, Katherine; Gupton, B Frank

    2017-12-01

    The efficiency gains produced by continuous-flow systems in conducting photochemical transformations have been extensively demonstrated. Recently, these systems have been used in developing safe and efficient methods for photo-oxidations using singlet oxygen generated by photosensitizers. Much of the previous work has focused on the use of homogeneous photocatalysts. The development of a unique, packed-bed photoreactor system using immobilized rose bengal expands these capabilities as this robust photocatalyst allows access to and elaboration from these highly useful building blocks without the need for further purification. With this platform we were able to demonstrate a wide scope of singlet oxygen ene, [4+2] cycloadditions and heteroatom oxidations. Furthermore, we applied this method as a strategic element in the synthesis of the high-volume antimalarial artemisinin. Copyright © 2017. Published by Elsevier Ltd.

  15. IMPROVED Ti II log(gf) VALUES AND ABUNDANCE DETERMINATIONS IN THE PHOTOSPHERES OF THE SUN AND METAL-POOR STAR HD 84937

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, M. P.; Lawler, J. E.; Sneden, C.

    2013-10-01

    Atomic transition probability measurements for 364 lines of Ti II in the UV through near-IR are reported. Branching fractions from data recorded using a Fourier transform spectrometer (FTS) and a new echelle spectrometer are combined with published radiative lifetimes to determine these transition probabilities. The new results are in generally good agreement with previously reported FTS measurements. Use of the new echelle spectrometer, independent radiometric calibration methods, and independent data analysis routines enables a reduction of systematic errors and overall improvement in transition probability accuracy over previous measurements. The new Ti II data are applied to high-resolution visible and UVmore » spectra of the Sun and metal-poor star HD 84937 to derive new, more accurate Ti abundances. Lines covering a range of wavelength and excitation potential are used to search for non-LTE effects. The Ti abundances derived using Ti II for these two stars match those derived using Ti I and support the relative Ti/Fe abundance ratio versus metallicity seen in previous studies.« less

  16. RNA-Seq based phylogeny recapitulates previous phylogeny of the genus Flaveria (Asteraceae) with some modifications.

    PubMed

    Lyu, Ming-Ju Amy; Gowik, Udo; Kelly, Steve; Covshoff, Sarah; Mallmann, Julia; Westhoff, Peter; Hibberd, Julian M; Stata, Matt; Sage, Rowan F; Lu, Haorong; Wei, Xiaofeng; Wong, Gane Ka-Shu; Zhu, Xin-Guang

    2015-06-18

    The genus Flaveria has been extensively used as a model to study the evolution of C4 photosynthesis as it contains C3 and C4 species as well as a number of species that exhibit intermediate types of photosynthesis. The current phylogenetic tree of the genus Flaveria contains 21 of the 23 known Flaveria species and has been previously constructed using a combination of morphological data and three non-coding DNA sequences (nuclear encoded ETS, ITS and chloroplast encoded trnL-F). Here we developed a new strategy to update the phylogenetic tree of 16 Flaveria species based on RNA-Seq data. The updated phylogeny is largely congruent with the previously published tree but with some modifications. We propose that the data collection method provided in this study can be used as a generic method for phylogenetic tree reconstruction if the target species has no genomic information. We also showed that a "F. pringlei" genotype recently used in a number of labs may be a hybrid between F. pringlei (C3) and F. angustifolia (C3-C4). We propose that the new strategy of obtaining phylogenetic sequences outlined in this study can be used to construct robust trees in a larger number of taxa. The updated Flaveria phylogenetic tree also supports a hypothesis of stepwise and parallel evolution of C4 photosynthesis in the Flavaria clade.

  17. Mapping the Binding Interface of VEGF and a Monoclonal Antibody Fab-1 Fragment with Fast Photochemical Oxidation of Proteins (FPOP) and Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Wecksler, Aaron T.; Molina, Patricia; Deperalta, Galahad; Gross, Michael L.

    2017-05-01

    We previously analyzed the Fab-1:VEGF (vascular endothelial growth factor) system described in this work, with both native top-down mass spectrometry and bottom-up mass spectrometry (carboxyl-group or GEE footprinting) techniques. This work continues bottom-up mass spectrometry analysis using a fast photochemical oxidation of proteins (FPOP) platform to map the solution binding interface of VEGF and a fragment antigen binding region of an antibody (Fab-1). In this study, we use FPOP to compare the changes in solvent accessibility by quantitating the extent of oxidative modification in the unbound versus bound states. Determining the changes in solvent accessibility enables the inference of the protein binding sites (epitope and paratopes) and a comparison to the previously published Fab-1:VEGF crystal structure, adding to the top-down and bottom-up data. Using this method, we investigated peptide-level and residue-level changes in solvent accessibility between the unbound proteins and bound complex. Mapping these data onto the Fab-1:VEGF crystal structure enabled successful characterization of both the binding region and regions of remote conformation changes. These data, coupled with our previous higher order structure (HOS) studies, demonstrate the value of a comprehensive toolbox of methods for identifying the putative epitopes and paratopes for biotherapeutic antibodies.

  18. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  19. Top 100 Most-cited Articles on Pituitary Adenoma: A Bibliometric Analysis.

    PubMed

    Guo, Xiaopeng; Gao, Lu; Wang, Zihao; Feng, Chenzhe; Xing, Bing

    2018-06-02

    Many articles have been published on pituitary adenomas. Bibliometric analyses are helpful for determining the most impactful studies within a field. To identify the top 100 most-cited articles on pituitary adenomas using the bibliometric analysis method. We searched the Thomson Reuters Web of Science on March 31, 2018. Articles were listed in descending order by the total citation (TC) number, and the most-cited articles on pituitary adenomas were identified and analyzed. The most-cited articles were published between 1970 and 2014, with 1999 as the most prolific year. Growth hormone-secreting pituitary adenoma was the most commonly studied tumor subtype (43%), and in clinical studies, treatment options and follow-up were the most important research focuses (62%). The average number of TCs was 326, and the average number of annual citations (ACs) was 17. More review articles were published in the last decade, and the average number of ACs was higher for this decade than for previous decades. Twenty-one articles were recognized as "Citation Classics" with a TC number>400. Twenty-five journals published the top 100 works; the Journal of Clinical Endocrinology and Metabolism published the most articles (25%). The most articles (43%) were published in the United States. S. Melmed authored the greatest number of publications (14%). Departments of Medicine (32%) and Endocrinology (32%) contributed to the largest number of articles. This study identified the research focuses and trends regarding pituitary adenoma and provides key references for investigators in guiding future pituitary adenoma research. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize this gap in best practices and subsequently to promote instrument development research that is more consistent through the peer-review process.

  1. Evolution of Toxoplasma-PCR methods and practices: a French national survey and proposal for technical guidelines.

    PubMed

    Roux, Guillaume; Varlet-Marie, Emmanuelle; Bastien, Patrick; Sterkers, Yvon

    2018-06-08

    The molecular diagnosis of toxoplasmosis lacks standardisation due to the use of numerous methods with variable performance. This diversity of methods also impairs robust performance comparisons between laboratories. The harmonisation of practices by diffusion of technical guidelines is a useful way to improve these performances. The knowledge of methods and practices used for this molecular diagnosis is an essential step to provide guidelines for Toxoplasma-PCR. In the present study, we aimed (i) to describe the methods and practices of Toxoplasma-PCR used by clinical microbiology laboratories in France and (ii) to propose technical guidelines to improve molecular diagnosis of toxoplasmosis. To do so, a yearly self-administered questionnaire-based survey was undertaken in proficient French laboratories from 2008 to 2015, and guidelines were proposed based on the results of those as well as previously published work. This period saw the progressive abandonment of conventional PCR methods, of Toxoplasma-PCR targeting the B1 gene and of the use of two concomitant molecular methods for this diagnosis. The diversity of practices persisted during the study, in spite of the increasing use of commercial kits such as PCR kits, DNA extraction controls and PCR inhibition controls. We also observed a tendency towards the automation of DNA extraction. The evolution of practices did not always go together with an improvement in those, as reported notably by the declining use of Uracil-DNA Glycosylase to avoid carry-over contamination. We here propose technical recommendations which correspond to items explored during the survey, with respect to DNA extraction, Toxoplasma-PCR and good PCR practices. Copyright © 2018 Australian Society for Parasitology. Published by Elsevier Ltd. All rights reserved.

  2. Determination of residues of three triphenylmethane dyes and their metabolites (malachite green, leuco malachite green, crystal violet, leuco crystal violet, and brilliant green) in aquaculture products by LC/MS/MS: first action 2012.25.

    PubMed

    Hurtaud-Pessel, Dominique; Couëdor, Pierrick; Verdon, Eric; Dowell, Dawn

    2013-01-01

    During the AOAC Annual Meeting held from September 30 to October 3, 2012 in Las Vegas, NV, the Expert Review Panel (ERP) on Veterinary Drug Residues reviewed data for the method for determination of residues of three triphenylmethane dyes and their metabolites (malachite green, leuco malachite green, crystal violet, leuco crystal violet, and brilliant green) in aquaculture products by LC/MS/MS, previously published in the Journal of Chromatography A 1218, 1632-1645 (2006). The method data were reviewed and compared to the standard method performance requirements (SMPRs) found in SMPR 2009.001, published in AOAC's Official Methods of Analysis, 19th Ed. (2012). The ERP determined that the data were acceptable, and the method was approved AOAC Official First Action. The method uses acetonitrile to isolate the analyte from the matrix. Then determination is conducted by LCIMS/MS with positive electrospray ionization. Accuracy ranged from 100.1 to 109.8% for samples fortified at levels of 0.5, 0.75, 1.0, and 2.0 microg/kg. Precision ranged from 2.0 to 10.3% RSD for the intraday samples and 1.9 to 10.6% for the interday samples analyzed over 3 days. The described method is designed to accurately operate in the analytical range from 0.5 to 2 microg/kg, where the minimum required performance limit for laboratories has been fixed in the European Union at 2.0 microg/kg for these banned substances and their metabolites. Upper levels of concentrations (1-100 microg/kg) can be analyzed depending on the different optional calibrations used.

  3. Variance estimation when using inverse probability of treatment weighting (IPTW) with survival analysis.

    PubMed

    Austin, Peter C

    2016-12-30

    Propensity score methods are used to reduce the effects of observed confounding when using observational data to estimate the effects of treatments or exposures. A popular method of using the propensity score is inverse probability of treatment weighting (IPTW). When using this method, a weight is calculated for each subject that is equal to the inverse of the probability of receiving the treatment that was actually received. These weights are then incorporated into the analyses to minimize the effects of observed confounding. Previous research has found that these methods result in unbiased estimation when estimating the effect of treatment on survival outcomes. However, conventional methods of variance estimation were shown to result in biased estimates of standard error. In this study, we conducted an extensive set of Monte Carlo simulations to examine different methods of variance estimation when using a weighted Cox proportional hazards model to estimate the effect of treatment. We considered three variance estimation methods: (i) a naïve model-based variance estimator; (ii) a robust sandwich-type variance estimator; and (iii) a bootstrap variance estimator. We considered estimation of both the average treatment effect and the average treatment effect in the treated. We found that the use of a bootstrap estimator resulted in approximately correct estimates of standard errors and confidence intervals with the correct coverage rates. The other estimators resulted in biased estimates of standard errors and confidence intervals with incorrect coverage rates. Our simulations were informed by a case study examining the effect of statin prescribing on mortality. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  4. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  5. Dietary intake and food sources of added sugar in the Australian population.

    PubMed

    Lei, Linggang; Rangan, Anna; Flood, Victoria M; Louie, Jimmy Chun Yu

    2016-03-14

    Previous studies in Australian children/adolescents and adults examining added sugar (AS) intake were based on now out-of-date national surveys. We aimed to examine the AS and free sugar (FS) intakes and the main food sources of AS among Australians, using plausible dietary data collected by a multiple-pass, 24-h recall, from the 2011-12 Australian Health Survey respondents (n 8202). AS and FS intakes were estimated using a previously published method, and as defined by the WHO, respectively. Food groups contributing to the AS intake were described and compared by age group and sex by one-way ANOVA. Linear regression was used to test for trends across age groups. Usual intake of FS (as percentage energy (%EFS)) was computed using a published method and compared with the WHO cut-off of <10%EFS. The mean AS intake of the participants was 60·3 (SD 52·6) g/d. Sugar-sweetened beverages accounted for the greatest proportion of the AS intake of the Australian population (21·4 (sd 30·1)%), followed by sugar and sweet spreads (16·3 (SD 24·5)%) and cakes, biscuits, pastries and batter-based products (15·7 (sd 24·4)%). More than half of the study population exceeded the WHO's cut-off for FS, especially children and adolescents. Overall, 80-90% of the daily AS intake came from high-sugar energy-dense and/or nutrient-poor foods. To conclude, the majority of Australian adults and children exceed the WHO recommendation for FS intake. Efforts to reduce AS intake should focus on energy-dense and/or nutrient-poor foods.

  6. Mixed Model Association with Family-Biased Case-Control Ascertainment.

    PubMed

    Hayeck, Tristan J; Loh, Po-Ru; Pollack, Samuela; Gusev, Alexander; Patterson, Nick; Zaitlen, Noah A; Price, Alkes L

    2017-01-05

    Mixed models have become the tool of choice for genetic association studies; however, standard mixed model methods may be poorly calibrated or underpowered under family sampling bias and/or case-control ascertainment. Previously, we introduced a liability threshold-based mixed model association statistic (LTMLM) to address case-control ascertainment in unrelated samples. Here, we consider family-biased case-control ascertainment, where case and control subjects are ascertained non-randomly with respect to family relatedness. Previous work has shown that this type of ascertainment can severely bias heritability estimates; we show here that it also impacts mixed model association statistics. We introduce a family-based association statistic (LT-Fam) that is robust to this problem. Similar to LTMLM, LT-Fam is computed from posterior mean liabilities (PML) under a liability threshold model; however, LT-Fam uses published narrow-sense heritability estimates to avoid the problem of biased heritability estimation, enabling correct calibration. In simulations with family-biased case-control ascertainment, LT-Fam was correctly calibrated (average χ 2 = 1.00-1.02 for null SNPs), whereas the Armitage trend test (ATT), standard mixed model association (MLM), and case-control retrospective association test (CARAT) were mis-calibrated (e.g., average χ 2 = 0.50-1.22 for MLM, 0.89-2.65 for CARAT). LT-Fam also attained higher power than other methods in some settings. In 1,259 type 2 diabetes-affected case subjects and 5,765 control subjects from the CARe cohort, downsampled to induce family-biased ascertainment, LT-Fam was correctly calibrated whereas ATT, MLM, and CARAT were again mis-calibrated. Our results highlight the importance of modeling family sampling bias in case-control datasets with related samples. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  7. A boundary-representation method for designing whole-body radiation dosimetry models: pregnant females at the ends of three gestational periods—RPI-P3, -P6 and -P9

    NASA Astrophysics Data System (ADS)

    Xu, X. George; Taranenko, Valery; Zhang, Juying; Shi, Chengyu

    2007-12-01

    Fetuses are extremely radiosensitive and the protection of pregnant females against ionizing radiation is of particular interest in many health and medical physics applications. Existing models of pregnant females relied on simplified anatomical shapes or partial-body images of low resolutions. This paper reviews two general types of solid geometry modeling: constructive solid geometry (CSG) and boundary representation (BREP). It presents in detail a project to adopt the BREP modeling approach to systematically design whole-body radiation dosimetry models: a pregnant female and her fetus at the ends of three gestational periods of 3, 6 and 9 months. Based on previously published CT images of a 7-month pregnant female, the VIP-Man model and mesh organ models, this new set of pregnant female models was constructed using 3D surface modeling technologies instead of voxels. The organ masses were adjusted to agree with the reference data provided by the International Commission on Radiological Protection (ICRP) and previously published papers within 0.5%. The models were then voxelized for the purpose of performing dose calculations in identically implemented EGS4 and MCNPX Monte Carlo codes. The agreements of the fetal doses obtained from these two codes for this set of models were found to be within 2% for the majority of the external photon irradiation geometries of AP, PA, LAT, ROT and ISO at various energies. It is concluded that the so-called RPI-P3, RPI-P6 and RPI-P9 models have been reliably defined for Monte Carlo calculations. The paper also discusses the needs for future research and the possibility for the BREP method to become a major tool in the anatomical modeling for radiation dosimetry.

  8. Characteristics of small areas with high rates of hospital-treated self-harm: deprived, fragmented and urban or just close to hospital? A national registry study.

    PubMed

    O'Farrell, I B; Corcoran, P; Perry, I J

    2015-02-01

    Previous research has shown an inconsistent relationship between the spatial distribution of hospital treated self-harm and area-level factors such as deprivation and social fragmentation. However, many of these studies have been confined to urban centres, with few focusing on rural settings and even fewer studies carried out at a national level. Furthermore, no previous research has investigated if travel time to hospital services can explain the area-level variation in the incidence of hospital treated self-harm. From 2009 to 2011, the Irish National Registry of Deliberate Self Harm collected data on self-harm presentations to all hospital emergency departments in the country. The Registry uses standard methods of case ascertainment and also geocodes patient addresses to small area geographical level. Negative binomial regression was used to explore the ecological relationship between area-level self-harm rates and various area-level factors. Deprivation, social fragmentation and population density had a positive linear association with self-harm, with deprivation having the strongest independent effect. Furthermore, self-harm incidence was found to be elevated in areas that had shorter journey times to hospital. However, while this association became attenuated after controlling for other area-level factors it still remained statistically significant. A subgroup analysis examining the effect of travel time on specific methods of self-harm, found that this effect was most marked for self-harm acts involving minor self-cutting. Self-harm incidence was influenced by proximity to hospital services, population density and social fragmentation; however, the strongest area-level predictor of self-harm was deprivation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. The Top 50 Most-Cited Articles on Acoustic Neuroma.

    PubMed

    Alfaifi, Abrar; AlMutairi, Othman; Allhaidan, Maha; Alsaleh, Saad; Ajlan, Abdulrazag

    2018-03-01

    Acoustic neuroma is the most common extra-axial primary cerebellopontine angle tumor in adults. A plethora of studies have been published on acoustic neuroma, but none of the previous works have highlighted the most influential articles. Our objective was to perform a bibliometric analysis of the 50 most-cited articles on acoustic neuroma. We performed a title-specific search on the Scopus database using the following search terms: "acoustic neuroma," "vestibular schwannoma," and "cerebellopontine angle." We recorded the 50 most-cited articles and reviewed them. The 50 most-cited articles had an average of 175 citations per article. All articles were published between 1980 and 2006, with 1997 the most prolific year, when 7 articles were published. The journals Neurosurgery and Laryngoscope published 10 and 8 of these articles, respectively. The most common study categories were nonsurgical management (17/50) and surgical management (13/50). Studies were predominantly published by otolaryngologists (22/50) and neurosurgeons (14/50). Douglas Kondziolka was the author with the highest number of contributions, with 7 publications. The majority of the articles were produced in the United States (64%). Identifying articles on acoustic neuroma with the most impact provides an important overview of the historical development of treatment methods and publication trends related to this condition. A finalized, comprehensive list of the most important works represents an excellent tool that can serve as a guide for evidence-based clinical practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Orbital Plotting of WDS 04545-0314 and WDS 04478+5318

    NASA Astrophysics Data System (ADS)

    Smith, Nick; Foster, Chris; Myers, Blake; Sepulveda, Barbel; Genet, Russell

    2016-01-01

    Students at Lincoln High School used the PlateSolve 3 program to obtain the position angle and separation of two double stars, WDS 04545-0314 and WDS 04478+5318. Both stars were observed at Kitt Peak on October 20, 2013. A java-based program developed by the team was used to plot the new data on the previously published orbital paths. It was determined that WDS 04545-0314 is maintaining the previously published orbital solution but that the orbit of WDS 04478+5318 may need to be revised.

  11. The forecasting of menstruation based on a state-space modeling of basal body temperature time series.

    PubMed

    Fukaya, Keiichi; Kawamori, Ai; Osada, Yutaka; Kitazawa, Masumi; Ishiguro, Makio

    2017-09-20

    Women's basal body temperature (BBT) shows a periodic pattern that associates with menstrual cycle. Although this fact suggests a possibility that daily BBT time series can be useful for estimating the underlying phase state as well as for predicting the length of current menstrual cycle, little attention has been paid to model BBT time series. In this study, we propose a state-space model that involves the menstrual phase as a latent state variable to explain the daily fluctuation of BBT and the menstruation cycle length. Conditional distributions of the phase are obtained by using sequential Bayesian filtering techniques. A predictive distribution of the next menstruation day can be derived based on this conditional distribution and the model, leading to a novel statistical framework that provides a sequentially updated prediction for upcoming menstruation day. We applied this framework to a real data set of women's BBT and menstruation days and compared prediction accuracy of the proposed method with that of previous methods, showing that the proposed method generally provides a better prediction. Because BBT can be obtained with relatively small cost and effort, the proposed method can be useful for women's health management. Potential extensions of this framework as the basis of modeling and predicting events that are associated with the menstrual cycles are discussed. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  12. An open-source method to analyze optokinetic reflex responses in larval zebrafish.

    PubMed

    Scheetz, Seth D; Shao, Enhua; Zhou, Yangzhong; Cario, Clinton L; Bai, Qing; Burton, Edward A

    2018-01-01

    Optokinetic reflex (OKR) responses provide a convenient means to evaluate oculomotor, integrative and afferent visual function in larval zebrafish models, which are commonly used to elucidate molecular mechanisms underlying development, disease and repair of the vertebrate nervous system. We developed an open-source MATLAB-based solution for automated quantitative analysis of OKR responses in larval zebrafish. The package includes applications to: (i) generate sinusoidally-transformed animated grating patterns suitable for projection onto a cylindrical screen to elicit the OKR; (ii) determine and record the angular orientations of the eyes in each frame of a video recording showing the OKR response; and (iii) analyze angular orientation data from the tracking program to yield a set of parameters that quantify essential elements of the OKR. The method can be employed without modification using the operating manual provided. In addition, annotated source code is included, allowing users to modify or adapt the software for other applications. We validated the algorithms and measured OKR responses in normal larval zebrafish, showing good agreement with published quantitative data, where available. We provide the first open-source method to elicit and analyze the OKR in larval zebrafish. The wide range of parameters that are automatically quantified by our algorithms significantly expands the scope of quantitative analysis previously reported. Our method for quantifying OKR responses will be useful for numerous applications in neuroscience using the genetically- and chemically-tractable zebrafish model. Published by Elsevier B.V.

  13. A nanobiosensor composed of Exfoliated Graphene Oxide and Gold Nano-Urchins, for detection of GMO products.

    PubMed

    Aghili, Zahra; Nasirizadeh, Navid; Divsalar, Adeleh; Shoeibi, Shahram; Yaghmaei, Parichehreh

    2017-09-15

    Genetically Modified Organisms, have been entered our food chain and detection of these organisms in market products are still the main challenge for scientists. Among several developed detection/quantification methods for detection of these organisms, the electrochemical nanobiosensors are the most attended which are combining the advantages of using nanomaterials, electrochemical methods and biosensors. In this research, a novel and sensitive electrochemical nanobiosensor for detection/quantification of these organisms have been developed using nanomaterials; Exfoliated Graphene Oxide and Gold Nano-Urchins for modification of the screen-printed carbon electrode, and also applying a specific DNA probe as well as hematoxylin for electrochemical indicator. Application time period and concentration of the components have been optimized and also several reliable methods have been used to assess the correct assembling of the nanobiosensor e.g. field emission scanning electron microscope, cyclic voltammetry and electrochemical impedance spectroscopy. The results shown the linear range of the sensor was 40.0-1100.0 femtomolar and the limit of detection calculated as 13.0 femtomolar. Besides, the biosensor had good selectivity towards the target DNA over the non-specific sequences and also it was cost and time-effective and possess ability to be used in real sample environment of extracted DNA of Genetically Modified Organism products. Therefore, the superiority of the aforementioned specification to the other previously published methods was proved adequate. Copyright © 2017. Published by Elsevier B.V.

  14. Optimal economic order quantity for buyer-distributor-vendor supply chain with backlogging derived without derivatives

    NASA Astrophysics Data System (ADS)

    Teng, Jinn-Tsair; Cárdenas-Barrón, Leopoldo Eduardo; Lou, Kuo-Ren; Wee, Hui Ming

    2013-05-01

    In this article, we first complement an inappropriate mathematical error on the total cost in the previously published paper by Chung and Wee [2007, 'Optimal the Economic Lot Size of a Three-stage Supply Chain With Backlogging Derived Without Derivatives', European Journal of Operational Research, 183, 933-943] related to buyer-distributor-vendor three-stage supply chain with backlogging derived without derivatives. Then, an arithmetic-geometric inequality method is proposed not only to simplify the algebraic method of completing prefect squares, but also to complement their shortcomings. In addition, we provide a closed-form solution to integral number of deliveries for the distributor and the vendor without using complex derivatives. Furthermore, our method can solve many cases in which their method cannot, because they did not consider that a squared root of a negative number does not exist. Finally, we use some numerical examples to show that our proposed optimal solution is cheaper to operate than theirs.

  15. Forensic Applicability of Femur Subtrochanteric Shape to Ancestry Assessment in Thai and White American Males.

    PubMed

    Tallman, Sean D; Winburn, Allysha P

    2015-09-01

    Ancestry assessment from the postcranial skeleton presents a significant challenge to forensic anthropologists. However, metric dimensions of the femur subtrochanteric region are believed to distinguish between individuals of Asian and non-Asian descent. This study tests the discriminatory power of subtrochanteric shape using modern samples of 128 Thai and 77 White American males. Results indicate that the samples' platymeric index distributions are significantly different (p≤0.001), with the Thai platymeric index range generally lower and the White American range generally higher. While the application of ancestry assessment methods developed from Native American subtrochanteric data results in low correct classification rates for the Thai sample (50.8-57.8%), adapting these methods to the current samples leads to better classification. The Thai data may be more useful in forensic analysis than previously published subtrochanteric data derived from Native American samples. Adapting methods to include appropriate geographic and contemporaneous populations increases the accuracy of femur subtrochanteric ancestry methods. © 2015 American Academy of Forensic Sciences.

  16. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services.

    PubMed

    Aarons, Gregory A; Fettes, Danielle L; Sommerfeld, David H; Palinkas, Lawrence A

    2012-02-01

    Many public sector service systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This article describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. The authors integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research.

  17. Mixed Methods for Implementation Research: Application to Evidence-Based Practice Implementation and Staff Turnover in Community Based Organizations Providing Child Welfare Services

    PubMed Central

    Aarons, Gregory A.; Fettes, Danielle L.; Sommerfeld, David H.; Palinkas, Lawrence

    2013-01-01

    Many public sector services systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well-suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This paper describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. We integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research. PMID:22146861

  18. Standardisation of the (129)I, (151)Sm and (166m)Ho activity concentration using the CIEMAT/NIST efficiency tracing method.

    PubMed

    Altzitzoglou, Timotheos; Rožkov, Andrej

    2016-03-01

    The (129)I, (151)Sm and (166m)Ho standardisations using the CIEMAT/NIST efficiency tracing method, that have been carried out in the frame of the European Metrology Research Program project "Metrology for Radioactive Waste Management" are described. The radionuclide beta counting efficiencies were calculated using two computer codes CN2005 and MICELLE2. The sensitivity analysis of the code input parameters (ionization quenching factor, beta shape factor) on the calculated efficiencies was performed, and the results are discussed. The combined relative standard uncertainty of the standardisations of the (129)I, (151)Sm and (166m)Ho solutions were 0.4%, 0.5% and 0.4%, respectively. The stated precision obtained using the CIEMAT/NIST method is better than that previously reported in the literature obtained by the TDCR ((129)I), the 4πγ-NaI ((166m)Ho) counting or the CIEMAT/NIST method ((151)Sm). Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Iron oxide nanoparticle-based magnetic resonance method to monitor release kinetics from polymeric particles with high resolution.

    PubMed

    Chan, Minnie; Schopf, Eric; Sankaranarayanan, Jagadis; Almutairi, Adah

    2012-09-18

    A new method to precisely monitor rapid release kinetics from polymeric particles using super paramagnetic iron oxide nanoparticles, specifically by measuring spin-spin relaxation time (T(2)), is reported. Previously, we have published the formulation of logic gate particles from an acid-sensitive poly-β-aminoester ketal-2 polymer. Here, a series of poly-β-aminoester ketal-2 polymers with varying hydrophobicities were synthesized and used to formulate particles. We attempted to measure fluorescence of released Nile red to determine whether the structural adjustments could finely tune the release kinetics in the range of minutes to hours; however, this standard technique did not differentiate each release rate of our series. Thus, a new method based on encapsulation of iron oxide nanoparticles was developed, which enabled us to resolve the release kinetics of our particles. Moreover, the kinetics matched the relative hydrophobicity order determined by octanol-water partition coefficients. To the best of our knowledge, this method provides the highest resolution of release kinetics to date.

  20. A sequential bioequivalence design with a potential ethical advantage.

    PubMed

    Fuglsang, Anders

    2014-07-01

    This paper introduces a two-stage approach for evaluation of bioequivalence, where, in contrast to the designs of Diane Potvin and co-workers, two stages are mandatory regardless of the data obtained at stage 1. The approach is derived from Potvin's method C. It is shown that under circumstances with relatively high variability and relatively low initial sample size, this method has an advantage over Potvin's approaches in terms of sample sizes while controlling type I error rates at or below 5% with a minute occasional trade-off in power. Ethically and economically, the method may thus be an attractive alternative to the Potvin designs. It is also shown that when using the method introduced here, average total sample sizes are rather independent of initial sample size. Finally, it is shown that when a futility rule in terms of sample size for stage 2 is incorporated into this method, i.e., when a second stage can be abolished due to sample size considerations, there is often an advantage in terms of power or sample size as compared to the previously published methods.

  1. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Rode, Karyn D.; Budge, Suzanne M.; Thiemann, Gregory W.

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted.

  2. Functional classification of protein structures by local structure matching in graph representation.

    PubMed

    Mills, Caitlyn L; Garg, Rohan; Lee, Joslynn S; Tian, Liang; Suciu, Alexandru; Cooperman, Gene; Beuning, Penny J; Ondrechen, Mary Jo

    2018-03-31

    As a result of high-throughput protein structure initiatives, over 14,400 protein structures have been solved by structural genomics (SG) centers and participating research groups. While the totality of SG data represents a tremendous contribution to genomics and structural biology, reliable functional information for these proteins is generally lacking. Better functional predictions for SG proteins will add substantial value to the structural information already obtained. Our method described herein, Graph Representation of Active Sites for Prediction of Function (GRASP-Func), predicts quickly and accurately the biochemical function of proteins by representing residues at the predicted local active site as graphs rather than in Cartesian coordinates. We compare the GRASP-Func method to our previously reported method, structurally aligned local sites of activity (SALSA), using the ribulose phosphate binding barrel (RPBB), 6-hairpin glycosidase (6-HG), and Concanavalin A-like Lectins/Glucanase (CAL/G) superfamilies as test cases. In each of the superfamilies, SALSA and the much faster method GRASP-Func yield similar correct classification of previously characterized proteins, providing a validated benchmark for the new method. In addition, we analyzed SG proteins using our SALSA and GRASP-Func methods to predict function. Forty-one SG proteins in the RPBB superfamily, nine SG proteins in the 6-HG superfamily, and one SG protein in the CAL/G superfamily were successfully classified into one of the functional families in their respective superfamily by both methods. This improved, faster, validated computational method can yield more reliable predictions of function that can be used for a wide variety of applications by the community. © 2018 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  3. Self-management programmes for people living with chronic obstructive pulmonary disease: a call for a reconceptualisation.

    PubMed

    Jonsdottir, Helga

    2013-03-01

    To synthesise findings from previously published studies on the effectiveness of self-management programmes for people with chronic obstructive pulmonary disease. Self-management is a widely valued concept to address contemporary issues of chronic health problems. Yet, findings of self-management programmes for people with chronic obstructive pulmonary disease are indecisive. Literature review of (1) previously published systematic reviews and (2) an integrative literature review. Synthesis of findings from previously published systematic reviews (n = 4) of the effectiveness of self-management programmes for people with chronic obstructive pulmonary disease and an integrated review that was performed on papers published between January 2007-June 2012 (n = 9). Findings demonstrate that there are few studies on the effectiveness of self-management programmes on people with chronic obstructive pulmonary disease despite more than a decade of research activities. Outcomes of the studies reveal some increase in health-related quality of life and reduction in use of healthcare resources. The methodological approaches vary, and the sample size is primarily small. Families are not acknowledged. Features of patient-centredness exist in self-management programmes, particularly in the more recent articles. The effectiveness of self-management programmes for people with chronic obstructive pulmonary disease remains indecisive. A reconceptualisation of self-management programmes is called for with attention to a family-centred, holistic and relational care focusing on living with and minimising the handicapping consequences of the health problems in their entirety. © 2013 Blackwell Publishing Ltd.

  4. Electromagnetic stirring in a microbioreactor with non-conventional chamber morphology and implementation of multiplexed mixing.

    PubMed

    Tan, Christabel Kl; Davies, Matthew J; McCluskey, Daniel K; Munro, Ian R; Nweke, Mauryn C; Tracey, Mark C; Szita, Nicolas

    2015-10-01

    Microbioreactors have emerged as novel tools for early bioprocess development. Mixing lies at the heart of bioreactor operation (at all scales). The successful implementation of micro-stirring methods is thus central to the further advancement of microbioreactor technology. The aim of this study was to develop a micro-stirring method that aids robust microbioreactor operation and facilitates cost-effective parallelization. A microbioreactor was developed with a novel micro-stirring method involving the movement of a magnetic bead by sequenced activation of a ring of electromagnets. The micro-stirring method offers flexibility in chamber designs, and mixing is demonstrated in cylindrical, diamond and triangular shaped reactor chambers. Mixing was analyzed for different electromagnet on/off sequences; mixing times of 4.5 s, 2.9 s, and 2.5 s were achieved for cylindrical, diamond and triangular shaped chambers, respectively. Ease of micro-bubble free priming, a typical challenge of cylindrical shaped microbioreactor chambers, was obtained with a diamond-shaped chamber. Consistent mixing behavior was observed between the constituent reactors in a duplex system. A novel stirring method using electromagnetic actuation offering rapid mixing and easy integration with microbioreactors was characterized. The design flexibility gained enables fabrication of chambers suitable for microfluidic operation, and a duplex demonstrator highlights potential for cost-effective parallelization. Combined with a previously published cassette-like fabrication of microbioreactors, these advances will facilitate the development of robust and parallelized microbioreactors. © 2015 The Authors. Journal of Chemical Technology & Biotechnology published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  5. Randomized controlled trials in pediatric complementary and alternative medicine: Where can they be found?

    PubMed Central

    Sampson, Margaret; Campbell, Kaitryn; Ajiferuke, Isola; Moher, David

    2003-01-01

    Background The safety and effectiveness of CAM interventions are of great relevance to pediatric health care providers. The objective of this study is to identify sources of reported randomized controlled trials (RCTs) in the field of pediatric complementary and alternative medicine (CAM). Methods Reports of RCTs were identified by searching Medline and 12 additional bibliographic databases and by reviewing the reference lists of previously identified pediatric CAM systematic reviews. Results We identified 908 reports of RCTs that included children under 18 and investigated a CAM therapy. Since 1965, there has been a steady growth in the number of these trials that are being published. The four journals that published the most reported RCTs are The American Journal of Clinical Nutrition, Pediatrics, Journal of Pediatrics, and Lancet. Medline, CAB Health, and Embase were the best database sources for identifying these studies; they indexed 93.2%, 58.4% and 42.2 % respectively of the journals publishing reports of pediatric CAM RCTs. Conclusions Those working or interested in the field of pediatric CAM should routinely search Medline, CAB Health and Embase for literature in the field. The four core journals identified above should be included in their collection. PMID:12589711

  6. Optimization of Direct Fibroblast Reprogramming to Cardiomyocytes Using Calcium Activity as a Functional Measure of Success

    PubMed Central

    Addis, Russell C.; Ifkovits, Jamie L.; Pinto, Filipa; Kellam, Lori D.; Esteso, Paul; Rentschler, Stacey; Christoforou, Nicolas; Epstein, Jonathan A.; Gearhart, John D.

    2013-01-01

    Direct conversion of fibroblasts to induced cardiomyocytes (iCMs) has great potential for regenerative medicine. Recent publications have reported significant progress, but the evaluation of reprogramming has relied upon non-functional measures such as flow cytometry for cardiomyocyte markers or GFP expression driven by a cardiomyocyte-specific promoter. The issue is one of practicality: the most stringent measures - electrophysiology to detect cell excitation and the presence of spontaneously contracting myocytes - are not readily quantifiable in the large numbers of cells screened in reprogramming experiments. However, excitation and contraction are linked by a third functional characteristic of cardiomyocytes: the rhythmic oscillation of intracellular calcium levels. We set out to optimize direct conversion of fibroblasts to iCMs with a quantifiable calcium reporter to rapidly assess functional transdifferentiation. We constructed a reporter system in which the calcium indicator GCaMP is driven by the cardiomyocyte-specific Troponin T promoter. Using calcium activity as our primary outcome measure, we compared several published combinations of transcription factors along with novel combinations in mouse embryonic fibroblasts. The most effective combination consisted of Hand2, Nkx2.5, Gata4, Mef2c, and Tbx5 (HNGMT). This combination is >50-fold more efficient than GMT alone and produces iCMs with cardiomyocyte marker expression, robust calcium oscillation, and spontaneous beating that persists for weeks following inactivation of reprogramming factors. HNGMT is also significantly more effective than previously published factor combinations for the transdifferentiation of adult mouse cardiac fibroblasts to iCMs. Quantification of calcium function is a convenient and effective means for the identification and evaluation of cardiomyocytes generated by direct reprogramming. Using this stringent outcome measure, we conclude that HNGMT produces iCMs more efficiently than previously published methods. PMID:23591016

  7. Evaluation of Occupational and Environmental Factors in the Assessment of Chronic Cough in Adults: A Systematic Review.

    PubMed

    Tarlo, Susan M; Altman, Kenneth W; French, Cynthia T; Diekemper, Rebecca L; Irwin, Richard S

    2016-01-01

    Several recent cough guidelines have advised consideration of occupational or environmental causes for chronic cough, but it is unclear how frequently this recommendation has been routinely applied. Therefore, we undertook a systematic review to address this aspect. Cough guidelines and protocols were reviewed to identify recommendations for assessment of occupational and environmental aspects of chronic cough. The systematic search previously used to identify intervention fidelity to the use of protocols for diagnosis and management of chronic cough in adults was used for this review after extension to June 2015. PubMed, Scopus, and the Cochrane Library were searched using the same search terms and inclusion criteria as previously. Papers that met our criteria were then reviewed to identify methods used to assess occupational and environmental aspects of chronic cough and the outcomes of these assessments. Among the 10 general chronic cough guidelines and protocols identified, only the three published since 2006 included details advising detailed occupational and environmental assessments. One additional cough statement focused entirely on occupational cough. Of the 28 cohort studies of patients with chronic cough that specifically noted that they followed guidelines or protocols, none provided details of occupational and environmental assessments. Despite published recommendations, it is not apparent that occupational and environmental causes for chronic cough are addressed in detail during assessments of patients with chronic cough. This leaves open to speculation whether lack of recognition of an occupational cause may delay important preventive measures, put additional workers at risk, and/or be the reason why a chronic cough may remain unexplained. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  8. Profilometry of three-dimensional discontinuous solids by combining two-steps temporal phase unwrapping, co-phased profilometry and phase-shifting interferometry

    NASA Astrophysics Data System (ADS)

    Servin, Manuel; Padilla, Moises; Garnica, Guillermo; Gonzalez, Adonai

    2016-12-01

    In this work we review and combine two techniques that have been recently published for three-dimensional (3D) fringe projection profilometry and phase unwrapping, namely: co-phased profilometry and 2-steps temporal phase-unwrapping. By combining these two methods we get a more accurate, higher signal-to-noise 3D profilometer for discontinuous industrial objects. In single-camera single-projector (standard) profilometry, the camera and the projector must form an angle between them. The phase-sensitivity of the profilometer depends on this angle, so it cannot be avoided. This angle produces regions with self-occluding shadows and glare from the solid as viewed from the camera's perspective, making impossible the demodulation of the fringe-pattern there. In other words, the phase data is undefined at those shadow regions. As published recently, this limitation can be solved by using several co-phased fringe-projectors and a single camera. These co-phased projectors are positioned at different directions towards the object, and as a consequence most shadows are compensated. In addition to this, most industrial objects are highly discontinuous, which precludes the use of spatial phase-unwrappers. One way to avoid spatial unwrapping is to decrease the phase-sensitivity to a point where the demodulated phase is bounded to one lambda, so the need for phase-unwrapping disappears. By doing this, however, the recovered non-wrapped phase contains too much harmonic distortion and noise. Using our recently proposed two-step temporal phase-unwrapping technique, the high-sensitivity phase is unwrapped using the low-frequency one as initial gross estimation. This two-step unwrapping technique solves the 3D object discontinuities while keeping the accuracy of the high-frequency profilometry data. In scientific research, new art are derived as logical and consistent result of previous efforts in the same direction. Here we present a new 3D-profilometer combining these two recently published methods: co-phased profilometry and two-steps temporal phase-unwrapping. By doing this, we obtain a new and more powerful 3D profilometry technique which overcomes the two main limitations of previous fringe-projection profilometers namely: high phase-sensitivity digitalization of discontinuous objects and solid's self-generated shadow minimization. This new 3D profilometer is demonstrated by an experiment digitizing a discontinuous 3D industrial-solid where the advantages of this new profilometer with respect to previous art are clearly shown.

  9. BinSanity: unsupervised clustering of environmental microbial assemblies using coverage and affinity propagation

    PubMed Central

    Heidelberg, John F.; Tully, Benjamin J.

    2017-01-01

    Metagenomics has become an integral part of defining microbial diversity in various environments. Many ecosystems have characteristically low biomass and few cultured representatives. Linking potential metabolisms to phylogeny in environmental microorganisms is important for interpreting microbial community functions and the impacts these communities have on geochemical cycles. However, with metagenomic studies there is the computational hurdle of ‘binning’ contigs into phylogenetically related units or putative genomes. Binning methods have been implemented with varying approaches such as k-means clustering, Gaussian mixture models, hierarchical clustering, neural networks, and two-way clustering; however, many of these suffer from biases against low coverage/abundance organisms and closely related taxa/strains. We are introducing a new binning method, BinSanity, that utilizes the clustering algorithm affinity propagation (AP), to cluster assemblies using coverage with compositional based refinement (tetranucleotide frequency and percent GC content) to optimize bins containing multiple source organisms. This separation of composition and coverage based clustering reduces bias for closely related taxa. BinSanity was developed and tested on artificial metagenomes varying in size and complexity. Results indicate that BinSanity has a higher precision, recall, and Adjusted Rand Index compared to five commonly implemented methods. When tested on a previously published environmental metagenome, BinSanity generated high completion and low redundancy bins corresponding with the published metagenome-assembled genomes. PMID:28289564

  10. Is "end of life" a special case? Connecting Q with survey methods to measure societal support for views on the value of life-extending treatments.

    PubMed

    Mason, Helen; Collins, Marissa; McHugh, Neil; Godwin, Jon; Van Exel, Job; Donaldson, Cam; Baker, Rachel

    2018-05-01

    Preference elicitation studies reporting societal views on the relative value of end-of-life treatments have produced equivocal results. This paper presents an alternative method, combining Q methodology and survey techniques (Q2S) to determine the distribution of 3 viewpoints on the relative value of end-of-life treatments identified in a previous, published, phase of this work. These were Viewpoint 1, "A population perspective: value for money, no special cases"; Viewpoint 2, "Life is precious: valuing life-extension and patient choice"; and Viewpoint 3, "Valuing wider benefits and opportunity cost: the quality of life and death." A Q2S survey of 4,902 respondents across the United Kingdom measured agreement with these viewpoints; 37% most agreed with Viewpoint 1, 49% with Viewpoint 2, and 9% with Viewpoint 3. Regression analysis showed associations of viewpoints with gender, level of education, religion, voting preferences, and satisfaction with the NHS. The Q2S approach provides a promising means to investigate how in-depth views and opinions are represented in the wider population. As demonstrated in this study, there is often more than 1 viewpoint on a topic and methods that seek to estimate that averages may not provide the best guidance for societal decision-making. © 2018 The Authors. Health Economics Published by John Wiley & Sons Ltd.

  11. SANSparallel: interactive homology search against Uniprot.

    PubMed

    Somervuo, Panu; Holm, Liisa

    2015-07-01

    Proteins evolve by mutations and natural selection. The network of sequence similarities is a rich source for mining homologous relationships that inform on protein structure and function. There are many servers available to browse the network of homology relationships but one has to wait up to a minute for results. The SANSparallel webserver provides protein sequence database searches with immediate response and professional alignment visualization by third-party software. The output is a list, pairwise alignment or stacked alignment of sequence-similar proteins from Uniprot, UniRef90/50, Swissprot or Protein Data Bank. The stacked alignments are viewed in Jalview or as sequence logos. The database search uses the suffix array neighborhood search (SANS) method, which has been re-implemented as a client-server, improved and parallelized. The method is extremely fast and as sensitive as BLAST above 50% sequence identity. Benchmarks show that the method is highly competitive compared to previously published fast database search programs: UBLAST, DIAMOND, LAST, LAMBDA, RAPSEARCH2 and BLAT. The web server can be accessed interactively or programmatically at http://ekhidna2.biocenter.helsinki.fi/cgi-bin/sans/sans.cgi. It can be used to make protein functional annotation pipelines more efficient, and it is useful in interactive exploration of the detailed evidence supporting the annotation of particular proteins of interest. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. TU-FG-209-08: Distribution of the Deviation Index (DI) in Digital Radiography Practices Across the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, A; Shepard, S; Dave, J

    Purpose: To characterize the distribution of the deviation index (DI) in digital radiography practices across the United States. Methods: DI data was obtained from 10 collaborating institutions in the United States between 2012 and 2015. Each institution complied with the requirements of the Institutional Review Board at their site. DI data from radiographs of the body parts chest, abdomen, pelvis and extremity were analyzed for anteroposterior, posteroanterior, lateral, and decubitus views. The DI data was analyzed both in aggregate and stratified by exposure control method, image receptor technology, patient age, and participating site for each body part and view. Themore » number of exposures with DI falling within previously published control limits for DI and descriptive statistics were calculated. Results: DI data from 505,930 radiographic exposures was analyzed. The number of exposures with DI falling within published control limits for DI varied from 10 to 20% for adult patients and 10 to 23% for pediatric patients for different body parts and views. Mean DI values averaged over other parameters for radiographs of the abdomen, chest, pelvis, and extremities ranged from 0.3 to 1.0, −0.6 to 0.5, 0.8, and −0.9 to 0.5 for the different adult views and ranged from −1.6 to −0.1, −0.3 to 0.5, −0.1, −0.2 to 1.4 for the different pediatric views, respectively (DI data was solicited only for anteroposterior view of pelvis). Standard deviation values of DI from individual sites ranged from 1.3 to 3.6 and 1.3 to 3.0 for the different adult and pediatric views, respectively. Also of interest was that target exposure indicators varied by up to a factor of 6 between sites for certain body parts and views. Conclusion: Previously published DI control limits do not reflect the state of clinical practice in digital radiography. Mean DI and target exposure indicators are targets for quality improvement efforts in radiography.« less

  13. Over-reassurance and undersupport after a 'false alarm': a systematic review of the impact on subsequent cancer symptom attribution and help seeking.

    PubMed

    Renzi, Cristina; Whitaker, Katriina L; Wardle, Jane

    2015-02-04

    This literature review examined research into the impact of a previous 'all-clear' or non-cancer diagnosis following symptomatic presentation ('false alarm') on symptom attribution and delays in help seeking for subsequent possible cancer symptoms. The comprehensive literature review included original research based on quantitative, qualitative and mixed data collection methods. We used a combination of search strategies, including in-depth searches of electronic databases (PubMed, EMBASE, PsychInfo), searching key authors and articles listed as 'related' in PubMed, and reference lists. We performed a narrative synthesis of key themes shared across studies. The review included studies published after 1990 and before February 2014 reporting information on adult patients having experienced a false alarm following symptomatic presentation. We excluded false alarms in the context of screening. We evaluated the effect of a 'false alarm' on symptom attribution and help seeking for new or recurrent possible cancer symptoms. Overall, 1442 papers were screened and 121 retrieved for full-text evaluation. Among them, 19 reported on false alarms and subsequent symptom attribution or help seeking. They used qualitative (n=14), quantitative (n=3) and mixed methods (n=2). Breast (n=7), gynaecological (n=3), colorectal (n=2), testicular (n=2), and head and neck cancers (n=2) were the most studied. Two broad themes emerged underlying delays in help seeking: (1) over-reassurance from the previous 'all-clear' diagnosis leading to subsequent symptoms being interpreted as benign, and (2) unsupportive healthcare experiences in which symptoms were dismissed, leaving patients concerned about appearing hypochondriacal or uncertain about the appropriate next actions. The evidence suggested that the effect of a false alarm can persist for months and even years. In conclusion, over-reassurance and undersupport of patients after a false alarm can undermine help seeking in the case of new or recurrent potential cancer symptoms, highlighting the need for appropriate patient information when investigations rule out cancer. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Past speculations of future health technologies: a description of technologies predicted in 15 forecasting studies published between 1986 and 2010.

    PubMed

    Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew

    2017-07-31

    To describe and classify health technologies predicted in forecasting studies. A portrait describing health technologies predicted in 15 forecasting studies published between 1986 and 2010 that were identified in a previous systematic review. Health technologies are classified according to their type, purpose and clinical use; relating these to the original purpose and timing of the forecasting studies. All health-related technologies predicted in 15 forecasting studies identified in a previously published systematic review. Outcomes related to (1) each forecasting study including country, year, intention and forecasting methods used and (2) the predicted technologies including technology type, purpose, targeted clinical area and forecast timeframe. Of the 896 identified health-related technologies, 685 (76.5%) were health technologies with an explicit or implied health application and included in our study. Of these, 19.1% were diagnostic or imaging tests, 14.3% devices or biomaterials, 12.6% information technology systems, eHealth or mHealth and 12% drugs. The majority of the technologies were intended to treat or manage disease (38.1%) or diagnose or monitor disease (26.1%). The most frequent targeted clinical areas were infectious diseases followed by cancer, circulatory and nervous system disorders. The most frequent technology types were for: infectious diseases-prophylactic vaccines (45.8%), cancer-drugs (40%), circulatory disease-devices and biomaterials (26.3%), and diseases of the nervous system-equally devices and biomaterials (25%) and regenerative medicine (25%). The mean timeframe for forecasting was 11.6 years (range 0-33 years, median=10, SD=6.6). The forecasting timeframe significantly differed by technology type (p=0.002), the intent of the forecasting group (p<0.001) and the methods used (p<001). While description and classification of predicted health-related technologies is crucial in preparing healthcare systems for adopting new innovations, further work is needed to test the accuracy of predictions made. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Highly polygenic architecture of antidepressant treatment response: Comparative analysis of SSRI and NRI treatment in an animal model of depression.

    PubMed

    Malki, Karim; Tosto, Maria Grazia; Mouriño-Talín, Héctor; Rodríguez-Lorenzo, Sabela; Pain, Oliver; Jumhaboy, Irfan; Liu, Tina; Parpas, Panos; Newman, Stuart; Malykh, Artem; Carboni, Lucia; Uher, Rudolf; McGuffin, Peter; Schalkwyk, Leonard C; Bryson, Kevin; Herbster, Mark

    2017-04-01

    Response to antidepressant (AD) treatment may be a more polygenic trait than previously hypothesized, with many genetic variants interacting in yet unclear ways. In this study we used methods that can automatically learn to detect patterns of statistical regularity from a sparsely distributed signal across hippocampal transcriptome measurements in a large-scale animal pharmacogenomic study to uncover genomic variations associated with AD. The study used four inbred mouse strains of both sexes, two drug treatments, and a control group (escitalopram, nortriptyline, and saline). Multi-class and binary classification using Machine Learning (ML) and regularization algorithms using iterative and univariate feature selection methods, including InfoGain, mRMR, ANOVA, and Chi Square, were used to uncover genomic markers associated with AD response. Relevant genes were selected based on Jaccard distance and carried forward for gene-network analysis. Linear association methods uncovered only one gene associated with drug treatment response. The implementation of ML algorithms, together with feature reduction methods, revealed a set of 204 genes associated with SSRI and 241 genes associated with NRI response. Although only 10% of genes overlapped across the two drugs, network analysis shows that both drugs modulated the CREB pathway, through different molecular mechanisms. Through careful implementation and optimisations, the algorithms detected a weak signal used to predict whether an animal was treated with nortriptyline (77%) or escitalopram (67%) on an independent testing set. The results from this study indicate that the molecular signature of AD treatment may include a much broader range of genomic markers than previously hypothesized, suggesting that response to medication may be as complex as the pathology. The search for biomarkers of antidepressant treatment response could therefore consider a higher number of genetic markers and their interactions. Through predominately different molecular targets and mechanisms of action, the two drugs modulate the same Creb1 pathway which plays a key role in neurotrophic responses and in inflammatory processes. © 2016 The Authors. American Journal of Medical Genetics Part B: Neuropsychiatric Genetics Published by Wiley Periodicals, Inc. © 2016 The Authors. American Journal of Medical Genetics Part B: Neuropsychiatric Genetics Published by Wiley Periodicals, Inc.

  16. Aprotinin; friend or foe? A review of recent medical literature.

    PubMed

    Royston, D; van Haaften, N; De Vooght, P

    2007-01-01

    Recent articles published in peer review journals have questioned the safety of using aprotinin in patients having heart surgery. Also, evidence has been published to suggest an increase in renal events in patients given aprotinin when compared to those where tranexamic acid was used. The present review will focus principally on the first of these articles in relation to previously published data and experience.

  17. Trans-catheter aortic valve implantation after previous aortic homograft surgery.

    PubMed

    Drews, Thorsten; Pasic, Miralem; Buz, Semih; Unbehaun, Axel

    2011-12-01

    In patients with previous heart surgery, the operative risk is elevated during conventional aortic valve re-operations. Trans-catheter aortic valve implantation is a new method for the treatment of high-risk patients. Nevertheless, this new procedure carries potential risks in patients with previous homograft implantation in aortic position. Between April 2008 and February 2011, 345 consecutive patients (mean EuroSCORE (European System for Cardiac Operative Risk Evaluation): 38 ± 20%; mean Society of Thoracic Surgeons (STS) Mortality Score: 19 ± 16%; mean age: 80 ± 8 years; 111 men and 234 women) underwent trans-apical aortic valve implantation. In three patients, previous aortic homograft implantation had been performed. Homograft degeneration causing combined valve stenosis and incompetence made re-operation necessary. In all three patients, the aortic valve could be implanted using the trans-apical approach, and the procedure was successful. In two patients, there was slight paravalvular leakage of the aortic prosthesis and the other patient had slight central leakage. Neither ostium obstruction nor mitral valve damage was observed. Trans-catheter valve implantation can be performed successfully after previous homograft implantation. Particular care should be taken to achieve optimal valve positioning, not to obstruct the ostium of the coronary vessels due to the changed anatomic situation and not to cause annulus rupture. Copyright © 2011 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.

  18. Lack of Association between Human Plasma Oxytocin and Interpersonal Trust in a Prisoner’s Dilemma Paradigm

    PubMed Central

    Christensen, James C.; Shiyanov, Pavel A.; Estepp, Justin R.; Schlager, John J.

    2014-01-01

    Expanding interest in oxytocin, particularly the role of endogenous oxytocin in human social behavior, has created a pressing need for replication of results and verification of assay methods. In this study, we sought to replicate and extend previous results correlating plasma oxytocin with trust and trustworthy behavior. As a necessary first step, the two most commonly used commercial assays were compared in human plasma via the addition of a known quantity of exogenous oxytocin, with and without sample extraction. Plasma sample extraction was found to be critical in obtaining repeatable concentrations of oxytocin. In the subsequent trust experiment, twelve samples in duplicate, from each of 82 participants, were collected over approximately six hours during the performance of a Prisoner’s Dilemma task paradigm that stressed human interpersonal trust. We found no significant relationship between plasma oxytocin concentrations and trusting or trustworthy behavior. In light of these findings, previous published work that used oxytocin immunoassays without sample extraction should be reexamined and future research exploring links between endogenous human oxytocin and trust or social behavior should proceed with careful consideration of methods and appropriate biofluids for analysis. PMID:25549255

  19. Genotype-dependent sulphite tolerance of Australian Dekkera (Brettanomyces) bruxellensis wine isolates.

    PubMed

    Curtin, C; Kennedy, E; Henschke, P A

    2012-07-01

    The aim of this study was to determine sulphite tolerance for a large number of Dekkera bruxellensis isolates and evaluate the relationship between this phenotype and previously assigned genotype markers. A published microplate-based method for evaluation of yeast growth in the presence of sulphite was benchmarked against culturability following sulphite treatment, for the D. bruxellensis type strain (CBS 74) and a reference wine isolate (AWRI 1499). This method was used to estimate maximal sulphite tolerance for 41 D. bruxellensis isolates, which was found to vary over a fivefold range. Significant differences in sulphite tolerance were observed when isolates were grouped according to previously assigned genotypes and ribotypes. Variable sulphite tolerance for the wine spoilage yeast D. bruxellensis can be linked to genotype markers. Strategies to minimize risk of wine spoilage by D. bruxellensis must take into account at least a threefold range in effective sulphite concentration that is dependent upon the genotype group(s) present. The isolates characterized in this study will be a useful resource for establishing the mechanisms conferring sulphite tolerance for this industrially important yeast species. © 2012 The Authors. Letters in Applied Microbiology © 2012 The Society for Applied Microbiology.

  20. Confirmation of the immunoreactivity of monoclonal anti-human C-terminal EGFR antibodies in bronze Corydoras Corydoras aeneus (Callichthyidae Teleostei) by Western Blot method.

    PubMed

    Mytych, Jennifer; Satora, Leszek; Kozioł, Katarzyna

    2018-02-01

    Bronze corydoras (Corydoras aeneus) uses the distal part of the intestine as accessory respiratory organ. Our previous study showed the presence of epidermal growth factor receptor (EGFR) cytoplasmic domain in the digestive tract of the bronze corydoras. In this study, using Western Blot method, we validated the results presented in the previous research. In detail, results of Western Blot analysis on digestive and respiratory part of bronze corydoras intestine homogenates confirmed the immunoreactivity of anti-cytoplasmic domain (C-terminal) human EGFR antibodies with protein band of approximately 180kDa (EGFR molecular weight). This indicates a high homology of EGFR domain between these species and the possibility of such antibody use in bronze corydoras. Statistically significantly higher EGFR expression was observed in the respiratory part of intestine when compared to the digestive part. This implies higher proliferation activity and angiogenesis of epithelium in this part of intestine, creating conditions for air respiration. Therefore, Corydoras aeneus may be considered as a model organism for the molecular studies of the mechanisms of epithelial proliferation initiation and inhibition depending on hypoxia and normoxia. Copyright © 2017. Published by Elsevier GmbH.

  1. Crowdsourcing Broad Absorption Line Properties and Other Features of Quasar Outflow Using Zooniverse Citizen Science Project Platform

    NASA Astrophysics Data System (ADS)

    Crowe, Cassie; Lundgren, Britt; Grier, Catherine

    2018-01-01

    The Sloan Digital Sky Survey (SDSS) regularly publishes vast catalogs of quasars and other astronomical objects. Previously, the SDSS collaboration has used visual inspection to check quasar redshift validity and flag instances of broad absorption lines (BALs). This information helps researchers to easily single out the quasars with BAL properties and study their outflows and other intervening gas clouds. Due to the ever-growing number of new SDSS quasar observations, visual inspections are no longer possible using previous methods. Currently, BAL information is being determined entirely computationally, and the accuracy of that information is not precisely known. This project uses the Zooniverse citizen science platform to visually inspect quasar spectra for BAL properties, to check the accuracy of the current autonomous methods, and to flag multi-phase outflows and find candidates for in-falling gas into the quasar central engine. The layout and format of a Zooniverse project provides an easier way to inspect and record data on each spectrum and share the workload via crowdsourcing. Work done by the SDSS collaboration members is serving as a beta test for a public project upon the official release of the DR14 quasar catalog by SDSS.

  2. Trends of Gallbladder Cancer in Jordan Over 2 Decades: Where Are We?

    PubMed Central

    Al Manasra, Abdel Rahman; Bani Hani, Mohammed; Qandeel, Haitham; Al Asmar, Samer; Alqudah, Mohammad; Al-Zoubi, Nabil; Nadig, Satish; Hamouri, Shadi; Obeidat, Khaled; Al-Muqaimi, Nada

    2018-01-01

    Background and Study Aims: The prevalence of gallbladder cancer (GBC) varies between different parts of the world. This study is a review of literature and an update of a previously published study conducted in our university and aims to reassess the incidence of GBC over the past 2 decades. Patients and Methods: We conducted a retrospective study between 2002 and 2016. Data regarding demographics, clinical presentation, risk factors, histopathology, investigations, and treatments were obtained. A diagnosis of GBC established during surgery or primarily detected in the surgical specimen was classified as incidental. Results: Of 11 391 cholecystectomies performed, 31 cases (0.27%) of GBC were found. The mean age of patients with GBC was 68 years (43-103 years), 74% were women. The annual incidence of GBC was 0.2/100 000 (men: 0.1/100 000; women: 0.3/100 000). Biliary colic and acute cholecystitis were the main presentations. Diagnosis of GBC was “incidental” in 67% of cases. About 75% of patients with GBC had gallstones, 13% had polyps, and 3% had porcelain gallbladder. Adenocarcinoma was the dominant (87%) histologic type. Conclusions: The GBC rate in our region, similar to others parts of the world, is still low and has not changed over the past 2 decades. This study consolidates the previously published recommendations regarding the high index of suspicion of GBC in elderly with cholelithiasis. PMID:29760576

  3. Rationing critical care medicine: recent studies and current trends.

    PubMed

    Ward, Nicholas S

    2005-12-01

    This paper reviews the literature on the rationing of critical care resources. Although much has been written about the concept of rationing, there have been few scientific studies as to its prevalence. A recent meta-analysis reviewed all previously published studies on rationing access to intensive care units but little is known about practices within the intensive care unit. Much literature in the past few years has focused on the growing use of critical care resources and projections for the future. Several authors suggest there may be a crisis in financial or personnel resources if some rationing does not take place. Other papers have argued that the methods of rationing critical care previously proposed, such as limiting the care of dying patients or using cost-effectiveness analysis to determine care, may not be effective or viewed as ethical by some. Finally, several recent papers review how critical care is practiced and allocated in India and Asian countries that already practice open rationing in their health care systems. There is currently no published evidence that overt rationing is taking place in critical care medicine. There is growing evidence that in the future, the need for critical care may outstrip financial resources unless some form of rationing takes place. It is also clear from the literature that choosing how to ration critical care will be a difficult task.

  4. Critical Appraisal of Emergency Medicine Education Research: The Best Publications of 2014.

    PubMed

    Yarris, Lalena M; Juve, Amy Miller; Coates, Wendy C; Fisher, Jonathan; Heitz, Corey; Shayne, Philip; Farrell, Susan E

    2015-11-01

    The objective was to critically appraise and highlight rigorous education research study articles published in 2014 whose outcomes advance the science of emergency medicine (EM) education. A search of the English language literature in 2014 querying Education Resources Information Center (ERIC), PsychINFO, PubMed, and Scopus identified 243 EM-related articles using either quantitative (hypothesis-testing or observational investigations of educational interventions) or qualitative (exploring important phenomena in EM education) methods. Two reviewers independently screened all of the publications using previously established exclusion criteria. Six reviewers then independently scored the 25 selected publications using either a qualitative or a quantitative scoring system. Each scoring system consisted of nine criteria. Selected criteria were based on accepted educational review literature and chosen a priori. Both scoring systems use parallel scoring metrics and have been used previously within this annual review. Twenty-five medical education research papers (22 quantitative, three qualitative) met the criteria for inclusion and were reviewed. Five quantitative and two qualitative studies were ranked most highly by the reviewers as exemplary and are summarized in this article. This annual critical appraisal series highlights seven excellent EM education research studies, meeting a priori criteria and published in 2014. Methodologic strengths in the 2014 papers are noted, and current trends in medical education research in EM are discussed. © 2015 by the Society for Academic Emergency Medicine.

  5. Trends in biomedical informatics: automated topic analysis of JAMIA articles.

    PubMed

    Han, Dong; Wang, Shuang; Jiang, Chao; Jiang, Xiaoqian; Kim, Hyeon-Eui; Sun, Jimeng; Ohno-Machado, Lucila

    2015-11-01

    Biomedical Informatics is a growing interdisciplinary field in which research topics and citation trends have been evolving rapidly in recent years. To analyze these data in a fast, reproducible manner, automation of certain processes is needed. JAMIA is a "generalist" journal for biomedical informatics. Its articles reflect the wide range of topics in informatics. In this study, we retrieved Medical Subject Headings (MeSH) terms and citations of JAMIA articles published between 2009 and 2014. We use tensors (i.e., multidimensional arrays) to represent the interaction among topics, time and citations, and applied tensor decomposition to automate the analysis. The trends represented by tensors were then carefully interpreted and the results were compared with previous findings based on manual topic analysis. A list of most cited JAMIA articles, their topics, and publication trends over recent years is presented. The analyses confirmed previous studies and showed that, from 2012 to 2014, the number of articles related to MeSH terms Methods, Organization & Administration, and Algorithms increased significantly both in number of publications and citations. Citation trends varied widely by topic, with Natural Language Processing having a large number of citations in particular years, and Medical Record Systems, Computerized remaining a very popular topic in all years. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. AMS 4.0: consensus prediction of post-translational modifications in protein sequences.

    PubMed

    Plewczynski, Dariusz; Basu, Subhadip; Saha, Indrajit

    2012-08-01

    We present here the 2011 update of the AutoMotif Service (AMS 4.0) that predicts the wide selection of 88 different types of the single amino acid post-translational modifications (PTM) in protein sequences. The selection of experimentally confirmed modifications is acquired from the latest UniProt and Phospho.ELM databases for training. The sequence vicinity of each modified residue is represented using amino acids physico-chemical features encoded using high quality indices (HQI) obtaining by automatic clustering of known indices extracted from AAindex database. For each type of the numerical representation, the method builds the ensemble of Multi-Layer Perceptron (MLP) pattern classifiers, each optimising different objectives during the training (for example the recall, precision or area under the ROC curve (AUC)). The consensus is built using brainstorming technology, which combines multi-objective instances of machine learning algorithm, and the data fusion of different training objects representations, in order to boost the overall prediction accuracy of conserved short sequence motifs. The performance of AMS 4.0 is compared with the accuracy of previous versions, which were constructed using single machine learning methods (artificial neural networks, support vector machine). Our software improves the average AUC score of the earlier version by close to 7 % as calculated on the test datasets of all 88 PTM types. Moreover, for the selected most-difficult sequence motifs types it is able to improve the prediction performance by almost 32 %, when compared with previously used single machine learning methods. Summarising, the brainstorming consensus meta-learning methodology on the average boosts the AUC score up to around 89 %, averaged over all 88 PTM types. Detailed results for single machine learning methods and the consensus methodology are also provided, together with the comparison to previously published methods and state-of-the-art software tools. The source code and precompiled binaries of brainstorming tool are available at http://code.google.com/p/automotifserver/ under Apache 2.0 licensing.

  7. A tool for design of primers for microRNA-specific quantitative RT-qPCR.

    PubMed

    Busk, Peter K

    2014-01-28

    MicroRNAs are small but biologically important RNA molecules. Although different methods can be used for quantification of microRNAs, quantitative PCR is regarded as the reference that is used to validate other methods. Several commercial qPCR assays are available but they often come at a high price and the sequences of the primers are not disclosed. An alternative to commercial assays is to manually design primers but this work is tedious and, hence, not practical for the design of primers for a larger number of targets. I have developed the software miRprimer for automatic design of primers for the method miR-specific RT-qPCR, which is one of the best performing microRNA qPCR methods available. The algorithm is based on an implementation of the previously published rules for manual design of miR-specific primers with the additional feature of evaluating the propensity of formation of secondary structures and primer dimers. Testing of the primers showed that 76 out of 79 primers (96%) worked for quantification of microRNAs by miR-specific RT-qPCR of mammalian RNA samples. This success rate corresponds to the success rate of manual primer design. Furthermore, primers designed by this method have been distributed to several labs and used successfully in published studies. The software miRprimer is an automatic and easy method for design of functional primers for miR-specific RT-qPCR. The application is available as stand-alone software that will work on the MS Windows platform and in a developer version written in the Ruby programming language.

  8. Multivariate meta-analysis of prognostic factor studies with multiple cut-points and/or methods of measurement.

    PubMed

    Riley, Richard D; Elia, Eleni G; Malin, Gemma; Hemming, Karla; Price, Malcolm P

    2015-07-30

    A prognostic factor is any measure that is associated with the risk of future health outcomes in those with existing disease. Often, the prognostic ability of a factor is evaluated in multiple studies. However, meta-analysis is difficult because primary studies often use different methods of measurement and/or different cut-points to dichotomise continuous factors into 'high' and 'low' groups; selective reporting is also common. We illustrate how multivariate random effects meta-analysis models can accommodate multiple prognostic effect estimates from the same study, relating to multiple cut-points and/or methods of measurement. The models account for within-study and between-study correlations, which utilises more information and reduces the impact of unreported cut-points and/or measurement methods in some studies. The applicability of the approach is improved with individual participant data and by assuming a functional relationship between prognostic effect and cut-point to reduce the number of unknown parameters. The models provide important inferential results for each cut-point and method of measurement, including the summary prognostic effect, the between-study variance and a 95% prediction interval for the prognostic effect in new populations. Two applications are presented. The first reveals that, in a multivariate meta-analysis using published results, the Apgar score is prognostic of neonatal mortality but effect sizes are smaller at most cut-points than previously thought. In the second, a multivariate meta-analysis of two methods of measurement provides weak evidence that microvessel density is prognostic of mortality in lung cancer, even when individual participant data are available so that a continuous prognostic trend is examined (rather than cut-points). © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  9. Fast and accurate inference of local ancestry in Latino populations

    PubMed Central

    Baran, Yael; Pasaniuc, Bogdan; Sankararaman, Sriram; Torgerson, Dara G.; Gignoux, Christopher; Eng, Celeste; Rodriguez-Cintron, William; Chapela, Rocio; Ford, Jean G.; Avila, Pedro C.; Rodriguez-Santana, Jose; Burchard, Esteban Gonzàlez; Halperin, Eran

    2012-01-01

    Motivation: It is becoming increasingly evident that the analysis of genotype data from recently admixed populations is providing important insights into medical genetics and population history. Such analyses have been used to identify novel disease loci, to understand recombination rate variation and to detect recent selection events. The utility of such studies crucially depends on accurate and unbiased estimation of the ancestry at every genomic locus in recently admixed populations. Although various methods have been proposed and shown to be extremely accurate in two-way admixtures (e.g. African Americans), only a few approaches have been proposed and thoroughly benchmarked on multi-way admixtures (e.g. Latino populations of the Americas). Results: To address these challenges we introduce here methods for local ancestry inference which leverage the structure of linkage disequilibrium in the ancestral population (LAMP-LD), and incorporate the constraint of Mendelian segregation when inferring local ancestry in nuclear family trios (LAMP-HAP). Our algorithms uniquely combine hidden Markov models (HMMs) of haplotype diversity within a novel window-based framework to achieve superior accuracy as compared with published methods. Further, unlike previous methods, the structure of our HMM does not depend on the number of reference haplotypes but on a fixed constant, and it is thereby capable of utilizing large datasets while remaining highly efficient and robust to over-fitting. Through simulations and analysis of real data from 489 nuclear trio families from the mainland US, Puerto Rico and Mexico, we demonstrate that our methods achieve superior accuracy compared with published methods for local ancestry inference in Latinos. Availability: http://lamp.icsi.berkeley.edu/lamp/lampld/ Contact: bpasaniu@hsph.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22495753

  10. Fourier Magnitude-Based Privacy-Preserving Clustering on Time-Series Data

    NASA Astrophysics Data System (ADS)

    Kim, Hea-Suk; Moon, Yang-Sae

    Privacy-preserving clustering (PPC in short) is important in publishing sensitive time-series data. Previous PPC solutions, however, have a problem of not preserving distance orders or incurring privacy breach. To solve this problem, we propose a new PPC approach that exploits Fourier magnitudes of time-series. Our magnitude-based method does not cause privacy breach even though its techniques or related parameters are publicly revealed. Using magnitudes only, however, incurs the distance order problem, and we thus present magnitude selection strategies to preserve as many Euclidean distance orders as possible. Through extensive experiments, we showcase the superiority of our magnitude-based approach.

  11. Vibrations of cantilevered circular cylindrical shells Shallow versus deep shell theory

    NASA Technical Reports Server (NTRS)

    Lee, J. K.; Leissa, A. W.; Wang, A. J.

    1983-01-01

    Free vibrations of cantilevered circular cylindrical shells having rectangular planforms are studied in this paper by means of the Ritz method. The deep shell theory of Novozhilov and Goldenveizer is used and compared with the usual shallow shell theory for a wide range of shell parameters. A thorough convergence study is presented along with comparisons to previously published finite element solutions and experimental results. Accurately computed frequency parameters and mode shapes for various shell configurations are presented. The present paper appears to be the first comprehensive study presenting rigorous comparisons between the two shell theories in dealing with free vibrations of cantilevered cylindrical shells.

  12. Potential Vaccines and Post-Exposure Treatments for Filovirus Infections

    PubMed Central

    Friedrich, Brian M.; Trefry, John C.; Biggins, Julia E.; Hensley, Lisa E.; Honko, Anna N.; Smith, Darci R.; Olinger, Gene G.

    2012-01-01

    Viruses of the family Filoviridae represent significant health risks as emerging infectious diseases as well as potentially engineered biothreats. While many research efforts have been published offering possibilities toward the mitigation of filoviral infection, there remain no sanctioned therapeutic or vaccine strategies. Current progress in the development of filovirus therapeutics and vaccines is outlined herein with respect to their current level of testing, evaluation, and proximity toward human implementation, specifically with regard to human clinical trials, nonhuman primate studies, small animal studies, and in vitro development. Contemporary methods of supportive care and previous treatment approaches for human patients are also discussed. PMID:23170176

  13. Diagnosing vocal cord dysfunction in young athletes.

    PubMed

    Rhodes, Rea Kae

    2008-12-01

    To provide an overview of the pathophysiology, steps in making a diagnosis, differential diagnosis, and treatment methods for vocal cord dysfunction (VCD) in young athletes. Review of published literature about VCD and exercise-induced asthma (EIA) and a case study. The clinical presentation of VCD is often confusing. A young athlete who is having difficulty "catching his breath" may have more than EIA. Young athletes who have been previously diagnosed with EIA may actually have VCD. The ability to correctly differentiate VCD from other causes of respiratory distress can lead to accurate interventions, save precious time in an acute situation, and promote long-term control of this condition.

  14. A fast, parallel algorithm for distant-dependent calculation of crystal properties

    NASA Astrophysics Data System (ADS)

    Stein, Matthew

    2017-12-01

    A fast, parallel algorithm for distant-dependent calculation and simulation of crystal properties is presented along with speedup results and methods of application. An illustrative example is used to compute the Lennard-Jones lattice constants up to 32 significant figures for 4 ≤ p ≤ 30 in the simple cubic, face-centered cubic, body-centered cubic, hexagonal-close-pack, and diamond lattices. In most cases, the known precision of these constants is more than doubled, and in some cases, corrected from previously published figures. The tools and strategies to make this computation possible are detailed along with application to other potentials, including those that model defects.

  15. Devices that can identify positive vs. negative charge

    NASA Astrophysics Data System (ADS)

    Lincoln, James

    2017-10-01

    When your clothes come out of the dryer, covered with static, do you know whether they are positively or negatively charged? In this article, I discuss a variety of devices that can determine sign of the charge on an insulator or conductor. Purposefully, none of these methods utilize comparison with a known charge. Some of these ideas have been previously published, and I am extending them, but many are original. These demonstrations provide students and teachers with an opportunity to contrast the actual flow of charge with conventional current and to compare the behavior of positive and negative charges with what we expect from protons and electrons.

  16. Practical in-situ determination of ortho-para hydrogen ratios via fiber-optic based Raman spectroscopy

    DOE PAGES

    Sutherland, Liese -Marie; Knudson, James N.; Mocko, Michal; ...

    2015-12-17

    An experiment was designed and developed to prototype a fiber-optic-based laser system, which measures the ratio of ortho-hydrogen to para-hydrogen in an operating neutron moderator system at the Los Alamos Neutron Science Center (LANSCE) spallation neutron source. Preliminary measurements resulted in an ortho to para ratio of 3.06:1, which is within acceptable agreement with the previously published ratio. As a result, the successful demonstration of Raman Spectroscopy for this measurement is expected to lead to a practical method that can be applied for similar in-situ measurements at operating neutron spallation sources.

  17. Practical in-situ determination of ortho-para hydrogen ratios via fiber-optic based Raman spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutherland, Liese -Marie; Knudson, James N.; Mocko, Michal

    An experiment was designed and developed to prototype a fiber-optic-based laser system, which measures the ratio of ortho-hydrogen to para-hydrogen in an operating neutron moderator system at the Los Alamos Neutron Science Center (LANSCE) spallation neutron source. Preliminary measurements resulted in an ortho to para ratio of 3.06:1, which is within acceptable agreement with the previously published ratio. As a result, the successful demonstration of Raman Spectroscopy for this measurement is expected to lead to a practical method that can be applied for similar in-situ measurements at operating neutron spallation sources.

  18. Assessment of NASA Dual Microstructure Heat Treatment Method for Multiple Forging Batch Heat Treatment

    NASA Technical Reports Server (NTRS)

    Gayda, John (Technical Monitor); Lemsky, Joe

    2004-01-01

    NASA dual microstructure heat treatment technology previously demonstrated on single forging heat treat batches of a generic disk shape was successfully demonstrated on a multiple disk batch of a production shape component. A group of four Rolls-Royce Corporation 3rd Stage AE2100 forgings produced from alloy ME209 were successfully dual microstructure heat treated as a single heat treat batch. The forgings responded uniformly as evidenced by part-to-part consistent thermocouple recordings and resultant macrostructures, and from ultrasonic examination. Multiple disk DMHT processing offers a low cost alternative to other published dual microstructure processing techniques.

  19. Preeclampsia: A review of the pathogenesis and possible management strategies based on its pathophysiological derangements.

    PubMed

    El-Sayed, Amel A F

    2017-10-01

    This review is divided into three parts. The first part briefly describes the pathogenesis of preeclampsia. This is followed by reviewing previously reported management strategies of the disease based on its pathophysiological derangements. Finally, the author defines the safe and acceptable methods/medications that may be used to 'prevent' preeclampsia (in high risk patients) and those that may be used to 'treat' preeclampsia (meant to prolong the pregnancy in patients with established preeclampsia). The review concludes that multi-center trials are required to include multiple drugs in the same management protocol. Copyright © 2017. Published by Elsevier B.V.

  20. 75 FR 56608 - Agency Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-16

    ... SECURITIES AND EXCHANGE COMMISSION Agency Meeting Federal Register Citation of Previous Announcement: [To be published] Status: Open Meeting. Place: 100 F. Street, NE., Washington, DC. Date and Time of Previously Announced Meeting: September 15, 2010. Change In the Meeting: Room Change. The Joint...

  1. Genetic analysis shows that morphology alone cannot distinguish asian carp eggs from those of other cyprinid species

    USGS Publications Warehouse

    Larson, James H.; McCalla, S. Grace; Chapman, Duane C.; Rees, Christopher B.; Knights, Brent C.; Vallazza, Jon; George, Amy E.; Richardson, William B.; Amberg, Jon J.

    2016-01-01

    Fish eggs and embryos (hereafter collectively referred to as “eggs”) were collected in the upper Mississippi River main stem (~300 km upstream of previously reported spawning by invasive Asian carp) during summer 2013. Based on previously published morphological characteristics, the eggs were identified as belonging to Asian carp. A subsample of the eggs was subsequently analyzed by using molecular methods to determine species identity. Genetic identification using the cytochrome-c oxidase 1 gene was attempted for a total of 41 eggs. Due to the preservation technique used (formalin) and the resulting DNA degradation, sequences were recovered from only 17 individual eggs. In all 17 cases, cyprinids other than Asian carp (usually Notropis sp.) were identified as the most likely species. In previously published reports, a key characteristic that distinguished Asian carp eggs from those of other cyprinids was size: Asian carp eggs exhibited diameters ranging from 4.0 to 6.0 mm and were thought to be much larger than the otherwise similar eggs of native species. Eggs from endemic cyprinids were believed to rarely reach 3.0 mm and had not been observed to exceed 3.3 mm. However, many of the eggs that were genetically identified as originating from native cyprinids were as large as 4.0 mm in diameter (at early developmental stages) and were therefore large enough to over- lap with the lower end of the size range observed for Asian carp eggs. Researchers studying the egg stages of Asian carp and other cyprinids should plan on preserving subsets of eggs for genetic analysis to confirm morphological identifications.

  2. Mendelism: New Insights from Gregor Mendel's Lectures in Brno.

    PubMed

    Zhang, Hui; Chen, Wen; Sun, Kun

    2017-09-01

    Interpretation of Gregor Mendel's work has previously been based on study of his published paper "Experiments in Plant Hybridization." In contrast, the lectures that he gave preceding publication of this work have been largely neglected for more than 150 years. Here, we report on and interpret the content of Mendel's previous two lectures, as they were reported in a local newspaper. We comprehensively reference both the text of his paper and the historical background of his experiments. Our analysis shows that while Mendel had inherited the traditional research program on interspecific hybridization in plants, he introduced the novel method of ratio analysis for representing the variation of unit-characters among offspring of hybrids. His aim was to characterize and explain the developmental features of the distributional pattern of unit-characters in two series of hybrid experiments, using self-crosses and backcrosses with parents. In doing so, he not only answered the question of what the unit-characters were and the nature of their hierarchical classification, but also successfully inferred the numerical principle of unit-character transmission from generation to generation. He also established the nature of the composition and behaviors of reproductive cells from one generation to the next. Here we highlight the evidence from Mendel's lectures, clearly announcing that he had discovered the general law of cross-generation transmission of unit-characters through reproductive cells containing unit-factors. The recovered content of these previous lectures more accurately describes the work he performed with his garden peas than his published paper and shows how he first presented it in Brno. It is thus an invaluable resource for understanding the origin of the science of genetics. Copyright © 2017 by the Genetics Society of America.

  3. Use of a Granulocyte Immunofluorescence Assay Designed for Humans for Detection of Antineutrophil Cytoplasmic Antibodies in Dogs with Chronic Enteropathies.

    PubMed

    Florey, J; Viall, A; Streu, S; DiMuro, V; Riddle, A; Kirk, J; Perazzotti, L; Affeldt, K; Wagner, R; Vaden, S; Harris, T; Allenspach, K

    2017-07-01

    Perinuclear antineutrophil cytoplasmic antibodies (pANCA) previously have been shown to be serum markers in dogs with chronic enteropathies, with dogs that have food-responsive disease (FRD) having higher frequencies of seropositivity than dogs with steroid-responsive disease (SRD). The indirect immunofluorescence (IIF) assay used in previous publications is time-consuming to perform, with low interobserver agreement. We hypothesized that a commercially available granulocyte IIF assay designed for humans could be used to detect perinuclear antineutrophil cytoplasmic antibodies in dogs. Forty-four dogs with FRD, 20 dogs with SRD, 20 control dogs, and 38 soft-coated wheaten terrier (SCWT) or SCWT-cross dogs. A granulocyte assay designed for humans was used to detect pANCA, cANCA, and antinuclear antibodies (ANA), as well as antibodies against proteinase-3 protein (PR-3) and myeloperoxidase protein (MPO) in archived serum samples. Sensitivity of the granulocyte assay to predict FRD in dogs was 0.61 (95% confidence interval (CI), 0.45, 0.75), and specificity was 1.00 (95% CI, 0.91, 1.00). A significant association was identified between positive pANCA or cANCA result and diagnosis of FRD (P < 0.0001). Agreement between the two assays to detect ANCA in the same serum samples from SCWT with protein-losing enteropathy/protein-losing nephropathy (PLE/PLN) was substantial (kappa, 0.77; 95% CI, 0.53, 1.00). Eight ANCA-positive cases were positive for MPO or PR-3 antibodies. The granulocyte immunofluorescence assay used in our pilot study was easy and quick to perform. Agreement with the previously published method was good. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  4. Optimized formulas for the gravitational field of a tesseroid

    NASA Astrophysics Data System (ADS)

    Grombein, Thomas; Seitz, Kurt; Heck, Bernhard

    2013-07-01

    Various tasks in geodesy, geophysics, and related geosciences require precise information on the impact of mass distributions on gravity field-related quantities, such as the gravitational potential and its partial derivatives. Using forward modeling based on Newton's integral, mass distributions are generally decomposed into regular elementary bodies. In classical approaches, prisms or point mass approximations are mostly utilized. Considering the effect of the sphericity of the Earth, alternative mass modeling methods based on tesseroid bodies (spherical prisms) should be taken into account, particularly in regional and global applications. Expressions for the gravitational field of a point mass are relatively simple when formulated in Cartesian coordinates. In the case of integrating over a tesseroid volume bounded by geocentric spherical coordinates, it will be shown that it is also beneficial to represent the integral kernel in terms of Cartesian coordinates. This considerably simplifies the determination of the tesseroid's potential derivatives in comparison with previously published methodologies that make use of integral kernels expressed in spherical coordinates. Based on this idea, optimized formulas for the gravitational potential of a homogeneous tesseroid and its derivatives up to second-order are elaborated in this paper. These new formulas do not suffer from the polar singularity of the spherical coordinate system and can, therefore, be evaluated for any position on the globe. Since integrals over tesseroid volumes cannot be solved analytically, the numerical evaluation is achieved by means of expanding the integral kernel in a Taylor series with fourth-order error in the spatial coordinates of the integration point. As the structure of the Cartesian integral kernel is substantially simplified, Taylor coefficients can be represented in a compact and computationally attractive form. Thus, the use of the optimized tesseroid formulas particularly benefits from a significant decrease in computation time by about 45 % compared to previously used algorithms. In order to show the computational efficiency and to validate the mathematical derivations, the new tesseroid formulas are applied to two realistic numerical experiments and are compared to previously published tesseroid methods and the conventional prism approach.

  5. Reconnaissance geologic map of part of the San Isidro Quadrangle, Baja California Sur, Mexico

    USGS Publications Warehouse

    McLean, Hugh; Hausback, B.P.; Knapp, J.H.

    1985-01-01

    Mapping was done on aerial photographs and transferred, where possible, to 1:50,000-scale topographic base maps. Areas with roads were field checked; however, in the northeast part of the map area, lack of roads prevented field checks. Previous geologic surveys of parts of the map area were made by horseback in the early 1920's; reports were published by Darton (1921), Heim (1922), and Beal (1948). Subsurface data from petroleum exploration and a geologic map were incorporated in a regional study by Mina (1957). The first radiometric ages of rocks from the map area were published by Gastil and others (1979). Recently determined radiometric ages and chemical analysis of volcanic rocks were reported by Hausback (1984) and by Sawlan and Smith (1984). Our study incorporates geologic mapping with age control based on new radiometric ages as well as paleontology, Flows and tuffs were dated by the K-Ar method. Fossil ages are based on diatom and mollusk assemblages.

  6. Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results

    PubMed Central

    Wicherts, Jelte M.; Bakker, Marjan; Molenaar, Dylan

    2011-01-01

    Background The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. Methods and Findings We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. Conclusions Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies. PMID:22073203

  7. MS2PIP prediction server: compute and visualize MS2 peak intensity predictions for CID and HCD fragmentation.

    PubMed

    Degroeve, Sven; Maddelein, Davy; Martens, Lennart

    2015-07-01

    We present an MS(2) peak intensity prediction server that computes MS(2) charge 2+ and 3+ spectra from peptide sequences for the most common fragment ions. The server integrates the Unimod public domain post-translational modification database for modified peptides. The prediction model is an improvement of the previously published MS(2)PIP model for Orbitrap-LTQ CID spectra. Predicted MS(2) spectra can be downloaded as a spectrum file and can be visualized in the browser for comparisons with observations. In addition, we added prediction models for HCD fragmentation (Q-Exactive Orbitrap) and show that these models compute accurate intensity predictions on par with CID performance. We also show that training prediction models for CID and HCD separately improves the accuracy for each fragmentation method. The MS(2)PIP prediction server is accessible from http://iomics.ugent.be/ms2pip. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Body mass index and dental caries in children and adolescents: a systematic review of literature published 2004 to 2011

    PubMed Central

    2012-01-01

    The objective The authors undertook an updated systematic review of the relationship between body mass index and dental caries in children and adolescents. Method The authors searched Medline, ISI, Cochrane, Scopus, Global Health and CINAHL databases and conducted lateral searches from reference lists for papers published from 2004 to 2011, inclusive. All empirical papers that tested associations between body mass index and dental caries in child and adolescent populations (aged 0 to 18 years) were included. Results Dental caries is associated with both high and low body mass index. Conclusion A non-linear association between body mass index and dental caries may account for inconsistent findings in previous research. We recommend future research investigate the nature of the association between body mass index and dental caries in samples that include a full range of body mass index scores, and explore how factors such as socioeconomic status mediate the association between body mass index and dental caries. PMID:23171603

  9. Validation of cryo-EM structure of IP₃R1 channel.

    PubMed

    Murray, Stephen C; Flanagan, John; Popova, Olga B; Chiu, Wah; Ludtke, Steven J; Serysheva, Irina I

    2013-06-04

    About a decade ago, three electron cryomicroscopy (cryo-EM) single-particle reconstructions of IP3R1 were reported at low resolution. It was disturbing that these structures bore little similarity to one another, even at the level of quaternary structure. Recently, we published an improved structure of IP3R1 at ∼1 nm resolution. However, this structure did not bear any resemblance to any of the three previously published structures, leading to the question of why the structure should be considered more reliable than the original three. Here, we apply several methods, including class-average/map comparisons, tilt-pair validation, and use of multiple refinement software packages, to give strong evidence for the reliability of our recent structure. The map resolution and feature resolvability are assessed with the gold standard criterion. This approach is generally applicable to assessing the validity of cryo-EM maps of other molecular machines. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. On-column trypsinization allows for re-use of matrix in modified multiplexed inhibitor beads assay.

    PubMed

    Petrovic, Voin; Olaisen, Camilla; Sharma, Animesh; Nepal, Anala; Bugge, Steffen; Sundby, Eirik; Hoff, Bård Helge; Slupphaug, Geir; Otterlei, Marit

    2017-04-15

    The Multiplexed Inhibitor Bead (MIB) assay is a previously published quantitative proteomic MS-based approach to study cellular kinomes. A rather extensive procedure, need for multiple custom-made kinase inhibitors and an inability to re-use the MIB-columns, has limited its applicability. Here we present a modified MIB assay in which elution of bound proteins is facilitated by on-column trypsinization. We tested the modified MIB assay by analyzing extract from three human cancer cell lines treated with the cytotoxic drugs cisplatin or docetaxel. Using only three immobilized kinase inhibitors, we were able to detect about 6000 proteins, including ∼40% of the kinome, as well as other signaling, metabolic and structural proteins. The method is reproducible and the MIB-columns are re-usable without loss of performance. This makes the MIB assay a simple, affordable, and rapid assay for monitoring changes in cellular signaling. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  11. An approximation of herd effect due to vaccinating children against seasonal influenza - a potential solution to the incorporation of indirect effects into static models.

    PubMed

    Van Vlaenderen, Ilse; Van Bellinghen, Laure-Anne; Meier, Genevieve; Nautrup, Barbara Poulsen

    2013-01-22

    Indirect herd effect from vaccination of children offers potential for improving the effectiveness of influenza prevention in the remaining unvaccinated population. Static models used in cost-effectiveness analyses cannot dynamically capture herd effects. The objective of this study was to develop a methodology to allow herd effect associated with vaccinating children against seasonal influenza to be incorporated into static models evaluating the cost-effectiveness of influenza vaccination. Two previously published linear equations for approximation of herd effects in general were compared with the results of a structured literature review undertaken using PubMed searches to identify data on herd effects specific to influenza vaccination. A linear function was fitted to point estimates from the literature using the sum of squared residuals. The literature review identified 21 publications on 20 studies for inclusion. Six studies provided data on a mathematical relationship between effective vaccine coverage in subgroups and reduction of influenza infection in a larger unvaccinated population. These supported a linear relationship when effective vaccine coverage in a subgroup population was between 20% and 80%. Three studies evaluating herd effect at a community level, specifically induced by vaccinating children, provided point estimates for fitting linear equations. The fitted linear equation for herd protection in the target population for vaccination (children) was slightly less conservative than a previously published equation for herd effects in general. The fitted linear equation for herd protection in the non-target population was considerably less conservative than the previously published equation. This method of approximating herd effect requires simple adjustments to the annual baseline risk of influenza in static models: (1) for the age group targeted by the childhood vaccination strategy (i.e. children); and (2) for other age groups not targeted (e.g. adults and/or elderly). Two approximations provide a linear relationship between effective coverage and reduction in the risk of infection. The first is a conservative approximation, recommended as a base-case for cost-effectiveness evaluations. The second, fitted to data extracted from a structured literature review, provides a less conservative estimate of herd effect, recommended for sensitivity analyses.

  12. Data reuse and the open data citation advantage

    PubMed Central

    Vision, Todd J.

    2013-01-01

    Background. Attribution to the original contributor upon reuse of published data is important both as a reward for data creators and to document the provenance of research findings. Previous studies have found that papers with publicly available datasets receive a higher number of citations than similar studies without available data. However, few previous analyses have had the statistical power to control for the many variables known to predict citation rate, which has led to uncertain estimates of the “citation benefit”. Furthermore, little is known about patterns in data reuse over time and across datasets. Method and Results. Here, we look at citation rates while controlling for many known citation predictors and investigate the variability of data reuse. In a multivariate regression on 10,555 studies that created gene expression microarray data, we found that studies that made data available in a public repository received 9% (95% confidence interval: 5% to 13%) more citations than similar studies for which the data was not made available. Date of publication, journal impact factor, open access status, number of authors, first and last author publication history, corresponding author country, institution citation history, and study topic were included as covariates. The citation benefit varied with date of dataset deposition: a citation benefit was most clear for papers published in 2004 and 2005, at about 30%. Authors published most papers using their own datasets within two years of their first publication on the dataset, whereas data reuse papers published by third-party investigators continued to accumulate for at least six years. To study patterns of data reuse directly, we compiled 9,724 instances of third party data reuse via mention of GEO or ArrayExpress accession numbers in the full text of papers. The level of third-party data use was high: for 100 datasets deposited in year 0, we estimated that 40 papers in PubMed reused a dataset by year 2, 100 by year 4, and more than 150 data reuse papers had been published by year 5. Data reuse was distributed across a broad base of datasets: a very conservative estimate found that 20% of the datasets deposited between 2003 and 2007 had been reused at least once by third parties. Conclusion. After accounting for other factors affecting citation rate, we find a robust citation benefit from open data, although a smaller one than previously reported. We conclude there is a direct effect of third-party data reuse that persists for years beyond the time when researchers have published most of the papers reusing their own data. Other factors that may also contribute to the citation benefit are considered. We further conclude that, at least for gene expression microarray data, a substantial fraction of archived datasets are reused, and that the intensity of dataset reuse has been steadily increasing since 2003. PMID:24109559

  13. Exact vibration analysis of a double-nanobeam-systems embedded in an elastic medium by a Hamiltonian-based method

    NASA Astrophysics Data System (ADS)

    Zhou, Zhenhuan; Li, Yuejie; Fan, Junhai; Rong, Dalun; Sui, Guohao; Xu, Chenghui

    2018-05-01

    A new Hamiltonian-based approach is presented for finding exact solutions for transverse vibrations of double-nanobeam-systems embedded in an elastic medium. The continuum model is established within the frameworks of the symplectic methodology and the nonlocal Euler-Bernoulli and Timoshenko beam beams. The symplectic eigenfunctions are obtained after expressing the governing equations in a Hamiltonian form. Exact frequency equations, vibration modes and displacement amplitudes are obtained by using symplectic eigenfunctions and end conditions. Comparisons with previously published work are presented to illustrate the accuracy and reliability of the proposed method. The comprehensive results for arbitrary boundary conditions could serve as benchmark results for verifying numerically obtained solutions. In addition, a study on the difference between the nonlocal beam and the nonlocal plate is also included.

  14. Fault detection and isolation in the challenging Tennessee Eastman process by using image processing techniques.

    PubMed

    Hajihosseini, Payman; Anzehaee, Mohammad Mousavi; Behnam, Behzad

    2018-05-22

    The early fault detection and isolation in industrial systems is a critical factor in preventing equipment damage. In the proposed method, instead of using the time signals of sensors, the 2D image obtained by placing these signals next to each other in a matrix has been used; and then a novel fault detection and isolation procedure has been carried out based on image processing techniques. Different features including texture, wavelet transform, mean and standard deviation of the image accompanied with MLP and RBF neural networks based classifiers have been used for this purpose. Obtained results indicate the notable efficacy and success of the proposed method in detecting and isolating faults of the Tennessee Eastman benchmark process and its superiority over previous techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Method of measuring blood oxygenation based on spectroscopy of diffusely scattered light

    NASA Astrophysics Data System (ADS)

    Kleshnin, M. S.; Orlova, A. G.; Kirillin, M. Yu.; Golubyatnikov, G. Yu.; Turchin, I. V.

    2017-05-01

    A new approach to the measurement of blood oxygenation is developed and implemented, based on an original two-step algorithm reconstructing the relative concentration of biological chromophores (haemoglobin, water, lipids) from the measured spectra of diffusely scattered light at different distances from the radiation source. The numerical experiments and approbation of the proposed approach using a biological phantom have shown the high accuracy of the reconstruction of optical properties of the object in question, as well as the possibility of correct calculation of the haemoglobin oxygenation in the presence of additive noises without calibration of the measuring device. The results of the experimental studies in animals agree with the previously published results obtained by other research groups and demonstrate the possibility of applying the developed method to the monitoring of blood oxygenation in tumour tissues.

  16. Nonparametric spirometry reference values for Hispanic Americans.

    PubMed

    Glenn, Nancy L; Brown, Vanessa M

    2011-02-01

    Recent literature sites ethnic origin as a major factor in developing pulmonary function reference values. Extensive studies established reference values for European and African Americans, but not for Hispanic Americans. The Third National Health and Nutrition Examination Survey defines Hispanic as individuals of Spanish speaking cultures. While no group was excluded from the target population, sample size requirements only allowed inclusion of individuals who identified themselves as Mexican Americans. This research constructs nonparametric reference value confidence intervals for Hispanic American pulmonary function. The method is applicable to all ethnicities. We use empirical likelihood confidence intervals to establish normal ranges for reference values. Its major advantage: it is model free, but shares asymptotic properties of model based methods. Statistical comparisons indicate that empirical likelihood interval lengths are comparable to normal theory intervals. Power and efficiency studies agree with previously published theoretical results.

  17. Quantitative photoabsorption and fluorescence spectroscopy of benzene, naphthalene, and some derivatives at 106-295 nm

    NASA Technical Reports Server (NTRS)

    Suto, Masako; Wang, Xiuyan; Shan, Jun; Lee, L. C.

    1992-01-01

    Photoabsorption and fluorescence cross sections of benzene, (o-, m-, p-) xylenes, naphthalene, 1-methylnaphthalene, and 2-ethylnaphthalene in the gas phase are measured at 106-295 nm using synchrotron radiation as a light source. Fluorescences are observed from the photoexcitation of benzene and xylenes at 230-280 nm and from naphthalene and its derivatives at 190-295 nm. The absolute fluorescence cross section is determined by calibration with respect to the emission intensity of the NO(A-X) system, for which the fluorescence quantum yield is equal to 1. To cross-check the current calibration method, the quantum yield of the SO2(C-X) system at 220-230 nm was measured since it is about equal to 1. The current quantum-yield data are compared with previously published values measured by different methods.

  18. An efficient formulation and implementation of the analytic energy gradient method to the single and double excitation coupled-cluster wave function - Application to Cl2O2

    NASA Technical Reports Server (NTRS)

    Rendell, Alistair P.; Lee, Timothy J.

    1991-01-01

    The analytic energy gradient for the single and double excitation coupled-cluster (CCSD) wave function has been reformulated and implemented in a new set of programs. The reformulated set of gradient equations have a smaller computational cost than any previously published. The iterative solution of the linear equations and the construction of the effective density matrices are fully vectorized, being based on matrix multiplications. The new method has been used to investigate the Cl2O2 molecule, which has recently been postulated as an important intermediate in the destruction of ozone in the stratosphere. In addition to reporting computational timings, the CCSD equilibrium geometries, harmonic vibrational frequencies, infrared intensities, and relative energetics of three isomers of Cl2O2 are presented.

  19. Solution of the one-dimensional consolidation theory equation with a pseudospectral method

    USGS Publications Warehouse

    Sepulveda, N.; ,

    1991-01-01

    The one-dimensional consolidation theory equation is solved for an aquifer system using a pseudospectral method. The spatial derivatives are computed using Fast Fourier Transforms and the time derivative is solved using a fourth-order Runge-Kutta scheme. The computer model calculates compaction based on the void ratio changes accumulated during the simulated periods of time. Compactions and expansions resulting from groundwater withdrawals and recharges are simulated for two observation wells in Santa Clara Valley and two in San Joaquin Valley, California. Field data previously published are used to obtain mean values for the soil grain density and the compression index and to generate depth-dependent profiles for hydraulic conductivity and initial void ratio. The water-level plots for the wells studied were digitized and used to obtain the time dependent profiles of effective stress.

  20. Validation of Laser-Induced Fluorescent Photogrammetric Targets on Membrane Structures

    NASA Technical Reports Server (NTRS)

    Jones, Thomas W.; Dorrington, Adrian A.; Shortis, Mark R.; Hendricks, Aron R.

    2004-01-01

    The need for static and dynamic characterization of a new generation of inflatable space structures requires the advancement of classical metrology techniques. A new photogrammetric-based method for non-contact ranging and surface profiling has been developed at NASA Langley Research Center (LaRC) to support modal analyses and structural validation of this class of space structures. This full field measurement method, known as Laser-Induced Fluorescence (LIF) photogrammetry, has previously yielded promising experimental results. However, data indicating the achievable measurement precision had not been published. This paper provides experimental results that indicate the LIF-photogrammetry measurement precision for three different target types used on a reflective membrane structure. The target types were: (1) non-contact targets generated using LIF, (2) surface attached retro-reflective targets, and (3) surface attached diffuse targets. Results from both static and dynamic investigations are included.

Top