Sample records for previously developed set

  1. Assessing Discriminative Performance at External Validation of Clinical Prediction Models

    PubMed Central

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W.

    2016-01-01

    Introduction External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. Methods We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. Results The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. Conclusion The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients. PMID:26881753

  2. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    PubMed

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W

    2016-01-01

    External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  3. Verification and implementation of set-up empirical models in pile design : research project capsule.

    DOT National Transportation Integrated Search

    2016-08-01

    The primary objectives of this research include: performing static and dynamic load tests on : newly instrumented test piles to better understand the set-up mechanism for individual soil : layers, verifying or recalibrating previously developed empir...

  4. The use of genetic programming to develop a predictor of swash excursion on sandy beaches

    NASA Astrophysics Data System (ADS)

    Passarella, Marinella; Goldstein, Evan B.; De Muro, Sandro; Coco, Giovanni

    2018-02-01

    We use genetic programming (GP), a type of machine learning (ML) approach, to predict the total and infragravity swash excursion using previously published data sets that have been used extensively in swash prediction studies. Three previously published works with a range of new conditions are added to this data set to extend the range of measured swash conditions. Using this newly compiled data set we demonstrate that a ML approach can reduce the prediction errors compared to well-established parameterizations and therefore it may improve coastal hazards assessment (e.g. coastal inundation). Predictors obtained using GP can also be physically sound and replicate the functionality and dependencies of previous published formulas. Overall, we show that ML techniques are capable of both improving predictability (compared to classical regression approaches) and providing physical insight into coastal processes.

  5. ImSET: Impact of Sector Energy Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roop, Joseph M.; Scott, Michael J.; Schultz, Robert W.

    2005-07-19

    This version of the Impact of Sector Energy Technologies (ImSET) model represents the ''next generation'' of the previously developed Visual Basic model (ImBUILD 2.0) that was developed in 2003 to estimate the macroeconomic impacts of energy-efficient technology in buildings. More specifically, a special-purpose version of the 1997 benchmark national Input-Output (I-O) model was designed specifically to estimate the national employment and income effects of the deployment of Office of Energy Efficiency and Renewable Energy (EERE) -developed energy-saving technologies. In comparison with the previous versions of the model, this version allows for more complete and automated analysis of the essential featuresmore » of energy efficiency investments in buildings, industry, transportation, and the electric power sectors. This version also incorporates improvements in the treatment of operations and maintenance costs, and improves the treatment of financing of investment options. ImSET is also easier to use than extant macroeconomic simulation models and incorporates information developed by each of the EERE offices as part of the requirements of the Government Performance and Results Act.« less

  6. Visionary Leadership

    DTIC Science & Technology

    1993-06-04

    SHARED PURPOSE + EMPOWERED PEOPLE + APPROPRIATE ORGANIZATIONAL CHANGES + STRATEGIC THINKING a SUCCESSFUL VISIONARY LEADERSHIP34 Nanus stated that the...energized and motivated people , acknowledged work well done, counseled and developed subordinates, always set the example, and listened to the soldiers...been given their positions of responsibility based on their previous succeses and their potential for future performance. Of course, previous successes

  7. Militant Extremist Mind-Set: Proviolence, Vile World, and Divine Power

    ERIC Educational Resources Information Center

    Stankov, Lazar; Saucier, Gerard; Knezevic, Goran

    2010-01-01

    In the present article, the authors report on the development of a scale for the measurement of the militant extremist mind-set. A previous pilot study identified 56 statements selected from writings of various terrorist groups as well as from psychological, historical, and political texts on terrorism. These statements, together with measures of…

  8. Developing a social enterprise through right to request.

    PubMed

    Parker, Nicola

    2010-10-01

    The 'right to request' the authority to run healthcare services was introduced by the previous government so that staff can respond to the needs of local communities by setting up social enterprises. This article explains how the right to request works in practice by describing how a social enterprise was set up in Bromley, Kent.

  9. Development and external validation of new ultrasound-based mathematical models for preoperative prediction of high-risk endometrial cancer.

    PubMed

    Van Holsbeke, C; Ameye, L; Testa, A C; Mascilini, F; Lindqvist, P; Fischerova, D; Frühauf, F; Fransis, S; de Jonge, E; Timmerman, D; Epstein, E

    2014-05-01

    To develop and validate strategies, using new ultrasound-based mathematical models, for the prediction of high-risk endometrial cancer and compare them with strategies using previously developed models or the use of preoperative grading only. Women with endometrial cancer were prospectively examined using two-dimensional (2D) and three-dimensional (3D) gray-scale and color Doppler ultrasound imaging. More than 25 ultrasound, demographic and histological variables were analyzed. Two logistic regression models were developed: one 'objective' model using mainly objective variables; and one 'subjective' model including subjective variables (i.e. subjective impression of myometrial and cervical invasion, preoperative grade and demographic variables). The following strategies were validated: a one-step strategy using only preoperative grading and two-step strategies using preoperative grading as the first step and one of the new models, subjective assessment or previously developed models as a second step. One-hundred and twenty-five patients were included in the development set and 211 were included in the validation set. The 'objective' model retained preoperative grade and minimal tumor-free myometrium as variables. The 'subjective' model retained preoperative grade and subjective assessment of myometrial invasion. On external validation, the performance of the new models was similar to that on the development set. Sensitivity for the two-step strategy with the 'objective' model was 78% (95% CI, 69-84%) at a cut-off of 0.50, 82% (95% CI, 74-88%) for the strategy with the 'subjective' model and 83% (95% CI, 75-88%) for that with subjective assessment. Specificity was 68% (95% CI, 58-77%), 72% (95% CI, 62-80%) and 71% (95% CI, 61-79%) respectively. The two-step strategies detected up to twice as many high-risk cases as preoperative grading only. The new models had a significantly higher sensitivity than did previously developed models, at the same specificity. Two-step strategies with 'new' ultrasound-based models predict high-risk endometrial cancers with good accuracy and do this better than do previously developed models. Copyright © 2013 ISUOG. Published by John Wiley & Sons Ltd.

  10. Exploiting MeSH indexing in MEDLINE to generate a data set for word sense disambiguation.

    PubMed

    Jimeno-Yepes, Antonio J; McInnes, Bridget T; Aronson, Alan R

    2011-06-02

    Evaluation of Word Sense Disambiguation (WSD) methods in the biomedical domain is difficult because the available resources are either too small or too focused on specific types of entities (e.g. diseases or genes). We present a method that can be used to automatically develop a WSD test collection using the Unified Medical Language System (UMLS) Metathesaurus and the manual MeSH indexing of MEDLINE. We demonstrate the use of this method by developing such a data set, called MSH WSD. In our method, the Metathesaurus is first screened to identify ambiguous terms whose possible senses consist of two or more MeSH headings. We then use each ambiguous term and its corresponding MeSH heading to extract MEDLINE citations where the term and only one of the MeSH headings co-occur. The term found in the MEDLINE citation is automatically assigned the UMLS CUI linked to the MeSH heading. Each instance has been assigned a UMLS Concept Unique Identifier (CUI). We compare the characteristics of the MSH WSD data set to the previously existing NLM WSD data set. The resulting MSH WSD data set consists of 106 ambiguous abbreviations, 88 ambiguous terms and 9 which are a combination of both, for a total of 203 ambiguous entities. For each ambiguous term/abbreviation, the data set contains a maximum of 100 instances per sense obtained from MEDLINE.We evaluated the reliability of the MSH WSD data set using existing knowledge-based methods and compared their performance to that of the results previously obtained by these algorithms on the pre-existing data set, NLM WSD. We show that the knowledge-based methods achieve different results but keep their relative performance except for the Journal Descriptor Indexing (JDI) method, whose performance is below the other methods. The MSH WSD data set allows the evaluation of WSD algorithms in the biomedical domain. Compared to previously existing data sets, MSH WSD contains a larger number of biomedical terms/abbreviations and covers the largest set of UMLS Semantic Types. Furthermore, the MSH WSD data set has been generated automatically reusing already existing annotations and, therefore, can be regenerated from subsequent UMLS versions.

  11. Estimation of Carcinogenicity using Hierarchical Clustering and Nearest Neighbor Methodologies

    EPA Science Inventory

    Previously a hierarchical clustering (HC) approach and a nearest neighbor (NN) approach were developed to model acute aquatic toxicity end points. These approaches were developed to correlate the toxicity for large, noncongeneric data sets. In this study these approaches applie...

  12. Context-specific attentional sampling: Intentional control as a pre-requisite for contextual control.

    PubMed

    Brosowsky, Nicholaus P; Crump, Matthew J C

    2016-08-01

    Recent work suggests that environmental cues associated with previous attentional control settings can rapidly and involuntarily adjust attentional priorities. The current study tests predictions from adaptive-learning and memory-based theories of contextual control about the role of intentions for setting attentional priorities. To extend the empirical boundaries of contextual control phenomena, and to determine whether theoretical principles of contextual control are generalizable we used a novel bi-dimensional stimulus sampling task. Subjects viewed briefly presented arrays of letters and colors presented above or below fixation, and identified specific stimuli according to a dimensional (letter or color) and positional cue. Location was predictive of the cued dimension, but not the position or identity. In contrast to previous findings, contextual control failed to develop through automatic, adaptive-learning processes. Instead, previous experience with intentionally changing attentional sampling priorities between different contexts was required for contextual control to develop. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Defining Outcome Measures for Psoriasis: The IDEOM Report from the GRAPPA 2016 Annual Meeting.

    PubMed

    Callis Duffin, Kristina; Gottlieb, Alice B; Merola, Joseph F; Latella, John; Garg, Amit; Armstrong, April W

    2017-05-01

    The International Dermatology Outcome Measures (IDEOM) psoriasis working group was established to develop core domains and measurements sets for psoriasis clinical trials and ultimately clinical practice. At the 2016 annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis, the IDEOM psoriasis group presented an overview of its progress toward developing this psoriasis core domain set. First, it summarized the February 2016 meeting of all involved with the IDEOM, highlighting patient and payer perspectives on outcome measures. Second, the group presented an overview of the consensus process for developing the core domain set for psoriasis, including previous literature reviews, nominal group exercises, and meeting discussions. Future plans include the development of working groups to review candidate measures for at least 2 of the domains, including primary pathophysiologic manifestations and patient-reported outcomes, and Delphi surveys to gain consensus on the final psoriasis core domain set.

  14. Fast Multiscale Algorithms for Information Representation and Fusion

    DTIC Science & Technology

    2011-07-01

    We are also developing convenient command-line invocation tools in addition to the previously developed APIs . Various real-world data sets...This knowledge is important in geolocation applications where knowing whether a received signal is line-of-sight or not is necessary for the

  15. Detections of Propellers in Saturn's Rings using Machine Learning: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Gordon, Mitchell K.; Showalter, Mark R.; Odess, Jennifer; Del Villar, Ambi; LaMora, Andy; Paik, Jin; Lakhani, Karim; Sergeev, Rinat; Erickson, Kristen; Galica, Carol; Grayzeck, Edwin; Morgan, Thomas; Knopf, William

    2015-11-01

    We report on the initial analysis of the output of a tool designed to identify persistent, non-axisymmetric features in the rings of Saturn. This project introduces a new paradigm for scientific software development. The preliminary results include what appear to be new detections of propellers in the rings of Saturn.The Planetary Data System (PDS), working with the NASA Tournament Lab (NTL), Crowd Innovation Lab at Harvard University, and the Topcoder community at Appirio, Inc., under the umbrella “Cassini Rings Challenge”, sponsored a set of competitions employing crowd sourcing and machine learning to develop a tool which could be made available to the community at large. The Challenge was tackled by running a series of separate contests to solve individual tasks prior to the major machine learning challenge. Each contest was comprised of a set of requirements, a timeline, one or more prizes, and other incentives, and was posted by Appirio to the Topcoder Community. In the case of the machine learning challenge (a “Marathon Challenge” on the Topcoder platform), members competed against each other by submitting solutions that were scored in real time and posted to a public leader-board by a scoring algorithm developed by Appirio for this contest.The current version of the algorithm was run against ~30,000 of the highest resolution Cassini ISS images. That set included 668 images with a total of 786 features previously identified as propellers in the main rings. The tool identified 81% of those previously identified propellers. In a preliminary, close examination of 130 detections identified by the tool, we determined that of the 130 detections, 11 were previously identified propeller detections, 5 appear to be new detections of known propellers, and 4 appear to be detections of propellers which have not been seen previously. A total of 20 valid detections from 130 candidates implies a relatively high false positive rate which we hope to reduce by further algorithm development. The machine learning aspect of the algorithm means that as our set of verified detections increases so does the pool of “ground-truth” data used to train the algorithm for future use.

  16. Protein Models Docking Benchmark 2

    PubMed Central

    Anishchenko, Ivan; Kundrotas, Petras J.; Tuzikov, Alexander V.; Vakser, Ilya A.

    2015-01-01

    Structural characterization of protein-protein interactions is essential for our ability to understand life processes. However, only a fraction of known proteins have experimentally determined structures. Such structures provide templates for modeling of a large part of the proteome, where individual proteins can be docked by template-free or template-based techniques. Still, the sensitivity of the docking methods to the inherent inaccuracies of protein models, as opposed to the experimentally determined high-resolution structures, remains largely untested, primarily due to the absence of appropriate benchmark set(s). Structures in such a set should have pre-defined inaccuracy levels and, at the same time, resemble actual protein models in terms of structural motifs/packing. The set should also be large enough to ensure statistical reliability of the benchmarking results. We present a major update of the previously developed benchmark set of protein models. For each interactor, six models were generated with the model-to-native Cα RMSD in the 1 to 6 Å range. The models in the set were generated by a new approach, which corresponds to the actual modeling of new protein structures in the “real case scenario,” as opposed to the previous set, where a significant number of structures were model-like only. In addition, the larger number of complexes (165 vs. 63 in the previous set) increases the statistical reliability of the benchmarking. We estimated the highest accuracy of the predicted complexes (according to CAPRI criteria), which can be attained using the benchmark structures. The set is available at http://dockground.bioinformatics.ku.edu. PMID:25712716

  17. Prediction of Fat-Free Mass in Kidney Transplant Recipients.

    PubMed

    Størset, Elisabet; von Düring, Marit Elizabeth; Godang, Kristin; Bergan, Stein; Midtvedt, Karsten; Åsberg, Anders

    2016-08-01

    Individualization of drug doses is essential in kidney transplant recipients. For many drugs, the individual dose is better predicted when using fat-free mass (FFM) as a scaling factor. Multiple equations have been developed to predict FFM based on healthy subjects. These equations have not been evaluated in kidney transplant recipients. The objectives of this study were to develop a kidney transplant specific equation for FFM prediction and to evaluate its predictive performance compared with previously published equations. Ten weeks after transplantation, FFM was measured by dual-energy X-ray absorptiometry. Data from a consecutive cohort of 369 kidney transplant recipients were randomly assigned to an equation development data set (n = 245) or an evaluation data set (n = 124). Prediction equations were developed using linear and nonlinear regression analysis. The predictive performance of the developed equation and previously published equations in the evaluation data set was assessed. The following equation was developed: FFM (kg) = {FFMmax × body weight (kg)/[81.3 + body weight (kg)]} × [1 + height (cm) × 0.052] × [1-age (years) × 0.0007], where FFMmax was estimated to be 11.4 in males and 10.2 in females. This equation provided an unbiased, precise prediction of FFM in the evaluation data set: mean error (ME) (95% CI), -0.71 kg (-1.60 to 0.19 kg) in males and -0.36 kg (-1.52 to 0.80 kg) in females, root mean squared error 4.21 kg (1.65-6.77 kg) in males and 3.49 kg (1.15-5.84 kg) in females. Using previously published equations, FFM was systematically overpredicted in kidney-transplanted males [ME +1.33 kg (0.40-2.25 kg) to +5.01 kg (4.06-5.95 kg)], but not in females [ME -2.99 kg (-4.07 to -1.90 kg) to +3.45 kg (2.29-4.61) kg]. A new equation for FFM prediction in kidney transplant recipients has been developed. The equation may be used for population pharmacokinetic modeling and clinical dose selection in kidney transplant recipients.

  18. Low Back Pain in 17 Countries, a Rasch Analysis of the ICF Core Set for Low Back Pain

    ERIC Educational Resources Information Center

    Roe, Cecilie; Bautz-Holter, Erik; Cieza, Alarcos

    2013-01-01

    Previous studies indicate that a worldwide measurement tool may be developed based on the International Classification of Functioning Disability and Health (ICF) Core Sets for chronic conditions. The aim of the present study was to explore the possibility of constructing a cross-cultural measurement of functioning for patients with low back pain…

  19. Developing Enhanced Blood–Brain Barrier Permeability Models: Integrating External Bio-Assay Data in QSAR Modeling

    PubMed Central

    Wang, Wenyi; Kim, Marlene T.; Sedykh, Alexander

    2015-01-01

    Purpose Experimental Blood–Brain Barrier (BBB) permeability models for drug molecules are expensive and time-consuming. As alternative methods, several traditional Quantitative Structure-Activity Relationship (QSAR) models have been developed previously. In this study, we aimed to improve the predictivity of traditional QSAR BBB permeability models by employing relevant public bio-assay data in the modeling process. Methods We compiled a BBB permeability database consisting of 439 unique compounds from various resources. The database was split into a modeling set of 341 compounds and a validation set of 98 compounds. Consensus QSAR modeling workflow was employed on the modeling set to develop various QSAR models. A five-fold cross-validation approach was used to validate the developed models, and the resulting models were used to predict the external validation set compounds. Furthermore, we used previously published membrane transporter models to generate relevant transporter profiles for target compounds. The transporter profiles were used as additional biological descriptors to develop hybrid QSAR BBB models. Results The consensus QSAR models have R2=0.638 for fivefold cross-validation and R2=0.504 for external validation. The consensus model developed by pooling chemical and transporter descriptors showed better predictivity (R2=0.646 for five-fold cross-validation and R2=0.526 for external validation). Moreover, several external bio-assays that correlate with BBB permeability were identified using our automatic profiling tool. Conclusions The BBB permeability models developed in this study can be useful for early evaluation of new compounds (e.g., new drug candidates). The combination of chemical and biological descriptors shows a promising direction to improve the current traditional QSAR models. PMID:25862462

  20. Continued development of a detailed model of arc discharge dynamics

    NASA Technical Reports Server (NTRS)

    Beers, B. L.; Pine, V. W.; Ives, S. T.

    1982-01-01

    Using a previously developed set of codes (SEMC, CASCAD, ACORN), a parametric study was performed to quantify the parameters which describe the development of a single electron indicated avalanche into a negative tip streamer. The electron distribution function in Teflon is presented for values of the electric field in the range of four-hundred million volts/meter to four billon volts/meter. A formulation of the scattering parameters is developed which shows that the transport can be represented by three independent variables. The distribution of ionization sites is used to indicate an avalanche. The self consistent evolution of the avalanche is computed over the parameter range of scattering set.

  1. A study on the application of topic models to motif finding algorithms.

    PubMed

    Basha Gutierrez, Josep; Nakai, Kenta

    2016-12-22

    Topic models are statistical algorithms which try to discover the structure of a set of documents according to the abstract topics contained in them. Here we try to apply this approach to the discovery of the structure of the transcription factor binding sites (TFBS) contained in a set of biological sequences, which is a fundamental problem in molecular biology research for the understanding of transcriptional regulation. Here we present two methods that make use of topic models for motif finding. First, we developed an algorithm in which first a set of biological sequences are treated as text documents, and the k-mers contained in them as words, to then build a correlated topic model (CTM) and iteratively reduce its perplexity. We also used the perplexity measurement of CTMs to improve our previous algorithm based on a genetic algorithm and several statistical coefficients. The algorithms were tested with 56 data sets from four different species and compared to 14 other methods by the use of several coefficients both at nucleotide and site level. The results of our first approach showed a performance comparable to the other methods studied, especially at site level and in sensitivity scores, in which it scored better than any of the 14 existing tools. In the case of our previous algorithm, the new approach with the addition of the perplexity measurement clearly outperformed all of the other methods in sensitivity, both at nucleotide and site level, and in overall performance at site level. The statistics obtained show that the performance of a motif finding method based on the use of a CTM is satisfying enough to conclude that the application of topic models is a valid method for developing motif finding algorithms. Moreover, the addition of topic models to a previously developed method dramatically increased its performance, suggesting that this combined algorithm can be a useful tool to successfully predict motifs in different kinds of sets of DNA sequences.

  2. Developing Nitrogen Load-Eelgrass Response Relationshups for New England Estuaries

    EPA Science Inventory

    We have accumulated and analyzed eelgrass areal extent data for 67 estuaries from three New England states. To our knowledge this is the largest data set of its kind. Previous comparative studies have utilized data from a far smaller number of estuaries (ten or less) to develop e...

  3. Reading, Writing, and Animation in Character Learning in Chinese as a Foreign Language

    ERIC Educational Resources Information Center

    Xu, Yi; Chang, Li-Yun; Zhang, Juan; Perfetti, Charles A.

    2013-01-01

    Previous studies suggest that writing helps reading development in Chinese in both first and second language settings by enabling higher-quality orthographic representation of the characters. This study investigated the comparative effectiveness of reading, animation, and writing in developing foreign language learners' orthographic knowledge…

  4. Being Influenced or Being an Influence: New Teachers' Induction Experiences

    ERIC Educational Resources Information Center

    Keay, Jeanne

    2009-01-01

    This article draws on and develops the outcomes of previous research which concluded that school subject departments provide the setting for influential professional development and that experienced teachers strongly influence their newly qualified colleagues. The findings of two subsequent research projects, which used this as a starting point,…

  5. Exploiting MeSH indexing in MEDLINE to generate a data set for word sense disambiguation

    PubMed Central

    2011-01-01

    Background Evaluation of Word Sense Disambiguation (WSD) methods in the biomedical domain is difficult because the available resources are either too small or too focused on specific types of entities (e.g. diseases or genes). We present a method that can be used to automatically develop a WSD test collection using the Unified Medical Language System (UMLS) Metathesaurus and the manual MeSH indexing of MEDLINE. We demonstrate the use of this method by developing such a data set, called MSH WSD. Methods In our method, the Metathesaurus is first screened to identify ambiguous terms whose possible senses consist of two or more MeSH headings. We then use each ambiguous term and its corresponding MeSH heading to extract MEDLINE citations where the term and only one of the MeSH headings co-occur. The term found in the MEDLINE citation is automatically assigned the UMLS CUI linked to the MeSH heading. Each instance has been assigned a UMLS Concept Unique Identifier (CUI). We compare the characteristics of the MSH WSD data set to the previously existing NLM WSD data set. Results The resulting MSH WSD data set consists of 106 ambiguous abbreviations, 88 ambiguous terms and 9 which are a combination of both, for a total of 203 ambiguous entities. For each ambiguous term/abbreviation, the data set contains a maximum of 100 instances per sense obtained from MEDLINE. We evaluated the reliability of the MSH WSD data set using existing knowledge-based methods and compared their performance to that of the results previously obtained by these algorithms on the pre-existing data set, NLM WSD. We show that the knowledge-based methods achieve different results but keep their relative performance except for the Journal Descriptor Indexing (JDI) method, whose performance is below the other methods. Conclusions The MSH WSD data set allows the evaluation of WSD algorithms in the biomedical domain. Compared to previously existing data sets, MSH WSD contains a larger number of biomedical terms/abbreviations and covers the largest set of UMLS Semantic Types. Furthermore, the MSH WSD data set has been generated automatically reusing already existing annotations and, therefore, can be regenerated from subsequent UMLS versions. PMID:21635749

  6. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  7. Multiscale intensity homogeneity transformation method and its application to computer-aided detection of pulmonary embolism in computed tomographic pulmonary angiography (CTPA)

    NASA Astrophysics Data System (ADS)

    Guo, Yanhui; Zhou, Chuan; Chan, Heang-Ping; Wei, Jun; Chughtai, Aamer; Sundaram, Baskaran; Hadjiiski, Lubomir M.; Patel, Smita; Kazerooni, Ella A.

    2013-04-01

    A 3D multiscale intensity homogeneity transformation (MIHT) method was developed to reduce false positives (FPs) in our previously developed CAD system for pulmonary embolism (PE) detection. In MIHT, the voxel intensity of a PE candidate region was transformed to an intensity homogeneity value (IHV) with respect to the local median intensity. The IHVs were calculated in multiscales (MIHVs) to measure the intensity homogeneity, taking into account vessels of different sizes and different degrees of occlusion. Seven new features including the entropy, gradient, and moments that characterized the intensity distributions of the candidate regions were derived from the MIHVs and combined with the previously designed features that described the shape and intensity of PE candidates for the training of a linear classifier to reduce the FPs. 59 CTPA PE cases were collected from our patient files (UM set) with IRB approval and 69 cases from the PIOPED II data set with access permission. 595 and 800 PEs were identified as reference standard by experienced thoracic radiologists in the UM and PIOPED set, respectively. FROC analysis was used for performance evaluation. Compared with our previous CAD system, at a test sensitivity of 80%, the new method reduced the FP rate from 18.9 to 14.1/scan for the PIOPED set when the classifier was trained with the UM set and from 22.6 to 16.0/scan vice versa. The improvement was statistically significant (p<0.05) by JAFROC analysis. This study demonstrated that the MIHT method is effective in reducing FPs and improving the performance of the CAD system.

  8. Electronic Principles VI, 7-10. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This sixth of 10 blocks of student and teacher materials for a secondary/postsecondary level course in electronic principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Prerequisites are the previous blocks. This block on…

  9. The Development, Test, and Evaluation of Three Pilot Performance Reference Scales.

    ERIC Educational Resources Information Center

    Horner, Walter R.; And Others

    A set of pilot performance reference scales was developed based upon airborne Audio-Video Recording (AVR) of student performance in T-37 undergraduate Pilot Training. After selection of the training maneuvers to be studied, video tape recordings of the maneuvers were selected from video tape recordings already available from a previous research…

  10. Electronic Principles IV, 7-8. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This fourth of 10 blocks of student and teacher materials for a secondary/postsecondary level course in electronic principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Prerequisites are the previous blocks. This block on…

  11. Protandry of western corn rootworm (Coleoptera: Chrysomelidae) beetle emergence partially due to earlier egg hatch of males

    USDA-ARS?s Scientific Manuscript database

    The western corn rootworm, Diabrotica virgifera virgifera LeConte, exhibits protandry. The contribution of pre-hatch development to protandry in western corn rootworm was previously investigated with a small set of data from one population. To verify the contribution of pre-hatch development to prot...

  12. "More Writing than Welding": Learning in Worker Writer Groups

    ERIC Educational Resources Information Center

    Woodin, Tom

    2005-01-01

    The Federation of Worker Writers and Community Publishers was set up in 1976 by a number of independent writing and publishing groups to support and develop the writing of working class and other marginalized people. Focusing on the development of individuals within a collective organization over the previous three decades provides important…

  13. Electronic Principles VIII, 7-12. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This eighth of 10 blocks of student and teacher materials for a secondary/postsecondary level course in electronic principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Prerequisites are the previous blocks. This block on…

  14. Electronic Principles III, 7-7. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This third of 10 blocks of student and teacher materials for a secondary/postsecondary level course in electronics principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Prerequisites are the previous blocks. This block on…

  15. Electronic Principles IX, 7-13. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This ninth of 10 blocks of student and teacher materials for a secondary/postsecondary level course in electronic principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Prerequisites are the previous blocks. This block on…

  16. Measuring Attending Behavior and Short-Term Memory with Knox's Cube Test.

    ERIC Educational Resources Information Center

    Stone, Mark H.; Wright, Benjamin D.

    1983-01-01

    A new revision was developed using Rasch psychometric techniques to build a Knox's Cube Test (KCT) variable and item bank using the tapping series from all previous editions. The report forms developed give a clear picture of the subject's performance set in a context that is both normative and criterion. (Author/BW)

  17. Electronic Principles X, 7-14. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This tenth of 10 blocks of student and teacher materials for a secondary/postsecondary level course in electronic principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Prerequisites are the previous blocks. This block on…

  18. PUBERTAL DEVELOPMENT IN FEMALE WISTAR RATS FOLLOWING EXPOSURE TO PROPAZINE AND ATRAZINE BIOTRANSFORMATION BY-PRODUCTS, DIAMINO-S-CHLOROTRIAZINE AND HYDROXYATRAZINE

    EPA Science Inventory

    We have shown previously that the chlorotriazine herbicide, atrazine (ATR), delays the onset of pubertal development in female rats. ATR and its by-products of microbial degradation are present in soil and groundwater. Since current maximum contaminant levels are set only for ATR...

  19. Transcriptomic Signature of the SHATTERPROOF2 Expression Domain Reveals the Meristematic Nature of Arabidopsis Gynoecial Medial Domain1[OPEN

    PubMed Central

    Villarino, Gonzalo H.; Hu, Qiwen; Flores-Vergara, Miguel; Sehra, Bhupinder; Brumos, Javier; Stepanova, Anna N.; Sundberg, Eva; Heber, Steffen

    2016-01-01

    Plant meristems, like animal stem cell niches, maintain a pool of multipotent, undifferentiated cells that divide and differentiate to give rise to organs. In Arabidopsis (Arabidopsis thaliana), the carpel margin meristem is a vital meristematic structure that generates ovules from the medial domain of the gynoecium, the female floral reproductive structure. The molecular mechanisms that specify this meristematic region and regulate its organogenic potential are poorly understood. Here, we present a novel approach to analyze the transcriptional signature of the medial domain of the Arabidopsis gynoecium, highlighting the developmental stages that immediately proceed ovule initiation, the earliest stages of seed development. Using a floral synchronization system and a SHATTERPROOF2 (SHP2) domain-specific reporter, paired with FACS and RNA sequencing, we assayed the transcriptome of the gynoecial medial domain with temporal and spatial precision. This analysis reveals a set of genes that are differentially expressed within the SHP2 expression domain, including genes that have been shown previously to function during the development of medial domain-derived structures, including the ovules, thus validating our approach. Global analyses of the transcriptomic data set indicate a similarity of the pSHP2-expressing cell population to previously characterized meristematic domains, further supporting the meristematic nature of this gynoecial tissue. Our method identifies additional genes including novel isoforms, cis-natural antisense transcripts, and a previously unrecognized member of the REPRODUCTIVE MERISTEM family of transcriptional regulators that are potential novel regulators of medial domain development. This data set provides genome-wide transcriptional insight into the development of the carpel margin meristem in Arabidopsis. PMID:26983993

  20. Comparison of Magnetic Resonance Imaging-based vocal tract area functions obtained from the same speaker in 1994 and 2002

    PubMed Central

    Story, Brad H.

    2008-01-01

    A new set of area functions for vowels has been obtained with Magnetic Resonance Imaging (MRI) from the same speaker as that previously reported in 1996 [Story, Titze, & Hoffman, JASA, 100, 537–554 (1996)]. The new area functions were derived from image data collected in 2002, whereas the previously reported area functions were based on MR images obtained in 1994. When compared, the new area function sets indicated a tendency toward a constricted pharyngeal region and expanded oral cavity relative to the previous set. Based on calculated formant frequencies and sensitivity functions, these morphological differences were shown to have the primary acoustic effect of systematically shifting the second formant (F2) downward in frequency. Multiple instances of target vocal tract shapes from a specific speaker provide additional sampling of the possible area functions that may be produced during speech production. This may be of benefit for understanding intra-speaker variability in vowel production and for further development of speech synthesizers and speech models that utilize area function information. PMID:18177162

  1. Developing an Enhanced Lightning Jump Algorithm for Operational Use

    NASA Technical Reports Server (NTRS)

    Schultz, Christopher J.; Petersen, Walter A.; Carey, Lawrence D.

    2009-01-01

    Overall Goals: 1. Build on the lightning jump framework set through previous studies. 2. Understand what typically occurs in nonsevere convection with respect to increases in lightning. 3. Ultimately develop a lightning jump algorithm for use on the Geostationary Lightning Mapper (GLM). 4 Lightning jump algorithm configurations were developed (2(sigma), 3(sigma), Threshold 10 and Threshold 8). 5 algorithms were tested on a population of 47 nonsevere and 38 severe thunderstorms. Results indicate that the 2(sigma) algorithm performed best over the entire thunderstorm sample set with a POD of 87%, a far of 35%, a CSI of 59% and a HSS of 75%.

  2. Sources of Cognitive Inflexibility in Set-Shifting Tasks: Insights Into Developmental Theories From Adult Data

    PubMed Central

    Dick, Anthony Steven

    2012-01-01

    Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming. In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation (attentional inertia), and the need to shift to the previously ignored representation (negative priming). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children. PMID:23539267

  3. Sources of Cognitive Inflexibility in Set-Shifting Tasks: Insights Into Developmental Theories From Adult Data.

    PubMed

    Dick, Anthony Steven

    2012-01-01

    Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming . In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation ( attentional inertia ), and the need to shift to the previously ignored representation ( negative priming ). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children.

  4. Children’s Numerical Equivalence Judgments: Crossmapping Effects

    PubMed Central

    Mix, Kelly S.

    2009-01-01

    Preschoolers made numerical comparisons between sets with varying degrees of shared surface similarity. When surface similarity was pitted against numerical equivalence (i.e., crossmapping), children made fewer number matches than when surface similarity was neutral (i.e, all sets contained the same objects). Only children who understood the number words for the target sets performed above chance in the crossmapping condition. These findings are consistent with previous research on children’s non-numerical comparisons (e.g., Rattermann & Gentner, 1998; Smith, 1993) and suggest that the same mechanisms may underlie numerical development. PMID:19655027

  5. Do cybernetics, system science and fuzzy sets share some epistemological problems. I. An analysis of cybernetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamburrini, G.; Termini, S.

    1982-01-01

    The general thesis underlying the present paper is that there are very strong methodological relations among cybernetics, system science, artificial intelligence, fuzzy sets and many other related fields. Then, in order to understand better both the achievements and the weak points of all the previous disciplines, one should look for some common features for looking at them in this general frame. What will be done is to present a brief analysis of the primitive program of cybernetics, presenting it as a case study useful for developing the previous thesis. Among the discussed points are the problems of interdisciplinarity and ofmore » the unity of cybernetics. Some implications of this analysis for a new reading of general system theory and fuzzy sets are briefly outlined at the end of the paper. 3 references.« less

  6. Method: automatic segmentation of mitochondria utilizing patch classification, contour pair classification, and automatically seeded level sets

    PubMed Central

    2012-01-01

    Background While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. Results We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. Conclusions We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with other texture identifiers, and we plan to explore this in future work. PMID:22321695

  7. Marker-assisted NIL development of an Oryza sativa x Oryza rufipogon cross using SSRs, InDels and SNPs

    USDA-ARS?s Scientific Manuscript database

    A set of near isogenic lines (NILs) with introgressions from O. rufipogon (IRGC 105491) in the genetic background of an elite US variety, cv Jefferson, were developed to confirm the performance of six yield-enhancing QTLs identified in a previous study. Approximately 200 SSRs were used to evaluate ...

  8. Electronic Principles V, 7-9. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This fifth of 10 blocks of student and teacher materials for a postsecondary level course in electronic principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Prerequisites are the previous blocks. This block on solid state…

  9. Does Research Degree Supervisor Training Work? The Impact of a Professional Development Induction Workshop on Supervision Practice

    ERIC Educational Resources Information Center

    McCulloch, Alistair; Loeser, Cassandra

    2016-01-01

    Supervisor induction and continued professional development programmes constitute good practice and are enshrined in institutional policies and national codes of practice. However, there is little evidence about whether they have an impact on either supervisors' learning or day-to-day practice. Set in a discussion of previous literature, this…

  10. Electronic Principles II, 7-6. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This second of 10 blocks of student and teacher materials for a secondary/postsecondary level course in electronic principles comprises one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. A prerequisite is the previous block. This block on AC…

  11. Choosing Important Health Outcomes for Comparative Effectiveness Research: An Updated Review and Identification of Gaps.

    PubMed

    Gorst, Sarah L; Gargon, Elizabeth; Clarke, Mike; Smith, Valerie; Williamson, Paula R

    2016-01-01

    The COMET (Core Outcome Measures in Effectiveness Trials) Initiative promotes the development and application of core outcome sets (COS), including relevant studies in an online database. In order to keep the database current, an annual search of the literature is undertaken. This study aimed to update a previous systematic review, in order to identify any further studies where a COS has been developed. Furthermore, no prioritization for COS development has previously been undertaken, therefore this study also aimed to identify COS relevant to the world's most prevalent health conditions. The methods used in this updated review followed the same approach used in the original review and the previous update. A survey was also sent to the corresponding authors of COS identified for inclusion in this review, to ascertain what lessons they had learnt from developing their COS. Additionally, the COMET database was searched to identify COS that might be relevant to the conditions with the highest global prevalence. Twenty-five reports relating to 22 new studies were eligible for inclusion in the review. Further improvements were identified in relation to the description of the scope of the COS, use of the Delphi technique, and the inclusion of patient participants within the development process. Additionally, 33 published and ongoing COS were identified for 13 of the world's most prevalent conditions. The development of a reporting guideline and minimum standards should contribute towards future improvements in development and reporting of COS. This study has also described a first approach to identifying gaps in existing COS, and to priority setting in this area. Important gaps have been identified, on the basis of global burden of disease, and the development and application of COS in these areas should be considered a priority.

  12. Development of a set of process and structure indicators for palliative care: the Europall project

    PubMed Central

    2012-01-01

    Background By measuring the quality of the organisation of palliative care with process and structure quality indicators (QIs), patients, caregivers and policy makers are able to monitor to what extent recommendations are met, like those of the council of the WHO on palliative care and guidelines. This will support the implementation of public programmes, and will enable comparisons between organisations or countries. Methods As no European set of indicators for the organisation of palliative care existed, such a set of QIs was developed. An update of a previous systematic review was made and extended with more databases and grey literature. In two project meetings with practitioners and experts in palliative care the development process of a QI set was finalised and the QIs were categorized in a framework, covering the recommendations of the Council of Europe. Results The searches resulted in 151 structure and process indicators, which were discussed in steering group meetings. Of those QIs, 110 were eligible for the final framework. Conclusions We developed the first set of QIs for the organisation of palliative care. This article is the first step in a multi step project to identify, validate and pilot QIs. PMID:23122255

  13. Comparison of taxon-specific versus general locus sets for targeted sequence capture in plant phylogenomics.

    PubMed

    Chau, John H; Rahfeldt, Wolfgang A; Olmstead, Richard G

    2018-03-01

    Targeted sequence capture can be used to efficiently gather sequence data for large numbers of loci, such as single-copy nuclear loci. Most published studies in plants have used taxon-specific locus sets developed individually for a clade using multiple genomic and transcriptomic resources. General locus sets can also be developed from loci that have been identified as single-copy and have orthologs in large clades of plants. We identify and compare a taxon-specific locus set and three general locus sets (conserved ortholog set [COSII], shared single-copy nuclear [APVO SSC] genes, and pentatricopeptide repeat [PPR] genes) for targeted sequence capture in Buddleja (Scrophulariaceae) and outgroups. We evaluate their performance in terms of assembly success, sequence variability, and resolution and support of inferred phylogenetic trees. The taxon-specific locus set had the most target loci. Assembly success was high for all locus sets in Buddleja samples. For outgroups, general locus sets had greater assembly success. Taxon-specific and PPR loci had the highest average variability. The taxon-specific data set produced the best-supported tree, but all data sets showed improved resolution over previous non-sequence capture data sets. General locus sets can be a useful source of sequence capture targets, especially if multiple genomic resources are not available for a taxon.

  14. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  15. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  16. Solution and reasoning reuse in space planning and scheduling applications

    NASA Technical Reports Server (NTRS)

    Verfaillie, Gerard; Schiex, Thomas

    1994-01-01

    In the space domain, as in other domains, the CSP (Constraint Satisfaction Problems) techniques are increasingly used to represent and solve planning and scheduling problems. But these techniques have been developed to solve CSP's which are composed of fixed sets of variables and constraints, whereas many planning and scheduling problems are dynamic. It is therefore important to develop methods which allow a new solution to be rapidly found, as close as possible to the previous one, when some variables or constraints are added or removed. After presenting some existing approaches, this paper proposes a simple and efficient method, which has been developed on the basis of the dynamic backtracking algorithm. This method allows previous solution and reasoning to be reused in the framework of a CSP which is close to the previous one. Some experimental results on general random CSPs and on operation scheduling problems for remote sensing satellites are given.

  17. Tutoring Online Tutors: Using Digital Badges to Encourage the Development of Online Tutoring Skills

    ERIC Educational Resources Information Center

    Hrastinski, Stefan; Cleveland-Innes, Martha; Stenbom, Stefan

    2018-01-01

    Online tutors play a critical role in e-learning and need to have an appropriate set of skills in addition to subject matter expertise. This paper explores how digital badges can be used to encourage the development of online tutoring skills. Based on previous research, we defined three digital badges, which are examples of essential tutoring…

  18. Less Developed Countries (LDCs) Facing Higher Education Curricula Reform Challenges in a "New World (Dis)Order"

    ERIC Educational Resources Information Center

    Gilder, Eric

    2011-01-01

    In a previous article for "EJHE," I detailed Curricula Reform (CR) efforts in Higher Education (HE) in four (relatively) well developed regional and national settings (The EU, the USA, Hong Kong SAR China, and Singapore). I detailed the backdrop motivating the moves by policymakers to reform the curricula in such "world class"…

  19. Maternal Psychopathology and Infant Development at 18 Months: The Impact of Maternal Personality Disorder and Depression

    ERIC Educational Resources Information Center

    Conroy, Susan; Pariante, Carmine M.; Marks, Maureen N.; Davies, Helen A.; Farrelly, Simone; Schacht, Robin; Moran, Paul

    2012-01-01

    Objective: No previous longitudinal study has examined the impact of comorbid maternal personality disorder (PD) and depression on child development. We set out to examine whether maternal PD and depression assessed at 2 months post partum would be independently associated with adverse developmental outcomes at 18 months of age. Method: Women were…

  20. Delirium in the geriatric unit: proton-pump inhibitors and other risk factors.

    PubMed

    Otremba, Iwona; Wilczyński, Krzysztof; Szewieczek, Jan

    2016-01-01

    Delirium remains a major nosocomial complication of hospitalized elderly. Predictive models for delirium may be useful for identification of high-risk patients for implementation of preventive strategies. Evaluate specific factors for development of delirium in a geriatric ward setting. Prospective cross-sectional study comprised 675 consecutive patients aged 79.2±7.7 years (66% women and 34% men), admitted to the subacute geriatric ward of a multiprofile university hospital after exclusion of 113 patients treated with antipsychotic medication because of behavioral disorders before admission. Comprehensive geriatric assessments including a structured interview, physical examination, geriatric functional assessment, blood sampling, ECG, abdominal ultrasound, chest X-ray, Confusion Assessment Method for diagnosis of delirium, Delirium-O-Meter to assess delirium severity, Richmond Agitation-Sedation Scale to assess sedation or agitation, visual analog scale and Doloplus-2 scale to assess pain level were performed. Multivariate logistic regression analysis revealed five independent factors associated with development of delirium in geriatric inpatients: transfer between hospital wards (odds ratio [OR] =2.78; confidence interval [CI] =1.54-5.01; P=0.001), preexisting dementia (OR =2.29; CI =1.44-3.65; P<0.001), previous delirium incidents (OR =2.23; CI =1.47-3.38; P<0.001), previous fall incidents (OR =1.76; CI =1.17-2.64; P=0.006), and use of proton-pump inhibitors (OR =1.67; CI =1.11-2.53; P=0.014). Transfer between hospital wards, preexisting dementia, previous delirium incidents, previous fall incidents, and use of proton-pump inhibitors are predictive of development of delirium in the geriatric inpatient setting.

  1. The Development of the Speaker Independent ARM Continuous Speech Recognition System

    DTIC Science & Technology

    1992-01-01

    spokeTi airborne reconnaissance reports u-ing a speech recognition system based on phoneme-level hidden Markov models (HMMs). Previous versions of the ARM...will involve automatic selection from multiple model sets, corresponding to different speaker types, and that the most rudimen- tary partition of a...The vocabulary size for the ARM task is 497 words. These words are related to the phoneme-level symbols corresponding to the models in the model set

  2. Algorithmic tools for interpreting vital signs.

    PubMed

    Rathbun, Melina C; Ruth-Sahd, Lisa A

    2009-07-01

    Today's complex world of nursing practice challenges nurse educators to develop teaching methods that promote critical thinking skills and foster quick problem solving in the novice nurse. Traditional pedagogies previously used in the classroom and clinical setting are no longer adequate to prepare nursing students for entry into practice. In addition, educators have expressed frustration when encouraging students to apply newly learned theoretical content to direct the care of assigned patients in the clinical setting. This article presents algorithms as an innovative teaching strategy to guide novice student nurses in the interpretation and decision making related to vital sign assessment in an acute care setting.

  3. Validation of the Contextual Assessment Inventory for Problem Behavior

    ERIC Educational Resources Information Center

    Carr, Edward G.; Ladd, Mara V.; Schulte, Christine F.

    2008-01-01

    Problem behavior is a major barrier to successful community integration for people with developmental disabilities. Recently, there has been increased interest in identifying contextual factors involving setting events and discriminative stimuli that impact the display of problem behavior. The authors previously developed the "Contextual…

  4. Landfill and Wastewater Treatment RNG Chemical and Physical Profiling: Increasing the Database Set

    DOT National Transportation Integrated Search

    2011-08-15

    The purpose of this USDOT PHMSA sponsored research project was to address the continued development of a draft guidance document for the safe introduction of renewable gas into natural gas pipelines. This project was designed to build upon previous s...

  5. Dysaerobic trace fossils and ichnofabrics in the upper Jurassic Kimmeridge Clay of southern England

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wignall, P.B.

    The trace fossil suite from the Kimmeridge Clay is calibrated against an oxygen gradient derived from previous geochemical, lithological and shelly macrofaunal studies. Several soft-bodied trace markers appear to have tolerated lower oxygen tensions than even the hardiest shelly benthic macrofauna-a common occurrence in both recent and ancient dysaerobic settings. Lowest diversity trace fossil assemblages consist of Astacimorphichnus etchesi (new ichnotaxon), a small endostratal pascichnial trace attributed to pioneering polychaete populations. Ekdale and Masons' (1988) contention that fodinichnia dominate the lowest diversity and lowest oxygen settings is not substantiated as the only example of this feeding strategy, Rhizocorallium irregulare, ismore » encountered in moderately diverse trace fossil assemblages associated with a low diversity shelly macrofauna. Upper dysaerobic conditions are characterized by the development of a surface mixed layer and the consequent destruction of fine lamination. Tiering is only developed under normal oxygen conditions with Chondrites as the deepest trace. In contrast to many previous studies, Chondrites is never found in dysaerobic facies.« less

  6. Women in science, engineering and technology (SET): a report on the Indonesian experience.

    PubMed

    Hermawati, W; Luhulima, A S

    2000-01-01

    This paper presents the preliminary results of a study by the Gender Working Group, Indonesian Institute of Sciences, on women's contribution to, and benefits to women from, science, engineering and technology (SET), specifically the benefits accruing to disadvantaged women in urban and rural areas in Indonesia. Previous studies on the participation of women in SET have shown the under-representation of women in all SET activities, including decision-making and advisory positions. However, some studies have shown that if gender perspectives are included in the design and implementation of development activities, disadvantaged women in urban and rural areas could greatly benefit from SET in development projects. The two case studies in North Sulawesi and Central Lombok provinces show that the projects have enabled the expansion of employment opportunities for women and thus increased their technical skills and income. In addition, the projects have also contributed to enhancing women's self-confidence, self-reliance and communication skills.

  7. 40 CFR 35.910-5 - Additional allotments of previously withheld sums.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ($11 billion) and subtracting the previously allotted sums, formerly set forth in § 35.910-3(c). (c... Pub. L. 93-243; and, finally, by subtracting the previously allotted sums set forth in § 35.910-4(c). (d) Based upon the computations set forth in paragraphs (b) and (c) of this section, the total...

  8. Early Examples from the Integrated Multi-Satellite Retrievals for GPM (IMERG)

    NASA Astrophysics Data System (ADS)

    Huffman, George; Bolvin, David; Braithwaite, Daniel; Hsu, Kuolin; Joyce, Robert; Kidd, Christopher; Sorooshian, Soroosh; Xie, Pingping

    2014-05-01

    The U.S. GPM Science Team's Day-1 algorithm for computing combined precipitation estimates as part of GPM is the Integrated Multi-satellitE Retrievals for GPM (IMERG). The goal is to compute the best time series of (nearly) global precipitation from "all" precipitation-relevant satellites and global surface precipitation gauge analyses. IMERG is being developed as a unified U.S. algorithm drawing on strengths in the three contributing groups, whose previous work includes: 1) the TRMM Multi-satellite Precipitation Analysis (TMPA); 2) the CPC Morphing algorithm with Kalman Filtering (K-CMORPH); and 3) the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS). We review the IMERG design and development, plans for testing, and current status. Some of the lessons learned in running and reprocessing the previous data sets include the importance of quality-controlling input data sets, strategies for coping with transitions in the various input data sets, and practical approaches to retrospective analysis of multiple output products (namely the real- and post-real-time data streams). IMERG output will be illustrated using early test data, including the variety of supporting fields, such as the merged-microwave and infrared estimates, and the precipitation type. We end by considering recent changes in input data specifications, the transition from TRMM-based calibration to GPM-based, and further "Day 2" development.

  9. Cognitive Support for Transportation Planners: A Collaborative Course of Action Exploration Tool

    DTIC Science & Technology

    2011-06-01

    smaller problem . We chose to work with MIDAS , for the pragmatic reason that MIDAS developers were at Raytheon BBN Technologies, and were accessible to...overall framework we built up to let the planner interact with MIDAS . In [Scott, 2009b], the problem under discussion is the design of a “Joint...previously been used for and what we now want to use it for - we are using MIDAS for a set of problems for which it has not previously been used. It

  10. Development of the global sea ice 6.0 CICE configuration for the Met Office global coupled model

    DOE PAGES

    Rae, J. G. L.; Hewitt, H. T.; Keen, A. B.; ...

    2015-07-24

    The new sea ice configuration GSI6.0, used in the Met Office global coupled configuration GC2.0, is described and the sea ice extent, thickness and volume are compared with the previous configuration and with observationally based data sets. In the Arctic, the sea ice is thicker in all seasons than in the previous configuration, and there is now better agreement of the modelled concentration and extent with the HadISST data set. As a result, in the Antarctic, a warm bias in the ocean model has been exacerbated at the higher resolution of GC2.0, leading to a large reduction in ice extentmore » and volume; further work is required to rectify this in future configurations.« less

  11. Development of the global sea ice 6.0 CICE configuration for the Met Office global coupled model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rae, J. G. L.; Hewitt, H. T.; Keen, A. B.

    The new sea ice configuration GSI6.0, used in the Met Office global coupled configuration GC2.0, is described and the sea ice extent, thickness and volume are compared with the previous configuration and with observationally based data sets. In the Arctic, the sea ice is thicker in all seasons than in the previous configuration, and there is now better agreement of the modelled concentration and extent with the HadISST data set. As a result, in the Antarctic, a warm bias in the ocean model has been exacerbated at the higher resolution of GC2.0, leading to a large reduction in ice extentmore » and volume; further work is required to rectify this in future configurations.« less

  12. Children's Gendered Drawings of Play Behaviours

    ERIC Educational Resources Information Center

    Akseer, Tabasum; Lao, Mary Grace; Bosacki, Sandra

    2012-01-01

    According to child psychologists, vital links exist between children's drawings and their emotional, social, and cognitive development. Previous research has explored the important relations between drawings and play in educational settings. Given the vast research that explores the ambiguous topic of children's play, according to Richer (1990),…

  13. Ecoenzymatic Stoichiometry of Stream Sediments with Comparison to Terrestrial Soils

    EPA Science Inventory

    In this study, we extend the development of ecoenzymatic stoichiometry to the surface sediments of stream ecosystems using data collected in a nationwide survey. The data set is larger and more comprehensive than those used in our previous studies. The data include the first broa...

  14. Career Education for Mental Health Workers. Integrative Seminar in Human Service. Human Service Instructional Series. Module No. 5.

    ERIC Educational Resources Information Center

    Redcay, Shirley

    This module on an integrative seminar in human service is one of a set of six developed to prepare human services workers for the changing mental health service delivery system. A total of eight objectives are included to help students integrate previously learned knowledge and skills into a process of assessing service need, developing treatment…

  15. A Cross-Validation Study of Sequential-Simultaneous Processing at Ages 2 1/2-12 1/2 Using the Kaufman Assessment Battery for Children (K-ABC).

    ERIC Educational Resources Information Center

    Kamphaus, Randy W.; And Others

    The development of two types of mental processing (sequential and simultaneous) in preschool and elementary children was examined in this study. Specifically, the aims of the study were to develop a revised set of tasks based upon previous findings (Naglieri, Kaufman, Kaufman, & Kamphaus, 1981; Kaufman, Kaufman, Kamphaus, & Naglieri, in…

  16. Multi-level trellis coded modulation and multi-stage decoding

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.; Wu, Jiantian; Lin, Shu

    1990-01-01

    Several constructions for multi-level trellis codes are presented and many codes with better performance than previously known codes are found. These codes provide a flexible trade-off between coding gain, decoding complexity, and decoding delay. New multi-level trellis coded modulation schemes using generalized set partitioning methods are developed for Quadrature Amplitude Modulation (QAM) and Phase Shift Keying (PSK) signal sets. New rotationally invariant multi-level trellis codes which can be combined with differential encoding to resolve phase ambiguity are presented.

  17. The Therapeutic Alliance: Clients' Categorization of Client-Identified Factors

    ERIC Educational Resources Information Center

    Simpson, Arlene J.; Bedi, Robinder P.

    2012-01-01

    Clients' perspectives on the therapeutic alliance were examined using written descriptions of factors that clients believed to be helpful in developing a strong alliance. Fifty participants sorted previously collected statements into thematically similar piles and then gave each set of statements a title. Multivariate concept mapping statistical…

  18. Parents' Perceptions of Preschool Activities: Exploring Outdoor Play

    ERIC Educational Resources Information Center

    Jayasuriya, Avanthi; Williams, Marcia; Edwards, Todd; Tandon, Pooja

    2016-01-01

    Research Findings: Outdoor play is important for children's health and development, yet many preschool-age children in child care settings do not receive the recommended 60 min/day of outdoor play. Child care providers have previously described parent-related barriers to increasing outdoor playtime, including parents not providing appropriate…

  19. Exploring Engaging Gamification Mechanics in Massive Online Open Courses

    ERIC Educational Resources Information Center

    Chang, Jen-Wei; Wei, Hung-Yu

    2016-01-01

    Massive open online courses (MOOCs) have developed rapidly and become tremendously popular because of their plentiful gamification designs, such as reputation points, rewards, and goal setting. Although previous studies have mentioned a broad range of gamification designs that might influence MOOC learner engagement, most gamified MOOCs fail to…

  20. Formal Mentoring Relationships and Attachment Theory: Implications for Human Resource Development

    ERIC Educational Resources Information Center

    Germain, Marie-Line

    2011-01-01

    An attachment theory perspective of mentoring is presented to explain the degree of functionality of a mentor-protege formal match in an organizational setting. By focusing on Bowlby's (1969/1982) behavioral system of attachment and its triarchic taxonomy of secure, avoidant, and anxious-ambivalent attachment, previous conceptualizations are…

  1. Multiple Mentoring in Academe: Developing the Professorial Network

    ERIC Educational Resources Information Center

    de Janasz, Suzanne C.; Sullivan, Sherry E.

    2004-01-01

    Previous studies in business organizations have shown that mentoring provides numerous benefits for both individuals and organizations. Most of this mentoring research has been based on traditional, hierarchical mentor-protege relationships in non-academic settings. We discuss why there is little empirical research on faculty mentoring and review…

  2. Development of estrogen receptor beta binding prediction model using large sets of chemicals.

    PubMed

    Sakkiah, Sugunadevi; Selvaraj, Chandrabose; Gong, Ping; Zhang, Chaoyang; Tong, Weida; Hong, Huixiao

    2017-11-03

    We developed an ER β binding prediction model to facilitate identification of chemicals specifically bind ER β or ER α together with our previously developed ER α binding model. Decision Forest was used to train ER β binding prediction model based on a large set of compounds obtained from EADB. Model performance was estimated through 1000 iterations of 5-fold cross validations. Prediction confidence was analyzed using predictions from the cross validations. Informative chemical features for ER β binding were identified through analysis of the frequency data of chemical descriptors used in the models in the 5-fold cross validations. 1000 permutations were conducted to assess the chance correlation. The average accuracy of 5-fold cross validations was 93.14% with a standard deviation of 0.64%. Prediction confidence analysis indicated that the higher the prediction confidence the more accurate the predictions. Permutation testing results revealed that the prediction model is unlikely generated by chance. Eighteen informative descriptors were identified to be important to ER β binding prediction. Application of the prediction model to the data from ToxCast project yielded very high sensitivity of 90-92%. Our results demonstrated ER β binding of chemicals could be accurately predicted using the developed model. Coupling with our previously developed ER α prediction model, this model could be expected to facilitate drug development through identification of chemicals that specifically bind ER β or ER α .

  3. A computer-aided detection (CAD) system with a 3D algorithm for small acute intracranial hemorrhage

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Fernandez, James; Deshpande, Ruchi; Lee, Joon K.; Chan, Tao; Liu, Brent

    2012-02-01

    Acute Intracranial hemorrhage (AIH) requires urgent diagnosis in the emergency setting to mitigate eventual sequelae. However, experienced radiologists may not always be available to make a timely diagnosis. This is especially true for small AIH, defined as lesion smaller than 10 mm in size. A computer-aided detection (CAD) system for the detection of small AIH would facilitate timely diagnosis. A previously developed 2D algorithm shows high false positive rates in the evaluation based on LAC/USC cases, due to the limitation of setting up correct coordinate system for the knowledge-based classification system. To achieve a higher sensitivity and specificity, a new 3D algorithm is developed. The algorithm utilizes a top-hat transformation and dynamic threshold map to detect small AIH lesions. Several key structures of brain are detected and are used to set up a 3D anatomical coordinate system. A rule-based classification of the lesion detected is applied based on the anatomical coordinate system. For convenient evaluation in clinical environment, the CAD module is integrated with a stand-alone system. The CAD is evaluated by small AIH cases and matched normal collected in LAC/USC. The result of 3D CAD and the previous 2D CAD has been compared.

  4. Design of Phoneme MIDI Codes Using the MIDI Encoding Tool “Auto-F” and Realizing Voice Synthesizing Functions Based on Musical Sounds

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    Using our previously developed audio to MIDI code converter tool “Auto-F”, from given vocal acoustic signals we can create MIDI data, which enable to playback the voice-like signals with a standard MIDI synthesizer. Applying this tool, we are constructing a MIDI database, which consists of previously converted simple harmonic structured MIDI codes from a set of 71 Japanese male and female syllable recorded signals. And we are developing a novel voice synthesizing system based on harmonically synthesizing musical sounds, which can generate MIDI data and playback voice signals with a MIDI synthesizer by giving Japanese plain (kana) texts, referring to the syllable MIDI code database. In this paper, we propose an improved MIDI converter tool, which can produce temporally higher-resolution MIDI codes. Then we propose an algorithm separating a set of 20 consonant and vowel phoneme MIDI codes from 71 syllable MIDI converted codes in order to construct a voice synthesizing system. And, we present the evaluation results of voice synthesizing quality between these separated phoneme MIDI codes and their original syllable MIDI codes by our developed 4-syllable word listening tests.

  5. The cardiac muscle duplex as a method to study myocardial heterogeneity

    PubMed Central

    Solovyova, O.; Katsnelson, L.B.; Konovalov, P.V.; Kursanov, A.G.; Vikulova, N.A.; Kohl, P.; Markhasin, V.S.

    2014-01-01

    This paper reviews the development and application of paired muscle preparations, called duplex, for the investigation of mechanisms and consequences of intra-myocardial electro-mechanical heterogeneity. We illustrate the utility of the underlying combined experimental and computational approach for conceptual development and integration of basic science insight with clinically relevant settings, using previously published and new data. Directions for further study are identified. PMID:25106702

  6. Automated embolic signal detection using Deep Convolutional Neural Network.

    PubMed

    Sombune, Praotasna; Phienphanich, Phongphan; Phuechpanpaisal, Sutanya; Muengtaweepongsa, Sombat; Ruamthanthong, Anuchit; Tantibundhit, Charturong

    2017-07-01

    This work investigated the potential of Deep Neural Network in detection of cerebral embolic signal (ES) from transcranial Doppler ultrasound (TCD). The resulting system is aimed to couple with TCD devices in diagnosing a risk of stroke in real-time with high accuracy. The Adaptive Gain Control (AGC) approach developed in our previous study is employed to capture suspected ESs in real-time. By using spectrograms of the same TCD signal dataset as that of our previous work as inputs and the same experimental setup, Deep Convolutional Neural Network (CNN), which can learn features while training, was investigated for its ability to bypass the traditional handcrafted feature extraction and selection process. Extracted feature vectors from the suspected ESs are later determined whether they are of an ES, artifact (AF) or normal (NR) interval. The effectiveness of the developed system was evaluated over 19 subjects going under procedures generating emboli. The CNN-based system could achieve in average of 83.0% sensitivity, 80.1% specificity, and 81.4% accuracy, with considerably much less time consumption in development. The certainly growing set of training samples and computational resources will contribute to high performance. Besides having potential use in various clinical ES monitoring settings, continuation of this promising study will benefit developments of wearable applications by leveraging learnable features to serve demographic differentials.

  7. Modelling the impact of liner shipping network perturbations on container cargo routing: Southeast Asia to Europe application.

    PubMed

    Achurra-Gonzalez, Pablo E; Novati, Matteo; Foulser-Piggott, Roxane; Graham, Daniel J; Bowman, Gary; Bell, Michael G H; Angeloudis, Panagiotis

    2016-06-03

    Understanding how container routing stands to be impacted by different scenarios of liner shipping network perturbations such as natural disasters or new major infrastructure developments is of key importance for decision-making in the liner shipping industry. The variety of actors and processes within modern supply chains and the complexity of their relationships have previously led to the development of simulation-based models, whose application has been largely compromised by their dependency on extensive and often confidential sets of data. This study proposes the application of optimisation techniques less dependent on complex data sets in order to develop a quantitative framework to assess the impacts of disruptive events on liner shipping networks. We provide a categorization of liner network perturbations, differentiating between systemic and external and formulate a container assignment model that minimises routing costs extending previous implementations to allow feasible solutions when routing capacity is reduced below transport demand. We develop a base case network for the Southeast Asia to Europe liner shipping trade and review of accidents related to port disruptions for two scenarios of seismic and political conflict hazards. Numerical results identify alternative routing paths and costs in the aftermath of port disruptions scenarios and suggest higher vulnerability of intra-regional connectivity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. A Mapmark method of standard setting as implemented for the National Assessment Governing Board.

    PubMed

    Schulz, E Matthew; Mitzel, Howard C

    2011-01-01

    This article describes a Mapmark standard setting procedure, developed under contract with the National Assessment Governing Board (NAGB). The procedure enhances the bookmark method with spatially representative item maps, holistic feedback, and an emphasis on independent judgment. A rationale for these enhancements, and the bookmark method, is presented, followed by a detailed description of the materials and procedures used in a meeting to set standards for the 2005 National Assessment of Educational Progress (NAEP) in Grade 12 mathematics. The use of difficulty-ordered content domains to provide holistic feedback is a particularly novel feature of the method. Process evaluation results comparing Mapmark to Anghoff-based methods previously used for NAEP standard setting are also presented.

  9. Colon Reference Set Application: Mary Disis - University of Washington (2008) — EDRN Public Portal

    Cancer.gov

    The proposed study aims to validate the diagnostic value of a panel of serum antibodies for the early detection of colorectal cancer (CRC). We have developed a serum antibody based assay that shows promise in discriminating sera from CRC patients from healthy donors. We have evaluated two separate sample sets of sera that were available either commercially or were comprised of left over samples from previous studies by our group. Both sample sets showed concordance in discriminatory power. We have not been able to identify investigators with a larger, well defined sample set of early stage colon cancer sera and request assistance from the EDRN in obtaining such samples to help assess the potential diagnostic value of our autoantibody panel

  10. Mechanisms of fine extinction band development in vein quartz: new insights from correlative light and electron microscopy

    NASA Astrophysics Data System (ADS)

    Derez, Tine; Van Der Donck, Tom; Plümper, Oliver; Muchez, Philippe; Pennock, Gill; Drury, Martyn R.; Sintubin, Manuel

    2017-07-01

    Fine extinction bands (FEBs) (also known as deformation lamellae) visible with polarized light microscopy in quartz consist of a range of nanostructures, inferring different formation processes. Previous transmission electron microscopy studies have shown that most FEB nanostructures in naturally deformed quartz are elongated subgrains formed by recovery of dislocation slip bands. Here we show that three types of FEB nanostructure occur in naturally deformed vein quartz from the low-grade metamorphic High-Ardenne slate belt (Belgium). Prismatic oriented FEBs are defined by bands of dislocation walls. Dauphiné twin boundaries present along the FEB boundaries probably formed after FEB formation. In an example of two sub-rhombohedral oriented FEBs, developed as two sets in one grain, the finer FEB set consists of elongated subgrains, similar to FEBs described in previous transmission electron microscopy studies. The second wider FEB set consists of bands with different dislocation density and fluid-inclusion content. The wider FEB set is interpreted as bands with different plastic strain associated with the primary growth banding of the vein quartz grain. The nanometre-scale fluid inclusions are interpreted to have formed from structurally bounded hydroxyl groups that moreover facilitated formation of the elongate subgrains. Larger fluid inclusions aligned along FEBs are explained by fluid-inclusion redistribution along dislocation cores. The prismatic FEB nanostructure and the relation between FEBs and growth bands have not been recognized before, although related structures have been reported in experimentally deformed quartz.

  11. Achieving Consensus on Total Joint Replacement Trial Outcome Reporting Using the OMERACT Filter: Endorsement of the Final Core Domain Set for Total Hip and Total Knee Replacement Trials for Endstage Arthritis.

    PubMed

    Singh, Jasvinder A; Dowsey, Michelle M; Dohm, Michael; Goodman, Susan M; Leong, Amye L; Scholte Voshaar, Marieke M J H; Choong, Peter F

    2017-11-01

    Discussion and endorsement of the OMERACT total joint replacement (TJR) core domain set for total hip replacement (THR) and total knee replacement (TKR) for endstage arthritis; and next steps for selection of instruments. The OMERACT TJR working group met at the 2016 meeting at Whistler, British Columbia, Canada. We summarized the previous systematic reviews, the preliminary OMERACT TJR core domain set and results from previous surveys. We discussed preliminary core domains for TJR clinical trials, made modifications, and identified challenges with domain measurement. Working group participants (n = 26) reviewed, clarified, and endorsed each of the inner and middle circle domains and added a range of motion domain to the research agenda. TJR were limited to THR and TKR but included all endstage hip and knee arthritis refractory to medical treatment. Participants overwhelmingly endorsed identification and evaluation of top instruments mapping to the core domains (100%) and use of subscales of validated multidimensional instruments to measure core domains for the TJR clinical trial core measurement set (92%). An OMERACT core domain set for hip/knee TJR trials has been defined and we are selecting instruments to develop the TJR clinical trial core measurement set to serve as a common foundation for harmonizing measures in TJR clinical trials.

  12. DIRAC in Large Particle Physics Experiments

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Tsaregorodtsev, A.; Arrabito, L.; Sailer, A.; Hara, T.; Zhang, X.; Consortium, DIRAC

    2017-10-01

    The DIRAC project is developing interware to build and operate distributed computing systems. It provides a development framework and a rich set of services for both Workload and Data Management tasks of large scientific communities. A number of High Energy Physics and Astrophysics collaborations have adopted DIRAC as the base for their computing models. DIRAC was initially developed for the LHCb experiment at LHC, CERN. Later, the Belle II, BES III and CTA experiments as well as the linear collider detector collaborations started using DIRAC for their computing systems. Some of the experiments built their DIRAC-based systems from scratch, others migrated from previous solutions, ad-hoc or based on different middlewares. Adaptation of DIRAC for a particular experiment was enabled through the creation of extensions to meet their specific requirements. Each experiment has a heterogeneous set of computing and storage resources at their disposal that were aggregated through DIRAC into a coherent pool. Users from different experiments can interact with the system in different ways depending on their specific tasks, expertise level and previous experience using command line tools, python APIs or Web Portals. In this contribution we will summarize the experience of using DIRAC in particle physics collaborations. The problems of migration to DIRAC from previous systems and their solutions will be presented. An overview of specific DIRAC extensions will be given. We hope that this review will be useful for experiments considering an update, or for those designing their computing models.

  13. Stream Kriging: Incremental and recursive ordinary Kriging over spatiotemporal data streams

    NASA Astrophysics Data System (ADS)

    Zhong, Xu; Kealy, Allison; Duckham, Matt

    2016-05-01

    Ordinary Kriging is widely used for geospatial interpolation and estimation. Due to the O (n3) time complexity of solving the system of linear equations, ordinary Kriging for a large set of source points is computationally intensive. Conducting real-time Kriging interpolation over continuously varying spatiotemporal data streams can therefore be especially challenging. This paper develops and tests two new strategies for improving the performance of an ordinary Kriging interpolator adapted to a stream-processing environment. These strategies rely on the expectation that, over time, source data points will frequently refer to the same spatial locations (for example, where static sensor nodes are generating repeated observations of a dynamic field). First, an incremental strategy improves efficiency in cases where a relatively small proportion of previously processed spatial locations are absent from the source points at any given iteration. Second, a recursive strategy improves efficiency in cases where there is substantial set overlap between the sets of spatial locations of source points at the current and previous iterations. These two strategies are evaluated in terms of their computational efficiency in comparison to ordinary Kriging algorithm. The results show that these two strategies can reduce the time taken to perform the interpolation by up to 90%, and approach average-case time complexity of O (n2) when most but not all source points refer to the same locations over time. By combining the approaches developed in this paper with existing heuristic ordinary Kriging algorithms, the conclusions indicate how further efficiency gains could potentially be accrued. The work ultimately contributes to the development of online ordinary Kriging interpolation algorithms, capable of real-time spatial interpolation with large streaming data sets.

  14. A fuzzy case based reasoning tool for model based approach to rocket engine health monitoring

    NASA Technical Reports Server (NTRS)

    Krovvidy, Srinivas; Nolan, Adam; Hu, Yong-Lin; Wee, William G.

    1992-01-01

    In this system we develop a fuzzy case based reasoner that can build a case representation for several past anomalies detected, and we develop case retrieval methods that can be used to index a relevant case when a new problem (case) is presented using fuzzy sets. The choice of fuzzy sets is justified by the uncertain data. The new problem can be solved using knowledge of the model along with the old cases. This system can then be used to generalize the knowledge from previous cases and use this generalization to refine the existing model definition. This in turn can help to detect failures using the model based algorithms.

  15. Transport Test Problems for Hybrid Methods Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.

    2011-12-28

    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  16. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  17. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  18. Estimating Single-Event Logic Cross Sections in Advanced Technologies

    NASA Astrophysics Data System (ADS)

    Harrington, R. C.; Kauppila, J. S.; Warren, K. M.; Chen, Y. P.; Maharrey, J. A.; Haeffner, T. D.; Loveless, T. D.; Bhuva, B. L.; Bounasser, M.; Lilja, K.; Massengill, L. W.

    2017-08-01

    Reliable estimation of logic single-event upset (SEU) cross section is becoming increasingly important for predicting the overall soft error rate. As technology scales and single-event transient (SET) pulse widths shrink to widths on the order of the setup-and-hold time of flip-flops, the probability of latching an SET as an SEU must be reevaluated. In this paper, previous assumptions about the relationship of SET pulsewidth to the probability of latching an SET are reconsidered and a model for transient latching probability has been developed for advanced technologies. A method using the improved transient latching probability and SET data is used to predict logic SEU cross section. The presented model has been used to estimate combinational logic SEU cross sections in 32-nm partially depleted silicon-on-insulator (SOI) technology given experimental heavy-ion SET data. Experimental SEU data show good agreement with the model presented in this paper.

  19. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    DTIC Science & Technology

    2016-02-01

    from the tools being used. For example, while Coq proves properties it does not dump an explanation of the proofs in any currently supported form. The...Distribution Unlimited 5 Hotel room locks and card keys use a simple protocol to manage the transition of rooms from one guest to the next. The lock...retains that guest key’s code. A new guest checks in and gets a card with a new current code, and the previous code set to the previous guest’s current

  20. Use of Web-Based Portfolios as Tools for Reflection in Preservice Teacher Education

    ERIC Educational Resources Information Center

    Oner, Diler; Adadan, Emine

    2011-01-01

    This mixed-methods study examined the use of web-based portfolios for developing preservice teachers' reflective skills. Building on the work of previous research, the authors proposed a set of reflection-based tasks to enrich preservice teachers' internship experiences. Their purpose was to identify (a) whether preservice teachers demonstrated…

  1. Application of a long-established molecular marker in larval teleosts to evaluate estrogenic potential in surface waters and wastewater effluents

    EPA Science Inventory

    In recent years molecular indicators, diagnostic for exposure in aquatic systems, have been developed using teleostean models in laboratory and field settings. Our laboratory has previously shown that the gene for vitellogenin, a protein precursor of egg yolk in oviparous animals...

  2. Promoting Positive Academic Dispositions Using a Web-Based PBL Environment: The GlobalEd 2 Project

    ERIC Educational Resources Information Center

    Brown, Scott W.; Lawless, Kimberly A.; Boyer, Mark A.

    2013-01-01

    Problem-based learning (PBL) is an instructional design approach for promoting student learning, understanding and knowledge development in context rich settings. Previous PBL research has primarily focused on face-to-face learning environments, but current technologies afford PBL designers the opportunities to create online, virtual, PBL…

  3. Developing Computerized Tests for Classroom Teachers: A Pilot Study.

    ERIC Educational Resources Information Center

    Glowacki, Margaret L.; And Others

    Two types of computerized testing have been defined: (1) computer-based testing, using a computer to administer conventional tests in which all examinees take the same set of items; and (2) adaptive tests, in which items are selected for administration by the computer, based on examinee's previous responses. This paper discusses an option for…

  4. Economic benefits of using adaptive predictive models of reproductive toxicity in the context of a tiered testing program

    EPA Science Inventory

    A predictive model of reproductive toxicity, as observed in rat multigeneration reproductive (MGR) studies, was previously developed using high throughput screening (HTS) data from 36 in vitro assays mapped to 8 genes or gene-sets from Phase I of USEPA ToxCast research program, t...

  5. Canopy cover and leaf area index relationships for wheat, triticale, and corn

    USDA-ARS?s Scientific Manuscript database

    The AquaCrop model requires canopy cover (CC) measurements to define crop growth and development. Some previously collected data sets that would be useful for calibrating and validating AquaCrop contain only leaf area index (LAI) data, but could be used if relationships were available relating LAI t...

  6. The Evolution, Design and Implementation of the Minds in Motion Curriculum

    ERIC Educational Resources Information Center

    Cottone, Elizabeth; Chen, Wei-Bing; Brock, Laura

    2013-01-01

    Building on the empirical work of the previous two studies, this paper describes the development of the Minds In Motion curriculum (MIM), as well as the setting and circumstances of a randomized controlled trial conducted to evaluate this intervention. Throughout this paper the authors emphasize the benefits and challenges of assembling an…

  7. Antecedents of Identity Development in a Structured Recreation Setting: A Qualitative Inquiry

    ERIC Educational Resources Information Center

    Duerden, Mat D.; Taniguchi, Stacy; Widmer, Mark

    2012-01-01

    Identity research has focused primarily on outcomes associated with identity formation. Far less attention, however, has been given to understanding the facilitating contextual elements of this process. This qualitative study examined a context, a 2-week adventure recreation program for youth, quantitatively shown in previous research to have…

  8. School Innovation: The Mutual Impacts of Organizational Learning and Creativity

    ERIC Educational Resources Information Center

    McCharen, Belinda; Song, JiHoon; Martens, Jon

    2011-01-01

    The primary aim of this research is to identify cultural determinants of organizational learning and knowledge creation practices, which could be the driving factors for the innovation process in school settings (Mulford, 1998; Silins et al., 2002). A conceptual process model for school innovation was developed. In contrast to previous approaches,…

  9. Context and Occasion Setting in "Drosophila" Visual Learning

    ERIC Educational Resources Information Center

    Brembs, Bjorn; Wiener, Jan

    2006-01-01

    In a permanently changing environment, it is by no means an easy task to distinguish potentially important events from negligible ones. Yet, to survive, every animal has to continuously face that challenge. How does the brain accomplish this feat? Building on previous work in "Drosophila melanogaster" visual learning, we have developed an…

  10. Patterns of Student Growth in Reasoning about Multivariate Correlational Problems.

    ERIC Educational Resources Information Center

    Ross, John A.; Cousins, J. Bradley

    Previous studies of the development of correlational reasoning have focused on the interpretation of relatively simple data sets contained in 2 X 2 tables. In contrast, this study examined age trends in subjects' responses to problems involving more than two continuous variables. The research is part of a multi-year project to conceptualize…

  11. Measuring the diffusion of innovative health promotion programs.

    PubMed

    Steckler, A; Goodman, R M; McLeroy, K R; Davis, S; Koch, G

    1992-01-01

    Once a health promotion program has proven to be effective in one or two initial settings, attempts may be made to transfer the program to new settings. One way to conceptualize the transference of health promotion programs from one locale to another is by considering the programs to be innovations that are being diffused. In this way, diffusion of innovation theory can be applied to guide the process of program transference. This article reports on the development of six questionnaires to measure the extent to which health promotion programs are successfully disseminated: Organizational Climate, Awareness-Concern, Rogers's Adoption Variables, Level of Use, Level of Success, and Level of Institutionalization. The instruments are being successfully used in a study of the diffusion of health promotion/tobacco prevention curricula to junior high schools in North Carolina. The instruments, which measure the four steps of the diffusion process, have construct validity since they were developed within existing theories and are derived from the work of previous researchers. No previous research has attempted to use instruments like these to measure sequentially the stages of the diffusion process.

  12. PDF4LHC recommendations for LHC Run II

    DOE PAGES

    Butterworth, Jon; Carrazza, Stefano; Cooper-Sarkar, Amanda; ...

    2016-01-06

    We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+αs uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. Lastly, we finally discuss tools which allow for themore » delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.« less

  13. Day 1 for the Integrated Multi-Satellite Retrievals for GPM (IMERG) Data Sets

    NASA Astrophysics Data System (ADS)

    Huffman, G. J.; Bolvin, D. T.; Braithwaite, D.; Hsu, K. L.; Joyce, R.; Kidd, C.; Sorooshian, S.; Xie, P.

    2014-12-01

    The Integrated Multi-satellitE Retrievals for GPM (IMERG) is designed to compute the best time series of (nearly) global precipitation from "all" precipitation-relevant satellites and global surface precipitation gauge analyses. IMERG was developed to use GPM Core Observatory data as a reference for the international constellation of satellites of opportunity that constitute the GPM virtual constellation. Computationally, IMERG is a unified U.S. algorithm drawing on strengths in the three contributing groups, whose previous work includes: 1) the TRMM Multi-satellite Precipitation Analysis (TMPA); 2) the CPC Morphing algorithm with Kalman Filtering (K-CMORPH); and 3) the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS). We review the IMERG design, development, testing, and current status. IMERG provides 0.1°x0.1° half-hourly data, and will be run at multiple times, providing successively more accurate estimates: 4 hours, 8 hours, and 2 months after observation time. In Day 1 the spatial extent is 60°N-S, for the period March 2014 to the present. In subsequent reprocessing the data will extend to fully global, covering the period 1998 to the present. Both the set of input data set retrievals and the IMERG system are substantially different than those used in previous U.S. products. The input passive microwave data are all being produced with GPROF2014, which is substantially upgraded compared to previous versions. For the first time, this includes microwave sounders. Accordingly, there is a strong need to carefully check the initial test data sets for performance. IMERG output will be illustrated using pre-operational test data, including the variety of supporting fields, such as the merged-microwave and infrared estimates, and the precipitation type. Finally, we will summarize the expected release of various output products, and the subsequent reprocessing sequence.

  14. Speech recognition features for EEG signal description in detection of neonatal seizures.

    PubMed

    Temko, A; Boylan, G; Marnane, W; Lightbody, G

    2010-01-01

    In this work, features which are usually employed in automatic speech recognition (ASR) are used for the detection of neonatal seizures in newborn EEG. Three conventional ASR feature sets are compared to the feature set which has been previously developed for this task. The results indicate that the thoroughly-studied spectral envelope based ASR features perform reasonably well on their own. Additionally, the SVM Recursive Feature Elimination routine is applied to all extracted features pooled together. It is shown that ASR features consistently appear among the top-rank features.

  15. Stability issues of nonlocal gravity during primordial inflation

    NASA Astrophysics Data System (ADS)

    Belgacem, Enis; Cusin, Giulia; Foffa, Stefano; Maggiore, Michele; Mancarella, Michele

    2018-01-01

    We study the cosmological evolution of some nonlocal gravity models, when the initial conditions are set during a phase of primordial inflation. We examine in particular three models, the so-called RT, RR and Δ4 models, previously introduced by our group. We find that, during inflation, the RT model has a viable background evolution, but at the level of cosmological perturbations develops instabilities that make it nonviable. In contrast, the RR and Δ4 models have a viable evolution even when their initial conditions are set during a phase of primordial inflation.

  16. Evaluation and comparison of predictive individual-level general surrogates.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Halloran, M Elizabeth

    2018-07-01

    An intermediate response measure that accurately predicts efficacy in a new setting at the individual level could be used both for prediction and personalized medical decisions. In this article, we define a predictive individual-level general surrogate (PIGS), which is an individual-level intermediate response that can be used to accurately predict individual efficacy in a new setting. While methods for evaluating trial-level general surrogates, which are predictors of trial-level efficacy, have been developed previously, few, if any, methods have been developed to evaluate individual-level general surrogates, and no methods have formalized the use of cross-validation to quantify the expected prediction error. Our proposed method uses existing methods of individual-level surrogate evaluation within a given clinical trial setting in combination with cross-validation over a set of clinical trials to evaluate surrogate quality and to estimate the absolute prediction error that is expected in a new trial setting when using a PIGS. Simulations show that our method performs well across a variety of scenarios. We use our method to evaluate and to compare candidate individual-level general surrogates over a set of multi-national trials of a pentavalent rotavirus vaccine.

  17. Quantifying Impacts of Urban Growth Potential on Army Training Capabilities

    DTIC Science & Technology

    2017-09-12

    Capacity” ERDC/CERL TR-17-34 ii Abstract Building on previous studies of urban growth and population effects on U.S. military installations and...combat team studies . CAA has developed an iterative process that builds on Military Value Anal- ysis (MVA) models that include a set of attributes that...Methods and tools were developed to support a nationwide analysis. This study focused on installations operating training areas that were high

  18. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed

    Dugan, J M; Berrios, D C; Liu, X; Kim, D K; Kaizer, H; Fagan, L M

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models.

  19. Comparison of Methods for Determining Boundary Layer Edge Conditions for Transition Correlations

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Berry, Scott A.; Hollis, Brian R.; Horvath, Thomas J.

    2003-01-01

    Data previously obtained for the X-33 in the NASA Langley Research Center 20-Inch Mach 6 Air Tunnel have been reanalyzed to compare methods for determining boundary layer edge conditions for use in transition correlations. The experimental results were previously obtained utilizing the phosphor thermography technique to monitor the status of the boundary layer downstream of discrete roughness elements via global heat transfer images of the X-33 windward surface. A boundary layer transition correlation was previously developed for this data set using boundary layer edge conditions calculated using an inviscid/integral boundary layer approach. An algorithm was written in the present study to extract boundary layer edge quantities from higher fidelity viscous computational fluid dynamic solutions to develop transition correlations that account for viscous effects on vehicles of arbitrary complexity. The boundary layer transition correlation developed for the X-33 from the viscous solutions are compared to the previous boundary layer transition correlations. It is shown that the boundary layer edge conditions calculated using an inviscid/integral boundary layer approach are significantly different than those extracted from viscous computational fluid dynamic solutions. The present results demonstrate the differences obtained in correlating transition data using different computational methods.

  20. Development of multiplex microsatellite PCR panels for the seagrass Thalassia hemprichii (Hydrocharitaceae).

    PubMed

    van Dijk, Kor-Jent; Mellors, Jane; Waycott, Michelle

    2014-11-01

    New microsatellites were developed for the seagrass Thalassia hemprichii (Hydrocharitaceae), a long-lived seagrass species that is found throughout the shallow waters of tropical and subtropical Indo-West Pacific. Three multiplex PCR panels were designed utilizing new and previously developed markers, resulting in a toolkit for generating a 16-locus genotype. • Through the use of microsatellite enrichment and next-generation sequencing, 16 new, validated, polymorphic microsatellite markers were isolated. Diversity was between two and four alleles per locus totaling 36 alleles. These markers, plus previously developed microsatellite markers for T. hemprichii and T. testudinum, were tested for suitability in multiplex PCR panels. • The generation of an easily replicated suite of multiplex panels of codominant molecular markers will allow for high-resolution and detailed genetic structure analysis and clonality assessment with minimal genotyping costs. We suggest the establishment of a T. hemprichii primer convention for the unification of future data sets.

  1. The medline UK filter: development and validation of a geographic search filter to retrieve research about the UK from OVID medline.

    PubMed

    Ayiku, Lynda; Levay, Paul; Hudson, Tom; Craven, Jenny; Barrett, Elizabeth; Finnegan, Amy; Adams, Rachel

    2017-07-13

    A validated geographic search filter for the retrieval of research about the United Kingdom (UK) from bibliographic databases had not previously been published. To develop and validate a geographic search filter to retrieve research about the UK from OVID medline with high recall and precision. Three gold standard sets of references were generated using the relative recall method. The sets contained references to studies about the UK which had informed National Institute for Health and Care Excellence (NICE) guidance. The first and second sets were used to develop and refine the medline UK filter. The third set was used to validate the filter. Recall, precision and number-needed-to-read (NNR) were calculated using a case study. The validated medline UK filter demonstrated 87.6% relative recall against the third gold standard set. In the case study, the medline UK filter demonstrated 100% recall, 11.4% precision and a NNR of nine. A validated geographic search filter to retrieve research about the UK with high recall and precision has been developed. The medline UK filter can be applied to systematic literature searches in OVID medline for topics with a UK focus. © 2017 Crown copyright. Health Information and Libraries Journal © 2017 Health Libraries GroupThis article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  2. Parallel digital forensics infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, Lorie M.; Duggan, David Patrick

    2009-10-01

    This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less

  3. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  4. High Temperature Joining and Characterization of Joint Properties in Silicon Carbide-Based Composite Materials

    NASA Technical Reports Server (NTRS)

    Halbig, Michael C.; Singh, Mrityunjay

    2015-01-01

    Advanced silicon carbide-based ceramics and composites are being developed for a wide variety of high temperature extreme environment applications. Robust high temperature joining and integration technologies are enabling for the fabrication and manufacturing of large and complex shaped components. The development of a new joining approach called SET (Single-step Elevated Temperature) joining will be described along with the overview of previously developed joining approaches including high temperature brazing, ARCJoinT (Affordable, Robust Ceramic Joining Technology), diffusion bonding, and REABOND (Refractory Eutectic Assisted Bonding). Unlike other approaches, SET joining does not have any lower temperature phases and will therefore have a use temperature above 1315C. Optimization of the composition for full conversion to silicon carbide will be discussed. The goal is to find a composition with no remaining carbon or free silicon. Green tape interlayers were developed for joining. Microstructural analysis and preliminary mechanical tests of the joints will be presented.

  5. Setting priorities for space research: An experiment in methodology

    NASA Technical Reports Server (NTRS)

    1995-01-01

    In 1989, the Space Studies Board created the Task Group on Priorities in Space Research to determine whether scientists should take a role in recommending priorities for long-term space research initiatives and, if so, to analyze the priority-setting problem in this context and develop a method by which such priorities could be established. After answering the first question in the affirmative in a previous report, the task group set out to accomplish the second task. The basic assumption in developing a priority-setting process is that a reasoned and structured approach for ordering competing initiatives will yield better results than other ways of proceeding. The task group proceeded from the principle that the central criterion for evaluating a research initiative must be its scientific merit -- the value of the initiative to the proposing discipline and to science generally. The group developed a two-stage methodology for priority setting and constructed a procedure and format to support the methodology. The first of two instruments developed was a standard format for structuring proposals for space research initiatives. The second instrument was a formal, semiquantitative appraisal procedure for evaluating competing proposals. This report makes available complete templates for the methodology, including the advocacy statement and evaluation forms, as well as an 11-step schema for a priority-setting process. From the beginning of its work, the task group was mindful that the issue of priority setting increasingly pervades all of federally supported science and that its work would have implications extending beyond space research. Thus, although the present report makes no recommendations for action by NASA or other government agencies, it provides the results of the task group's work for the use of others who may study priority-setting procedures or take up the challenge of implementing them in the future.

  6. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.

    2015-02-01

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  7. 1000 Genomes-based meta-analysis identifies 10 novel loci for kidney function

    PubMed Central

    Gorski, Mathias; van der Most, Peter J.; Teumer, Alexander; Chu, Audrey Y.; Li, Man; Mijatovic, Vladan; Nolte, Ilja M.; Cocca, Massimiliano; Taliun, Daniel; Gomez, Felicia; Li, Yong; Tayo, Bamidele; Tin, Adrienne; Feitosa, Mary F.; Aspelund, Thor; Attia, John; Biffar, Reiner; Bochud, Murielle; Boerwinkle, Eric; Borecki, Ingrid; Bottinger, Erwin P.; Chen, Ming-Huei; Chouraki, Vincent; Ciullo, Marina; Coresh, Josef; Cornelis, Marilyn C.; Curhan, Gary C.; d’Adamo, Adamo Pio; Dehghan, Abbas; Dengler, Laura; Ding, Jingzhong; Eiriksdottir, Gudny; Endlich, Karlhans; Enroth, Stefan; Esko, Tõnu; Franco, Oscar H.; Gasparini, Paolo; Gieger, Christian; Girotto, Giorgia; Gottesman, Omri; Gudnason, Vilmundur; Gyllensten, Ulf; Hancock, Stephen J.; Harris, Tamara B.; Helmer, Catherine; Höllerer, Simon; Hofer, Edith; Hofman, Albert; Holliday, Elizabeth G.; Homuth, Georg; Hu, Frank B.; Huth, Cornelia; Hutri-Kähönen, Nina; Hwang, Shih-Jen; Imboden, Medea; Johansson, Åsa; Kähönen, Mika; König, Wolfgang; Kramer, Holly; Krämer, Bernhard K.; Kumar, Ashish; Kutalik, Zoltan; Lambert, Jean-Charles; Launer, Lenore J.; Lehtimäki, Terho; de Borst, Martin; Navis, Gerjan; Swertz, Morris; Liu, Yongmei; Lohman, Kurt; Loos, Ruth J. F.; Lu, Yingchang; Lyytikäinen, Leo-Pekka; McEvoy, Mark A.; Meisinger, Christa; Meitinger, Thomas; Metspalu, Andres; Metzger, Marie; Mihailov, Evelin; Mitchell, Paul; Nauck, Matthias; Oldehinkel, Albertine J.; Olden, Matthias; WJH Penninx, Brenda; Pistis, Giorgio; Pramstaller, Peter P.; Probst-Hensch, Nicole; Raitakari, Olli T.; Rettig, Rainer; Ridker, Paul M.; Rivadeneira, Fernando; Robino, Antonietta; Rosas, Sylvia E.; Ruderfer, Douglas; Ruggiero, Daniela; Saba, Yasaman; Sala, Cinzia; Schmidt, Helena; Schmidt, Reinhold; Scott, Rodney J.; Sedaghat, Sanaz; Smith, Albert V.; Sorice, Rossella; Stengel, Benedicte; Stracke, Sylvia; Strauch, Konstantin; Toniolo, Daniela; Uitterlinden, Andre G.; Ulivi, Sheila; Viikari, Jorma S.; Völker, Uwe; Vollenweider, Peter; Völzke, Henry; Vuckovic, Dragana; Waldenberger, Melanie; Jin Wang, Jie; Yang, Qiong; Chasman, Daniel I.; Tromp, Gerard; Snieder, Harold; Heid, Iris M.; Fox, Caroline S.; Köttgen, Anna; Pattaro, Cristian; Böger, Carsten A.; Fuchsberger, Christian

    2017-01-01

    HapMap imputed genome-wide association studies (GWAS) have revealed >50 loci at which common variants with minor allele frequency >5% are associated with kidney function. GWAS using more complete reference sets for imputation, such as those from The 1000 Genomes project, promise to identify novel loci that have been missed by previous efforts. To investigate the value of such a more complete variant catalog, we conducted a GWAS meta-analysis of kidney function based on the estimated glomerular filtration rate (eGFR) in 110,517 European ancestry participants using 1000 Genomes imputed data. We identified 10 novel loci with p-value < 5 × 10−8 previously missed by HapMap-based GWAS. Six of these loci (HOXD8, ARL15, PIK3R1, EYA4, ASTN2, and EPB41L3) are tagged by common SNPs unique to the 1000 Genomes reference panel. Using pathway analysis, we identified 39 significant (FDR < 0.05) genes and 127 significantly (FDR < 0.05) enriched gene sets, which were missed by our previous analyses. Among those, the 10 identified novel genes are part of pathways of kidney development, carbohydrate metabolism, cardiac septum development and glucose metabolism. These results highlight the utility of re-imputing from denser reference panels, until whole-genome sequencing becomes feasible in large samples. PMID:28452372

  8. 1000 Genomes-based meta-analysis identifies 10 novel loci for kidney function.

    PubMed

    Gorski, Mathias; van der Most, Peter J; Teumer, Alexander; Chu, Audrey Y; Li, Man; Mijatovic, Vladan; Nolte, Ilja M; Cocca, Massimiliano; Taliun, Daniel; Gomez, Felicia; Li, Yong; Tayo, Bamidele; Tin, Adrienne; Feitosa, Mary F; Aspelund, Thor; Attia, John; Biffar, Reiner; Bochud, Murielle; Boerwinkle, Eric; Borecki, Ingrid; Bottinger, Erwin P; Chen, Ming-Huei; Chouraki, Vincent; Ciullo, Marina; Coresh, Josef; Cornelis, Marilyn C; Curhan, Gary C; d'Adamo, Adamo Pio; Dehghan, Abbas; Dengler, Laura; Ding, Jingzhong; Eiriksdottir, Gudny; Endlich, Karlhans; Enroth, Stefan; Esko, Tõnu; Franco, Oscar H; Gasparini, Paolo; Gieger, Christian; Girotto, Giorgia; Gottesman, Omri; Gudnason, Vilmundur; Gyllensten, Ulf; Hancock, Stephen J; Harris, Tamara B; Helmer, Catherine; Höllerer, Simon; Hofer, Edith; Hofman, Albert; Holliday, Elizabeth G; Homuth, Georg; Hu, Frank B; Huth, Cornelia; Hutri-Kähönen, Nina; Hwang, Shih-Jen; Imboden, Medea; Johansson, Åsa; Kähönen, Mika; König, Wolfgang; Kramer, Holly; Krämer, Bernhard K; Kumar, Ashish; Kutalik, Zoltan; Lambert, Jean-Charles; Launer, Lenore J; Lehtimäki, Terho; de Borst, Martin; Navis, Gerjan; Swertz, Morris; Liu, Yongmei; Lohman, Kurt; Loos, Ruth J F; Lu, Yingchang; Lyytikäinen, Leo-Pekka; McEvoy, Mark A; Meisinger, Christa; Meitinger, Thomas; Metspalu, Andres; Metzger, Marie; Mihailov, Evelin; Mitchell, Paul; Nauck, Matthias; Oldehinkel, Albertine J; Olden, Matthias; Wjh Penninx, Brenda; Pistis, Giorgio; Pramstaller, Peter P; Probst-Hensch, Nicole; Raitakari, Olli T; Rettig, Rainer; Ridker, Paul M; Rivadeneira, Fernando; Robino, Antonietta; Rosas, Sylvia E; Ruderfer, Douglas; Ruggiero, Daniela; Saba, Yasaman; Sala, Cinzia; Schmidt, Helena; Schmidt, Reinhold; Scott, Rodney J; Sedaghat, Sanaz; Smith, Albert V; Sorice, Rossella; Stengel, Benedicte; Stracke, Sylvia; Strauch, Konstantin; Toniolo, Daniela; Uitterlinden, Andre G; Ulivi, Sheila; Viikari, Jorma S; Völker, Uwe; Vollenweider, Peter; Völzke, Henry; Vuckovic, Dragana; Waldenberger, Melanie; Jin Wang, Jie; Yang, Qiong; Chasman, Daniel I; Tromp, Gerard; Snieder, Harold; Heid, Iris M; Fox, Caroline S; Köttgen, Anna; Pattaro, Cristian; Böger, Carsten A; Fuchsberger, Christian

    2017-04-28

    HapMap imputed genome-wide association studies (GWAS) have revealed >50 loci at which common variants with minor allele frequency >5% are associated with kidney function. GWAS using more complete reference sets for imputation, such as those from The 1000 Genomes project, promise to identify novel loci that have been missed by previous efforts. To investigate the value of such a more complete variant catalog, we conducted a GWAS meta-analysis of kidney function based on the estimated glomerular filtration rate (eGFR) in 110,517 European ancestry participants using 1000 Genomes imputed data. We identified 10 novel loci with p-value < 5 × 10 -8 previously missed by HapMap-based GWAS. Six of these loci (HOXD8, ARL15, PIK3R1, EYA4, ASTN2, and EPB41L3) are tagged by common SNPs unique to the 1000 Genomes reference panel. Using pathway analysis, we identified 39 significant (FDR < 0.05) genes and 127 significantly (FDR < 0.05) enriched gene sets, which were missed by our previous analyses. Among those, the 10 identified novel genes are part of pathways of kidney development, carbohydrate metabolism, cardiac septum development and glucose metabolism. These results highlight the utility of re-imputing from denser reference panels, until whole-genome sequencing becomes feasible in large samples.

  9. [Optimization of the parameters of microcirculatory structural adaptation model based on improved quantum-behaved particle swarm optimization algorithm].

    PubMed

    Pan, Qing; Yao, Jialiang; Wang, Ruofan; Cao, Ping; Ning, Gangmin; Fang, Luping

    2017-08-01

    The vessels in the microcirculation keep adjusting their structure to meet the functional requirements of the different tissues. A previously developed theoretical model can reproduce the process of vascular structural adaptation to help the study of the microcirculatory physiology. However, until now, such model lacks the appropriate methods for its parameter settings with subsequent limitation of further applications. This study proposed an improved quantum-behaved particle swarm optimization (QPSO) algorithm for setting the parameter values in this model. The optimization was performed on a real mesenteric microvascular network of rat. The results showed that the improved QPSO was superior to the standard particle swarm optimization, the standard QPSO and the previously reported Downhill algorithm. We conclude that the improved QPSO leads to a better agreement between mathematical simulation and animal experiment, rendering the model more reliable in future physiological studies.

  10. The role of socio-communicative rearing environments in the development of social and physical cognition in apes.

    PubMed

    Russell, Jamie L; Lyn, Heidi; Schaeffer, Jennifer A; Hopkins, William D

    2011-11-01

    The cultural intelligence hypothesis (CIH) claims that humans' advanced cognition is a direct result of human culture and that children are uniquely specialized to absorb and utilize this cultural experience (Tomasello, 2000). Comparative data demonstrating that 2.5-year-old human children outperform apes on measures of social cognition but not on measures of physical cognition support this claim (Herrmann et al., 2007). However, the previous study failed to control for rearing when comparing these two species. Specifically, the human children were raised in a human culture whereas the apes were raised in standard sanctuary settings. To further explore the CIH, here we compared the performance on multiple measures of social and physical cognition in a group of standard reared apes raised in conditions typical of zoo and biomedical laboratory settings to that of apes reared in an enculturated socio-communicatively rich environment. Overall, the enculturated apes significantly outperformed their standard reared counterparts on the cognitive tasks and this was particularly true for measures of communication. Furthermore, the performance of the enculturated apes was very similar to previously reported data from 2.5-year-old children. We conclude that apes who are reared in a human-like socio-communicatively rich environment develop superior communicative abilities compared to apes reared in standard laboratory settings, which supports some assumptions of the cultural intelligence hypothesis. 2011 Blackwell Publishing Ltd.

  11. Support vector regression scoring of receptor-ligand complexes for rank-ordering and virtual screening of chemical libraries.

    PubMed

    Li, Liwei; Wang, Bo; Meroueh, Samy O

    2011-09-26

    The community structure-activity resource (CSAR) data sets are used to develop and test a support vector machine-based scoring function in regression mode (SVR). Two scoring functions (SVR-KB and SVR-EP) are derived with the objective of reproducing the trend of the experimental binding affinities provided within the two CSAR data sets. The features used to train SVR-KB are knowledge-based pairwise potentials, while SVR-EP is based on physicochemical properties. SVR-KB and SVR-EP were compared to seven other widely used scoring functions, including Glide, X-score, GoldScore, ChemScore, Vina, Dock, and PMF. Results showed that SVR-KB trained with features obtained from three-dimensional complexes of the PDBbind data set outperformed all other scoring functions, including best performing X-score, by nearly 0.1 using three correlation coefficients, namely Pearson, Spearman, and Kendall. It was interesting that higher performance in rank ordering did not translate into greater enrichment in virtual screening assessed using the 40 targets of the Directory of Useful Decoys (DUD). To remedy this situation, a variant of SVR-KB (SVR-KBD) was developed by following a target-specific tailoring strategy that we had previously employed to derive SVM-SP. SVR-KBD showed a much higher enrichment, outperforming all other scoring functions tested, and was comparable in performance to our previously derived scoring function SVM-SP.

  12. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    PubMed

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  13. Exploring Genetic Attributions Underlying Radiotherapy-Induced Fatigue in Prostate Cancer Patients.

    PubMed

    Hashemi, Sepehr; Fernandez Martinez, Juan Luis; Saligan, Leorey; Sonis, Stephen

    2017-09-01

    Despite numerous proposed mechanisms, no definitive pathophysiology underlying radiotherapy-induced fatigue (RIF) has been established. However, the dysregulation of a set of 35 genes was recently validated to predict development of fatigue in prostate cancer patients receiving radiotherapy. To hypothesize novel pathways, and provide genetic targets for currently proposed pathways implicated in RIF development through analysis of the previously validated gene set. The gene set was analyzed for all phenotypic attributions implicated in the phenotype of fatigue. Initially, a "directed" approach was used by querying specific fatigue-related sub-phenotypes against all known phenotypic attributions of the gene set. Then, an "undirected" approach, reviewing the entirety of the literature referencing the 35 genes, was used to increase analysis sensitivity. The dysregulated genes attribute to neural, immunological, mitochondrial, muscular, and metabolic pathways. In addition, certain genes suggest phenotypes not previously emphasized in the context of RIF, such as ionizing radiation sensitivity, DNA damage, and altered DNA repair frequency. Several genes also associated with prostate cancer depression, possibly emphasizing variable radiosensitivity by RIF-prone patients, which may have palliative care implications. Despite the relevant findings, many of the 35 RIF-predictive genes are poorly characterized, warranting their investigation. The implications of herein presented RIF pathways are purely theoretical until specific end-point driven experiments are conducted in more congruent contexts. Nevertheless, the presented attributions are informative, directing future investigation to definitively elucidate RIF's pathoetiology. This study demonstrates an arguably comprehensive method of approaching known differential expression underlying a complex phenotype, to correlate feasible pathophysiology. Copyright © 2017 American Academy of Hospice and Palliative Medicine. All rights reserved.

  14. On fixed-area plot sampling for downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Paul C. Van Deusen

    2011-01-01

    The use of fixed-area plots for sampling down coarse woody debris is reviewed. A set of clearly defined protocols for two previously described methods is established and a new method, which we call the 'sausage' method, is developed. All methods (protocols) are shown to be unbiased for volume estimation, but not necessarily for estimation of population...

  15. Cognitive Training for Children: Effects on Inductive Reasoning, Deductive Reasoning, and Mathematics Achievement in an Australian School Setting

    ERIC Educational Resources Information Center

    Barkl, Sophie; Porter, Amy; Ginns, Paul

    2012-01-01

    Inductive reasoning is a core cognitive process of fluid intelligence, predicting a variety of educational outcomes. The Cognitive Training for Children (CTC) program is an educational intervention designed to develop children's inductive reasoning skills, with previous investigations finding substantial effects of the program on both inductive…

  16. Causal Attribution: A New Scale Developed to Minimize Existing Methodological Problems.

    ERIC Educational Resources Information Center

    Bull, Kay Sather; Feuquay, Jeffrey P.

    In order to facilitate research on the construct of causal attribution, this paper details developmental procedures used to minimize previous deficiencies and proposes a new scale. The first version of the scale was in ipsative form and provided two basic sets of indices: (1) ability, effort, luck, and task difficulty indices in success and…

  17. Great Expectations: The Relationship between Future Time Perspective, Learning from Others, and Employability

    ERIC Educational Resources Information Center

    Froehlich, Dominik E.; Beausaert, Simon A. J.; Segers, Mien S. R.

    2015-01-01

    Employees in countries with advanced industrial economies need to continuously develop their competences to sustain their employability--that is, to have a set of competences that enables them to maintain or find an adequate job. But how should efforts to enhance employability progress in the context of the demographic shift? Previous research…

  18. Reducing Stress in Young Children's Lives.

    ERIC Educational Resources Information Center

    McCracken, Janet Brown, Ed.

    Few adults deliberately set out to cause children stress or to teach them how to deal with it, yet adults do just that with every word, action, and reaction. This book collects work in the field of human development on how adults can help children learn to cope with stress. Each of the 30 chapters previously appeared in "Young Children,"…

  19. Early School Outcomes for Children of Postpartum Depressed Mothers: Comparison with a Community Sample

    ERIC Educational Resources Information Center

    Kersten-Alvarez, Laura E.; Hosman, Clemens M. H.; Riksen-Walraven, J. Marianne; van Doesum, Karin T. M.; Smeekens, Sanny; Hoefnagels, Cees

    2012-01-01

    Previous studies of the long-term effects of maternal postpartum depression (PPD) on child development have mostly focused on a limited set of outcomes, and have often not controlled for risk factors associated with maternal depression. The present study compared children of postpartum depressed mothers (n = 29) with children from a community…

  20. Preliminary Study on the Role of Alternative Educational Pathways in Promoting the Use of Problem-Focused Coping Strategies

    ERIC Educational Resources Information Center

    Shankland, Rebecca; Franca, Lionel Riou; Genolini, Christophe M.; Guelfi, Julien-Daniel; Ionescu, Serban

    2009-01-01

    Coping styles are generally considered to be environmentally driven. Up to now, research has mainly focused on family influences. However, some studies underline the effect of educational settings on the development of problem-focused coping strategies. Consistently with previous reports on the enhancement of autonomy and problem-solving in…

  1. Science Framework for the 2009 National Assessment of Educational Progress

    ERIC Educational Resources Information Center

    National Assessment Governing Board, 2008

    2008-01-01

    This document sets forth recommendations for the design of a new science assessment. The assessment resulting from this framework will start a new NAEP science trend (i.e., measure of student progress in science) beginning in 2009. This framework represents a unique opportunity to build on previous NAEP science work as well as key developments in…

  2. Collisional disruptions of rotating targets

    NASA Astrophysics Data System (ADS)

    Ševeček, Pavel; Broz, Miroslav

    2017-10-01

    Collisions are key processes in the evolution of the Main Asteroid Belt and impact events - i.e. target fragmentation and gravitational reaccumulation - are commonly studied by numerical simulations, namely by SPH and N-body methods. In our work, we extend the previous studies by assuming rotating targets and we study the dependence of resulting size-distributions on the pre-impact rotation of the target. To obtain stable initial conditions, it is also necessary to include the self-gravity already in the fragmentation phase which was previously neglected.To tackle this problem, we developed an SPH code, accelerated by SSE/AVX instruction sets and parallelized. The code solves the standard set of hydrodynamic equations, using the Tillotson equation of state, von Mises criterion for plastic yielding and scalar Grady-Kipp model for fragmentation. We further modified the velocity gradient by a correction tensor (Schäfer et al. 2007) to ensure a first-order conservation of the total angular momentum. As the intact target is a spherical body, its gravity can be approximated by a potential of a homogeneous sphere, making it easy to set up initial conditions. This is however infeasible for later stages of the disruption; to this point, we included the Barnes-Hut algorithm to compute the gravitational accelerations, using a multipole expansion of distant particles up to hexadecapole order.We tested the code carefully, comparing the results to our previous computations obtained with the SPH5 code (Benz and Asphaug 1994). Finally, we ran a set of simulations and we discuss the difference between the synthetic families created by rotating and static targets.

  3. Resonance vibrations in intake and exhaust pipes of in-line engines III : the inlet process of a four-stroke-cycle engine

    NASA Technical Reports Server (NTRS)

    Lutz, O

    1940-01-01

    Using a previously developed method, the boundary process of four-stroke-cycle engines are set up. The results deviate considerably from those obtained under the assumption that the velocity fluctuation is proportional to the cylinder piston motion. The deviation is less at the position of resonance frequencies. By the method developed, the effect of the resonance vibrations on the volumetric efficiency can be demonstrated.

  4. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  5. Developing, implementing and disseminating a core outcome set for neonatal medicine.

    PubMed

    Webbe, James; Brunton, Ginny; Ali, Shohaib; Duffy, James Mn; Modi, Neena; Gale, Chris

    2017-01-01

    In high resource settings, 1 in 10 newborn babies require admission to a neonatal unit. Research evaluating neonatal care involves recording and reporting many different outcomes and outcome measures. Such variation limits the usefulness of research as studies cannot be compared or combined. To address these limitations, we aim to develop, disseminate and implement a core outcome set for neonatal medicine. A steering group that includes parents and former patients, healthcare professionals and researchers has been formed to guide the development of the core outcome set. We will review neonatal trials systematically to identify previously reported outcomes. Additionally, we will specifically identify outcomes of importance to parents, former patients and healthcare professionals through a systematic review of qualitative studies. Outcomes identified will be entered into an international, multi-perspective eDelphi survey. All key stakeholders will be invited to participate. The Delphi method will encourage individual and group stakeholder consensus to identify a core outcome set. The core outcome set will be mapped to existing, routinely recorded data where these exist. Use of a core set will ensure outcomes of importance to key stakeholders, including former patients and parents, are recorded and reported in a standard fashion in future research. Embedding the core outcome set within future clinical studies will extend the usefulness of research to inform practice, enhance patient care and ultimately improve outcomes. Using routinely recorded electronic data will facilitate implementation with minimal addition burden. Core Outcome Measures in Effectiveness Trials (COMET) database: 842 (www.comet-initiative.org/studies/details/842).

  6. The Importance of Integration of Stakeholder Views in Core Outcome Set Development: Otitis Media with Effusion in Children with Cleft Palate

    PubMed Central

    Harman, Nicola L.; Bruce, Iain A.; Kirkham, Jamie J.; Tierney, Stephanie; Callery, Peter; O'Brien, Kevin; Williamson, Paula R.

    2015-01-01

    Background Approximately 75% of children with cleft palate (CP) have Otitis Media with Effusion (OME) histories. Evidence for the effective management of OME in these children is lacking. The inconsistency in outcome measurement in previous studies has led to a call for the development of a Core Outcome Set (COS). Despite the increase in the number of published COS, involvement of patients in the COS development process, and methods to integrate the views of patients and health professionals, to date have been limited. Methods and Findings A list of outcomes measured in previous research was identified through reviewing the literature. Opinion on the importance of each of these outcomes was then sought from key stakeholders: Ear, Nose and Throat (ENT) surgeons, audiologists, cleft surgeons, speech and language therapists, specialist cleft nurses, psychologists, parents and children. The opinion of health professionals was sought in a three round Delphi survey where participants were asked to score each outcome using a bespoke online system. Parents and children were also asked to score outcomes in a survey and provided an in-depth insight into having OME through semi-structured interviews. The results of the Delphi survey, interviews and parent/patient survey were brought together in a final consensus meeting with representation from all stakeholders. A final set of eleven outcomes reached the definition of “consensus in” to form the recommended COS: hearing; chronic otitis media (COM); OME; receptive language skills; speech development; psycho social development; acute otitis media (AOM); cholesteatoma; side effects of treatment; listening skills; otalgia. Conclusions We have produced a recommendation about the outcomes that should be measured, as a minimum, in studies of the management of OME in children with CP. The development process included input from key stakeholders and used novel methodology to integrate the opinion of healthcare professionals, parents and children. PMID:26115172

  7. The Importance of Integration of Stakeholder Views in Core Outcome Set Development: Otitis Media with Effusion in Children with Cleft Palate.

    PubMed

    Harman, Nicola L; Bruce, Iain A; Kirkham, Jamie J; Tierney, Stephanie; Callery, Peter; O'Brien, Kevin; Bennett, Alex M D; Chorbachi, Raouf; Hall, Per N; Harding-Bell, Anne; Parfect, Victoria H; Rumsey, Nichola; Sell, Debbie; Sharma, Ravi; Williamson, Paula R

    2015-01-01

    Approximately 75% of children with cleft palate (CP) have Otitis Media with Effusion (OME) histories. Evidence for the effective management of OME in these children is lacking. The inconsistency in outcome measurement in previous studies has led to a call for the development of a Core Outcome Set (COS). Despite the increase in the number of published COS, involvement of patients in the COS development process, and methods to integrate the views of patients and health professionals, to date have been limited. A list of outcomes measured in previous research was identified through reviewing the literature. Opinion on the importance of each of these outcomes was then sought from key stakeholders: Ear, Nose and Throat (ENT) surgeons, audiologists, cleft surgeons, speech and language therapists, specialist cleft nurses, psychologists, parents and children. The opinion of health professionals was sought in a three round Delphi survey where participants were asked to score each outcome using a bespoke online system. Parents and children were also asked to score outcomes in a survey and provided an in-depth insight into having OME through semi-structured interviews. The results of the Delphi survey, interviews and parent/patient survey were brought together in a final consensus meeting with representation from all stakeholders. A final set of eleven outcomes reached the definition of "consensus in" to form the recommended COS: hearing; chronic otitis media (COM); OME; receptive language skills; speech development; psycho social development; acute otitis media (AOM); cholesteatoma; side effects of treatment; listening skills; otalgia. We have produced a recommendation about the outcomes that should be measured, as a minimum, in studies of the management of OME in children with CP. The development process included input from key stakeholders and used novel methodology to integrate the opinion of healthcare professionals, parents and children.

  8. Detection of mumps virus genotype H in two previously vaccinated patients from Mexico City.

    PubMed

    Del Valle, Alberto; García, Alí A; Barrón, Blanca L

    2016-06-01

    Infections caused by mumps virus (MuV) have been successfully prevented through vaccination; however, in recent years, an increasing number of mumps outbreaks have been reported within vaccinated populations. In this study, MuV was genotyped for the first time in Mexico. Saliva samples were obtained from two previously vaccinated patients in Mexico City who had developed parotitis. Viral isolation was carried out in Vero cells, and the SH and HN genes were amplified by RT-PCR. Amplicons were sequenced and compared to a set of reference sequences to identify the MuV genotype.

  9. dSet1 Is the Main H3K4 Di- and Tri-Methyltransferase Throughout Drosophila Development

    PubMed Central

    Hallson, Graham; Hollebakken, Robert E.; Li, Taosui; Syrzycka, Monika; Kim, Inho; Cotsworth, Shawn; Fitzpatrick, Kathleen A.; Sinclair, Donald A. R.; Honda, Barry M.

    2012-01-01

    In eukaryotes, the post-translational addition of methyl groups to histone H3 lysine 4 (H3K4) plays key roles in maintenance and establishment of appropriate gene expression patterns and chromatin states. We report here that an essential locus within chromosome 3L centric heterochromatin encodes the previously uncharacterized Drosophila melanogaster ortholog (dSet1, CG40351) of the Set1 H3K4 histone methyltransferase (HMT). Our results suggest that dSet1 acts as a “global” or general H3K4 di- and trimethyl HMT in Drosophila. Levels of H3K4 di- and trimethylation are significantly reduced in dSet1 mutants during late larval and post-larval stages, but not in animals carrying mutations in genes encoding other well-characterized H3K4 HMTs such as trr, trx, and ash1. The latter results suggest that Trr, Trx, and Ash1 may play more specific roles in regulating key cellular targets and pathways and/or act as global H3K4 HMTs earlier in development. In yeast and mammalian cells, the HMT activity of Set1 proteins is mediated through an evolutionarily conserved protein complex known as Complex of Proteins Associated with Set1 (COMPASS). We present biochemical evidence that dSet1 interacts with members of a putative Drosophila COMPASS complex and genetic evidence that these members are functionally required for H3K4 methylation. Taken together, our results suggest that dSet1 is responsible for the bulk of H3K4 di- and trimethylation throughout Drosophila development, thus providing a model system for better understanding the requirements for and functions of these modifications in metazoans. PMID:22048023

  10. Endoscopic third ventriculostomy in the treatment of childhood hydrocephalus.

    PubMed

    Kulkarni, Abhaya V; Drake, James M; Mallucci, Conor L; Sgouros, Spyros; Roth, Jonathan; Constantini, Shlomi

    2009-08-01

    To develop a model to predict the probability of endoscopic third ventriculostomy (ETV) success in the treatment for hydrocephalus on the basis of a child's individual characteristics. We analyzed 618 ETVs performed consecutively on children at 12 international institutions to identify predictors of ETV success at 6 months. A multivariable logistic regression model was developed on 70% of the dataset (training set) and validated on 30% of the dataset (validation set). In the training set, 305/455 ETVs (67.0%) were successful. The regression model (containing patient age, cause of hydrocephalus, and previous cerebrospinal fluid shunt) demonstrated good fit (Hosmer-Lemeshow, P = .78) and discrimination (C statistic = 0.70). In the validation set, 105/163 ETVs (64.4%) were successful and the model maintained good fit (Hosmer-Lemeshow, P = .45), discrimination (C statistic = 0.68), and calibration (calibration slope = 0.88). A simplified ETV Success Score was devised that closely approximates the predicted probability of ETV success. Children most likely to succeed with ETV can now be accurately identified and spared the long-term complications of CSF shunting.

  11. OARSI Clinical Trials Recommendations: Design and conduct of clinical trials of lifestyle diet and exercise interventions for osteoarthritis.

    PubMed

    Messier, S P; Callahan, L F; Golightly, Y M; Keefe, F J

    2015-05-01

    The objective was to develop a set of "best practices" for use as a primer for those interested in entering the clinical trials field for lifestyle diet and/or exercise interventions in osteoarthritis (OA), and as a set of recommendations for experienced clinical trials investigators. A subcommittee of the non-pharmacologic therapies committee of the OARSI Clinical Trials Working Group was selected by the Steering Committee to develop a set of recommended principles for non-pharmacologic diet/exercise OA randomized clinical trials. Topics were identified for inclusion by co-authors and reviewed by the subcommittee. Resources included authors' expert opinions, traditional search methods including MEDLINE (via PubMed), and previously published guidelines. Suggested steps and considerations for study methods (e.g., recruitment and enrollment of participants, study design, intervention and assessment methods) were recommended. The recommendations set forth in this paper provide a guide from which a research group can design a lifestyle diet/exercise randomized clinical trial in patients with OA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  12. Development of intelligent model to determine favorable wheelchair tilt and recline angles for people with spinal cord injury.

    PubMed

    Fu, Jicheng; Jan, Yih-Kuen; Jones, Maria

    2011-01-01

    Machine-learning techniques have found widespread applications in bioinformatics. Such techniques provide invaluable insight on understanding the complex biomedical mechanisms and predicting the optimal individualized intervention for patients. In our case, we are particularly interested in developing an individualized clinical guideline on wheelchair tilt and recline usage for people with spinal cord injury (SCI). The current clinical practice suggests uniform settings to all patients. However, our previous study revealed that the response of skin blood flow to wheelchair tilt and recline settings varied largely among patients. Our finding suggests that an individualized setting is needed for people with SCI to maximally utilize the residual neurological function to reduce pressure ulcer risk. In order to achieve this goal, we intend to develop an intelligent model to determine the favorable wheelchair usage to reduce pressure ulcers risk for wheelchair users with SCI. In this study, we use artificial neural networks (ANNs) to construct an intelligent model that can predict whether a given tilt and recline setting will be favorable to people with SCI based on neurological functions and SCI injury history. Our results indicate that the intelligent model significantly outperforms the traditional statistical approach in accurately classifying favorable wheelchair tilt and recline settings. To the best of our knowledge, this is the first study using intelligent models to predict the favorable wheelchair tilt and recline angles. Our methods demonstrate the feasibility of using ANN to develop individualized wheelchair tilt and recline guidance for people with SCI.

  13. Application of Raman spectroscopy for cervical dysplasia diagnosis

    PubMed Central

    Kanter, Elizabeth M.; Vargis, Elizabeth; Majumder, Shovan; Keller, Matthew D.; Woeste, Emily; Rao, Gautam G.; Mahadevan-Jansen, Anita

    2014-01-01

    Cervical cancer is the second most common malignancy among women worldwide, with over 490000 cases diagnosed and 274000 deaths each year. Although current screening methods have dramatically reduced cervical cancer incidence and mortality in developed countries, a “See and Treat” method would be preferred, especially in developing countries. Results from our previous work have suggested that Raman spectroscopy can be used to detect cervical precancers; however, with a classification accuracy of 88%, it was not clinically applicable. In this paper, we describe how incorporating a woman's hormonal status, particularly the point in menstrual cycle and menopausal state, into our previously developed classification algorithm improves the accuracy of our method to 94%. The results of this paper bring Raman spectroscopy one step closer to being utilized in a clinical setting to diagnose cervical dysplasia. Posterior probabilities of class membership, as determined by MRDF-SMLR, for patients regardless of menopausal status, and for pre-menopausal patients only PMID:19343687

  14. MAVTgsa: An R Package for Gene Set (Enrichment) Analysis

    DOE PAGES

    Chien, Chih-Yi; Chang, Ching-Wei; Tsai, Chen-An; ...

    2014-01-01

    Gene semore » t analysis methods aim to determine whether an a priori defined set of genes shows statistically significant difference in expression on either categorical or continuous outcomes. Although many methods for gene set analysis have been proposed, a systematic analysis tool for identification of different types of gene set significance modules has not been developed previously. This work presents an R package, called MAVTgsa, which includes three different methods for integrated gene set enrichment analysis. (1) The one-sided OLS (ordinary least squares) test detects coordinated changes of genes in gene set in one direction, either up- or downregulation. (2) The two-sided MANOVA (multivariate analysis variance) detects changes both up- and downregulation for studying two or more experimental conditions. (3) A random forests-based procedure is to identify gene sets that can accurately predict samples from different experimental conditions or are associated with the continuous phenotypes. MAVTgsa computes the P values and FDR (false discovery rate) q -value for all gene sets in the study. Furthermore, MAVTgsa provides several visualization outputs to support and interpret the enrichment results. This package is available online.« less

  15. An Alternate Set of Basis Functions for the Electromagnetic Solution of Arbitrarily-Shaped, Three-Dimensional, Closed, Conducting Bodies Using Method of Moments

    NASA Technical Reports Server (NTRS)

    Mackenzie, Anne I.; Baginski, Michael E.; Rao, Sadasiva M.

    2008-01-01

    In this work, we present an alternate set of basis functions, each defined over a pair of planar triangular patches, for the method of moments solution of electromagnetic scattering and radiation problems associated with arbitrarily-shaped, closed, conducting surfaces. The present basis functions are point-wise orthogonal to the pulse basis functions previously defined. The prime motivation to develop the present set of basis functions is to utilize them for the electromagnetic solution of dielectric bodies using a surface integral equation formulation which involves both electric and magnetic cur- rents. However, in the present work, only the conducting body solution is presented and compared with other data.

  16. The undergraduate research fellows program: a unique model to promote engagement in research.

    PubMed

    Vessey, Judith A; DeMarco, Rosanna F

    2008-01-01

    Well-educated nurses with research expertise are needed to advance evidence-based nursing practice. A primary goal of undergraduate nursing curricula is to create meaningful participatory experiences to help students develop a research skill set that articulates with rapid career advancement of gifted, young graduates interested in nursing research and faculty careers. Three research enrichment models-undergraduate honors programs, research assistant work-for-hire programs, and research work/mentorship programs-to be in conjunction with standard research content are reviewed. The development and implementation of one research work/mentorship program, the Boston College undergraduate research fellows program (UGRF), is explicated. This process included surveying previous UGRFs followed by creating a retreat and seminars to address specific research skill sets. The research skill sets included (a) how to develop a research team, (b) accurate data retrieval, (c) ethical considerations, (d) the research process, (e) data management, (f) successful writing of abstracts, and (g) creating effective poster presentations. Outcomes include evidence of involvement in research productivity and valuing of evidenced-based practice through the UGRF mentorship process with faculty partners.

  17. An improved MCNP version of the NORMAN voxel phantom for dosimetry studies.

    PubMed

    Ferrari, P; Gualdrini, G

    2005-09-21

    In recent years voxel phantoms have been developed on the basis of tomographic data of real individuals allowing new sets of conversion coefficients to be calculated for effective dose. Progress in radiation studies brought ICRP to revise its recommendations and a new report, already circulated in draft form, is expected to change the actual effective dose evaluation method. In the present paper the voxel phantom NORMAN developed at HPA, formerly NRPB, was employed with MCNP Monte Carlo code. A modified version of the phantom, NORMAN-05, was developed to take into account the new set of tissues and weighting factors proposed in the cited ICRP draft. Air kerma to organ equivalent dose and effective dose conversion coefficients for antero-posterior and postero-anterior parallel photon beam irradiations, from 20 keV to 10 MeV, have been calculated and compared with data obtained in other laboratories using different numerical phantoms. Obtained results are in good agreement with published data with some differences for the effective dose calculated employing the proposed new tissue weighting factors set in comparison with previous evaluations based on the ICRP 60 report.

  18. An inventory of undiscovered Canadian mineral resources

    NASA Technical Reports Server (NTRS)

    Labovitz, M. L.; Griffiths, J. C.

    1982-01-01

    Unit regional value (URV) and unit regional weight are area standardized measures of the expected value and quantity, respectively, of the mineral resources of a region. Estimation and manipulation of the URV statistic is the basis of an approach to mineral resource evaluation. Estimates of the kind and value of exploitable mineral resources yet to be discovered in the provinces of Canada are used as an illustration of the procedure. The URV statistic is set within a previously developed model wherein geology, as measured by point counting geologic maps, is related to the historical record of mineral resource production of well-developed regions of the world, such as the 50 states of the U.S.A.; these may be considered the training set. The Canadian provinces are related to this training set using geological information obtained in the same way from geologic maps of the provinces. The desired predictions of yet to be discovered mineral resources in the Canadian provinces arise as a consequence. The implicit assumption is that regions of similar geology, if equally well developed, will produce similar weights and values of mineral resources.

  19. On the development of HSCT tail sizing criteria using linear matrix inequalities

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac

    1995-01-01

    This report presents the results of a study to extend existing high speed civil transport (HSCT) tail sizing criteria using linear matrix inequalities (LMI). In particular, the effects of feedback specifications, such as MIL STD 1797 Level 1 and 2 flying qualities requirements, and actuator amplitude and rate constraints on the maximum allowable cg travel for a given set of tail sizes are considered. Results comparing previously developed industry criteria and the LMI methodology on an HSCT concept airplane are presented.

  20. A landscape inventory framework: scenic analyses of the Northern Great Plains

    Treesearch

    Litton R. Burton Jr.; Robert J. Tetlow

    1978-01-01

    A set of four visual inventories are proposed. They are designed to document scenic resources for varied scales of application, from regional and general to local and specific. The Northern Great Plains is used as a case study. Scenic analysis and identification of criteria extend earlier work. The inventory is based on (1) study of previously developed landscape...

  1. Charting a Course through CORAL: Texas A&M University Libraries' Experience Implementing an Open-Source Electronic Resources Management System

    ERIC Educational Resources Information Center

    Hartnett, Eric; Beh, Eugenia; Resnick, Taryn; Ugaz, Ana; Tabacaru, Simona

    2013-01-01

    In 2010, after two previous unsuccessful attempts at electronic resources management system (ERMS) implementation, Texas A&M University (TAMU) Libraries set out once again to find an ERMS that would fit its needs. After surveying the field, TAMU Libraries selected the University of Notre Dame Hesburgh Libraries-developed, open-source ERMS,…

  2. Application of the One-Minute Preceptor Technique by Novice Teachers in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Chan, Lap Ki; Yang, Jian; Irby, David M.

    2015-01-01

    The one-minute preceptor (OMP) was originally developed in the ambulatory care setting as a time-efficient teaching technique for learner-centered clinical training. There are also possible advantages of using the OMP in the gross anatomy laboratory. However, in a previous study it was found that providing training to experienced gross anatomy…

  3. A Language Support Program for English-Medium Instruction Courses: Its Development and Evaluation in an EFL Setting

    ERIC Educational Resources Information Center

    Chang, Ji-Yeon; Kim, Wooyeon; Lee, Heewon

    2017-01-01

    Many English as a foreign language universities have increased the number of English-medium instruction (EMI) courses regardless of their students' preparedness for them. As a result, previous studies have reflected the necessity of additional language assistance to students who have to take EMI courses with limited English proficiency. Drawing…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricafort, Juliet

    A model was developed to determine the forces exerted by several flexor and extensor muscles of the human knee under static conditions. The following muscles were studied: the gastrocnemius, biceps femoris, semitendinosus, semimembranosus, and the set of quadricep muscles. The tibia and fibula were each modeled as rigid bodies; muscles were modeled by their functional lines of action in space. Assumptions based on previous data were used to resolve the indeterminacy.

  5. Drop and Give Us 20, Seifried: A Practical Response to "Defending the Use of Punishment by Coaches"

    ERIC Educational Resources Information Center

    Albrecht, Rick

    2009-01-01

    In a recent issue of "Quest," Seifried (2008) explicitly depicted his work as an "attempt to present some rationale for supporting the use by coaches of corporal punishment in the sport setting . . . [and] to develop a defense, not previously offered, for those coaches who thoughtfully employ punishment strategies to manage their players and…

  6. Estimating tuberculosis incidence from primary survey data: a mathematical modeling approach.

    PubMed

    Pandey, S; Chadha, V K; Laxminarayan, R; Arinaminpathy, N

    2017-04-01

    There is an urgent need for improved estimations of the burden of tuberculosis (TB). To develop a new quantitative method based on mathematical modelling, and to demonstrate its application to TB in India. We developed a simple model of TB transmission dynamics to estimate the annual incidence of TB disease from the annual risk of tuberculous infection and prevalence of smear-positive TB. We first compared model estimates for annual infections per smear-positive TB case using previous empirical estimates from China, Korea and the Philippines. We then applied the model to estimate TB incidence in India, stratified by urban and rural settings. Study model estimates show agreement with previous empirical estimates. Applied to India, the model suggests an annual incidence of smear-positive TB of 89.8 per 100 000 population (95%CI 56.8-156.3). Results show differences in urban and rural TB: while an urban TB case infects more individuals per year, a rural TB case remains infectious for appreciably longer, suggesting the need for interventions tailored to these different settings. Simple models of TB transmission, in conjunction with necessary data, can offer approaches to burden estimation that complement those currently being used.

  7. GUIDANCE2: accurate detection of unreliable alignment regions accounting for the uncertainty of multiple parameters.

    PubMed

    Sela, Itamar; Ashkenazy, Haim; Katoh, Kazutaka; Pupko, Tal

    2015-07-01

    Inference of multiple sequence alignments (MSAs) is a critical part of phylogenetic and comparative genomics studies. However, from the same set of sequences different MSAs are often inferred, depending on the methodologies used and the assumed parameters. Much effort has recently been devoted to improving the ability to identify unreliable alignment regions. Detecting such unreliable regions was previously shown to be important for downstream analyses relying on MSAs, such as the detection of positive selection. Here we developed GUIDANCE2, a new integrative methodology that accounts for: (i) uncertainty in the process of indel formation, (ii) uncertainty in the assumed guide tree and (iii) co-optimal solutions in the pairwise alignments, used as building blocks in progressive alignment algorithms. We compared GUIDANCE2 with seven methodologies to detect unreliable MSA regions using extensive simulations and empirical benchmarks. We show that GUIDANCE2 outperforms all previously developed methodologies. Furthermore, GUIDANCE2 also provides a set of alternative MSAs which can be useful for downstream analyses. The novel algorithm is implemented as a web-server, available at: http://guidance.tau.ac.il. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. GSA-PCA: gene set generation by principal component analysis of the Laplacian matrix of a metabolic network

    PubMed Central

    2012-01-01

    Background Gene Set Analysis (GSA) has proven to be a useful approach to microarray analysis. However, most of the method development for GSA has focused on the statistical tests to be used rather than on the generation of sets that will be tested. Existing methods of set generation are often overly simplistic. The creation of sets from individual pathways (in isolation) is a poor reflection of the complexity of the underlying metabolic network. We have developed a novel approach to set generation via the use of Principal Component Analysis of the Laplacian matrix of a metabolic network. We have analysed a relatively simple data set to show the difference in results between our method and the current state-of-the-art pathway-based sets. Results The sets generated with this method are semi-exhaustive and capture much of the topological complexity of the metabolic network. The semi-exhaustive nature of this method has also allowed us to design a hypergeometric enrichment test to determine which genes are likely responsible for set significance. We show that our method finds significant aspects of biology that would be missed (i.e. false negatives) and addresses the false positive rates found with the use of simple pathway-based sets. Conclusions The set generation step for GSA is often neglected but is a crucial part of the analysis as it defines the full context for the analysis. As such, set generation methods should be robust and yield as complete a representation of the extant biological knowledge as possible. The method reported here achieves this goal and is demonstrably superior to previous set analysis methods. PMID:22876834

  9. Use of Early Inhaled Nitric Oxide Therapy in Fat Embolism Syndrome to Prevent Right Heart Failure

    PubMed Central

    Koyfman, Leonid; Kutz, Ruslan; Frenkel, Amit; Gruenbaum, Shaun E.; Zlotnik, Alexander; Klein, Moti

    2014-01-01

    Fat embolism syndrome (FES) is a life-threatening condition in which multiorgan dysfunction manifests 48–72 hours after long bone or pelvis fractures. Right ventricular (RV) failure, especially in the setting of pulmonary hypertension, is a frequent feature of FES. We report our experience treating 2 young, previously healthy trauma patients who developed severe hypoxemia in the setting of FES. Neither patient had evidence of RV dysfunction on echocardiogram. The patients were treated with inhaled nitric oxide (NO), and their oxygenation significantly improved over the subsequent few days. Neither patient developed any cardiovascular compromise. Patients with FES that have severe hypoxemia and evidence of adult respiratory distress syndrome (ARDS) are likely at risk for developing RV failure. We recommend that these patients with FES and severe refractory hypoxemia should be treated with inhaled NO therapy prior to the onset of RV dysfunction. PMID:25180103

  10. Through the eyes of a child: preschoolers' identification of emotional expressions from the child affective facial expression (CAFE) set.

    PubMed

    LoBue, Vanessa; Baker, Lewis; Thrasher, Cat

    2017-08-10

    Researchers have been interested in the perception of human emotional expressions for decades. Importantly, most empirical work in this domain has relied on controlled stimulus sets of adults posing for various emotional expressions. Recently, the Child Affective Facial Expression (CAFE) set was introduced to the scientific community, featuring a large validated set of photographs of preschool aged children posing for seven different emotional expressions. Although the CAFE set was extensively validated using adult participants, the set was designed for use with children. It is therefore necessary to verify that adult validation applies to child performance. In the current study, we examined 3- to 4-year-olds' identification of a subset of children's faces in the CAFE set, and compared it to adult ratings cited in previous research. Our results demonstrate an exceptionally strong relationship between adult ratings of the CAFE photos and children's ratings, suggesting that the adult validation of the set can be applied to preschool-aged participants. The results are discussed in terms of methodological implications for the use of the CAFE set with children, and theoretical implications for using the set to study the development of emotion perception in early childhood.

  11. Required Assets for a Nuclear Energy Applied R&D Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harold F. McFarlane; Craig L. Jacobson

    2009-03-01

    This report is one of a set of three documents that have collectively identified and recommended research and development capabilities that will be required to advance nuclear energy in the next 20 to 50 years. The first report, Nuclear Energy for the Future: Required Research and Development Capabilities—An Industry Perspective, was produced by Battelle Memorial Institute at the request of the Assistant Secretary of Nuclear Energy. That report, drawn from input by industry, academia, and Department of Energy laboratories, can be found in Appendix 5.1. This Idaho National Laboratory report maps the nuclear-specific capabilities from the Battelle report onto facilitymore » requirements, identifying options from the set of national laboratory, university, industry, and international facilities. It also identifies significant gaps in the required facility capabilities. The third document, Executive Recommendations for Nuclear R&D Capabilities, is a letter report containing a set of recommendations made by a team of senior executives representing nuclear vendors, utilities, academia, and the national laboratories (at Battelle’s request). That third report can be found in Appendix 5.2. The three reports should be considered as set in order to have a more complete picture. The basis of this report was drawn from three sources: previous Department of Energy reports, workshops and committee meetings, and expert opinion. The facilities discussed were winnowed from several hundred facilities that had previously been catalogued and several additional facilities that had been overlooked in past exercises. The scope of this report is limited to commercial nuclear energy and those things the federal government, or more specifically the Office of Nuclear Energy, should do to support its expanded deployment in order to increase energy security and reduce carbon emissions. In the context of this report, capabilities mean innovative, well-structured research and development programs, a viable work force, and well-equipped specialized facilities.« less

  12. Deterministic protein inference for shotgun proteomics data provides new insights into Arabidopsis pollen development and function

    PubMed Central

    Grobei, Monica A.; Qeli, Ermir; Brunner, Erich; Rehrauer, Hubert; Zhang, Runxuan; Roschitzki, Bernd; Basler, Konrad; Ahrens, Christian H.; Grossniklaus, Ueli

    2009-01-01

    Pollen, the male gametophyte of flowering plants, represents an ideal biological system to study developmental processes, such as cell polarity, tip growth, and morphogenesis. Upon hydration, the metabolically quiescent pollen rapidly switches to an active state, exhibiting extremely fast growth. This rapid switch requires relevant proteins to be stored in the mature pollen, where they have to retain functionality in a desiccated environment. Using a shotgun proteomics approach, we unambiguously identified ∼3500 proteins in Arabidopsis pollen, including 537 proteins that were not identified in genetic or transcriptomic studies. To generate this comprehensive reference data set, which extends the previously reported pollen proteome by a factor of 13, we developed a novel deterministic peptide classification scheme for protein inference. This generally applicable approach considers the gene model–protein sequence–protein accession relationships. It allowed us to classify and eliminate ambiguities inherently associated with any shotgun proteomics data set, to report a conservative list of protein identifications, and to seamlessly integrate data from previous transcriptomics studies. Manual validation of proteins unambiguously identified by a single, information-rich peptide enabled us to significantly reduce the false discovery rate, while keeping valuable identifications of shorter and lower abundant proteins. Bioinformatic analyses revealed a higher stability of pollen proteins compared to those of other tissues and implied a protein family of previously unknown function in vesicle trafficking. Interestingly, the pollen proteome is most similar to that of seeds, indicating physiological similarities between these developmentally distinct tissues. PMID:19546170

  13. A novel semi-transductive learning framework for efficient atypicality detection in chest radiographs

    NASA Astrophysics Data System (ADS)

    Alzubaidi, Mohammad; Balasubramanian, Vineeth; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.

    2012-03-01

    Inductive learning refers to machine learning algorithms that learn a model from a set of training data instances. Any test instance is then classified by comparing it to the learned model. When the set of training instances lend themselves well to modeling, the use of a model substantially reduces the computation cost of classification. However, some training data sets are complex, and do not lend themselves well to modeling. Transductive learning refers to machine learning algorithms that classify test instances by comparing them to all of the training instances, without creating an explicit model. This can produce better classification performance, but at a much higher computational cost. Medical images vary greatly across human populations, constituting a data set that does not lend itself well to modeling. Our previous work showed that the wide variations seen across training sets of "normal" chest radiographs make it difficult to successfully classify test radiographs with an inductive (modeling) approach, and that a transductive approach leads to much better performance in detecting atypical regions. The problem with the transductive approach is its high computational cost. This paper develops and demonstrates a novel semi-transductive framework that can address the unique challenges of atypicality detection in chest radiographs. The proposed framework combines the superior performance of transductive methods with the reduced computational cost of inductive methods. Our results show that the proposed semitransductive approach provides both effective and efficient detection of atypical regions within a set of chest radiographs previously labeled by Mayo Clinic expert thoracic radiologists.

  14. Development of quality indicators to evaluate antibiotic treatment of patients with community-acquired pneumonia in Indonesia.

    PubMed

    Farida, Helmia; Rondags, Angelique; Gasem, M Hussein; Leong, Katharina; Adityana, A; van den Broek, Peterhans J; Keuter, Monique; Natsch, Stephanie

    2015-04-01

    To develop an instrument for evaluating the quality of antibiotic management of patients with community-acquired pneumonia (CAP) applicable in a middle-income developing country. A previous study and Indonesian guidelines were reviewed to derive potential quality of care indicators (QIs). An expert panel performed a two-round Delphi consensus procedure on the QI's relevance to patient recovery, reduction of antimicrobial resistance and cost containment. Applicability in practice, including reliability, feasibility and opportunity for improvement, was determined in a data set of 128 patients hospitalised with CAP in Semarang, Indonesia. Fifteen QIs were selected by the consensus procedure. Five QIs did not pass feasibility criteria, because of inappropriate documentation, inefficient laboratory services or patient factors. Three QIs provided minor opportunity for improvement. Two QIs contradicted each other; one of these was considered not valid and excluded. A final set of six QIs was defined for use in the Indonesian setting. Using the Delphi method, we defined a list of QIs for assessing the quality of care, in particular antibiotic treatment, for CAP in Indonesia. For further improvement, a modified Delphi method that includes discussion, a sound medical documentation system, improvement of microbiology laboratory services, and multi-center applicability tests are needed to develop a valid and applicable QI list for the Indonesian setting. © 2014 John Wiley & Sons Ltd.

  15. Development of multiplex microsatellite PCR panels for the seagrass Thalassia hemprichii (Hydrocharitaceae)1

    PubMed Central

    van Dijk, Kor-jent; Mellors, Jane; Waycott, Michelle

    2014-01-01

    • Premise of the study: New microsatellites were developed for the seagrass Thalassia hemprichii (Hydrocharitaceae), a long-lived seagrass species that is found throughout the shallow waters of tropical and subtropical Indo-West Pacific. Three multiplex PCR panels were designed utilizing new and previously developed markers, resulting in a toolkit for generating a 16-locus genotype. • Methods and Results: Through the use of microsatellite enrichment and next-generation sequencing, 16 new, validated, polymorphic microsatellite markers were isolated. Diversity was between two and four alleles per locus totaling 36 alleles. These markers, plus previously developed microsatellite markers for T. hemprichii and T. testudinum, were tested for suitability in multiplex PCR panels. • Conclusions: The generation of an easily replicated suite of multiplex panels of codominant molecular markers will allow for high-resolution and detailed genetic structure analysis and clonality assessment with minimal genotyping costs. We suggest the establishment of a T. hemprichii primer convention for the unification of future data sets. PMID:25383269

  16. A New Z Score Curve of the Coronary Arterial Internal Diameter Using the Lambda-Mu-Sigma Method in a Pediatric Population.

    PubMed

    Kobayashi, Tohru; Fuse, Shigeto; Sakamoto, Naoko; Mikami, Masashi; Ogawa, Shunichi; Hamaoka, Kenji; Arakaki, Yoshio; Nakamura, Tsuneyuki; Nagasawa, Hiroyuki; Kato, Taichi; Jibiki, Toshiaki; Iwashima, Satoru; Yamakawa, Masaru; Ohkubo, Takashi; Shimoyama, Shinya; Aso, Kentaro; Sato, Seiichi; Saji, Tsutomu

    2016-08-01

    Several coronary artery Z score models have been developed. However, a Z score model derived by the lambda-mu-sigma (LMS) method has not been established. Echocardiographic measurements of the proximal right coronary artery, left main coronary artery, proximal left anterior descending coronary artery, and proximal left circumflex artery were prospectively collected in 3,851 healthy children ≤18 years of age and divided into developmental and validation data sets. In the developmental data set, smooth curves were fitted for each coronary artery using linear, logarithmic, square-root, and LMS methods for both sexes. The relative goodness of fit of these models was compared using the Bayesian information criterion. The best-fitting model was tested for reproducibility using the validation data set. The goodness of fit of the selected model was visually compared with that of the previously reported regression models using a Q-Q plot. Because the internal diameter of each coronary artery was not similar between sexes, sex-specific Z score models were developed. The LMS model with body surface area as the independent variable showed the best goodness of fit; therefore, the internal diameter of each coronary artery was transformed into a sex-specific Z score on the basis of body surface area using the LMS method. In the validation data set, a Q-Q plot of each model indicated that the distribution of Z scores in the LMS models was closer to the normal distribution compared with previously reported regression models. Finally, the final models for each coronary artery in both sexes were developed using the developmental and validation data sets. A Microsoft Excel-based Z score calculator was also created, which is freely available online (http://raise.umin.jp/zsp/calculator/). Novel LMS models with which to estimate the sex-specific Z score of each internal coronary artery diameter were generated and validated using a large pediatric population. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  17. Predicting factors for malaria re-introduction: an applied model in an elimination setting to prevent malaria outbreaks.

    PubMed

    Ranjbar, Mansour; Shoghli, Alireza; Kolifarhood, Goodarz; Tabatabaei, Seyed Mehdi; Amlashi, Morteza; Mohammadi, Mahdi

    2016-03-02

    Malaria re-introduction is a challenge in elimination settings. To prevent re-introduction, receptivity, vulnerability, and health system capacity of foci should be monitored using appropriate tools. This study aimed to design an applicable model to monitor predicting factors of re-introduction of malaria in highly prone areas. This exploratory, descriptive study was conducted in a pre-elimination setting with a high-risk of malaria transmission re-introduction. By using nominal group technique and literature review, a list of predicting indicators for malaria re-introduction and outbreak was defined. Accordingly, a checklist was developed and completed in the field for foci affected by re-introduction and for cleared-up foci as a control group, for a period of 12 weeks before re-introduction and for the same period in the previous year. Using field data and analytic hierarchical process (AHP), each variable and its sub-categories were weighted, and by calculating geometric means for each sub-category, score of corresponding cells of interaction matrices, lower and upper threshold of different risks strata, including low and mild risk of re-introduction and moderate and high risk of malaria outbreaks, were determined. The developed predictive model was calibrated through resampling with different sets of explanatory variables using R software. Sensitivity and specificity of the model were calculated based on new samples. Twenty explanatory predictive variables of malaria re-introduction were identified and a predictive model was developed. Unpermitted immigrants from endemic neighbouring countries were determined as a pivotal factor (AHP score: 0.181). Moreover, quality of population movement (0.114), following malaria transmission season (0.088), average daily minimum temperature in the previous 8 weeks (0.062), an outdoor resting shelter for vectors (0.045), and rainfall (0.042) were determined. Positive and negative predictive values of the model were 81.8 and 100 %, respectively. This study introduced a new, simple, yet reliable model to forecast malaria re-introduction and outbreaks eight weeks in advance in pre-elimination and elimination settings. The model incorporates comprehensive deterministic factors that can easily be measured in the field, thereby facilitating preventive measures.

  18. Identification of consensus biomarkers for predicting non-genotoxic hepatocarcinogens

    PubMed Central

    Huang, Shan-Han; Tung, Chun-Wei

    2017-01-01

    The assessment of non-genotoxic hepatocarcinogens (NGHCs) is currently relying on two-year rodent bioassays. Toxicogenomics biomarkers provide a potential alternative method for the prioritization of NGHCs that could be useful for risk assessment. However, previous studies using inconsistently classified chemicals as the training set and a single microarray dataset concluded no consensus biomarkers. In this study, 4 consensus biomarkers of A2m, Ca3, Cxcl1, and Cyp8b1 were identified from four large-scale microarray datasets of the one-day single maximum tolerated dose and a large set of chemicals without inconsistent classifications. Machine learning techniques were subsequently applied to develop prediction models for NGHCs. The final bagging decision tree models were constructed with an average AUC performance of 0.803 for an independent test. A set of 16 chemicals with controversial classifications were reclassified according to the consensus biomarkers. The developed prediction models and identified consensus biomarkers are expected to be potential alternative methods for prioritization of NGHCs for further experimental validation. PMID:28117354

  19. Development and application of a phylogenomic toolkit: Resolving the evolutionary history of Madagascar’s lemurs

    PubMed Central

    Horvath, Julie E.; Weisrock, David W.; Embry, Stephanie L.; Fiorentino, Isabella; Balhoff, James P.; Kappeler, Peter; Wray, Gregory A.; Willard, Huntington F.; Yoder, Anne D.

    2008-01-01

    Lemurs and the other strepsirrhine primates are of great interest to the primate genomics community due to their phylogenetic placement as the sister lineage to all other primates. Previous attempts to resolve the phylogeny of lemurs employed limited mitochondrial or small nuclear data sets, with many relationships poorly supported or entirely unresolved. We used genomic resources to develop 11 novel markers from nine chromosomes, representing ∼9 kb of nuclear sequence data. In combination with previously published nuclear and mitochondrial loci, this yields a data set of more than 16 kb and adds ∼275 kb of DNA sequence to current databases. Our phylogenetic analyses confirm hypotheses of lemuriform monophyly and provide robust resolution of the phylogenetic relationships among the five lemuriform families. We verify that the genus Daubentonia is the sister lineage to all other lemurs. The Cheirogaleidae and Lepilemuridae are sister taxa and together form the sister lineage to the Indriidae; this clade is the sister lineage to the Lemuridae. Divergence time estimates indicate that lemurs are an ancient group, with their initial diversification occurring around the Cretaceous-Tertiary boundary. Given the power of this data set to resolve branches in a notoriously problematic area of primate phylogeny, we anticipate that our phylogenomic toolkit will be of value to other studies of primate phylogeny and diversification. Moreover, the methods applied will be broadly applicable to other taxonomic groups where phylogenetic relationships have been notoriously difficult to resolve. PMID:18245770

  20. The Role of Socio-Communicative Rearing Environments on the Development of Social and Physical Cognition in Apes

    PubMed Central

    Russell, J. L.; Lyn, H.; Schaeffer, J. A.; Hopkins, W. D.

    2011-01-01

    The cultural intelligence hypothesis (CIH) claims that humans' advanced cognition is a direct result of human culture and that children are uniquely specialized to absorb and utilize this cultural experience (Tomasello, 2000). Comparative data demonstrating that 2.5 year old human children outperform apes on measures of social cognition but not on measures of physical cognition support this claim (E. Herrmann, J. Call, M. V. Hernandez-Lloreda, B. Hare, & M. Tomasello, 2007). However, the previous study failed to control for rearing when comparing these two species. Specifically, the human children were raised in a human culture whereas the apes were raised in standard sanctuary settings. To further explore the CIH, here we compared the performance on multiple measures of social and physical cognition in a group of standard reared apes raised in conditions typical of zoo and biomedical laboratory settings to that of apes reared in an enculturated socio-communicatively rich environment. Overall, the enculturated apes significantly outperformed their standard reared counterparts on the cognitive tasks and this was particularly true for measures of communication. Furthermore, the performance of the enculturated apes was very similar to previously reported data from 2.5 year old children. We conclude that apes who are reared in a human-like socio-communicatively rich environment develop superior communicative abilities compared to apes reared in standard laboratory settings, which supports some assumptions of the cultural intelligence hypothesis. PMID:22010903

  1. Ensembl Genomes 2013: scaling up access to genome-wide data.

    PubMed

    Kersey, Paul Julian; Allen, James E; Christensen, Mikkel; Davis, Paul; Falin, Lee J; Grabmueller, Christoph; Hughes, Daniel Seth Toney; Humphrey, Jay; Kerhornou, Arnaud; Khobova, Julia; Langridge, Nicholas; McDowall, Mark D; Maheswari, Uma; Maslen, Gareth; Nuhn, Michael; Ong, Chuang Kee; Paulini, Michael; Pedro, Helder; Toneva, Iliana; Tuli, Mary Ann; Walts, Brandon; Williams, Gareth; Wilson, Derek; Youens-Clark, Ken; Monaco, Marcela K; Stein, Joshua; Wei, Xuehong; Ware, Doreen; Bolser, Daniel M; Howe, Kevin Lee; Kulesha, Eugene; Lawson, Daniel; Staines, Daniel Michael

    2014-01-01

    Ensembl Genomes (http://www.ensemblgenomes.org) is an integrating resource for genome-scale data from non-vertebrate species. The project exploits and extends technologies for genome annotation, analysis and dissemination, developed in the context of the vertebrate-focused Ensembl project, and provides a complementary set of resources for non-vertebrate species through a consistent set of programmatic and interactive interfaces. These provide access to data including reference sequence, gene models, transcriptional data, polymorphisms and comparative analysis. This article provides an update to the previous publications about the resource, with a focus on recent developments. These include the addition of important new genomes (and related data sets) including crop plants, vectors of human disease and eukaryotic pathogens. In addition, the resource has scaled up its representation of bacterial genomes, and now includes the genomes of over 9000 bacteria. Specific extensions to the web and programmatic interfaces have been developed to support users in navigating these large data sets. Looking forward, analytic tools to allow targeted selection of data for visualization and download are likely to become increasingly important in future as the number of available genomes increases within all domains of life, and some of the challenges faced in representing bacterial data are likely to become commonplace for eukaryotes in future.

  2. Development and field testing of a consumer shared decision-making training program for adults with low literacy.

    PubMed

    Muscat, Danielle M; Morony, Suzanne; Shepherd, Heather L; Smith, Sian K; Dhillon, Haryana M; Trevena, Lyndal; Hayen, Andrew; Luxford, Karen; Nutbeam, Don; McCaffery, Kirsten

    2015-10-01

    Given the scarcity of shared decision-making (SDM) interventions for adults with low literacy, we created a SDM training program tailored to this population to be delivered in adult education settings. Formative evaluation during program development included a review of the problem and previous efforts to address it, qualitative interviews with the target population, program planning and field testing. A comprehensive SDM training program was developed incorporating core SDM elements. The program aimed to improve students' understanding of SDM and to provide them with the necessary skills (understanding probabilistic risks and benefits, personal values and preferences) and self-efficacy to use an existing set of questions (the AskShareKnow questions) as a means to engage in SDM during healthcare interactions. There is an ethical imperative to develop SDM interventions for adults with lower literacy. Generic training programs delivered direct-to-consumers in adult education settings offer promise in a national and international environment where too few initiatives exist. Formative evaluation of the program offers practical insights into developing consumer-focused SDM training. The content of the program can be used as a guide for future efforts to engage consumers in SDM. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Inferring Mechanisms of Compensation from E-MAP and SGA Data Using Local Search Algorithms for Max Cut

    NASA Astrophysics Data System (ADS)

    Leiserson, Mark D. M.; Tatar, Diana; Cowen, Lenore J.; Hescott, Benjamin J.

    A new method based on a mathematically natural local search framework for max cut is developed to uncover functionally coherent module and BPM motifs in high-throughput genetic interaction data. Unlike previous methods which also consider physical protein-protein interaction data, our method utilizes genetic interaction data only; this becomes increasingly important as high-throughput genetic interaction data is becoming available in settings where less is known about physical interaction data. We compare modules and BPMs obtained to previous methods and across different datasets. Despite needing no physical interaction information, the BPMs produced by our method are competitive with previous methods. Biological findings include a suggested global role for the prefoldin complex and a SWR subcomplex in pathway buffering in the budding yeast interactome.

  4. Inferring mechanisms of compensation from E-MAP and SGA data using local search algorithms for max cut.

    PubMed

    Leiserson, Mark D M; Tatar, Diana; Cowen, Lenore J; Hescott, Benjamin J

    2011-11-01

    A new method based on a mathematically natural local search framework for max cut is developed to uncover functionally coherent module and BPM motifs in high-throughput genetic interaction data. Unlike previous methods, which also consider physical protein-protein interaction data, our method utilizes genetic interaction data only; this becomes increasingly important as high-throughput genetic interaction data is becoming available in settings where less is known about physical interaction data. We compare modules and BPMs obtained to previous methods and across different datasets. Despite needing no physical interaction information, the BPMs produced by our method are competitive with previous methods. Biological findings include a suggested global role for the prefoldin complex and a SWR subcomplex in pathway buffering in the budding yeast interactome.

  5. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    PubMed

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Global Survey of Protein Expression during Gonadal Sex Determination in Mice*

    PubMed Central

    Ewen, Katherine; Baker, Mark; Wilhelm, Dagmar; Aitken, R. John; Koopman, Peter

    2009-01-01

    The development of an embryo as male or female depends on differentiation of the gonads as either testes or ovaries. A number of genes are known to be important for gonadal differentiation, but our understanding of the regulatory networks underpinning sex determination remains fragmentary. To advance our understanding of sexual development beyond the transcriptome level, we performed the first global survey of the mouse gonad proteome at the time of sex determination by using two-dimensional nanoflow LC-MS/MS. The resulting data set contains a total of 1037 gene products (154 non-redundant and 883 redundant proteins) identified from 620 peptides. Functional classification and biological network construction suggested that the identified proteins primarily serve in RNA post-transcriptional modification and trafficking, protein synthesis and folding, and post-translational modification. The data set contains potential novel regulators of gonad development and sex determination not revealed previously by transcriptomics and proteomics studies and more than 60 proteins with potential links to human disorders of sexual development. PMID:19617587

  7. Exploring the epididymis: a personal perspective on careers in science.

    PubMed

    Turner, Terry T

    2015-01-01

    Science is a profession of inquiry. We ask ourselves what is it we see and why our observations happen the way they do. Answering those two question puts us in the company of those early explorers, who from Europe found the New World, and from Asia reached west to encounter Europe. Vasco Núñez de Balboa of Spain was such an explorer. He was the first European to see or "discover" the Pacific Ocean. One can imagine his amazement, his excitement when he first saw from a mountain top that vast ocean previously unknown to his culture. A career in science sends each of us seeking our own "Balboa Moments," those observations or results that surprise or even amaze us, those discoveries that open our eyes to new views of nature and medicine. Scientists aim to do what those early explorers did: discover what has previously been unknown, see what has previously been unseen, and reveal what has previously been hidden. Science requires the scientist to discover the facts from among many fictions and to separate the important facts from the trivial so that knowledge can be properly developed. It is only with knowledge that old dogmas can be challenged and corrected. Careers in science produce specific sets of knowledge. When pooled with other knowledge sets they eventually contribute to wisdom and it is wisdom, we hope, that will improve the human condition.

  8. Overcoming residual interference in mental set switching: Neural correlates and developmental trajectory

    PubMed Central

    Witt, Suzanne T.; Stevens, Michael C.

    2012-01-01

    Mental set switching is a key facet of executive control measured behaviorally through reaction time or accuracy (i.e., ‘switch costs’) when shifting among task types. One of several experimentally-dissociable influences on switch costs is ‘task set inertia’, conceptualized as the residual interference conferred when a previous stimulus-response tendency interferes with subsequent stimulus processing on a new task. Task set inertia is thought to represent the passive decay of the previous stimulus-response set from working memory, and its effects decrease with increased interstimulus interval. Closely spaced trials confer high task set inertia, while sparsely spaced trials confer low task set inertia. This functional magnetic resonance imaging (fMRI) study characterized, for the first time, two opposing brain systems engaged to resolve task set inertia: 1) a frontoparietal ‘cortical control’ network for overcoming high task set inertia interference and 2) a subcortical-motor network more active during trials with low task set inertia. These networks were distinct from brain regions showing general switching effects (i.e., switch > non-switch) and from other previously-characterized interference effects. Moreover, there were ongoing maturational effects throughout adolescence for the brain regions engaged to overcome high task set inertia not seen for generalized switching effects. These novel findings represent a new avenue of exploration of cognitive set switching neural function. PMID:22584223

  9. Informal 'Sorter' Houses: A qualitative insight of the 'shooting gallery' phenomenon in a UK setting.

    PubMed

    Parkin, Stephen; Coomber, Ross

    2009-12-01

    This paper considers the 'shooting gallery' phenomenon and presents findings from a sample of injecting drug users with experience of attending such premises in the South-West of England (UK). Due to the reciprocal relationship within these settings, involving the provision of drugs for place, the term Informal Sorter House has been coined by the authors. The social organisation and associative health risks within Informal Sorter Houses were found to have resounding similarities with those previously identified within American settings. However, several differences were also noted. Namely, Informal Sorter Houses appear to be located within a continuum of control that contains regulated, unregulated, and restored injecting environments and accordingly, it is suggested that such environments are in constant flux. A further difference relates to drug-user activism identified within such settings. This involves the establishment of an informal, street-based harm reduction practice that provides potential for future service development.

  10. Deep epistasis in human metabolism

    NASA Astrophysics Data System (ADS)

    Imielinski, Marcin; Belta, Calin

    2010-06-01

    We extend and apply a method that we have developed for deriving high-order epistatic relationships in large biochemical networks to a published genome-scale model of human metabolism. In our analysis we compute 33 328 reaction sets whose knockout synergistically disables one or more of 43 important metabolic functions. We also design minimal knockouts that remove flux through fumarase, an enzyme that has previously been shown to play an important role in human cancer. Most of these knockout sets employ more than eight mutually buffering reactions, spanning multiple cellular compartments and metabolic subsystems. These reaction sets suggest that human metabolic pathways possess a striking degree of parallelism, inducing "deep" epistasis between diversely annotated genes. Our results prompt specific chemical and genetic perturbation follow-up experiments that could be used to query in vivo pathway redundancy. They also suggest directions for future statistical studies of epistasis in genetic variation data sets.

  11. Quantitative Computed Tomography (QCT) derived Bone Mineral Density (BMD) in finite element studies: a review of the literature.

    PubMed

    Knowles, Nikolas K; Reeves, Jacob M; Ferreira, Louis M

    2016-12-01

    Finite element modeling of human bone provides a powerful tool to evaluate a wide variety of outcomes in a highly repeatable and parametric manner. These models are most often derived from computed tomography data, with mechanical properties related to bone mineral density (BMD) from the x-ray energy attenuation provided from this data. To increase accuracy, many researchers report the use of quantitative computed tomography (QCT), in which a calibration phantom is used during image acquisition to improve the estimation of BMD. Since model accuracy is dependent on the methods used in the calculation of BMD and density-mechanical property relationships, it is important to use relationships developed for the same anatomical location and using the same scanner settings, as these may impact model accuracy. The purpose of this literature review is to report the relationships used in the conversion of QCT equivalent density measures to ash, apparent, and/or tissue densities in recent finite element (FE) studies used in common density-modulus relationships. For studies reporting experimental validation, the validation metrics and results are presented. Of the studies reviewed, 29% reported the use of a dipotassium phosphate (K 2 HPO 4 ) phantom, 47% a hydroxyapatite (HA) phantom, 13% did not report phantom type, 7% reported use of both K 2 HPO 4 and HA phantoms, and 4% alternate phantom types. Scanner type and/or settings were omitted or partially reported in 31% of studies. The majority of studies used densitometric and/or density-modulus relationships derived from different anatomical locations scanned in different scanners with different scanner settings. The methods used to derive various densitometric relationships are reported and recommendations are provided toward the standardization of reporting metrics. This review assessed the current state of QCT-based FE modeling with use of clinical scanners. It was found that previously developed densitometric relationships vary by anatomical location, scanner type and settings. Reporting of all parameters used when referring to previously developed relationships, or in the development of new relationships, may increase the accuracy and repeatability of future FE models.

  12. Transferring genomics to the clinic: distinguishing Burkitt and diffuse large B cell lymphomas.

    PubMed

    Sha, Chulin; Barrans, Sharon; Care, Matthew A; Cunningham, David; Tooze, Reuben M; Jack, Andrew; Westhead, David R

    2015-01-01

    Classifiers based on molecular criteria such as gene expression signatures have been developed to distinguish Burkitt lymphoma and diffuse large B cell lymphoma, which help to explore the intermediate cases where traditional diagnosis is difficult. Transfer of these research classifiers into a clinical setting is challenging because there are competing classifiers in the literature based on different methodology and gene sets with no clear best choice; classifiers based on one expression measurement platform may not transfer effectively to another; and, classifiers developed using fresh frozen samples may not work effectively with the commonly used and more convenient formalin fixed paraffin-embedded samples used in routine diagnosis. Here we thoroughly compared two published high profile classifiers developed on data from different Affymetrix array platforms and fresh-frozen tissue, examining their transferability and concordance. Based on this analysis, a new Burkitt and diffuse large B cell lymphoma classifier (BDC) was developed and employed on Illumina DASL data from our own paraffin-embedded samples, allowing comparison with the diagnosis made in a central haematopathology laboratory and evaluation of clinical relevance. We show that both previous classifiers can be recapitulated using very much smaller gene sets than originally employed, and that the classification result is closely dependent on the Burkitt lymphoma criteria applied in the training set. The BDC classification on our data exhibits high agreement (~95 %) with the original diagnosis. A simple outcome comparison in the patients presenting intermediate features on conventional criteria suggests that the cases classified as Burkitt lymphoma by BDC have worse response to standard diffuse large B cell lymphoma treatment than those classified as diffuse large B cell lymphoma. In this study, we comprehensively investigate two previous Burkitt lymphoma molecular classifiers, and implement a new gene expression classifier, BDC, that works effectively on paraffin-embedded samples and provides useful information for treatment decisions. The classifier is available as a free software package under the GNU public licence within the R statistical software environment through the link http://www.bioinformatics.leeds.ac.uk/labpages/softwares/ or on github https://github.com/Sharlene/BDC.

  13. Validating the Malheur model for predicting ponderosa pine post-fire mortality using 24 fires in the Pacific Northwest, USA

    Treesearch

    Walter G. Thies; Douglas J. Westlind

    2012-01-01

    Fires, whether intentionally or accidentally set, commonly occur in western interior forests of the US. Following fire, managers need the ability to predict mortality of individual trees based on easily observed characteristics. Previously, a two-factor model using crown scorch and bole scorch proportions was developed with data from 3415 trees for predicting the...

  14. Organized Out-of-School Activities and Peer Relationships: Theoretical Perspectives and Previous Research

    ERIC Educational Resources Information Center

    Fredricks, Jennifer A.; Simpkins, Sandra D.

    2013-01-01

    The goal of this volume is to show how organized activities provide an ideal setting for developing a deeper understanding of peer relations, as well as offering a context for a more positive study of peers. The chapters in this volume focus on youth 10 to 18 years of age. In this introductory chapter we first describe the reasons why organized…

  15. High frequency data acquisition system for space shuttle main engine testing

    NASA Technical Reports Server (NTRS)

    Lewallen, Pat

    1987-01-01

    The high frequency data acquisition system developed for the Space Shuttle Main Engine (SSME) single engine test facility at the National Space Technology Laboratories is discussed. The real time system will provide engineering data for a complete set of SSME instrumentation (approx. 100 measurements) within 4 hours following engine cutoff, a decrease of over 48 hours from the previous analog tape based system.

  16. Supply Chain Management: How the Curricula of the Top Ten Undergraduate Universities Meet the Practitioners' Knowledge Set

    ERIC Educational Resources Information Center

    Bahouth, Saba; Hartmann, David; Willis, Geoff

    2014-01-01

    The disciplines of logistics and supply chain management have the potential of having many areas of emphasis. Universities that have some kind of emphasis in this field have developed programs that depend on the need of potential employers and their own faculty mix. Several studies have previously looked at how universities deal with this field at…

  17. The Effect of Dynamic Assessment on Adult Learners of Arabic: A Mixed-Method Study at the Defense Language Institute Foreign Language Center

    ERIC Educational Resources Information Center

    Fahmy, Mohsen M.

    2013-01-01

    Dynamic assessment (DA) is based on Vygotsky's (1978) sociocultural theory and his Zone of Proximal Development (ZPD). ZPD is the range of abilities bordered by the learner's assisted and independent performances. Previous studies showed promising results for DA in tutoring settings. However, they did not use proficiency-based rubrics to measure…

  18. Developing a Staff Physical Activity Program at Your School: Implementing the Lesser-Used Component of the CSPAP Model

    ERIC Educational Resources Information Center

    Langley, Katherine; Kulinna, Pamela Hodges

    2018-01-01

    The purpose of this article is to explore staff physical activity programs in the school setting, describe a viable option for a staff walking program in an elementary school, and determine elementary school staff members' participation and perceptions in one such program. Previous research has shown that placing a focus on staff involvement and…

  19. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed Central

    Dugan, J. M.; Berrios, D. C.; Liu, X.; Kim, D. K.; Kaizer, H.; Fagan, L. M.

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models. Images Figure 1 Figure 2 Figure 4 Figure 5 PMID:10566457

  20. 40 CFR 141.707 - Grandfathering previously collected data.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... determines that a previously collected data set submitted for grandfathering was generated during source... additional source water monitoring data, as determined by the State, to ensure that the data set used under... data. 141.707 Section 141.707 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED...

  1. 40 CFR 141.707 - Grandfathering previously collected data.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... determines that a previously collected data set submitted for grandfathering was generated during source... additional source water monitoring data, as determined by the State, to ensure that the data set used under... data. 141.707 Section 141.707 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED...

  2. 40 CFR 141.707 - Grandfathering previously collected data.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... determines that a previously collected data set submitted for grandfathering was generated during source... additional source water monitoring data, as determined by the State, to ensure that the data set used under... data. 141.707 Section 141.707 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED...

  3. Formal methods for test case generation

    NASA Technical Reports Server (NTRS)

    Rushby, John (Inventor); De Moura, Leonardo Mendonga (Inventor); Hamon, Gregoire (Inventor)

    2011-01-01

    The invention relates to the use of model checkers to generate efficient test sets for hardware and software systems. The method provides for extending existing tests to reach new coverage targets; searching *to* some or all of the uncovered targets in parallel; searching in parallel *from* some or all of the states reached in previous tests; and slicing the model relative to the current set of coverage targets. The invention provides efficient test case generation and test set formation. Deep regions of the state space can be reached within allotted time and memory. The approach has been applied to use of the model checkers of SRI's SAL system and to model-based designs developed in Stateflow. Stateflow models achieving complete state and transition coverage in a single test case are reported.

  4. Novel ITS1 Fungal Primers for Characterization of the Mycobiome

    PubMed Central

    Usyk, Mykhaylo; Zolnik, Christine P.; Patel, Hitesh; Levi, Michael H.

    2017-01-01

    ABSTRACT Studies of the human microbiome frequently omit characterization of fungal communities (the mycobiome), which limits our ability to investigate how fungal communities influence human health. The internal transcribed spacer 1 (ITS1) region of the eukaryotic ribosomal cluster has features allowing for wide taxonomic coverage and has been recognized as a suitable barcode region for species-level identification of fungal organisms. We developed custom ITS1 primer sets using iterative alignment refinement. Primer performance was evaluated using in silico testing and experimental testing of fungal cultures and human samples. Using an expanded novel reference database, SIS (18S-ITS1-5.8S), the newly designed primers showed an average in silico taxonomic coverage of 79.9% ± 7.1% compared to a coverage of 44.6% ± 13.2% using previously published primers (P = 0.05). The newly described primer sets recovered an average of 21,830 ± 225 fungal reads from fungal isolate culture samples, whereas the previously published primers had an average of 3,305 ± 1,621 reads (P = 0.03). Of note was an increase in the taxonomic coverage of the Candida genus, which went from a mean coverage of 59.5% ± 13% to 100.0% ± 0.0% (P = 0.0015) comparing the previously described primers to the new primers, respectively. The newly developed ITS1 primer sets significantly improve general taxonomic coverage of fungal communities infecting humans and increased read depth by an order of magnitude over the best-performing published primer set tested. The overall best-performing primer pair in terms of taxonomic coverage and read recovery, ITS1-30F/ITS1-217R, will aid in advancing research in the area of the human mycobiome. IMPORTANCE The mycobiome constitutes all the fungal organisms within an environment or biological niche. The fungi are eukaryotes, are extremely heterogeneous, and include yeasts and molds that colonize humans as part of the microbiome. In addition, fungi can also infect humans and cause disease. Characterization of the bacterial component of the microbiome was revolutionized by 16S rRNA gene fragment amplification, next-generation sequencing technologies, and bioinformatics pipelines. Characterization of the mycobiome has often not been included in microbiome studies because of limitations in amplification systems. This report revisited the selection of PCR primers that amplify the fungal ITS1 region. We have identified primers with superior identification of fungi present in the database. We have compared the new primer sets against those previously used in the literature and show a significant improvement in read count and taxon identification. These primers should facilitate the study of fungi in human physiology and disease states. PMID:29242834

  5. Atomic and vibrational origins of mechanical toughness in bioactive cement during setting

    PubMed Central

    Tian, Kun V.; Yang, Bin; Yue, Yuanzheng; Bowron, Daniel T.; Mayers, Jerry; Donnan, Robert S.; Dobó-Nagy, Csaba; Nicholson, John W.; Fang, De-Cai; Greer, A. Lindsay; Chass, Gregory A.; Greaves, G. Neville

    2015-01-01

    Bioactive glass ionomer cements (GICs) have been in widespread use for ∼40 years in dentistry and medicine. However, these composites fall short of the toughness needed for permanent implants. Significant impediment to improvement has been the requisite use of conventional destructive mechanical testing, which is necessarily retrospective. Here we show quantitatively, through the novel use of calorimetry, terahertz (THz) spectroscopy and neutron scattering, how GIC's developing fracture toughness during setting is related to interfacial THz dynamics, changing atomic cohesion and fluctuating interfacial configurations. Contrary to convention, we find setting is non-monotonic, characterized by abrupt features not previously detected, including a glass–polymer coupling point, an early setting point, where decreasing toughness unexpectedly recovers, followed by stress-induced weakening of interfaces. Subsequently, toughness declines asymptotically to long-term fracture test values. We expect the insight afforded by these in situ non-destructive techniques will assist in raising understanding of the setting mechanisms and associated dynamics of cementitious materials. PMID:26548704

  6. The GENCODE exome: sequencing the complete human exome

    PubMed Central

    Coffey, Alison J; Kokocinski, Felix; Calafato, Maria S; Scott, Carol E; Palta, Priit; Drury, Eleanor; Joyce, Christopher J; LeProust, Emily M; Harrow, Jen; Hunt, Sarah; Lehesjoki, Anna-Elina; Turner, Daniel J; Hubbard, Tim J; Palotie, Aarno

    2011-01-01

    Sequencing the coding regions, the exome, of the human genome is one of the major current strategies to identify low frequency and rare variants associated with human disease traits. So far, the most widely used commercial exome capture reagents have mainly targeted the consensus coding sequence (CCDS) database. We report the design of an extended set of targets for capturing the complete human exome, based on annotation from the GENCODE consortium. The extended set covers an additional 5594 genes and 10.3 Mb compared with the current CCDS-based sets. The additional regions include potential disease genes previously inaccessible to exome resequencing studies, such as 43 genes linked to ion channel activity and 70 genes linked to protein kinase activity. In total, the new GENCODE exome set developed here covers 47.9 Mb and performed well in sequence capture experiments. In the sample set used in this study, we identified over 5000 SNP variants more in the GENCODE exome target (24%) than in the CCDS-based exome sequencing. PMID:21364695

  7. Measuring surface topography with scanning electron microscopy. I. EZEImage: a program to obtain 3D surface data.

    PubMed

    Ponz, Ezequiel; Ladaga, Juan Luis; Bonetto, Rita Dominga

    2006-04-01

    Scanning electron microscopy (SEM) is widely used in the science of materials and different parameters were developed to characterize the surface roughness. In a previous work, we studied the surface topography with fractal dimension at low scale and two parameters at high scale by using the variogram, that is, variance vs. step log-log graph, of a SEM image. Those studies were carried out with the FERImage program, previously developed by us. To verify the previously accepted hypothesis by working with only an image, it is indispensable to have reliable three-dimensional (3D) surface data. In this work, a new program (EZEImage) to characterize 3D surface topography in SEM has been developed. It uses fast cross correlation and dynamic programming to obtain reliable dense height maps in a few seconds which can be displayed as an image where each gray level represents a height value. This image can be used for the FERImage program or any other software to obtain surface topography characteristics. EZEImage also generates anaglyph images as well as characterizes 3D surface topography by means of a parameter set to describe amplitude properties and three functional indices for characterizing bearing and fluid properties.

  8. Self-Reported Physical Activity in Medically Underserved Adults With Type 2 Diabetes in Clinical and Community Settings.

    PubMed

    Cooper, John; Stetson, Barbara; Bonner, Jason; Spille, Sean; Krishnasamy, Sathya; Mokshagundam, Sri Prakash

    2015-07-01

    This study assessed physical activity (PA) in community dwelling adults with Type 2 diabetes, using multiple instruments reflecting internationally normed PA and diabetes-specific self-care behaviors. Two hundred and fifty-three Black (44.8%) and White (55.2%) Americans [mean age = 57.93; 39.5% male] recruited at low-income clinic and community health settings. Participants completed validated PA self-report measures developed for international comparisons (International Physical Activity Questionnaire Short Form), characterization of diabetes self-care (Summary of Diabetes Self-Care Activities Measure; SDSCA) and exercise-related domains including provider recommendations and PA behaviors and barriers (Personal Diabetes Questionnaire; PDQ). Self-reported PA and PA correlates differed by instrument. BMI was negatively correlated with PA level assessed by the PDQ in both genders, and assessed with SDSCA activity items in females. PA levels were low, comparable to previous research with community and diabetes samples. Pain was the most frequently reported barrier; females reported more frequent PA barriers overall. When using self-report PA measures for PA evaluation of adults with diabetes in clinical settings, it is critical to consider population and setting in selecting appropriate tools. PA barriers may be an important consideration when interpreting PA levels and developing interventions. Recommendations for incorporating these measures in clinical and research settings are discussed.

  9. Functional cohesion of gene sets determined by latent semantic indexing of PubMed abstracts.

    PubMed

    Xu, Lijing; Furlotte, Nicholas; Lin, Yunyue; Heinrich, Kevin; Berry, Michael W; George, Ebenezer O; Homayouni, Ramin

    2011-04-14

    High-throughput genomic technologies enable researchers to identify genes that are co-regulated with respect to specific experimental conditions. Numerous statistical approaches have been developed to identify differentially expressed genes. Because each approach can produce distinct gene sets, it is difficult for biologists to determine which statistical approach yields biologically relevant gene sets and is appropriate for their study. To address this issue, we implemented Latent Semantic Indexing (LSI) to determine the functional coherence of gene sets. An LSI model was built using over 1 million Medline abstracts for over 20,000 mouse and human genes annotated in Entrez Gene. The gene-to-gene LSI-derived similarities were used to calculate a literature cohesion p-value (LPv) for a given gene set using a Fisher's exact test. We tested this method against genes in more than 6,000 functional pathways annotated in Gene Ontology (GO) and found that approximately 75% of gene sets in GO biological process category and 90% of the gene sets in GO molecular function and cellular component categories were functionally cohesive (LPv<0.05). These results indicate that the LPv methodology is both robust and accurate. Application of this method to previously published microarray datasets demonstrated that LPv can be helpful in selecting the appropriate feature extraction methods. To enable real-time calculation of LPv for mouse or human gene sets, we developed a web tool called Gene-set Cohesion Analysis Tool (GCAT). GCAT can complement other gene set enrichment approaches by determining the overall functional cohesion of data sets, taking into account both explicit and implicit gene interactions reported in the biomedical literature. GCAT is freely available at http://binf1.memphis.edu/gcat.

  10. Rehearsal development as development of iterative recall processes.

    PubMed

    Lehmann, Martin

    2015-01-01

    Although much is known about the critical importance of active verbal rehearsal for successful recall, knowledge about the mechanisms of rehearsal and their respective development in children is very limited. To be able to rehearse several items together, these items have to be available, or, if presented and rehearsed previously, retrieved from memory. Therefore, joint rehearsal of several items may itself be considered recall. Accordingly, by analyzing free recall, one cannot only gain insight into how recall and rehearsal unfold, but also into how principles that govern children's recall govern children's rehearsal. Over a period of three and a half years (beginning at grade 3) 54 children were longitudinally assessed seven times on several overt rehearsal free recall trials. A first set of analyses on recall revealed significant age-related increases in the primacy effect and an age-invariant recency effect. In the middle portion of the list, wave-shaped recall characteristics emerged and increased with age, indicating grouping of the list into subsequences. In a second set of analyses, overt rehearsal behavior was decomposed into distinct rehearsal sets. Analyses of these sets revealed that the distribution of rehearsals within each set resembled the serial position curves with one- or two-item primacy and recency effects and wave-shaped rehearsal patterns in between. In addition, rehearsal behavior throughout the list was characterized by a decreasing tendency to begin rehearsal sets with the first list item. This result parallels the phenomenon of beginning recall with the first item on short lists and with the last item on longer lists.

  11. Rehearsal development as development of iterative recall processes

    PubMed Central

    Lehmann, Martin

    2015-01-01

    Although much is known about the critical importance of active verbal rehearsal for successful recall, knowledge about the mechanisms of rehearsal and their respective development in children is very limited. To be able to rehearse several items together, these items have to be available, or, if presented and rehearsed previously, retrieved from memory. Therefore, joint rehearsal of several items may itself be considered recall. Accordingly, by analyzing free recall, one cannot only gain insight into how recall and rehearsal unfold, but also into how principles that govern children’s recall govern children’s rehearsal. Over a period of three and a half years (beginning at grade 3) 54 children were longitudinally assessed seven times on several overt rehearsal free recall trials. A first set of analyses on recall revealed significant age-related increases in the primacy effect and an age-invariant recency effect. In the middle portion of the list, wave-shaped recall characteristics emerged and increased with age, indicating grouping of the list into subsequences. In a second set of analyses, overt rehearsal behavior was decomposed into distinct rehearsal sets. Analyses of these sets revealed that the distribution of rehearsals within each set resembled the serial position curves with one- or two-item primacy and recency effects and wave-shaped rehearsal patterns in between. In addition, rehearsal behavior throughout the list was characterized by a decreasing tendency to begin rehearsal sets with the first list item. This result parallels the phenomenon of beginning recall with the first item on short lists and with the last item on longer lists. PMID:25870569

  12. GLEAM v3: satellite-based land evaporation and root-zone soil moisture

    NASA Astrophysics Data System (ADS)

    Martens, Brecht; Miralles, Diego G.; Lievens, Hans; van der Schalie, Robin; de Jeu, Richard A. M.; Fernández-Prieto, Diego; Beck, Hylke E.; Dorigo, Wouter A.; Verhoest, Niko E. C.

    2017-05-01

    The Global Land Evaporation Amsterdam Model (GLEAM) is a set of algorithms dedicated to the estimation of terrestrial evaporation and root-zone soil moisture from satellite data. Ever since its development in 2011, the model has been regularly revised, aiming at the optimal incorporation of new satellite-observed geophysical variables, and improving the representation of physical processes. In this study, the next version of this model (v3) is presented. Key changes relative to the previous version include (1) a revised formulation of the evaporative stress, (2) an optimized drainage algorithm, and (3) a new soil moisture data assimilation system. GLEAM v3 is used to produce three new data sets of terrestrial evaporation and root-zone soil moisture, including a 36-year data set spanning 1980-2015, referred to as v3a (based on satellite-observed soil moisture, vegetation optical depth and snow-water equivalent, reanalysis air temperature and radiation, and a multi-source precipitation product), and two satellite-based data sets. The latter share most of their forcing, except for the vegetation optical depth and soil moisture, which are based on observations from different passive and active C- and L-band microwave sensors (European Space Agency Climate Change Initiative, ESA CCI) for the v3b data set (spanning 2003-2015) and observations from the Soil Moisture and Ocean Salinity (SMOS) satellite in the v3c data set (spanning 2011-2015). Here, these three data sets are described in detail, compared against analogous data sets generated using the previous version of GLEAM (v2), and validated against measurements from 91 eddy-covariance towers and 2325 soil moisture sensors across a broad range of ecosystems. Results indicate that the quality of the v3 soil moisture is consistently better than the one from v2: average correlations against in situ surface soil moisture measurements increase from 0.61 to 0.64 in the case of the v3a data set and the representation of soil moisture in the second layer improves as well, with correlations increasing from 0.47 to 0.53. Similar improvements are observed for the v3b and c data sets. Despite regional differences, the quality of the evaporation fluxes remains overall similar to the one obtained using the previous version of GLEAM, with average correlations against eddy-covariance measurements ranging between 0.78 and 0.81 for the different data sets. These global data sets of terrestrial evaporation and root-zone soil moisture are now openly available at www.GLEAM.eu and may be used for large-scale hydrological applications, climate studies, or research on land-atmosphere feedbacks.

  13. Lexical Entrainment and Lexical Differentiation in Reference Phrase Choice

    ERIC Educational Resources Information Center

    Van Der Wege, Mija M.

    2009-01-01

    Speakers reuse prior references to objects when choosing reference phrases, a phenomenon known as lexical entrainment. One explanation is that speakers want to maintain a set of previously established referential precedents. Speakers may also contrast any new referents against this previously established set, thereby avoiding applying the same…

  14. [siRNAs with high specificity to the target: a systematic design by CRM algorithm].

    PubMed

    Alsheddi, T; Vasin, L; Meduri, R; Randhawa, M; Glazko, G; Baranova, A

    2008-01-01

    'Off-target' silencing effect hinders the development of siRNA-based therapeutic and research applications. Common solution to this problem is an employment of the BLAST that may miss significant alignments or an exhaustive Smith-Waterman algorithm that is very time-consuming. We have developed a Comprehensive Redundancy Minimizer (CRM) approach for mapping all unique sequences ("targets") 9-to-15 nt in size within large sets of sequences (e.g. transcriptomes). CRM outputs a list of potential siRNA candidates for every transcript of the particular species. These candidates could be further analyzed by traditional "set-of-rules" types of siRNA designing tools. For human, 91% of transcripts are covered by candidate siRNAs with kernel targets of N = 15. We tested our approach on the collection of previously described experimentally assessed siRNAs and found that the correlation between efficacy and presence in CRM-approved set is significant (r = 0.215, p-value = 0.0001). An interactive database that contains a precompiled set of all human siRNA candidates with minimized redundancy is available at http://129.174.194.243. Application of the CRM-based filtering minimizes potential "off-target" silencing effects and could improve routine siRNA applications.

  15. An extended harmonic balance method based on incremental nonlinear control parameters

    NASA Astrophysics Data System (ADS)

    Khodaparast, Hamed Haddad; Madinei, Hadi; Friswell, Michael I.; Adhikari, Sondipon; Coggon, Simon; Cooper, Jonathan E.

    2017-02-01

    A new formulation for calculating the steady-state responses of multiple-degree-of-freedom (MDOF) non-linear dynamic systems due to harmonic excitation is developed. This is aimed at solving multi-dimensional nonlinear systems using linear equations. Nonlinearity is parameterised by a set of 'non-linear control parameters' such that the dynamic system is effectively linear for zero values of these parameters and nonlinearity increases with increasing values of these parameters. Two sets of linear equations which are formed from a first-order truncated Taylor series expansion are developed. The first set of linear equations provides the summation of sensitivities of linear system responses with respect to non-linear control parameters and the second set are recursive equations that use the previous responses to update the sensitivities. The obtained sensitivities of steady-state responses are then used to calculate the steady state responses of non-linear dynamic systems in an iterative process. The application and verification of the method are illustrated using a non-linear Micro-Electro-Mechanical System (MEMS) subject to a base harmonic excitation. The non-linear control parameters in these examples are the DC voltages that are applied to the electrodes of the MEMS devices.

  16. Goal Development Practices of Physical Therapists Working in Educational Environments.

    PubMed

    Wynarczuk, Kimberly D; Chiarello, Lisa A; Gohrband, Catherine L

    2017-11-01

    The aims of this study were to (1) describe the practices that school-based physical therapists use in developing student goals, and (2) identify facilitators and barriers to development of goals that are specific to participation in the context of the school setting. 46 school-based physical therapists who participated in a previous study on school-based physical therapy practice (PT COUNTS) completed a questionnaire on goal development. Frequencies and cross tabulations were generated for quantitative data. Open-ended questions were analyzed using an iterative qualitative analysis process. A majority of therapists reported that they frequently develop goals collaboratively with other educational team members. Input from teachers, related services personnel, and parents has the most influence on goal development. Qualitative analysis identified five themes that influence development of participation-based goals: (1) school-based philosophy and practice; (2) the educational environment, settings, and routines; (3) student strengths, needs, and personal characteristics; (4) support from and collaboration with members of the educational team; and (5) therapist practice and motivation. Goal development is a complex process that involves multiple members of the educational team and is influenced by many different aspects of practice, the school environment, and student characteristics.

  17. Are developments in mental scanning and mental rotation related?

    PubMed Central

    Wimmer, Marina C.; Robinson, Elizabeth J.; Doherty, Martin J.

    2017-01-01

    The development and relation of mental scanning and mental rotation were examined in 4-, 6-, 8-, 10-year old children and adults (N = 102). Based on previous findings from adults and ageing populations, the key question was whether they develop as a set of related abilities and become increasingly differentiated or are unrelated abilities per se. Findings revealed that both mental scanning and rotation abilities develop between 4- and 6 years of age. Specifically, 4-year-olds showed no difference in accuracy of mental scanning and no scanning trials whereas all older children and adults made more errors in scanning trials. Additionally, the minority of 4-year-olds showed a linear increase in response time with increasing rotation angle difference of two stimuli in contrast to all older participants. Despite similar developmental trajectories, mental scanning and rotation performances were unrelated. Thus, adding to research findings from adults, mental scanning and rotation appear to develop as a set of unrelated abilities from the outset. Different underlying abilities such as visual working memory and spatial coding versus representing past and future events are discussed. PMID:28207810

  18. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  19. Development of SNP Genotyping Assays for Seed Composition Traits in Soybean

    PubMed Central

    Patil, Gunvant; Chaudhary, Juhi; Vuong, Tri D.; Jenkins, Brian; Qiu, Dan; Kadam, Suhas; Shannon, Grover J.

    2017-01-01

    Seed composition is one of the most important determinants of the economic values in soybean. The quality and quantity of different seed components, such as oil, protein, and carbohydrates, are crucial ingredients in food, feed, and numerous industrial products. Soybean researchers have successfully developed and utilized a diverse set of molecular markers for seed trait improvement in soybean breeding programs. It is imperative to design and develop molecular assays that are accurate, robust, high-throughput, cost-effective, and available on a common genotyping platform. In the present study, we developed and validated KASP (Kompetitive allele-specific polymerase chain reaction) genotyping assays based on previously known functional mutant alleles for the seed composition traits, including fatty acids, oligosaccharides, trypsin inhibitor, and lipoxygenase. These assays were validated on mutant sources as well as mapping populations and precisely distinguish the homozygotes and heterozygotes of the mutant genes. With the obvious advantages, newly developed KASP assays in this study can substitute the genotyping assays that were previously developed for marker-assisted selection (MAS). The functional gene-based assay resource developed using common genotyping platform will be helpful to accelerate efforts to improve soybean seed composition traits. PMID:28630621

  20. A high-performance seizure detection algorithm based on Discrete Wavelet Transform (DWT) and EEG

    PubMed Central

    Chen, Duo; Wan, Suiren; Xiang, Jing; Bao, Forrest Sheng

    2017-01-01

    In the past decade, Discrete Wavelet Transform (DWT), a powerful time-frequency tool, has been widely used in computer-aided signal analysis of epileptic electroencephalography (EEG), such as the detection of seizures. One of the important hurdles in the applications of DWT is the settings of DWT, which are chosen empirically or arbitrarily in previous works. The objective of this study aimed to develop a framework for automatically searching the optimal DWT settings to improve accuracy and to reduce computational cost of seizure detection. To address this, we developed a method to decompose EEG data into 7 commonly used wavelet families, to the maximum theoretical level of each mother wavelet. Wavelets and decomposition levels providing the highest accuracy in each wavelet family were then searched in an exhaustive selection of frequency bands, which showed optimal accuracy and low computational cost. The selection of frequency bands and features removed approximately 40% of redundancies. The developed algorithm achieved promising performance on two well-tested EEG datasets (accuracy >90% for both datasets). The experimental results of the developed method have demonstrated that the settings of DWT affect its performance on seizure detection substantially. Compared with existing seizure detection methods based on wavelet, the new approach is more accurate and transferable among datasets. PMID:28278203

  1. Inferring Mechanisms of Compensation from E-MAP and SGA Data Using Local Search Algorithms for Max Cut

    PubMed Central

    Leiserson, Mark D.M.; Tatar, Diana; Cowen, Lenore J.

    2011-01-01

    Abstract A new method based on a mathematically natural local search framework for max cut is developed to uncover functionally coherent module and BPM motifs in high-throughput genetic interaction data. Unlike previous methods, which also consider physical protein-protein interaction data, our method utilizes genetic interaction data only; this becomes increasingly important as high-throughput genetic interaction data is becoming available in settings where less is known about physical interaction data. We compare modules and BPMs obtained to previous methods and across different datasets. Despite needing no physical interaction information, the BPMs produced by our method are competitive with previous methods. Biological findings include a suggested global role for the prefoldin complex and a SWR subcomplex in pathway buffering in the budding yeast interactome. PMID:21882903

  2. Using an epiphytic moss to identify previously unknown sources of atmospheric cadmium pollution

    Treesearch

    Geoffrey H. Donovan; Sarah E. Jovan; Demetrios Gatziolis; Igor Burstyn; Yvonne L. Michael; Michael C. Amacher; Vicente J. Monleon

    2016-01-01

    Urban networks of air-quality monitors are often too widely spaced to identify sources of air pollutants, especially if they do not disperse far from emission sources. The objectives of this study were to test the use of moss bio-indicators to develop a fine-scale map of atmospherically-derived cadmium and to identify the sources of cadmium in a complex urban setting....

  3. Estimating tuberculosis incidence from primary survey data: a mathematical modeling approach

    PubMed Central

    Chadha, V. K.; Laxminarayan, R.; Arinaminpathy, N.

    2017-01-01

    SUMMARY BACKGROUND: There is an urgent need for improved estimations of the burden of tuberculosis (TB). OBJECTIVE: To develop a new quantitative method based on mathematical modelling, and to demonstrate its application to TB in India. DESIGN: We developed a simple model of TB transmission dynamics to estimate the annual incidence of TB disease from the annual risk of tuberculous infection and prevalence of smear-positive TB. We first compared model estimates for annual infections per smear-positive TB case using previous empirical estimates from China, Korea and the Philippines. We then applied the model to estimate TB incidence in India, stratified by urban and rural settings. RESULTS: Study model estimates show agreement with previous empirical estimates. Applied to India, the model suggests an annual incidence of smear-positive TB of 89.8 per 100 000 population (95%CI 56.8–156.3). Results show differences in urban and rural TB: while an urban TB case infects more individuals per year, a rural TB case remains infectious for appreciably longer, suggesting the need for interventions tailored to these different settings. CONCLUSIONS: Simple models of TB transmission, in conjunction with necessary data, can offer approaches to burden estimation that complement those currently being used. PMID:28284250

  4. Screening of missing proteins in the human liver proteome by improved MRM-approach-based targeted proteomics.

    PubMed

    Chen, Chen; Liu, Xiaohui; Zheng, Weimin; Zhang, Lei; Yao, Jun; Yang, Pengyuan

    2014-04-04

    To completely annotate the human genome, the task of identifying and characterizing proteins that currently lack mass spectrometry (MS) evidence is inevitable and urgent. In this study, as the first effort to screen missing proteins in large scale, we developed an approach based on SDS-PAGE followed by liquid chromatography-multiple reaction monitoring (LC-MRM), for screening of those missing proteins with only a single peptide hit in the previous liver proteome data set. Proteins extracted from normal human liver were separated in SDS-PAGE and digested in split gel slice, and the resulting digests were then subjected to LC-schedule MRM analysis. The MRM assays were developed through synthesized crude peptides for target peptides. In total, the expressions of 57 target proteins were confirmed from 185 MRM assays in normal human liver tissues. Among the proved 57 one-hit wonders, 50 proteins are of the minimally redundant set in the PeptideAtlas database, 7 proteins even have none MS-based information previously in various biological processes. We conclude that our SDS-PAGE-MRM workflow can be a powerful approach to screen missing or poorly characterized proteins in different samples and to provide their quantity if detected. The MRM raw data have been uploaded to ISB/SRM Atlas/PASSEL (PXD000648).

  5. Development and confirmation of potential gene classifiers of human clear cell renal cell carcinoma using next-generation RNA sequencing.

    PubMed

    Eikrem, Oystein S; Strauss, Philipp; Beisland, Christian; Scherer, Andreas; Landolt, Lea; Flatberg, Arnar; Leh, Sabine; Beisvag, Vidar; Skogstrand, Trude; Hjelle, Karin; Shresta, Anjana; Marti, Hans-Peter

    2016-12-01

    A previous study by this group demonstrated the feasibility of RNA sequencing (RNAseq) technology for capturing disease biology of clear cell renal cell carcinoma (ccRCC), and presented initial results for carbonic anhydrase-9 (CA9) and tumor necrosis factor-α-induced protein-6 (TNFAIP6) as possible biomarkers of ccRCC (discovery set) [Eikrem et al. PLoS One 2016;11:e0149743]. To confirm these results, the previous study is expanded, and RNAseq data from additional matched ccRCC and normal renal biopsies are analyzed (confirmation set). Two core biopsies from patients (n = 12) undergoing partial or full nephrectomy were obtained with a 16 g needle. RNA sequencing libraries were generated with the Illumina TruSeq ® Access library preparation protocol. Comparative analysis was done using linear modeling (voom/Limma; R Bioconductor). The formalin-fixed and paraffin-embedded discovery and confirmation data yielded 8957 and 11,047 detected transcripts, respectively. The two data sets shared 1193 of differentially expressed genes with each other. The average expression and the log 2 -fold changes of differentially expressed transcripts in both data sets correlated, with R²   =   .95 and R²   =   .94, respectively. Among transcripts with the highest fold changes were CA9, neuronal pentraxin-2 and uromodulin. Epithelial-mesenchymal transition was highlighted by differential expression of, for example, transforming growth factor-β 1 and delta-like ligand-4. The diagnostic accuracy of CA9 was 100% and 93.9% when using the discovery set as the training set and the confirmation data as the test set, and vice versa, respectively. These data further support TNFAIP6 as a novel biomarker of ccRCC. TNFAIP6 had combined accuracy of 98.5% in the two data sets. This study provides confirmatory data on the potential use of CA9 and TNFAIP6 as biomarkers of ccRCC. Thus, next-generation sequencing expands the clinical application of tissue analyses.

  6. Sample classification for improved performance of PLS models applied to the quality control of deep-frying oils of different botanic origins analyzed using ATR-FTIR spectroscopy.

    PubMed

    Kuligowski, Julia; Carrión, David; Quintás, Guillermo; Garrigues, Salvador; de la Guardia, Miguel

    2011-01-01

    The selection of an appropriate calibration set is a critical step in multivariate method development. In this work, the effect of using different calibration sets, based on a previous classification of unknown samples, on the partial least squares (PLS) regression model performance has been discussed. As an example, attenuated total reflection (ATR) mid-infrared spectra of deep-fried vegetable oil samples from three botanical origins (olive, sunflower, and corn oil), with increasing polymerized triacylglyceride (PTG) content induced by a deep-frying process were employed. The use of a one-class-classifier partial least squares-discriminant analysis (PLS-DA) and a rooted binary directed acyclic graph tree provided accurate oil classification. Oil samples fried without foodstuff could be classified correctly, independent of their PTG content. However, class separation of oil samples fried with foodstuff, was less evident. The combined use of double-cross model validation with permutation testing was used to validate the obtained PLS-DA classification models, confirming the results. To discuss the usefulness of the selection of an appropriate PLS calibration set, the PTG content was determined by calculating a PLS model based on the previously selected classes. In comparison to a PLS model calculated using a pooled calibration set containing samples from all classes, the root mean square error of prediction could be improved significantly using PLS models based on the selected calibration sets using PLS-DA, ranging between 1.06 and 2.91% (w/w).

  7. Simulating Category Learning and Set Shifting Deficits in Patients Weight-Restored from Anorexia Nervosa

    DTIC Science & Technology

    2014-01-01

    Neuropsychology, in press     Simulating Category Learning and Set Shifting Deficits in Patients Weight-Restored from Anorexia Nervosa J...University   Objective: To examine set shifting in a group of women previously diagnosed with anorexia nervosa (AN) who are now weight-restored (AN-WR...participant fails to switch to the new rule but rather persists with the previously correct rule. Adult patients with Anorexia Nervosa (AN) are often impaired

  8. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions

    PubMed Central

    Nico, Magalí; Mantese, Anita I.; Miralles, Daniel J.; Kantolic, Adriana G.

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod’s chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated. PMID:26512057

  9. Validation and Improvement of SRTM Performance over Rugged Terrain

    NASA Technical Reports Server (NTRS)

    Zebker, Howard A.

    2004-01-01

    We have previously reported work related to basic technique development in phase unwrapping and generation of digital elevation models (DEM). In the final year of this work we have applied our technique work to the improvement of DEM's produced by SRTM. In particular, we have developed a rigorous mathematical algorithm and means to fill in missing data over rough terrain from other data sets. We illustrate this method by using a higher resolution, but globally less accurate, DEM produced by the TOPSAR airborne instrument over the Galapagos Islands to augment the SRTM data set in this area, We combine this data set with SRTM to use each set to fill in holes left over by the other imaging system. The infilling is done by first interpolating each data set using a prediction error filter that reproduces the same statistical characterization as exhibited by the entire data set within the interpolated region. After this procedure is implemented on each data set, the two are combined on a point by point basis with weights that reflect the accuracy of each data point in its original image. In areas that are better covered by SRTM, TOPSAR data are weighted down but still retain TOPSAR statistics. The reverse is true for regions better covered by TOPSAR. The resulting DEM passes statistical tests and appears quite feasible to the eye, but as this DEM is the best available for the region we cannot fully veri@ its accuracy. Spot checks with GPS points show that locally the technique results in a more comprehensive and accurate map than either data set alone.

  10. Merge of Five Previous Catalogues Into the Ground Truth Catalogue and Registration Based on MOLA Data with THEMIS-DIR, MDIM and MOC Data-Sets

    NASA Astrophysics Data System (ADS)

    Salamuniccar, G.; Loncaric, S.

    2008-03-01

    The Catalogue from our previous work was merged with the date of Barlow, Rodionova, Boyce, and Kuzmin. The resulting ground truth catalogue with 57,633 craters was registered, using MOLA data, with THEMIS-DIR, MDIM, and MOC data-sets.

  11. Developing a workplace resilience instrument.

    PubMed

    Mallak, Larry A; Yildiz, Mustafa

    2016-05-27

    Resilience benefits from the use of protective factors, as opposed to risk factors, which are associated with vulnerability. Considerable research and instrument development has been conducted in clinical settings for patients. The need existed for an instrument to be developed in a workplace setting to measure resilience of employees. This study developed and tested a resilience instrument for employees in the workplace. The research instrument was distributed to executives and nurses working in the United States in hospital settings. Five-hundred-forty completed and usable responses were obtained. The instrument contained an inventory of workplace resilience, a job stress questionnaire, and relevant demographics. The resilience items were written based on previous work by the lead author and inspired by Weick's [1] sense-making theory. A four-factor model yielded an instrument having psychometric properties showing good model fit. Twenty items were retained for the resulting Workplace Resilience Instrument (WRI). Parallel analysis was conducted with successive iterations of exploratory and confirmatory factor analyses. Respondents were classified based on their employment with either a rural or an urban hospital. Executives had significantly higher WRI scores than nurses, controlling for gender. WRI scores were positively and significantly correlated with years of experience and the Brief Job Stress Questionnaire. An instrument to measure individual resilience in the workplace (WRI) was developed. The WRI's four factors identify dimensions of workplace resilience for use in subsequent investigations: Active Problem-Solving, Team Efficacy, Confident Sense-Making, and Bricolage.

  12. One- and two-center ETF-integrals of first order in relativistic calculation of NMR parameters

    NASA Astrophysics Data System (ADS)

    Slevinsky, R. M.; Temga, T.; Mouattamid, M.; Safouhi, H.

    2010-06-01

    The present work focuses on the analytical and numerical developments of first-order integrals involved in the relativistic calculation of the shielding tensor using exponential-type functions as a basis set of atomic orbitals. For the analytical development, we use the Fourier integral transformation and practical properties of spherical harmonics and the Rayleigh expansion of the plane wavefunctions. The Fourier transforms of the operators were derived in previous work and they are used for analytical development. In both the one- and two-center integrals, Cauchy's residue theorem is used in the final developments of the analytical expressions, which are shown to be accurate to machine precision.

  13. Net present value approaches for drug discovery.

    PubMed

    Svennebring, Andreas M; Wikberg, Jarl Es

    2013-12-01

    Three dedicated approaches to the calculation of the risk-adjusted net present value (rNPV) in drug discovery projects under different assumptions are suggested. The probability of finding a candidate drug suitable for clinical development and the time to the initiation of the clinical development is assumed to be flexible in contrast to the previously used models. The rNPV of the post-discovery cash flows is calculated as the probability weighted average of the rNPV at each potential time of initiation of clinical development. Practical considerations how to set probability rates, in particular during the initiation and termination of a project is discussed.

  14. Model fitting for small skin permeability data sets: hyperparameter optimisation in Gaussian Process Regression.

    PubMed

    Ashrafi, Parivash; Sun, Yi; Davey, Neil; Adams, Roderick G; Wilkinson, Simon C; Moss, Gary Patrick

    2018-03-01

    The aim of this study was to investigate how to improve predictions from Gaussian Process models by optimising the model hyperparameters. Optimisation methods, including Grid Search, Conjugate Gradient, Random Search, Evolutionary Algorithm and Hyper-prior, were evaluated and applied to previously published data. Data sets were also altered in a structured manner to reduce their size, which retained the range, or 'chemical space' of the key descriptors to assess the effect of the data range on model quality. The Hyper-prior Smoothbox kernel results in the best models for the majority of data sets, and they exhibited significantly better performance than benchmark quantitative structure-permeability relationship (QSPR) models. When the data sets were systematically reduced in size, the different optimisation methods generally retained their statistical quality, whereas benchmark QSPR models performed poorly. The design of the data set, and possibly also the approach to validation of the model, is critical in the development of improved models. The size of the data set, if carefully controlled, was not generally a significant factor for these models and that models of excellent statistical quality could be produced from substantially smaller data sets. © 2018 Royal Pharmaceutical Society.

  15. Developing an OMERACT Core Outcome Set for Assessing Safety Components in Rheumatology Trials: The OMERACT Safety Working Group.

    PubMed

    Klokker, Louise; Tugwell, Peter; Furst, Daniel E; Devoe, Dan; Williamson, Paula; Terwee, Caroline B; Suarez-Almazor, Maria E; Strand, Vibeke; Woodworth, Thasia; Leong, Amye L; Goel, Niti; Boers, Maarten; Brooks, Peter M; Simon, Lee S; Christensen, Robin

    2017-12-01

    Failure to report harmful outcomes in clinical research can introduce bias favoring a potentially harmful intervention. While core outcome sets (COS) are available for benefits in randomized controlled trials in many rheumatic conditions, less attention has been paid to safety in such COS. The Outcome Measures in Rheumatology (OMERACT) Filter 2.0 emphasizes the importance of measuring harms. The Safety Working Group was reestablished at the OMERACT 2016 with the objective to develop a COS for assessing safety components in trials across rheumatologic conditions. The safety issue has previously been discussed at OMERACT, but without a consistent approach to ensure harms were included in COS. Our methods include (1) identifying harmful outcomes in trials of interventions studied in patients with rheumatic diseases by a systematic literature review, (2) identifying components of safety that should be measured in such trials by use of a patient-driven approach including qualitative data collection and statistical organization of data, and (3) developing a COS through consensus processes including everyone involved. Members of OMERACT including patients, clinicians, researchers, methodologists, and industry representatives reached consensus on the need to continue the efforts on developing a COS for safety in rheumatology trials. There was a general agreement about the need to identify safety-related outcomes that are meaningful to patients, framed in terms that patients consider relevant so that they will be able to make informed decisions. The OMERACT Safety Working Group will advance the work previously done within OMERACT using a new patient-driven approach.

  16. Development of Survey Scales for Measuring Exposure and Behavioral Responses to Disruptive Intraoperative Behavior.

    PubMed

    Villafranca, Alexander; Hamlin, Colin; Rodebaugh, Thomas L; Robinson, Sandra; Jacobsohn, Eric

    2017-09-10

    Disruptive intraoperative behavior has detrimental effects to clinicians, institutions, and patients. How clinicians respond to this behavior can either exacerbate or attenuate its effects. Previous investigations of disruptive behavior have used survey scales with significant limitations. The study objective was to develop appropriate scales to measure exposure and responses to disruptive behavior. We obtained ethics approval. The scales were developed in a sequence of steps. They were pretested using expert reviews, computational linguistic analysis, and cognitive interviews. The scales were then piloted on Canadian operating room clinicians. Factor analysis was applied to half of the data set for question reduction and grouping. Item response analysis and theoretical reviews ensured that important questions were not eliminated. Internal consistency was evaluated using Cronbach α. Model fit was examined on the second half of the data set using confirmatory factor analysis. Content validity of the final scales was re-evaluated. Consistency between observed relationships and theoretical predictions was assessed. Temporal stability was evaluated on a subsample of 38 respondents. A total of 1433 and 746 clinicians completed the exposure and response scales, respectively. Content validity indices were excellent (exposure = 0.96, responses = 1.0). Internal consistency was good (exposure = 0.93, responses = 0.87). Correlations between the exposure scale and secondary measures were consistent with expectations based on theory. Temporal stability was acceptable (exposure = 0.77, responses = 0.73). We have developed scales measuring exposure and responses to disruptive behavior. They generate valid and reliable scores when surveying operating room clinicians, and they overcome the limitations of previous tools. These survey scales are freely available.

  17. Using level set based inversion of arrival times to recover shear wave speed in transient elastography and supersonic imaging

    NASA Astrophysics Data System (ADS)

    McLaughlin, Joyce; Renzi, Daniel

    2006-04-01

    Transient elastography and supersonic imaging are promising new techniques for characterizing the elasticity of soft tissues. Using this method, an 'ultrafast imaging' system (up to 10 000 frames s-1) follows in real time the propagation of a low-frequency shear wave. The displacement of the propagating shear wave is measured as a function of time and space. Here we develop a fast level set based algorithm for finding the shear wave speed from the interior positions of the propagating front. We compare the performance of level curve methods developed here and our previously developed (McLaughlin J and Renzi D 2006 Shear wave speed recovery in transient elastography and supersonic imaging using propagating fronts Inverse Problems 22 681-706) distance methods. We give reconstruction examples from synthetic data and from data obtained from a phantom experiment accomplished by Mathias Fink's group (the Laboratoire Ondes et Acoustique, ESPCI, Université Paris VII).

  18. Development of distinct control networks through segregation and integration

    PubMed Central

    Fair, Damien A.; Dosenbach, Nico U. F.; Church, Jessica A.; Cohen, Alexander L.; Brahmbhatt, Shefali; Miezin, Francis M.; Barch, Deanna M.; Raichle, Marcus E.; Petersen, Steven E.; Schlaggar, Bradley L.

    2007-01-01

    Human attentional control is unrivaled. We recently proposed that adults depend on distinct frontoparietal and cinguloopercular networks for adaptive online task control versus more stable set control, respectively. During development, both experience-dependent evoked activity and spontaneous waves of synchronized cortical activity are thought to support the formation and maintenance of neural networks. Such mechanisms may encourage tighter “integration” of some regions into networks over time while “segregating” other sets of regions into separate networks. Here we use resting state functional connectivity MRI, which measures correlations in spontaneous blood oxygenation level-dependent signal fluctuations between brain regions to compare previously identified control networks between children and adults. We find that development of the proposed adult control networks involves both segregation (i.e., decreased short-range connections) and integration (i.e., increased long-range connections) of the brain regions that comprise them. Delay/disruption in the developmental processes of segregation and integration may play a role in disorders of control, such as autism, attention deficit hyperactivity disorder, and Tourette's syndrome. PMID:17679691

  19. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  20. Resource Economics

    NASA Astrophysics Data System (ADS)

    Conrad, Jon M.

    2000-01-01

    Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues

  1. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    PubMed Central

    Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576

  2. PlantCV v2: Image analysis software for high-throughput plant phenotyping.

    PubMed

    Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  3. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  4. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE PAGES

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...

    2017-12-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  5. Automatic telangiectasia analysis in dermoscopy images using adaptive critic design.

    PubMed

    Cheng, B; Stanley, R J; Stoecker, W V; Hinton, K

    2012-11-01

    Telangiectasia, tiny skin vessels, are important dermoscopy structures used to discriminate basal cell carcinoma (BCC) from benign skin lesions. This research builds off of previously developed image analysis techniques to identify vessels automatically to discriminate benign lesions from BCCs. A biologically inspired reinforcement learning approach is investigated in an adaptive critic design framework to apply action-dependent heuristic dynamic programming (ADHDP) for discrimination based on computed features using different skin lesion contrast variations to promote the discrimination process. Lesion discrimination results for ADHDP are compared with multilayer perception backpropagation artificial neural networks. This study uses a data set of 498 dermoscopy skin lesion images of 263 BCCs and 226 competitive benign images as the input sets. This data set is extended from previous research [Cheng et al., Skin Research and Technology, 2011, 17: 278]. Experimental results yielded a diagnostic accuracy as high as 84.6% using the ADHDP approach, providing an 8.03% improvement over a standard multilayer perception method. We have chosen BCC detection rather than vessel detection as the endpoint. Although vessel detection is inherently easier, BCC detection has potential direct clinical applications. Small BCCs are detectable early by dermoscopy and potentially detectable by the automated methods described in this research. © 2011 John Wiley & Sons A/S.

  6. Coherent soft X-ray diffraction imaging of coliphage PR772 at the Linac coherent light source

    PubMed Central

    Reddy, Hemanth K.N.; Yoon, Chun Hong; Aquila, Andrew; Awel, Salah; Ayyer, Kartik; Barty, Anton; Berntsen, Peter; Bielecki, Johan; Bobkov, Sergey; Bucher, Maximilian; Carini, Gabriella A.; Carron, Sebastian; Chapman, Henry; Daurer, Benedikt; DeMirci, Hasan; Ekeberg, Tomas; Fromme, Petra; Hajdu, Janos; Hanke, Max Felix; Hart, Philip; Hogue, Brenda G.; Hosseinizadeh, Ahmad; Kim, Yoonhee; Kirian, Richard A.; Kurta, Ruslan P.; Larsson, Daniel S.D.; Duane Loh, N.; Maia, Filipe R.N.C.; Mancuso, Adrian P.; Mühlig, Kerstin; Munke, Anna; Nam, Daewoong; Nettelblad, Carl; Ourmazd, Abbas; Rose, Max; Schwander, Peter; Seibert, Marvin; Sellberg, Jonas A.; Song, Changyong; Spence, John C.H.; Svenda, Martin; Van der Schot, Gijs; Vartanyants, Ivan A.; Williams, Garth J.; Xavier, P. Lourdu

    2017-01-01

    Single-particle diffraction from X-ray Free Electron Lasers offers the potential for molecular structure determination without the need for crystallization. In an effort to further develop the technique, we present a dataset of coherent soft X-ray diffraction images of Coliphage PR772 virus, collected at the Atomic Molecular Optics (AMO) beamline with pnCCD detectors in the LAMP instrument at the Linac Coherent Light Source. The diameter of PR772 ranges from 65–70 nm, which is considerably smaller than the previously reported ~600 nm diameter Mimivirus. This reflects continued progress in XFEL-based single-particle imaging towards the single molecular imaging regime. The data set contains significantly more single particle hits than collected in previous experiments, enabling the development of improved statistical analysis, reconstruction algorithms, and quantitative metrics to determine resolution and self-consistency. PMID:28654088

  7. Coherent soft X-ray diffraction imaging of coliphage PR772 at the Linac coherent light source

    DOE PAGES

    Reddy, Hemanth K. N.; Yoon, Chun Hong; Aquila, Andrew; ...

    2017-06-27

    Single-particle diffraction from X-ray Free Electron Lasers offers the potential for molecular structure determination without the need for crystallization. In an effort to further develop the technique, we present a dataset of coherent soft X-ray diffraction images of Coliphage PR772 virus, collected at the Atomic Molecular Optics (AMO) beamline with pnCCD detectors in the LAMP instrument at the Linac Coherent Light Source. The diameter of PR772 ranges from 65–70 nm, which is considerably smaller than the previously reported ~600 nm diameter Mimivirus. This reflects continued progress in XFEL-based single-particle imaging towards the single molecular imaging regime. As a result, themore » data set contains significantly more single particle hits than collected in previous experiments, enabling the development of improved statistical analysis, reconstruction algorithms, and quantitative metrics to determine resolution and self-consistency.« less

  8. Non-linear quantitative structure-activity relationship for adenine derivatives as competitive inhibitors of adenosine deaminase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadat Hayatshahi, Sayyed Hamed; Abdolmaleki, Parviz; Safarian, Shahrokh

    2005-12-16

    Logistic regression and artificial neural networks have been developed as two non-linear models to establish quantitative structure-activity relationships between structural descriptors and biochemical activity of adenosine based competitive inhibitors, toward adenosine deaminase. The training set included 24 compounds with known k {sub i} values. The models were trained to solve two-class problems. Unlike the previous work in which multiple linear regression was used, the highest of positive charge on the molecules was recognized to be in close relation with their inhibition activity, while the electric charge on atom N1 of adenosine was found to be a poor descriptor. Consequently, themore » previously developed equation was improved and the newly formed one could predict the class of 91.66% of compounds correctly. Also optimized 2-3-1 and 3-4-1 neural networks could increase this rate to 95.83%.« less

  9. Coherent soft X-ray diffraction imaging of coliphage PR772 at the Linac coherent light source.

    PubMed

    Reddy, Hemanth K N; Yoon, Chun Hong; Aquila, Andrew; Awel, Salah; Ayyer, Kartik; Barty, Anton; Berntsen, Peter; Bielecki, Johan; Bobkov, Sergey; Bucher, Maximilian; Carini, Gabriella A; Carron, Sebastian; Chapman, Henry; Daurer, Benedikt; DeMirci, Hasan; Ekeberg, Tomas; Fromme, Petra; Hajdu, Janos; Hanke, Max Felix; Hart, Philip; Hogue, Brenda G; Hosseinizadeh, Ahmad; Kim, Yoonhee; Kirian, Richard A; Kurta, Ruslan P; Larsson, Daniel S D; Duane Loh, N; Maia, Filipe R N C; Mancuso, Adrian P; Mühlig, Kerstin; Munke, Anna; Nam, Daewoong; Nettelblad, Carl; Ourmazd, Abbas; Rose, Max; Schwander, Peter; Seibert, Marvin; Sellberg, Jonas A; Song, Changyong; Spence, John C H; Svenda, Martin; Van der Schot, Gijs; Vartanyants, Ivan A; Williams, Garth J; Xavier, P Lourdu

    2017-06-27

    Single-particle diffraction from X-ray Free Electron Lasers offers the potential for molecular structure determination without the need for crystallization. In an effort to further develop the technique, we present a dataset of coherent soft X-ray diffraction images of Coliphage PR772 virus, collected at the Atomic Molecular Optics (AMO) beamline with pnCCD detectors in the LAMP instrument at the Linac Coherent Light Source. The diameter of PR772 ranges from 65-70 nm, which is considerably smaller than the previously reported ~600 nm diameter Mimivirus. This reflects continued progress in XFEL-based single-particle imaging towards the single molecular imaging regime. The data set contains significantly more single particle hits than collected in previous experiments, enabling the development of improved statistical analysis, reconstruction algorithms, and quantitative metrics to determine resolution and self-consistency.

  10. Generating perfect fluid spheres in general relativity

    NASA Astrophysics Data System (ADS)

    Boonserm, Petarpa; Visser, Matt; Weinfurtner, Silke

    2005-06-01

    Ever since Karl Schwarzschild’s 1916 discovery of the spacetime geometry describing the interior of a particular idealized general relativistic star—a static spherically symmetric blob of fluid with position-independent density—the general relativity community has continued to devote considerable time and energy to understanding the general-relativistic static perfect fluid sphere. Over the last 90 years a tangle of specific perfect fluid spheres has been discovered, with most of these specific examples seemingly independent from each other. To bring some order to this collection, in this article we develop several new transformation theorems that map perfect fluid spheres into perfect fluid spheres. These transformation theorems sometimes lead to unexpected connections between previously known perfect fluid spheres, sometimes lead to new previously unknown perfect fluid spheres, and in general can be used to develop a systematic way of classifying the set of all perfect fluid spheres.

  11. Investigational Antimicrobial Agents of 2013

    PubMed Central

    Pucci, Michael J.

    2013-01-01

    SUMMARY New antimicrobial agents are always needed to counteract the resistant pathogens that continue to be selected by current therapeutic regimens. This review provides a survey of known antimicrobial agents that were currently in clinical development in the fall of 2012 and spring of 2013. Data were collected from published literature primarily from 2010 to 2012, meeting abstracts (2011 to 2012), government websites, and company websites when appropriate. Compared to what was reported in previous surveys, a surprising number of new agents are currently in company pipelines, particularly in phase 3 clinical development. Familiar antibacterial classes of the quinolones, tetracyclines, oxazolidinones, glycopeptides, and cephalosporins are represented by entities with enhanced antimicrobial or pharmacological properties. More importantly, compounds of novel chemical structures targeting bacterial pathways not previously exploited are under development. Some of the most promising compounds include novel β-lactamase inhibitor combinations that target many multidrug-resistant Gram-negative bacteria, a critical medical need. Although new antimicrobial agents will continue to be needed to address increasing antibiotic resistance, there are novel agents in development to tackle at least some of the more worrisome pathogens in the current nosocomial setting. PMID:24092856

  12. Towards a characterization of information automation systems on the flight deck

    NASA Astrophysics Data System (ADS)

    Dudley, Rachel Feddersen

    This thesis summarizes research to investigate the characteristics that define information automation systems used on aircraft flight decks and the significant impacts that these characteristics have on pilot performance. Major accomplishments of the work include the development of a set of characteristics that describe information automation systems on the flight deck and an experiment designed to study a subset of these characteristics. Information automation systems on the flight deck are responsible for the collection, processing, analysis, and presentation of data to the flightcrew. These systems pose human factors issues and challenges that must be considered by designers of these systems. Based on a previously developed formal definition of information automation for aircraft flight deck systems, an analysis process was developed and conducted to reach a refined set of information automation characteristics. In this work, characteristics are defined as a set of properties or attributes that describe an information automation system's operation or behavior, which can be used to identify and assess potential human factors issues. Hypotheses were formed for a subset of the characteristics: Automation Visibility, Information Quality, and Display Complexity. An experimental investigation was developed to measure performance impacts related to these characteristics, which showed mixed results of expected and surprising findings, with many interactions. A set of recommendations were then developed based on the experimental observations. Ensuring that the right information is presented to pilots at the right time and in the appropriate manner is the job of flight deck system designers. This work provides a foundation for developing recommendations and guidelines specific to information automation on the flight deck with the goal of improving the design and evaluation of information automation systems before they are implemented.

  13. Electronegativity equalization method: parameterization and validation for organic molecules using the Merz-Kollman-Singh charge distribution scheme.

    PubMed

    Jirousková, Zuzana; Vareková, Radka Svobodová; Vanek, Jakub; Koca, Jaroslav

    2009-05-01

    The electronegativity equalization method (EEM) was developed by Mortier et al. as a semiempirical method based on the density-functional theory. After parameterization, in which EEM parameters A(i), B(i), and adjusting factor kappa are obtained, this approach can be used for calculation of average electronegativity and charge distribution in a molecule. The aim of this work is to perform the EEM parameterization using the Merz-Kollman-Singh (MK) charge distribution scheme obtained from B3LYP/6-31G* and HF/6-31G* calculations. To achieve this goal, we selected a set of 380 organic molecules from the Cambridge Structural Database (CSD) and used the methodology, which was recently successfully applied to EEM parameterization to calculate the HF/STO-3G Mulliken charges on large sets of molecules. In the case of B3LYP/6-31G* MK charges, we have improved the EEM parameters for already parameterized elements, specifically C, H, N, O, and F. Moreover, EEM parameters for S, Br, Cl, and Zn, which have not as yet been parameterized for this level of theory and basis set, we also developed. In the case of HF/6-31G* MK charges, we have developed the EEM parameters for C, H, N, O, S, Br, Cl, F, and Zn that have not been parameterized for this level of theory and basis set so far. The obtained EEM parameters were verified by a previously developed validation procedure and used for the charge calculation on a different set of 116 organic molecules from the CSD. The calculated EEM charges are in a very good agreement with the quantum mechanically obtained ab initio charges. 2008 Wiley Periodicals, Inc.

  14. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method

    PubMed Central

    2011-01-01

    Background The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. Results A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. Conclusions The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim. PMID:21586134

  15. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method.

    PubMed

    Luna, Augustin; Karac, Evrim I; Sunshine, Margot; Chang, Lucas; Nussinov, Ruth; Aladjem, Mirit I; Kohn, Kurt W

    2011-05-17

    The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim.

  16. Synthesis of linear regression coefficients by recovering the within-study covariance matrix from summary statistics.

    PubMed

    Yoneoka, Daisuke; Henmi, Masayuki

    2017-06-01

    Recently, the number of regression models has dramatically increased in several academic fields. However, within the context of meta-analysis, synthesis methods for such models have not been developed in a commensurate trend. One of the difficulties hindering the development is the disparity in sets of covariates among literature models. If the sets of covariates differ across models, interpretation of coefficients will differ, thereby making it difficult to synthesize them. Moreover, previous synthesis methods for regression models, such as multivariate meta-analysis, often have problems because covariance matrix of coefficients (i.e. within-study correlations) or individual patient data are not necessarily available. This study, therefore, proposes a brief explanation regarding a method to synthesize linear regression models under different covariate sets by using a generalized least squares method involving bias correction terms. Especially, we also propose an approach to recover (at most) threecorrelations of covariates, which is required for the calculation of the bias term without individual patient data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Cardiac Involvement with Parasitic Infections

    PubMed Central

    Hidron, Alicia; Vogenthaler, Nicholas; Santos-Preciado, José I.; Rodriguez-Morales, Alfonso J.; Franco-Paredes, Carlos; Rassi, Anis

    2010-01-01

    Summary: Parasitic infections previously seen only in developing tropical settings can be currently diagnosed worldwide due to travel and population migration. Some parasites may directly or indirectly affect various anatomical structures of the heart, with infections manifested as myocarditis, pericarditis, pancarditis, or pulmonary hypertension. Thus, it has become quite relevant for clinicians in developed settings to consider parasitic infections in the differential diagnosis of myocardial and pericardial disease anywhere around the globe. Chagas' disease is by far the most important parasitic infection of the heart and one that it is currently considered a global parasitic infection due to the growing migration of populations from areas where these infections are highly endemic to settings where they are not endemic. Current advances in the treatment of African trypanosomiasis offer hope to prevent not only the neurological complications but also the frequently identified cardiac manifestations of this life-threatening parasitic infection. The lack of effective vaccines, optimal chemoprophylaxis, or evidence-based pharmacological therapies to control many of the parasitic diseases of the heart, in particular Chagas' disease, makes this disease one of the most important public health challenges of our time. PMID:20375355

  18. Examination of factor structure for the consumers' responses to the Value Consciousness Scale.

    PubMed

    Conrad, C A; Williams, J R

    2000-12-01

    The psychometric properties of the Value Consciousness Scale developed by Lichtenstein, Netemeyer, and Burton in 1990 were examined in a retail grocery study (N = 497). Original assessment of scale properties was undertaken using two convenience samples in a nonretail setting and additional scale performance has been documented by the scale authors. This study furthers previous research by (1) examining performance on the items in the retail grocery setting and (2) utilizing an appropriately rigorous sampling procedure. A confirmatory factor analysis indicated that the Value Consciousness Scale does not exhibit unidimensional properties, and one must be cautious if this scale is used in applications of market segmentation until further clarification can be provided.

  19. Design of a modular digital computer system, CDRL no. D001, final design plan

    NASA Technical Reports Server (NTRS)

    Easton, R. A.

    1975-01-01

    The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.

  20. Development and application of a unified balancing approach with multiple constraints

    NASA Technical Reports Server (NTRS)

    Zorzi, E. S.; Lee, C. C.; Giordano, J. C.

    1985-01-01

    The development of a general analytic approach to constrained balancing that is consistent with past influence coefficient methods is described. The approach uses Lagrange multipliers to impose orbit and/or weight constraints; these constraints are combined with the least squares minimization process to provide a set of coupled equations that result in a single solution form for determining correction weights. Proper selection of constraints results in the capability to: (1) balance higher speeds without disturbing previously balanced modes, thru the use of modal trial weight sets; (2) balance off-critical speeds; and (3) balance decoupled modes by use of a single balance plane. If no constraints are imposed, this solution form reduces to the general weighted least squares influence coefficient method. A test facility used to examine the use of the general constrained balancing procedure and application of modal trial weight ratios is also described.

  1. Upper bound of abutment scour in laboratory and field data

    USGS Publications Warehouse

    Benedict, Stephen

    2016-01-01

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted a field investigation of abutment scour in South Carolina and used those data to develop envelope curves that define the upper bound of abutment scour. To expand on this previous work, an additional cooperative investigation was initiated to combine the South Carolina data with abutment scour data from other sources and evaluate upper bound patterns with this larger data set. To facilitate this analysis, 446 laboratory and 331 field measurements of abutment scour were compiled into a digital database. This extensive database was used to evaluate the South Carolina abutment scour envelope curves and to develop additional envelope curves that reflected the upper bound of abutment scour depth for the laboratory and field data. The envelope curves provide simple but useful supplementary tools for assessing the potential maximum abutment scour depth in the field setting.

  2. Moral Distress Scale for Occupational Therapists: Part 1. Instrument Development and Content Validity.

    PubMed

    Penny, Neil H; Bires, Samantha J; Bonn, Elizabeth A; Dockery, Alisha N; Pettit, Nicole L

    2016-01-01

    We describe the development of an instrument to measure moral distress experienced by occupational therapists and show how its content validity was established. Written comments (n = 78) from a previous survey using the Moral Distress Scale-Revised-Other Health Provider Adult were used to modify that instrument and create the Moral Distress Scale-Revised-Occupational Therapy-Adult Settings (MDS-R-OT[A]). The MDS-R-OT[A] was distributed to a nationwide random sample of 400 occupational therapists who rated the relevance of each item to their clinical practice. A scale content validity index of 81.8% was found (geriatric = 81.5%, physical disability = 80.8%, combination of the two = 85.7%). The MDS-R-OT[A] possesses acceptable content validity and is appropriate for use with occupational therapists working in geriatric or physical disability settings. Copyright © 2016 by the American Occupational Therapy Association, Inc.

  3. Mass detection in digital breast tomosynthesis: Deep convolutional neural network with transfer learning from mammography.

    PubMed

    Samala, Ravi K; Chan, Heang-Ping; Hadjiiski, Lubomir; Helvie, Mark A; Wei, Jun; Cha, Kenny

    2016-12-01

    Develop a computer-aided detection (CAD) system for masses in digital breast tomosynthesis (DBT) volume using a deep convolutional neural network (DCNN) with transfer learning from mammograms. A data set containing 2282 digitized film and digital mammograms and 324 DBT volumes were collected with IRB approval. The mass of interest on the images was marked by an experienced breast radiologist as reference standard. The data set was partitioned into a training set (2282 mammograms with 2461 masses and 230 DBT views with 228 masses) and an independent test set (94 DBT views with 89 masses). For DCNN training, the region of interest (ROI) containing the mass (true positive) was extracted from each image. False positive (FP) ROIs were identified at prescreening by their previously developed CAD systems. After data augmentation, a total of 45 072 mammographic ROIs and 37 450 DBT ROIs were obtained. Data normalization and reduction of non-uniformity in the ROIs across heterogeneous data was achieved using a background correction method applied to each ROI. A DCNN with four convolutional layers and three fully connected (FC) layers was first trained on the mammography data. Jittering and dropout techniques were used to reduce overfitting. After training with the mammographic ROIs, all weights in the first three convolutional layers were frozen, and only the last convolution layer and the FC layers were randomly initialized again and trained using the DBT training ROIs. The authors compared the performances of two CAD systems for mass detection in DBT: one used the DCNN-based approach and the other used their previously developed feature-based approach for FP reduction. The prescreening stage was identical in both systems, passing the same set of mass candidates to the FP reduction stage. For the feature-based CAD system, 3D clustering and active contour method was used for segmentation; morphological, gray level, and texture features were extracted and merged with a linear discriminant classifier to score the detected masses. For the DCNN-based CAD system, ROIs from five consecutive slices centered at each candidate were passed through the trained DCNN and a mass likelihood score was generated. The performances of the CAD systems were evaluated using free-response ROC curves and the performance difference was analyzed using a non-parametric method. Before transfer learning, the DCNN trained only on mammograms with an AUC of 0.99 classified DBT masses with an AUC of 0.81 in the DBT training set. After transfer learning with DBT, the AUC improved to 0.90. For breast-based CAD detection in the test set, the sensitivity for the feature-based and the DCNN-based CAD systems was 83% and 91%, respectively, at 1 FP/DBT volume. The difference between the performances for the two systems was statistically significant (p-value < 0.05). The image patterns learned from the mammograms were transferred to the mass detection on DBT slices through the DCNN. This study demonstrated that large data sets collected from mammography are useful for developing new CAD systems for DBT, alleviating the problem and effort of collecting entirely new large data sets for the new modality.

  4. Development and validation of a socioculturally competent trust in physician scale for a developing country setting.

    PubMed

    Gopichandran, Vijayaprasad; Wouters, Edwin; Chetlapalli, Satish Kumar

    2015-05-03

    Trust in physicians is the unwritten covenant between the patient and the physician that the physician will do what is in the best interest of the patient. This forms the undercurrent of all healthcare relationships. Several scales exist for assessment of trust in physicians in developed healthcare settings, but to our knowledge none of these have been developed in a developing country context. To develop and validate a new trust in physician scale for a developing country setting. Dimensions of trust in physicians, which were identified in a previous qualitative study in the same setting, were used to develop a scale. This scale was administered among 616 adults selected from urban and rural areas of Tamil Nadu, south India, using a multistage sampling cross sectional survey method. The individual items were analysed using a classical test approach as well as item response theory. Cronbach's α was calculated and the item to total correlation of each item was assessed. After testing for unidimensionality and absence of local dependence, a 2 parameter logistic Semajima's graded response model was fit and item characteristics assessed. Competence, assurance of treatment, respect for the physician and loyalty to the physician were important dimensions of trust. A total of 31 items were developed using these dimensions. Of these, 22 were selected for final analysis. The Cronbach's α was 0.928. The item to total correlations were acceptable for all the 22 items. The item response analysis revealed good item characteristic curves and item information for all the items. Based on the item parameters and item information, a final 12 item scale was developed. The scale performs optimally in the low to moderate trust range. The final 12 item trust in physician scale has a good construct validity and internal consistency. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Development and validation of a socioculturally competent trust in physician scale for a developing country setting

    PubMed Central

    Gopichandran, Vijayaprasad; Wouters, Edwin; Chetlapalli, Satish Kumar

    2015-01-01

    Trust in physicians is the unwritten covenant between the patient and the physician that the physician will do what is in the best interest of the patient. This forms the undercurrent of all healthcare relationships. Several scales exist for assessment of trust in physicians in developed healthcare settings, but to our knowledge none of these have been developed in a developing country context. Objectives To develop and validate a new trust in physician scale for a developing country setting. Methods Dimensions of trust in physicians, which were identified in a previous qualitative study in the same setting, were used to develop a scale. This scale was administered among 616 adults selected from urban and rural areas of Tamil Nadu, south India, using a multistage sampling cross sectional survey method. The individual items were analysed using a classical test approach as well as item response theory. Cronbach's α was calculated and the item to total correlation of each item was assessed. After testing for unidimensionality and absence of local dependence, a 2 parameter logistic Semajima's graded response model was fit and item characteristics assessed. Results Competence, assurance of treatment, respect for the physician and loyalty to the physician were important dimensions of trust. A total of 31 items were developed using these dimensions. Of these, 22 were selected for final analysis. The Cronbach's α was 0.928. The item to total correlations were acceptable for all the 22 items. The item response analysis revealed good item characteristic curves and item information for all the items. Based on the item parameters and item information, a final 12 item scale was developed. The scale performs optimally in the low to moderate trust range. Conclusions The final 12 item trust in physician scale has a good construct validity and internal consistency. PMID:25941182

  6. A cross-sectional evaluation of community pharmacists' perceptions of intermediate care and medicines management across the healthcare interface.

    PubMed

    Millar, Anna; Hughes, Carmel; Devlin, Maria; Ryan, Cristín

    2016-12-01

    Background Despite the importance placed on the concept of the multidisciplinary team in relation to intermediate care (IC), little is known about community pharmacists' (CPs) involvement. Objective To determine CPs' awareness of and involvement with IC services, perceptions of the transfer of patients' medication information between healthcare settings and views of the development of a CP-IC service. Setting Community pharmacies in Northern Ireland. Methods A postal questionnaire, informed by previous qualitative work was developed and piloted. Main outcome measure CPs' awareness of and involvement with IC. Results The response rate was 35.3 % (190/539). Under half (47.4 %) of CPs 'agreed/strongly agreed' that they understood the term 'intermediate care'. Three quarters of respondents were either not involved or unsure if they were involved with providing services to IC. A small minority (1.2 %) of CPs reported that they received communication regarding medication changes made in hospital or IC settings 'all of the time'. Only 9.5 and 0.5 % of respondents 'strongly agreed' that communication from hospital and IC, respectively, was sufficiently detailed. In total, 155 (81.6 %) CPs indicated that they would like to have greater involvement with IC services. 'Current workload' was ranked as the most important barrier to service development. Conclusion It was revealed that CPs had little awareness of, or involvement with, IC. Communication of information relating to patients' medicines between settings was perceived as insufficient, especially between IC and community pharmacy settings. CPs demonstrated willingness to be involved with IC and services aimed at bridging the communication gap between healthcare settings.

  7. Validated Outcomes in the Grafting of Autologous Fat to the Breast: The VOGUE Study. Development of a Core Outcome Set for Research and Audit.

    PubMed

    Agha, Riaz A; Pidgeon, Thomas E; Borrelli, Mimi R; Dowlut, Naeem; Orkar, Ter-Er K; Ahmed, Maziyah; Pujji, Ojas; Orgill, Dennis P

    2018-05-01

    Autologous fat grafting is an important part of the reconstructive surgeon's toolbox when treating women affected by breast cancer and subsequent tumor extirpation. The debate over safety and efficacy of autologous fat grafting continues within the literature. However, work performed by the authors' group has shown significant heterogeneity in outcome reporting. Core outcome sets have been shown to reduce heterogeneity in outcome reporting. The authors' goal was to develop a core outcome set for autologous fat grafting in breast reconstruction. The authors published their protocol a priori. A Delphi consensus exercise among key stakeholders was conducted using a list of outcomes generated from their previous work. These outcomes were divided into six domains: oncologic, clinical, aesthetic and functional, patient-reported, process, and radiologic. In the first round, 55 of 78 participants (71 percent) completed the Delphi consensus exercise. Consensus was reached on nine of the 13 outcomes. The clarity of the results and lack of additional suggested outcomes deemed further rounds to be unnecessary. The VOGUE Study has led to the development of a much-needed core outcome set in the active research front and clinical area of autologous fat grafting. The authors hope that clinicians will use this core outcome set to audit their practice, and that researchers will implement these outcomes in their study design and reporting of autologous fat grafting outcomes. The authors encourage journals and surgical societies to endorse and encourage use of this core outcome set to help refine the scientific quality of the debate, the discourse, and the literature. Therapeutic, V.

  8. Developing a set of consensus indicators to support maternity service quality improvement: using Core Outcome Set methodology including a Delphi process.

    PubMed

    Bunch, K J; Allin, B; Jolly, M; Hardie, T; Knight, M

    2018-05-16

    To develop a core metric set to monitor the quality of maternity care. Delphi process followed by a face-to-face consensus meeting. English maternity units. Three representative expert panels: service designers, providers and users. Maternity care metrics judged important by participants. Participants were asked to complete a two-phase Delphi process, scoring metrics from existing local maternity dashboards. A consensus meeting discussed the results and re-scored the metrics. In all, 125 distinct metrics across six domains were identified from existing dashboards. Following the consensus meeting, 14 metrics met the inclusion criteria for the final core set: smoking rate at booking; rate of birth without intervention; caesarean section delivery rate in Robson group 1 women; caesarean section delivery rate in Robson group 2 women; caesarean section delivery rate in Robson group 5 women; third- and fourth-degree tear rate among women delivering vaginally; rate of postpartum haemorrhage of ≥1500 ml; rate of successful vaginal birth after a single previous caesarean section; smoking rate at delivery; proportion of babies born at term with an Apgar score <7 at 5 minutes; proportion of babies born at term admitted to the neonatal intensive care unit; proportion of babies readmitted to hospital at <30 days of age; breastfeeding initiation rate; and breastfeeding rate at 6-8 weeks. Core outcome set methodology can be used to incorporate the views of key stakeholders in developing a core metric set to monitor the quality of care in maternity units, thus enabling improvement. Achieving consensus on core metrics for monitoring the quality of maternity care. © 2018 The Authors. BJOG: An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.

  9. Development of a new model to engage patients and clinicians in setting research priorities.

    PubMed

    Pollock, Alex; St George, Bridget; Fenton, Mark; Crowe, Sally; Firkins, Lester

    2014-01-01

    Equitable involvement of patients and clinicians in setting research and funding priorities is ethically desirable and can improve the quality, relevance and implementation of research. Survey methods used in previous priority setting projects to gather treatment uncertainties may not be sufficient to facilitate responses from patients and their lay carers for some health care topics. We aimed to develop a new model to engage patients and clinicians in setting research priorities relating to life after stroke, and to explore the use of this model within a James Lind Alliance (JLA) priority setting project. We developed a model to facilitate involvement through targeted engagement and assisted involvement (FREE TEA model). We implemented both standard surveys and the FREE TEA model to gather research priorities (treatment uncertainties) from people affected by stroke living in Scotland. We explored and configured the number of treatment uncertainties elicited from different groups by the two approaches. We gathered 516 treatment uncertainties from stroke survivors, carers and health professionals. We achieved approximately equal numbers of contributions; 281 (54%) from stroke survivors/carers; 235 (46%) from health professionals. For stroke survivors and carers, 98 (35%) treatment uncertainties were elicited from the standard survey and 183 (65%) at FREE TEA face-to-face visits. This contrasted with the health professionals for whom 198 (84%) were elicited from the standard survey and only 37 (16%) from FREE TEA visits. The FREE TEA model has implications for future priority setting projects and user-involvement relating to populations of people with complex health needs. Our results imply that reliance on standard surveys may result in poor and unrepresentative involvement of patients, thereby favouring the views of health professionals.

  10. Charting a course to competency: an approach to mapping public health core competencies to existing trainings.

    PubMed

    Neiworth, Latrissa L; Allan, Susan; D'Ambrosio, Luann; Coplen-Abrahamson, Marlene

    2014-03-01

    Consistent with other professional fields, the goals of public health training have moved from a focus on knowledge transfer to the development of skills or competencies. At least six national competency sets have been developed in the past decade pertaining to public health professionals. State and local public health agencies are increasingly using competency sets as frameworks for staff development and assessment. Mapping competencies to training has potential for enhancing the value of public health training during resource-constrained times by directly linking training content to the desired skills. For existing public health trainings, the challenge is how to identify competencies addressed in those courses in a manner that is not burdensome and that produces valid results. This article describes a process for mapping competencies to the learning objectives, assignments, and assessments of existing trainings. The process presented could be used by any training center or organization that seeks to connect public health workforce competencies to previously developed instruction. Public health practice can be strengthened more effectively if trainings can be selected for the desired practice skills or competencies.

  11. Framework for Smart Electronic Health Record- Linked Predictive Models to Optimize Care for Complex Digestive Diseases

    DTIC Science & Technology

    2015-03-01

    data against previous published outcomes in AP and Chronic Pancreatitis (CP). This served as useful validation of our data set before entering the...These patients can develop multiple complications from their disease. In addition, the treatments for CD (both medical and surgical ) can impose...years of diagnosis. The treatment for CD can sometimes involve very expensive medications with potentially serious side effects, as well as surgical

  12. Knowledge-Assisted Approach to Identify Pathways with Differential Dependencies | Office of Cancer Genomics

    Cancer.gov

    We have previously developed a statistical method to identify gene sets enriched with condition-specific genetic dependencies. The method constructs gene dependency networks from bootstrapped samples in one condition and computes the divergence between distributions of network likelihood scores from different conditions. It was shown to be capable of sensitive and specific identification of pathways with phenotype-specific dysregulation, i.e., rewiring of dependencies between genes in different conditions.

  13. Tank waste remediation system baseline tank waste inventory estimates for fiscal year 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shelton, L.W., Westinghouse Hanford

    1996-12-06

    A set of tank-by-tank waste inventories is derived from historical waste models, flowsheet records, and analytical data to support the Tank Waste Remediation System flowsheet and retrieval sequence studies. Enabling assumptions and methodologies used to develop the inventories are discussed. These provisional inventories conform to previously established baseline inventories and are meant to serve as an interim basis until standardized inventory estimates are made available.

  14. Modeling Synergistic Drug Inhibition of Mycobacterium tuberculosis Growth in Murine Macrophages

    DTIC Science & Technology

    2011-01-01

    important application of metabolic network modeling is the ability to quantitatively model metabolic enzyme inhibition and predict bacterial growth...describe the extensions of this framework to model drug- induced growth inhibition of M. tuberculosis in macrophages.39 Mathematical framework Fig. 1 shows...starting point, we used the previously developed iNJ661v model to represent the metabolic Fig. 1 Mathematical framework: a set of coupled models used to

  15. The LARSYS Educational Package: Instructor's Notes for Use with the Data 100

    NASA Technical Reports Server (NTRS)

    Lindenlaub, J. C.; Russell, J. D.

    1977-01-01

    The LARSYS Educational Package is a set of instructional materials developed to train people to analyze remotely sensed multispectral data using LARSYS, a computer software system. The materials included in this volume have been designed to assist LARSYS instructors as they guide students through the LARSYS Educational Package. All of the materials have been updated from the previous version to reflect the use of a Data 100 Remote Terminal.

  16. Computerized multiple image analysis on mammograms: performance improvement of nipple identification for registration of multiple views using texture convergence analyses

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Sahiner, Berkman; Hadjiiski, Lubomir M.; Paramagul, Chintana

    2004-05-01

    Automated registration of multiple mammograms for CAD depends on accurate nipple identification. We developed two new image analysis techniques based on geometric and texture convergence analyses to improve the performance of our previously developed nipple identification method. A gradient-based algorithm is used to automatically track the breast boundary. The nipple search region along the boundary is then defined by geometric convergence analysis of the breast shape. Three nipple candidates are identified by detecting the changes along the gray level profiles inside and outside the boundary and the changes in the boundary direction. A texture orientation-field analysis method is developed to estimate the fourth nipple candidate based on the convergence of the tissue texture pattern towards the nipple. The final nipple location is determined from the four nipple candidates by a confidence analysis. Our training and test data sets consisted of 419 and 368 randomly selected mammograms, respectively. The nipple location identified on each image by an experienced radiologist was used as the ground truth. For 118 of the training and 70 of the test images, the radiologist could not positively identify the nipple, but provided an estimate of its location. These were referred to as invisible nipple images. In the training data set, 89.37% (269/301) of the visible nipples and 81.36% (96/118) of the invisible nipples could be detected within 1 cm of the truth. In the test data set, 92.28% (275/298) of the visible nipples and 67.14% (47/70) of the invisible nipples were identified within 1 cm of the truth. In comparison, our previous nipple identification method without using the two convergence analysis techniques detected 82.39% (248/301), 77.12% (91/118), 89.93% (268/298) and 54.29% (38/70) of the nipples within 1 cm of the truth for the visible and invisible nipples in the training and test sets, respectively. The results indicate that the nipple on mammograms can be detected accurately. This will be an important step towards automatic multiple image analysis for CAD techniques.

  17. Core Outcomes in Ventilation Trials (COVenT): protocol for a core outcome set using a Delphi survey with a nested randomised trial and observational cohort study.

    PubMed

    Blackwood, Bronagh; Ringrow, Suzanne; Clarke, Mike; Marshall, John; Rose, Louise; Williamson, Paula; McAuley, Danny

    2015-08-20

    Among clinical trials of interventions that aim to modify time spent on mechanical ventilation for critically ill patients there is considerable inconsistency in chosen outcomes and how they are measured. The Core Outcomes in Ventilation Trials (COVenT) study aims to develop a set of core outcomes for use in future ventilation trials in mechanically ventilated adults and children. We will use a mixed methods approach that incorporates a randomised trial nested within a Delphi study and a consensus meeting. Additionally, we will conduct an observational cohort study to evaluate uptake of the core outcome set in published studies at 5 and 10 years following core outcome set publication. The three-round online Delphi study will use a list of outcomes that have been reported previously in a review of ventilation trials. The Delphi panel will include a range of stakeholder groups including patient support groups. The panel will be randomised to one of three feedback methods to assess the impact of the feedback mechanism on subsequent ranking of outcomes. A final consensus meeting will be held with stakeholder representatives to review outcomes. The COVenT study aims to develop a core outcome set for ventilation trials in critical care, explore the best Delphi feedback mechanism for achieving consensus and determine if participation increases use of the core outcome set in the long term.

  18. Performance measures for a dialysis setting.

    PubMed

    Gu, Xiuzhu; Itoh, Kenji

    2018-03-01

    This study from Japan extracted performance measures for dialysis unit management and investigated their characteristics from professional views. Two surveys were conducted using self-administered questionnaires, in which dialysis managers/staff were asked to rate the usefulness of 44 performance indicators. A total of 255 managers and 2,097 staff responded. Eight performance measures were elicited from dialysis manager and staff responses: these were safety, operational efficiency, quality of working life, financial effectiveness, employee development, mortality, patient/employee satisfaction and patient-centred health care. These performance measures were almost compatible with those extracted in overall healthcare settings in a previous study. Internal reliability, content and construct validity of the performance measures for the dialysis setting were ensured to some extent. As a general trend, both dialysis managers and staff perceived performance measures as highly useful, especially for safety, mortality, operational efficiency and patient/employee satisfaction, but showed relatively low concerns for patient-centred health care and employee development. However, dialysis managers' usefulness perceptions were significantly higher than staff. Important guidelines for designing a holistic hospital/clinic management system were yielded. Performance measures must be balanced for outcomes and performance shaping factors (PSF); a common set of performance measures could be applied to all the healthcare settings, although performance indicators of each measure should be composed based on the application field and setting; in addition, sound causal relationships between PSF and outcome measures/indicators should be explored for further improvement. © 2017 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  19. Validation of quality indicators for the organization of palliative care: a modified RAND Delphi study in seven European countries (the Europall project).

    PubMed

    Woitha, Kathrin; Van Beek, Karen; Ahmed, Nisar; Jaspers, Birgit; Mollard, Jean M; Ahmedzai, Sam H; Hasselaar, Jeroen; Menten, Johan; Vissers, Kris; Engels, Yvonne

    2014-02-01

    Validated quality indicators can help health-care professionals to evaluate their medical practices in a comparative manner to deliver optimal clinical care. No international set of quality indicators to measure the organizational aspects of palliative care settings exists. To develop and validate a set of structure and process indicators for palliative care settings in Europe. A two-round modified RAND Delphi process was conducted to rate clarity and usefulness of a previously developed set of 110 quality indicators. In total, 20 multi-professional palliative care teams of centers of excellence from seven European countries. In total, 56 quality indicators were rated as useful. These valid quality indicators concerned the following domains: the definition of a palliative care service (2 quality indicators), accessibility to palliative care (16 quality indicators), specific infrastructure to deliver palliative care (8 quality indicators), symptom assessment tools (1 quality indicator), specific personnel in palliative care services (9 quality indicators), documentation methodology of clinical data (14 quality indicators), evaluation of quality and safety procedures (1 quality indicator), reporting of clinical activities (1 quality indicator), and education in palliative care (4 quality indicator). The modified RAND Delphi process resulted in 56 international face-validated quality indicators to measure and compare organizational aspects of palliative care. These quality indicators, aimed to assess and improve the organization of palliative care, will be pilot tested in palliative care settings all over Europe and be used in the EU FP7 funded IMPACT project.

  20. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2012-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often great, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques. Technical Methodology/Approach: Apply massively parallel algorithms and data structures to the specific analysis requirements presented when working with thermographic data sets.

  1. Dynamical Causal Modeling from a Quantum Dynamical Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demiralp, Emre; Demiralp, Metin

    Recent research suggests that any set of first order linear vector ODEs can be converted to a set of specific vector ODEs adhering to what we have called ''Quantum Harmonical Form (QHF)''. QHF has been developed using a virtual quantum multi harmonic oscillator system where mass and force constants are considered to be time variant and the Hamiltonian is defined as a conic structure over positions and momenta to conserve the Hermiticity. As described in previous works, the conversion to QHF requires the matrix coefficient of the first set of ODEs to be a normal matrix. In this paper, thismore » limitation is circumvented using a space extension approach expanding the potential applicability of this method. Overall, conversion to QHF allows the investigation of a set of ODEs using mathematical tools available to the investigation of the physical concepts underlying quantum harmonic oscillators. The utility of QHF in the context of dynamical systems and dynamical causal modeling in behavioral and cognitive neuroscience is briefly discussed.« less

  2. High-precision horizontally directed force measurements for high dead loads based on a differential electromagnetic force compensation system

    NASA Astrophysics Data System (ADS)

    Vasilyan, Suren; Rivero, Michel; Schleichert, Jan; Halbedel, Bernd; Fröhlich, Thomas

    2016-04-01

    In this paper, we present an application for realizing high-precision horizontally directed force measurements in the order of several tens of nN in combination with high dead loads of about 10 N. The set-up is developed on the basis of two identical state-of-the-art electromagnetic force compensation (EMFC) high precision balances. The measurement resolution of horizontally directed single-axis quasi-dynamic forces is 20 nN over the working range of  ±100 μN. The set-up operates in two different measurement modes: in the open-loop mode the mechanical deflection of the proportional lever is an indication of the acting force, whereas in the closed-loop mode it is the applied electric current to the coil inside the EMFC balance that compensates deflection of the lever to the offset zero position. The estimated loading frequency (cutoff frequency) of the set-up in the open-loop mode is about 0.18 Hz, in the closed-loop mode it is 0.7 Hz. One of the practical applications that the set-up is suitable for is the flow rate measurements of low electrically conducting electrolytes by applying the contactless technique of Lorentz force velocimetry. Based on a previously developed set-up which uses a single EMFC balance, experimental, theoretical and numerical analyses of the thermo-mechanical properties of the supporting structure are presented.

  3. Unsupervised classification of variable stars

    NASA Astrophysics Data System (ADS)

    Valenzuela, Lucas; Pichara, Karim

    2018-03-01

    During the past 10 years, a considerable amount of effort has been made to develop algorithms for automatic classification of variable stars. That has been primarily achieved by applying machine learning methods to photometric data sets where objects are represented as light curves. Classifiers require training sets to learn the underlying patterns that allow the separation among classes. Unfortunately, building training sets is an expensive process that demands a lot of human efforts. Every time data come from new surveys; the only available training instances are the ones that have a cross-match with previously labelled objects, consequently generating insufficient training sets compared with the large amounts of unlabelled sources. In this work, we present an algorithm that performs unsupervised classification of variable stars, relying only on the similarity among light curves. We tackle the unsupervised classification problem by proposing an untraditional approach. Instead of trying to match classes of stars with clusters found by a clustering algorithm, we propose a query-based method where astronomers can find groups of variable stars ranked by similarity. We also develop a fast similarity function specific for light curves, based on a novel data structure that allows scaling the search over the entire data set of unlabelled objects. Experiments show that our unsupervised model achieves high accuracy in the classification of different types of variable stars and that the proposed algorithm scales up to massive amounts of light curves.

  4. Equilibrium Conformations of Concentric-tube Continuum Robots

    PubMed Central

    Rucker, D. Caleb; Webster, Robert J.; Chirikjian, Gregory S.; Cowan, Noah J.

    2013-01-01

    Robots consisting of several concentric, preshaped, elastic tubes can work dexterously in narrow, constrained, and/or winding spaces, as are commonly found in minimally invasive surgery. Previous models of these “active cannulas” assume piecewise constant precurvature of component tubes and neglect torsion in curved sections of the device. In this paper we develop a new coordinate-free energy formulation that accounts for general preshaping of an arbitrary number of component tubes, and which explicitly includes both bending and torsion throughout the device. We show that previously reported models are special cases of our formulation, and then explore in detail the implications of torsional flexibility for the special case of two tubes. Experiments demonstrate that this framework is more descriptive of physical prototype behavior than previous models; it reduces model prediction error by 82% over the calibrated bending-only model, and 17% over the calibrated transmissional torsion model in a set of experiments. PMID:25125773

  5. Generating a taxonomy of spatially cued attention for visual discrimination: Effects of judgment precision and set size on attention

    PubMed Central

    Hetley, Richard; Dosher, Barbara Anne; Lu, Zhong-Lin

    2014-01-01

    Attention precues improve the performance of perceptual tasks in many but not all circumstances. These spatial attention effects may depend upon display set size or workload, and have been variously attributed to external noise filtering, stimulus enhancement, contrast gain, or response gain, or to uncertainty or other decision effects. In this study, we document systematically different effects of spatial attention in low- and high-precision judgments, with and without external noise, and in different set sizes in order to contribute to the development of a taxonomy of spatial attention. An elaborated perceptual template model (ePTM) provides an integrated account of a complex set of effects of spatial attention with just two attention factors: a set-size dependent exclusion or filtering of external noise and a narrowing of the perceptual template to focus on the signal stimulus. These results are related to the previous literature by classifying the judgment precision and presence of external noise masks in those experiments, suggesting a taxonomy of spatially cued attention in discrimination accuracy. PMID:24939234

  6. Generating a taxonomy of spatially cued attention for visual discrimination: effects of judgment precision and set size on attention.

    PubMed

    Hetley, Richard; Dosher, Barbara Anne; Lu, Zhong-Lin

    2014-11-01

    Attention precues improve the performance of perceptual tasks in many but not all circumstances. These spatial attention effects may depend upon display set size or workload, and have been variously attributed to external noise filtering, stimulus enhancement, contrast gain, or response gain, or to uncertainty or other decision effects. In this study, we document systematically different effects of spatial attention in low- and high-precision judgments, with and without external noise, and in different set sizes in order to contribute to the development of a taxonomy of spatial attention. An elaborated perceptual template model (ePTM) provides an integrated account of a complex set of effects of spatial attention with just two attention factors: a set-size dependent exclusion or filtering of external noise and a narrowing of the perceptual template to focus on the signal stimulus. These results are related to the previous literature by classifying the judgment precision and presence of external noise masks in those experiments, suggesting a taxonomy of spatially cued attention in discrimination accuracy.

  7. Quantitative reconstruction of cross-sectional dimensions and hydrological parameters of gravelly fluvial channels developed in a forearc basin setting under a temperate climatic condition, central Japan

    NASA Astrophysics Data System (ADS)

    Shibata, Kenichiro; Adhiperdana, Billy G.; Ito, Makoto

    2018-01-01

    Reconstructions of the dimensions and hydrological features of ancient fluvial channels, such as bankfull depth, bankfull width, and water discharges, have used empirical equations developed from compiled data-sets, mainly from modern meandering rivers, in various tectonic and climatic settings. However, the application of the proposed empirical equations to an ancient fluvial succession should be carefully examined with respect to the tectonic and climatic settings of the objective deposits. In this study, we developed empirical relationships among the mean bankfull channel depth, bankfull channel depth, drainage area, bankfull channel width, mean discharge, and bankfull discharge using data from 24 observation sites of modern gravelly rivers in the Kanto region, central Japan. Some of the equations among these parameters are different from those proposed by previous studies. The discrepancies are considered to reflect tectonic and climatic settings of the present river systems, which are characterized by relatively steeper valley slope, active supply of volcaniclastic sediments, and seasonal precipitation in the Kanto region. The empirical relationships derived from the present study can be applied to modern and ancient gravelly fluvial channels with multiple and alternate bars, developed in convergent margin settings under a temperate climatic condition. The developed empirical equations were applied to a transgressive gravelly fluvial succession of the Paleogene Iwaki Formation, Northeast Japan as a case study. Stratigraphic thicknesses of bar deposits were used for estimation of the bankfull channel depth. In addition, some other geomorphological and hydrological parameters were calculated using the empirical equations developed by the present study. The results indicate that the Iwaki Formation fluvial deposits were formed by a fluvial system that was represented by the dimensions and discharges of channels similar to those of the middle to lower reaches of the modern Kuji River, northern Kanto region. In addition, no distinct temporal changes in paleochannel dimensions and discharges were observed in an overall transgressive Iwaki Formation fluvial system. This implies that a rise in relative sea level did not affect the paleochannel dimensions within a sequence stratigraphic framework.

  8. Predicting the chance of vaginal delivery after one cesarean section: validation and elaboration of a published prediction model.

    PubMed

    Fagerberg, Marie C; Maršál, Karel; Källén, Karin

    2015-05-01

    We aimed to validate a widely used US prediction model for vaginal birth after cesarean (Grobman et al. [8]) and modify it to suit Swedish conditions. Women having experienced one cesarean section and at least one subsequent delivery (n=49,472) in the Swedish Medical Birth Registry 1992-2011 were randomly divided into two data sets. In the development data set, variables associated with successful trial of labor were identified using multiple logistic regression. The predictive ability of the estimates previously published by Grobman et al., and of our modified and new estimates, respectively, was then evaluated using the validation data set. The accuracy of the models for prediction of vaginal birth after cesarean was measured by area under the receiver operating characteristics curve. For maternal age, body mass index, prior vaginal delivery, and prior labor arrest, the odds ratio estimates for vaginal birth after cesarean were similar to those previously published. The prediction accuracy increased when information on indication for the previous cesarean section was added (from area under the receiver operating characteristics curve=0.69-0.71), and increased further when maternal height and delivery unit cesarean section rates were included (area under the receiver operating characteristics curve=0.74). The correlation between the individual predicted vaginal birth after cesarean probability and the observed trial of labor success rate was high in all the respective predicted probability decentiles. Customization of prediction models for vaginal birth after cesarean is of considerable value. Choosing relevant indicators for a Swedish setting made it possible to achieve excellent prediction accuracy for success in trial of labor after cesarean. During the delicate process of counseling about preferred delivery mode after one cesarean section, considering the results of our study may facilitate the choice between a trial of labor or an elective repeat cesarean section. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Computational correction of copy number effect improves specificity of CRISPR-Cas9 essentiality screens in cancer cells.

    PubMed

    Meyers, Robin M; Bryan, Jordan G; McFarland, James M; Weir, Barbara A; Sizemore, Ann E; Xu, Han; Dharia, Neekesh V; Montgomery, Phillip G; Cowley, Glenn S; Pantel, Sasha; Goodale, Amy; Lee, Yenarae; Ali, Levi D; Jiang, Guozhi; Lubonja, Rakela; Harrington, William F; Strickland, Matthew; Wu, Ting; Hawes, Derek C; Zhivich, Victor A; Wyatt, Meghan R; Kalani, Zohra; Chang, Jaime J; Okamoto, Michael; Stegmaier, Kimberly; Golub, Todd R; Boehm, Jesse S; Vazquez, Francisca; Root, David E; Hahn, William C; Tsherniak, Aviad

    2017-12-01

    The CRISPR-Cas9 system has revolutionized gene editing both at single genes and in multiplexed loss-of-function screens, thus enabling precise genome-scale identification of genes essential for proliferation and survival of cancer cells. However, previous studies have reported that a gene-independent antiproliferative effect of Cas9-mediated DNA cleavage confounds such measurement of genetic dependency, thereby leading to false-positive results in copy number-amplified regions. We developed CERES, a computational method to estimate gene-dependency levels from CRISPR-Cas9 essentiality screens while accounting for the copy number-specific effect. In our efforts to define a cancer dependency map, we performed genome-scale CRISPR-Cas9 essentiality screens across 342 cancer cell lines and applied CERES to this data set. We found that CERES decreased false-positive results and estimated sgRNA activity for both this data set and previously published screens performed with different sgRNA libraries. We further demonstrate the utility of this collection of screens, after CERES correction, for identifying cancer-type-specific vulnerabilities.

  10. Intestinal mucositis: mechanisms and management.

    PubMed

    Keefe, Dorothy M

    2007-07-01

    To describe the advances in the rapidly evolving field of intestinal (or alimentary) mucositis during the past year. Major advances have been made in both the clinical and preclinical setting, with the publication of a suite of articles regarding the pathobiology and management of mucositis, as well as several articles on important basic research in the area. The mechanism of mucositis development is now understood to be much more complex than previously thought, with an interplay of host and drug factors leading to overt damage, and variation in manifestation of that damage depending on the specific region of the gut. The MASCC/ISOO management guidelines for mucositis have been updated: a recommendation for the use of palifermin in the hematology transplant setting has been added, and a couple of previous recommendations have been revoked. This marks an important milestone in mucositis, as it is the first time a drug has been available that substantially reduces the occurrence and severity of mucositis. There is still much to be done to abolish the severe toxicity of chemotherapy and radiotherapy; however, progress is accelerating, and new targeted drugs are becoming available.

  11. Factor analysis in optimization of formulation of high content uniformity tablets containing low dose active substance.

    PubMed

    Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David

    2017-11-15

    Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Urinary bladder segmentation in CT urography using deep-learning convolutional neural network and level sets

    PubMed Central

    Cha, Kenny H.; Hadjiiski, Lubomir; Samala, Ravi K.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.

    2016-01-01

    Purpose: The authors are developing a computerized system for bladder segmentation in CT urography (CTU) as a critical component for computer-aided detection of bladder cancer. Methods: A deep-learning convolutional neural network (DL-CNN) was trained to distinguish between the inside and the outside of the bladder using 160 000 regions of interest (ROI) from CTU images. The trained DL-CNN was used to estimate the likelihood of an ROI being inside the bladder for ROIs centered at each voxel in a CTU case, resulting in a likelihood map. Thresholding and hole-filling were applied to the map to generate the initial contour for the bladder, which was then refined by 3D and 2D level sets. The segmentation performance was evaluated using 173 cases: 81 cases in the training set (42 lesions, 21 wall thickenings, and 18 normal bladders) and 92 cases in the test set (43 lesions, 36 wall thickenings, and 13 normal bladders). The computerized segmentation accuracy using the DL likelihood map was compared to that using a likelihood map generated by Haar features and a random forest classifier, and that using our previous conjoint level set analysis and segmentation system (CLASS) without using a likelihood map. All methods were evaluated relative to the 3D hand-segmented reference contours. Results: With DL-CNN-based likelihood map and level sets, the average volume intersection ratio, average percent volume error, average absolute volume error, average minimum distance, and the Jaccard index for the test set were 81.9% ± 12.1%, 10.2% ± 16.2%, 14.0% ± 13.0%, 3.6 ± 2.0 mm, and 76.2% ± 11.8%, respectively. With the Haar-feature-based likelihood map and level sets, the corresponding values were 74.3% ± 12.7%, 13.0% ± 22.3%, 20.5% ± 15.7%, 5.7 ± 2.6 mm, and 66.7% ± 12.6%, respectively. With our previous CLASS with local contour refinement (LCR) method, the corresponding values were 78.0% ± 14.7%, 16.5% ± 16.8%, 18.2% ± 15.0%, 3.8 ± 2.3 mm, and 73.9% ± 13.5%, respectively. Conclusions: The authors demonstrated that the DL-CNN can overcome the strong boundary between two regions that have large difference in gray levels and provides a seamless mask to guide level set segmentation, which has been a problem for many gradient-based segmentation methods. Compared to our previous CLASS with LCR method, which required two user inputs to initialize the segmentation, DL-CNN with level sets achieved better segmentation performance while using a single user input. Compared to the Haar-feature-based likelihood map, the DL-CNN-based likelihood map could guide the level sets to achieve better segmentation. The results demonstrate the feasibility of our new approach of using DL-CNN in combination with level sets for segmentation of the bladder. PMID:27036584

  13. Development of a Human Brain Diffusion Tensor Template

    PubMed Central

    Peng, Huiling; Orlichenko, Anton; Dawe, Robert J.; Agam, Gady; Zhang, Shengwei; Arfanakis, Konstantinos

    2009-01-01

    The development of a brain template for diffusion tensor imaging (DTI) is crucial for comparisons of neuronal structural integrity and brain connectivity across populations, as well as for the development of a white matter atlas. Previous efforts to produce a DTI brain template have been compromised by factors related to image quality, the effectiveness of the image registration approach, the appropriateness of subject inclusion criteria, the completeness and accuracy of the information summarized in the final template. The purpose of this work was to develop a DTI human brain template using techniques that address the shortcomings of previous efforts. Therefore, data containing minimal artifacts were first obtained on 67 healthy human subjects selected from an age-group with relatively similar diffusion characteristics (20–40 years of age), using an appropriate DTI acquisition protocol. Non-linear image registration based on mean diffusion-weighted and fractional anisotropy images was employed. DTI brain templates containing median and mean tensors were produced in ICBM-152 space and made publicly available. The resulting set of DTI templates is characterized by higher image sharpness, provides the ability to distinguish smaller white matter fiber structures, contains fewer image artifacts, than previously developed templates, and to our knowledge, is one of only two templates produced based on a relatively large number of subjects. Furthermore, median tensors were shown to better preserve the diffusion characteristics at the group level than mean tensors. Finally, white matter fiber tractography was applied on the template and several fiber-bundles were traced. PMID:19341801

  14. Development of a human brain diffusion tensor template.

    PubMed

    Peng, Huiling; Orlichenko, Anton; Dawe, Robert J; Agam, Gady; Zhang, Shengwei; Arfanakis, Konstantinos

    2009-07-15

    The development of a brain template for diffusion tensor imaging (DTI) is crucial for comparisons of neuronal structural integrity and brain connectivity across populations, as well as for the development of a white matter atlas. Previous efforts to produce a DTI brain template have been compromised by factors related to image quality, the effectiveness of the image registration approach, the appropriateness of subject inclusion criteria, and the completeness and accuracy of the information summarized in the final template. The purpose of this work was to develop a DTI human brain template using techniques that address the shortcomings of previous efforts. Therefore, data containing minimal artifacts were first obtained on 67 healthy human subjects selected from an age-group with relatively similar diffusion characteristics (20-40 years of age), using an appropriate DTI acquisition protocol. Non-linear image registration based on mean diffusion-weighted and fractional anisotropy images was employed. DTI brain templates containing median and mean tensors were produced in ICBM-152 space and made publicly available. The resulting set of DTI templates is characterized by higher image sharpness, provides the ability to distinguish smaller white matter fiber structures, contains fewer image artifacts, than previously developed templates, and to our knowledge, is one of only two templates produced based on a relatively large number of subjects. Furthermore, median tensors were shown to better preserve the diffusion characteristics at the group level than mean tensors. Finally, white matter fiber tractography was applied on the template and several fiber-bundles were traced.

  15. DArT whole genome profiling provides insights on the evolution and taxonomy of edible Banana (Musa spp.)

    PubMed Central

    Sardos, J.; Perrier, X.; Doležel, J.; Hřibová, E.; Christelová, P.; Van den houwe, I.; Kilian, A.; Roux, N.

    2016-01-01

    Background and Aims Dessert and cooking bananas are vegetatively propagated crops of great importance for both the subsistence and the livelihood of people in developing countries. A wide diversity of diploid and triploid cultivars including AA, AB, AS, AT, AAA, AAB, ABB, AAS and AAT genomic constitutions exists. Within each of this genome groups, cultivars are classified into subgroups that are reported to correspond to varieties clonally derived from each other after a single sexual event. The number of those founding events at the basis of the diversity of bananas is a matter of debate. Methods We analysed a large panel of 575 accessions, 94 wild relatives and 481 cultivated accessions belonging to the section Musa with a set of 498 DArT markers previously developed. Key Results DArT appeared successful and accurate to describe Musa diversity and help in the resolution of cultivated banana genome constitution and taxonomy, and highlighted discrepancies in the acknowledged classification of some accessions. This study also argues for at least two centres of domestication corresponding to South-East Asia and New Guinea, respectively. Banana domestication in New Guinea probably followed different schemes that those previously reported where hybridization underpins the emergence of edible banana. In addition, our results suggest that not all wild ancestors of bananas are known, especially in M. acuminata subspecies. We also estimate the extent of the two consecutive bottlenecks in edible bananas by evaluating the number of sexual founding events underlying our sets of edible diploids and triploids, respectively. Conclusions The attribution of clone identity to each sample of the sets allowed the detection of subgroups represented by several sets of clones. Although morphological characterization of some of the accessions is needed to correct potentially erroneous classifications, some of the subgroups seem polyclonal. PMID:27590334

  16. DArT whole genome profiling provides insights on the evolution and taxonomy of edible Banana (Musa spp.).

    PubMed

    Sardos, J; Perrier, X; Doležel, J; Hřibová, E; Christelová, P; Van den Houwe, I; Kilian, A; Roux, N

    2016-12-01

    Dessert and cooking bananas are vegetatively propagated crops of great importance for both the subsistence and the livelihood of people in developing countries. A wide diversity of diploid and triploid cultivars including AA, AB, AS, AT, AAA, AAB, ABB, AAS and AAT genomic constitutions exists. Within each of this genome groups, cultivars are classified into subgroups that are reported to correspond to varieties clonally derived from each other after a single sexual event. The number of those founding events at the basis of the diversity of bananas is a matter of debate. We analysed a large panel of 575 accessions, 94 wild relatives and 481 cultivated accessions belonging to the section Musa with a set of 498 DArT markers previously developed. DArT appeared successful and accurate to describe Musa diversity and help in the resolution of cultivated banana genome constitution and taxonomy, and highlighted discrepancies in the acknowledged classification of some accessions. This study also argues for at least two centres of domestication corresponding to South-East Asia and New Guinea, respectively. Banana domestication in New Guinea probably followed different schemes that those previously reported where hybridization underpins the emergence of edible banana. In addition, our results suggest that not all wild ancestors of bananas are known, especially in M. acuminata subspecies. We also estimate the extent of the two consecutive bottlenecks in edible bananas by evaluating the number of sexual founding events underlying our sets of edible diploids and triploids, respectively. The attribution of clone identity to each sample of the sets allowed the detection of subgroups represented by several sets of clones. Although morphological characterization of some of the accessions is needed to correct potentially erroneous classifications, some of the subgroups seem polyclonal. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company.

  17. Two-Photon Excitation, Fluorescence Microscopy, and Quantitative Measurement of Two-Photon Absorption Cross Sections

    NASA Astrophysics Data System (ADS)

    DeArmond, Fredrick Michael

    As optical microscopy techniques continue to improve, most notably the development of super-resolution optical microscopy which garnered the Nobel Prize in Chemistry in 2014, renewed emphasis has been placed on the development and use of fluorescence microscopy techniques. Of particular note is a renewed interest in multiphoton excitation due to a number of inherent properties of the technique including simplified optical filtering, increased sample penetration, and inherently confocal operation. With this renewed interest in multiphoton fluorescence microscopy, comes an increased demand for robust non-linear fluorescent markers, and characterization of the associated tool set. These factors have led to an experimental setup to allow a systematized approach for identifying and characterizing properties of fluorescent probes in the hopes that the tool set will provide researchers with additional information to guide their efforts in developing novel fluorophores suitable for use in advanced optical microscopy techniques as well as identifying trends for their synthesis. Hardware was setup around a software control system previously developed. Three experimental tool sets were set up, characterized, and applied over the course of this work. These tools include scanning multiphoton fluorescence microscope with single molecule sensitivity, an interferometric autocorrelator for precise determination of the bandwidth and pulse width of the ultrafast Titanium Sapphire excitation source, and a simplified fluorescence microscope for the measurement of two-photon absorption cross sections. Resulting values for two-photon absorption cross sections and two-photon absorption action cross sections for two standardized fluorophores, four commercially available fluorophores, and ten novel fluorophores are presented as well as absorption and emission spectra.

  18. View-invariant object recognition ability develops after discrimination, not mere exposure, at several viewing angles.

    PubMed

    Yamashita, Wakayo; Wang, Gang; Tanaka, Keiji

    2010-01-01

    One usually fails to recognize an unfamiliar object across changes in viewing angle when it has to be discriminated from similar distractor objects. Previous work has demonstrated that after long-term experience in discriminating among a set of objects seen from the same viewing angle, immediate recognition of the objects across 30-60 degrees changes in viewing angle becomes possible. The capability for view-invariant object recognition should develop during the within-viewing-angle discrimination, which includes two kinds of experience: seeing individual views and discriminating among the objects. The aim of the present study was to determine the relative contribution of each factor to the development of view-invariant object recognition capability. Monkeys were first extensively trained in a task that required view-invariant object recognition (Object task) with several sets of objects. The animals were then exposed to a new set of objects over 26 days in one of two preparatory tasks: one in which each object view was seen individually, and a second that required discrimination among the objects at each of four viewing angles. After the preparatory period, we measured the monkeys' ability to recognize the objects across changes in viewing angle, by introducing the object set to the Object task. Results indicated significant view-invariant recognition after the second but not first preparatory task. These results suggest that discrimination of objects from distractors at each of several viewing angles is required for the development of view-invariant recognition of the objects when the distractors are similar to the objects.

  19. Application of Artificial Neural Network to Optical Fluid Analyzer

    NASA Astrophysics Data System (ADS)

    Kimura, Makoto; Nishida, Katsuhiko

    1994-04-01

    A three-layer artificial neural network has been applied to the presentation of optical fluid analyzer (OFA) raw data, and the accuracy of oil fraction determination has been significantly improved compared to previous approaches. To apply the artificial neural network approach to solving a problem, the first step is training to determine the appropriate weight set for calculating the target values. This involves using a series of data sets (each comprising a set of input values and an associated set of output values that the artificial neural network is required to determine) to tune artificial neural network weighting parameters so that the output of the neural network to the given set of input values is as close as possible to the required output. The physical model used to generate the series of learning data sets was the effective flow stream model, developed for OFA data presentation. The effectiveness of the training was verified by reprocessing the same input data as were used to determine the weighting parameters and then by comparing the results of the artificial neural network to the expected output values. The standard deviation of the expected and obtained values was approximately 10% (two sigma).

  20. An in vivo MRI Template Set for Morphometry, Tissue Segmentation, and fMRI Localization in Rats

    PubMed Central

    Valdés-Hernández, Pedro Antonio; Sumiyoshi, Akira; Nonaka, Hiroi; Haga, Risa; Aubert-Vásquez, Eduardo; Ogawa, Takeshi; Iturria-Medina, Yasser; Riera, Jorge J.; Kawashima, Ryuta

    2011-01-01

    Over the last decade, several papers have focused on the construction of highly detailed mouse high field magnetic resonance image (MRI) templates via non-linear registration to unbiased reference spaces, allowing for a variety of neuroimaging applications such as robust morphometric analyses. However, work in rats has only provided medium field MRI averages based on linear registration to biased spaces with the sole purpose of approximate functional MRI (fMRI) localization. This precludes any morphometric analysis in spite of the need of exploring in detail the neuroanatomical substrates of diseases in a recent advent of rat models. In this paper we present a new in vivo rat T2 MRI template set, comprising average images of both intensity and shape, obtained via non-linear registration. Also, unlike previous rat template sets, we include white and gray matter probabilistic segmentations, expanding its use to those applications demanding prior-based tissue segmentation, e.g., statistical parametric mapping (SPM) voxel-based morphometry. We also provide a preliminary digitalization of latest Paxinos and Watson atlas for anatomical and functional interpretations within the cerebral cortex. We confirmed that, like with previous templates, forepaw and hindpaw fMRI activations can be correctly localized in the expected atlas structure. To exemplify the use of our new MRI template set, were reported the volumes of brain tissues and cortical structures and probed their relationships with ontogenetic development. Other in vivo applications in the near future can be tensor-, deformation-, or voxel-based morphometry, morphological connectivity, and diffusion tensor-based anatomical connectivity. Our template set, freely available through the SPM extension website, could be an important tool for future longitudinal and/or functional extensive preclinical studies. PMID:22275894

  1. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  2. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2013-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques.

  3. Low- and high-spin iron (II) complexes studied by effective crystal field method combined with molecular mechanics.

    PubMed

    Darkhovskii, M B; Pletnev, I V; Tchougréeff, A L

    2003-11-15

    A computational method targeted to Werner-type complexes is developed on the basis of quantum mechanical effective Hamiltonian crystal field (EHCF) methodology (previously proposed for describing electronic structure of transition metal complexes) combined with the Gillespie-Kepert version of molecular mechanics (MM). It is a special version of the hybrid quantum/MM approach. The MM part is responsible for representing the whole molecule, including ligand atoms and metal ion coordination sphere, but leaving out the effects of the d-shell. The quantum mechanical EHCF part is limited to the metal ion d-shell. The method reproduces with reasonable accuracy geometry and spin states of the Fe(II) complexes with monodentate and polydentate aromatic ligands with nitrogen donor atoms. In this setting a single set of MM parameters set is shown to be sufficient for handling all spin states of the complexes under consideration. Copyright 2003 Wiley Periodicals, Inc.

  4. Computer-aided detection of bladder masses in CT urography (CTU)

    NASA Astrophysics Data System (ADS)

    Cha, Kenny H.; Hadjiiski, Lubomir M.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.; Weizer, Alon; Samala, Ravi K.

    2017-03-01

    We are developing a computer-aided detection system for bladder cancer in CT urography (CTU). We have previously developed methods for detection of bladder masses within the contrast-enhanced and the non-contrastenhanced regions of the bladder individually. In this study, we investigated methods for detection of bladder masses within the entire bladder. The bladder was segmented using our method that combined deep-learning convolutional neural network with level sets. The non-contrast-enhanced region was separated from the contrast-enhanced region with a maximum-intensity-projection-based method. The non-contrast region was smoothed and gray level threshold was applied to the contrast and non-contrast regions separately to extract the bladder wall and potential masses. The bladder wall was transformed into a straightened thickness profile, which was analyzed to identify lesion candidates in a prescreening step. The candidates were mapped back to the 3D CT volume and segmented using our auto-initialized cascaded level set (AI-CALS) segmentation method. Twenty-seven morphological features were extracted for each candidate. A data set of 57 patients with 71 biopsy-proven bladder lesions was used, which was split into independent training and test sets: 42 training cases with 52 lesions, and 15 test cases with 19 lesions. Using the training set, feature selection was performed and a linear discriminant (LDA) classifier was designed to merge the selected features for classification of bladder lesions and false positives. The trained classifier was evaluated with the test set. FROC analysis showed that the system achieved a sensitivity of 86.5% at 3.3 FPs/case for the training set, and 84.2% at 3.7 FPs/case for the test set.

  5. Integration of Evidence into a Detailed Clinical Model-based Electronic Nursing Record System

    PubMed Central

    Park, Hyeoun-Ae; Jeon, Eunjoo; Chung, Eunja

    2012-01-01

    Objectives The purpose of this study was to test the feasibility of an electronic nursing record system for perinatal care that is based on detailed clinical models and clinical practice guidelines in perinatal care. Methods This study was carried out in five phases: 1) generating nursing statements using detailed clinical models; 2) identifying the relevant evidence; 3) linking nursing statements with the evidence; 4) developing a prototype electronic nursing record system based on detailed clinical models and clinical practice guidelines; and 5) evaluating the prototype system. Results We first generated 799 nursing statements describing nursing assessments, diagnoses, interventions, and outcomes using entities, attributes, and value sets of detailed clinical models for perinatal care which we developed in a previous study. We then extracted 506 recommendations from nine clinical practice guidelines and created sets of nursing statements to be used for nursing documentation by grouping nursing statements according to these recommendations. Finally, we developed and evaluated a prototype electronic nursing record system that can provide nurses with recommendations for nursing practice and sets of nursing statements based on the recommendations for guiding nursing documentation. Conclusions The prototype system was found to be sufficiently complete, relevant, useful, and applicable in terms of content, and easy to use and useful in terms of system user interface. This study has revealed the feasibility of developing such an ENR system. PMID:22844649

  6. Soybean fruit development and set at the node level under combined photoperiod and radiation conditions.

    PubMed

    Nico, Magalí; Mantese, Anita I; Miralles, Daniel J; Kantolic, Adriana G

    2016-01-01

    In soybean, long days during post-flowering increase seed number. This positive photoperiodic effect on seed number has been previously associated with increments in the amount of radiation accumulated during the crop cycle because long days extend the duration of the crop cycle. However, evidence of intra-nodal processes independent of the availability of assimilates suggests that photoperiodic effects at the node level might also contribute to pod set. This work aims to identify the main mechanisms responsible for the increase in pod number per node in response to long days; including the dynamics of flowering, pod development, growth and set at the node level. Long days increased pods per node on the main stems, by increasing pods on lateral racemes (usually dominated positions) at some main stem nodes. Long days lengthened the flowering period and thereby increased the number of opened flowers on lateral racemes. The flowering period was prolonged under long days because effective seed filling was delayed on primary racemes (dominant positions). Long days also delayed the development of flowers into pods with filling seeds, delaying the initiation of pod elongation without modifying pod elongation rate. The embryo development matched the external pod length irrespective of the pod's chronological age. These results suggest that long days during post-flowering enhance pod number per node through a relief of the competition between pods of different hierarchy within the node. The photoperiodic effect on the development of dominant pods, delaying their elongation and therefore postponing their active growth, extends flowering and allows pod set at positions that are usually dominated. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  7. Primary care physicians’ experiences with electronic medical records

    PubMed Central

    Ludwick, Dave; Manca, Donna; Doucette, John

    2010-01-01

    OBJECTIVE To understand how remuneration and care setting affect the implementation of electronic medical records (EMRs). DESIGN Semistructured interviews were used to illicit descriptions from community-based family physicians (paid on a fee-for-service basis) and from urban, hospital, and academic family physicians (remunerated via alternative payment models or sessional pay for activities pertaining to EMR implementation). SETTING Small suburban community and large urban-, hospital-, and academic-based family medicine clinics in Alberta. All participants were supported by a jurisdictional EMR certification funding mechanism. PARTICIPANTS Physicians who practised in 1 or a combination of the above settings and had experience implementing and using EMRs. METHODS Purposive and maximum variation sampling was used to obtain descriptive data from key informants through individually conducted semistructured interviews. The interview guide, which was developed from key findings of our previous literature review, was used in a previous study of community-based family physicians on this same topic. Field notes were analyzed to generate themes through a comparative immersion approach. MAIN FINDINGS Physicians in urban, hospital, and academic settings leverage professional working relationships to investigate EMRs, a resource not available to community physicians. Physicians in urban, hospital, and academic settings work in larger interdisciplinary teams with a greater need for interdisciplinary care coordination, EMR training, and technical support. These practices were able to support the cost of project management or technical support resources. These physicians followed a planned system rollout approach compared with community physicians who installed their systems quickly and required users to transition to the new system immediately. Electronic medical records did not increase, or decrease, patient throughput. Physicians developed ways of including patients in the note-taking process. CONCLUSION We studied physicians’ procurement approaches under various payment models. Our findings do not suggest that one remuneration approach supports EMR adoption any more than another. Rather, this study suggests that stronger physician professional networks used in information gathering, more complete training, and in-house technical support might be more influential than remuneration in facilitating the EMR adoption experience. PMID:20090083

  8. On the Relation of Setting and Early-Age Strength Development to Porosity and Hydration in Cement-Based Materials

    PubMed Central

    Lootens, Didier; Bentz, Dale P.

    2016-01-01

    Previous research has demonstrated a linear relationship between compressive strength (mortar cubes and concrete cylinders) and cumulative heat release normalized per unit volume of (mixing) water for a wide variety of cement-based mixtures at ages of 1 d and beyond. This paper utilizes concurrent ultrasonic reflection and calorimetry measurements to further explore this relationship from the time of specimen casting to 3 d. The ultrasonic measurements permit a continuous evaluation of thickening, setting, and strength development during this time period for comparison with the ongoing chemical reactions, as characterized by isothermal calorimetry measurements. Initially, the ultrasonic strength-heat release relation depends strongly on water-to-cement ratio, as well as admixture additions, with no universal behavior. Still, each individual strength-heat release curve is consistent with a percolation-based view of the cement setting process. However, beyond about 8 h for the systems investigated in the present study, the various strength-heat release curves merge towards a single relationship that broadly characterizes the development of strength as a function of heat released (fractional space filled), demonstrating that mortar and/or concrete strength at early ages can be effectively monitored using either ultrasonic or calorimetry measurements on small paste or mortar specimens. PMID:27046956

  9. Intermittent explosive disorder: development of integrated research criteria for Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition.

    PubMed

    Coccaro, Emil F

    2011-01-01

    This study was designed to develop a revised diagnostic criteria set for intermittent explosive disorder (IED) for consideration for inclusion in Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-V). This revised criteria set was developed by integrating previous research criteria with elements from the current DSM-IV set of diagnostic criteria. Evidence supporting the reliability and validity of IED-IR ("IED Integrated Criteria") in a new and well-characterized group of subjects with personality disorder is presented. Clinical, phenomenologic, and diagnostic data from 201 individuals with personality disorder were reviewed. All IED diagnoses were assigned using a best-estimate process (eg, kappa for IED-IR >0.85). In addition, subjects meeting IED-IR criteria had higher scores on dimensional measures of aggression and had lower global functioning scores than non-IED-IR subjects, even when related variables were controlled. The IED-IR criteria were more sensitive than the DSM-IV criteria only in identifying subjects with significant impulsive-aggressive behavior by a factor of 16. We conclude that the IED-IR criteria can be reliably applied and have sufficient validity to warrant consideration as DSM-V criteria for IED. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. A comprehensive custom panel design for routine hereditary cancer testing: preserving control, improving diagnostics and revealing a complex variation landscape.

    PubMed

    Castellanos, Elisabeth; Gel, Bernat; Rosas, Inma; Tornero, Eva; Santín, Sheila; Pluvinet, Raquel; Velasco, Juan; Sumoy, Lauro; Del Valle, Jesús; Perucho, Manuel; Blanco, Ignacio; Navarro, Matilde; Brunet, Joan; Pineda, Marta; Feliubadaló, Lidia; Capellá, Gabi; Lázaro, Conxi; Serra, Eduard

    2017-01-04

    We wanted to implement an NGS strategy to globally analyze hereditary cancer with diagnostic quality while retaining the same degree of understanding and control we had in pre-NGS strategies. To do this, we developed the I2HCP panel, a custom bait library covering 122 hereditary cancer genes. We improved bait design, tested different NGS platforms and created a clinically driven custom data analysis pipeline. The I2HCP panel was developed using a training set of hereditary colorectal cancer, hereditary breast and ovarian cancer and neurofibromatosis patients and reached an accuracy, analytical sensitivity and specificity greater than 99%, which was maintained in a validation set. I2HCP changed our diagnostic approach, involving clinicians and a genetic diagnostics team from panel design to reporting. The new strategy improved diagnostic sensitivity, solved uncertain clinical diagnoses and identified mutations in new genes. We assessed the genetic variation in the complete set of hereditary cancer genes, revealing a complex variation landscape that coexists with the disease-causing mutation. We developed, validated and implemented a custom NGS-based strategy for hereditary cancer diagnostics that improved our previous workflows. Additionally, the existence of a rich genetic variation in hereditary cancer genes favors the use of this panel to investigate their role in cancer risk.

  11. On the Relation of Setting and Early-Age Strength Development to Porosity and Hydration in Cement-Based Materials.

    PubMed

    Lootens, Didier; Bentz, Dale P

    2016-04-01

    Previous research has demonstrated a linear relationship between compressive strength (mortar cubes and concrete cylinders) and cumulative heat release normalized per unit volume of (mixing) water for a wide variety of cement-based mixtures at ages of 1 d and beyond. This paper utilizes concurrent ultrasonic reflection and calorimetry measurements to further explore this relationship from the time of specimen casting to 3 d. The ultrasonic measurements permit a continuous evaluation of thickening, setting, and strength development during this time period for comparison with the ongoing chemical reactions, as characterized by isothermal calorimetry measurements. Initially, the ultrasonic strength-heat release relation depends strongly on water-to-cement ratio, as well as admixture additions, with no universal behavior. Still, each individual strength-heat release curve is consistent with a percolation-based view of the cement setting process. However, beyond about 8 h for the systems investigated in the present study, the various strength-heat release curves merge towards a single relationship that broadly characterizes the development of strength as a function of heat released (fractional space filled), demonstrating that mortar and/or concrete strength at early ages can be effectively monitored using either ultrasonic or calorimetry measurements on small paste or mortar specimens.

  12. [The development of reagents set in the format of DNA-chip for genetic typing of strains of Vibrio cholerae].

    PubMed

    Pudova, E A; Markelov, M L; Dedkov, V G; Tchekanova, T A; Sadjin, A I; Kirdiyashkina, N P; Bekova, M V; Deviyatkin, A A

    2014-05-01

    The necessity of development of methods of genic diagnostic of cholera is conditioned by continuation of the Seventh pandemic of cholera, taxonomic variability of strains of Vibrio cholerae involved into pandemic and also permanent danger of delivery of disease to the territory of the Russian Federation. The methods of genic diagnostic of cholera make it possible in a comparatively short time to maximally minutely characterize strains isolated from patients or their environment. The article presents information about working out reagents set for genetic typing of agents of cholera using DNA-chip. The makeup of DNA-chip included oligonucleotide probes making possible to differentiate strains of V. cholerae on serogroups and biovars and to determine their pathogenicity. The single DNA-chip makes it possible to genetically type up to 12 samples concurrently. At that, duration of analysis without accounting stage of DNA separation makes up to 5 hours. In the progress of work, 23 cholera and non-cholera strains were analyzed. The full compliance of DNA-chip typing results to previously known characteristics of strains. Hence, there is a reason to consider availability of further development of reagents set and possibility of its further application in laboratories of regional level and reference centers.

  13. General form of a cooperative gradual maximal covering location problem

    NASA Astrophysics Data System (ADS)

    Bagherinejad, Jafar; Bashiri, Mahdi; Nikzad, Hamideh

    2018-07-01

    Cooperative and gradual covering are two new methods for developing covering location models. In this paper, a cooperative maximal covering location-allocation model is developed (CMCLAP). In addition, both cooperative and gradual covering concepts are applied to the maximal covering location simultaneously (CGMCLP). Then, we develop an integrated form of a cooperative gradual maximal covering location problem, which is called a general CGMCLP. By setting the model parameters, the proposed general model can easily be transformed into other existing models, facilitating general comparisons. The proposed models are developed without allocation for physical signals and with allocation for non-physical signals in discrete location space. Comparison of the previously introduced gradual maximal covering location problem (GMCLP) and cooperative maximal covering location problem (CMCLP) models with our proposed CGMCLP model in similar data sets shows that the proposed model can cover more demands and acts more efficiently. Sensitivity analyses are performed to show the effect of related parameters and the model's validity. Simulated annealing (SA) and a tabu search (TS) are proposed as solution algorithms for the developed models for large-sized instances. The results show that the proposed algorithms are efficient solution approaches, considering solution quality and running time.

  14. Neuroimaging Field Methods Using Functional Near Infrared Spectroscopy (NIRS) Neuroimaging to Study Global Child Development: Rural Sub-Saharan Africa.

    PubMed

    Jasińska, Kaja K; Guei, Sosthène

    2018-02-02

    Portable neuroimaging approaches provide new advances to the study of brain function and brain development with previously inaccessible populations and in remote locations. This paper shows the development of field functional Near Infrared Spectroscopy (fNIRS) imaging to the study of child language, reading, and cognitive development in a rural village setting of Côte d'Ivoire. Innovation in methods and the development of culturally appropriate neuroimaging protocols allow a first-time look into the brain's development and children's learning outcomes in understudied environments. This paper demonstrates protocols for transporting and setting up a mobile laboratory, discusses considerations for field versus laboratory neuroimaging, and presents a guide for developing neuroimaging consent procedures and building meaningful long-term collaborations with local government and science partners. Portable neuroimaging methods can be used to study complex child development contexts, including the impact of significant poverty and adversity on brain development. The protocol presented here has been developed for use in Côte d'Ivoire, the world's primary source of cocoa, and where reports of child labor in the cocoa sector are common. Yet, little is known about the impact of child labor on brain development and learning. Field neuroimaging methods have the potential to yield new insights into such urgent issues, and the development of children globally.

  15. The association of preterm birth and small birthweight for gestational age on childhood disability screening using the Ten Questions Plus tool in rural Sarlahi district, southern Nepal.

    PubMed

    Wu, L A; Katz, J; Mullany, L C; Khatry, S K; Darmstadt, G L; LeClerq, S C; Tielsch, J M

    2012-05-01

    The Ten Questions tool was developed in 1984 as a low-cost, simple screen for childhood disability and referral for diagnosis in low-resource settings, and its use in Nepal has not been previously evaluated. Preterm birth and intrauterine growth restriction are potential risk factors for child disability and loss of developmental potential, but there are few studies examining this relationship from developing settings.   To examine the associations of small for gestational age and preterm birth as predictors of Ten Questions Plus positivity. The Ten Questions Plus questionnaire was administered to caregivers of 680 children between 2 and 5 years of age from August 2007 to March 2008 in rural Sarlahi, southern Nepal. Participants had previously been enrolled in a randomized trial of chlorhexidine cleansing at birth. At 1 month of age, children were then enrolled into a randomized 2 × 2 factorial trial of daily iron and zinc supplementation between October 2001 and January 2006. None. Positive screen on the Ten Questions Plus tool defined as a positive response to one or more questions. Of preterm children, 37 (33.6%) had a positive response to at least one question on the Ten Questions Plus and were considered at risk for disability. One hundred and seventy term children (29.8%) were at risk for disability. The Ten Questions Plus tool can be used in this rural Nepali setting to identify children at increased risk for mental and physical disability to be targeted for further examination. The prevalence of parent-reported disabilities is high in this population (almost one-third of children); children who are both preterm and small-for-gestational age are at increased risk for motor milestone delay, reported learning difficulty, speech and behavioural problems. Intrauterine growth restriction may affect child development and result in disabilities later in childhood. © 2011 Blackwell Publishing Ltd.

  16. Estimated incidence rate and distribution of tumours in 4,653 cases of archival submissions derived from the Dutch golden retriever population

    PubMed Central

    2014-01-01

    Background A genetic predisposition for certain tumour types has been proven for some dog breeds. Some studies have suggested that this may also be true for the Golden retriever breed. The present study aimed to examine a possible existence of a tumour (type) predisposition in the Dutch population of Golden retrievers by evaluating annual estimated incidence rates compared to incidence rates from previous publications. A second aim was to evaluate whether incidences of various tumours differed as related to the diagnostic method chosen, being either cytology or histology. Results Tumours submitted to Utrecht University during the period 1998–2004 diagnosed either by means of cytology (n = 2,529) or histology (n = 2,124), were related to an average annual Dutch kennel club population of 29,304 Golden retrievers. Combining individual tumours from both the cytological and the histopathological data-set resulted in an annual estimated incidence rate of 2,242 for 100,000 dog-years at risk regarding tumour development in general. The most common cytological tumor diagnoses were ‘fat, possibly lipoma’ (35%), mast cell tumour (21%) and non-Hodgkin lymphoma (10%). The most commonly diagnosed tumours by histology were mast cell tumour (26%), soft tissue sarcomas (11%) and melanoma (8%). Both the cytological and histopathological data-sets, showed variation; in patient age distribution, age of onset and incidence of various tumours. Conclusion Comparing our data with previous reports in non-breed-specified dog populations, the Golden retriever breed shows an increased risk for the development of tumours in general, as well as an increased risk for the development of specific tumour types, including the group of soft tissue sarcomas. Variations in age, location and incidence of various tumours were observed between the two data-sets, indicating a selection bias for diagnostic procedure. PMID:24484635

  17. Top 10 metrics for life science software good practices.

    PubMed

    Artaza, Haydee; Chue Hong, Neil; Corpas, Manuel; Corpuz, Angel; Hooft, Rob; Jimenez, Rafael C; Leskošek, Brane; Olivier, Brett G; Stourac, Jan; Svobodová Vařeková, Radka; Van Parys, Thomas; Vaughan, Daniel

    2016-01-01

    Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.

  18. Top 10 metrics for life science software good practices

    PubMed Central

    2016-01-01

    Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here. PMID:27635232

  19. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Research in Presistent Simulation: Development of the Persistent ModSim Object-Oriented Programming Language

    DTIC Science & Technology

    1993-07-01

    version tree is formed that permits users to go back to any previous version. There are methods for traversing the version tree of a particular...workspace. Workspace objects are linked (or nested) hierarchically into a workspace tree . Applications can set the access privileges to parts of this...workspace tree to control access (and hence change). There must be a default global workspace. Workspace objects are then allocated within the context

  1. Generalization of Figure-Ground Segmentation from Binocular to Monocular Vision in an Embodied Biological Brain Model

    DTIC Science & Technology

    2011-08-01

    Intelligence (AGI). For example, it promises to unlock vast sets of training data , such as Google Images, which have previously been inaccessible to...development of this skill holds great promise for e orts, like Emer, that aim to create an Artifcial General Intelligence (AGI). For example, it promises to...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send

  2. Detailed Validation of the Bidirectional Effect in Various Case 1 and Case 2 Waters

    DTIC Science & Technology

    2012-03-26

    of the viewing direction, i.e., they assumed a completely diffuse BRDF . Previous efforts to model / understand the actual BRDF [4-10] have produced...places. Second, the MAG2002 BRDF tables were developed from a radiative transfer (RT) model that used scattering particle phase functions that...situ measurements from just 3 locations to validate their model ; here we used a much larger data set across a wide variety of inherent optical

  3. Gunshot identification system by integration of open source consumer electronics

    NASA Astrophysics Data System (ADS)

    López R., Juan Manuel; Marulanda B., Jose Ignacio

    2014-05-01

    This work presents a prototype of low-cost gunshots identification system that uses consumer electronics in order to ensure the existence of gunshots and then classify it according to a previously established database. The implementation of this tool in the urban areas is to set records that support the forensics, hence improving law enforcement also on developing countries. An analysis of its effectiveness is presented in comparison with theoretical results obtained with numerical simulations.

  4. Visual memory effects on intraoperator study design: determining a minimum time gap between case reviews to reduce recall bias.

    PubMed

    Campbell, W Scott; Talmon, Geoffrey A; Foster, Kirk W; Baker, John J; Smith, Lynette M; Hinrichs, Steven H

    2015-03-01

    The objective of this research was to determine test intervals between intraoperator case reviews to minimize the impact of recall. Three pathologists were presented with a group of 120 slides and subsequently challenged with a study set of 120 slides after 2-week and 4-week intervals. The challenge set consisted of 60 slides seen during the initial review and 60 slides previously unseen within the study. Pathologists rendered a diagnosis for each slide and indicated whether they recalled seeing the slide previously (yes/no). Two weeks after having been shown 60 cases from a challenge set of 120 cases, the pathologists correctly remembered 26, 22, and 24 cases or 40% overall. After 4 weeks, the pathologists correctly recalled 31% of cases previously seen. Pathologists were capable of recalling from memory cases seen previously at 2 and 4 weeks. Recall rates may be sufficiently high to affect intraobserver study design. Copyright© by the American Society for Clinical Pathology.

  5. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  6. Creating peer groups for assessing and comparing nursing home performance.

    PubMed

    Byrne, Margaret M; Daw, Christina; Pietz, Ken; Reis, Brian; Petersen, Laura A

    2013-11-01

    Publicly reported performance data for hospitals and nursing homes are becoming ubiquitous. For such comparisons to be fair, facilities must be compared with their peers. To adapt a previously published methodology for developing hospital peer groupings so that it is applicable to nursing homes and to explore the characteristics of "nearest-neighbor" peer groupings. Analysis of Department of Veterans Affairs administrative databases and nursing home facility characteristics. The nearest-neighbor methodology for developing peer groupings involves calculating the Euclidean distance between facilities based on facility characteristics. We describe our steps in selection of facility characteristics, describe the characteristics of nearest-neighbor peer groups, and compare them with peer groups derived through classical cluster analysis. The facility characteristics most pertinent to nursing home groupings were found to be different from those that were most relevant for hospitals. Unlike classical cluster groups, nearest neighbor groups are not mutually exclusive, and the nearest-neighbor methodology resulted in nursing home peer groupings that were substantially less diffuse than nursing home peer groups created using traditional cluster analysis. It is essential that healthcare policy makers and administrators have a means of fairly grouping facilities for the purposes of quality, cost, or efficiency comparisons. In this research, we show that a previously published methodology can be successfully applied to a nursing home setting. The same approach could be applied in other clinical settings such as primary care.

  7. The Functional Genetics of Handedness and Language Lateralization: Insights from Gene Ontology, Pathway and Disease Association Analyses.

    PubMed

    Schmitz, Judith; Lor, Stephanie; Klose, Rena; Güntürkün, Onur; Ocklenburg, Sebastian

    2017-01-01

    Handedness and language lateralization are partially determined by genetic influences. It has been estimated that at least 40 (and potentially more) possibly interacting genes may influence the ontogenesis of hemispheric asymmetries. Recently, it has been suggested that analyzing the genetics of hemispheric asymmetries on the level of gene ontology sets, rather than at the level of individual genes, might be more informative for understanding the underlying functional cascades. Here, we performed gene ontology, pathway and disease association analyses on genes that have previously been associated with handedness and language lateralization. Significant gene ontology sets for handedness were anatomical structure development, pattern specification (especially asymmetry formation) and biological regulation. Pathway analysis highlighted the importance of the TGF-beta signaling pathway for handedness ontogenesis. Significant gene ontology sets for language lateralization were responses to different stimuli, nervous system development, transport, signaling, and biological regulation. Despite the fact that some authors assume that handedness and language lateralization share a common ontogenetic basis, gene ontology sets barely overlap between phenotypes. Compared to genes involved in handedness, which mostly contribute to structural development, genes involved in language lateralization rather contribute to activity-dependent cognitive processes. Disease association analysis revealed associations of genes involved in handedness with diseases affecting the whole body, while genes involved in language lateralization were specifically engaged in mental and neurological diseases. These findings further support the idea that handedness and language lateralization are ontogenetically independent, complex phenotypes.

  8. AlzhCPI: A knowledge base for predicting chemical-protein interactions towards Alzheimer's disease.

    PubMed

    Fang, Jiansong; Wang, Ling; Li, Yecheng; Lian, Wenwen; Pang, Xiaocong; Wang, Hong; Yuan, Dongsheng; Wang, Qi; Liu, Ai-Lin; Du, Guan-Hua

    2017-01-01

    Alzheimer's disease (AD) is a complicated progressive neurodegeneration disorder. To confront AD, scientists are searching for multi-target-directed ligands (MTDLs) to delay disease progression. The in silico prediction of chemical-protein interactions (CPI) can accelerate target identification and drug discovery. Previously, we developed 100 binary classifiers to predict the CPI for 25 key targets against AD using the multi-target quantitative structure-activity relationship (mt-QSAR) method. In this investigation, we aimed to apply the mt-QSAR method to enlarge the model library to predict CPI towards AD. Another 104 binary classifiers were further constructed to predict the CPI for 26 preclinical AD targets based on the naive Bayesian (NB) and recursive partitioning (RP) algorithms. The internal 5-fold cross-validation and external test set validation were applied to evaluate the performance of the training sets and test set, respectively. The area under the receiver operating characteristic curve (ROC) for the test sets ranged from 0.629 to 1.0, with an average of 0.903. In addition, we developed a web server named AlzhCPI to integrate the comprehensive information of approximately 204 binary classifiers, which has potential applications in network pharmacology and drug repositioning. AlzhCPI is available online at http://rcidm.org/AlzhCPI/index.html. To illustrate the applicability of AlzhCPI, the developed system was employed for the systems pharmacology-based investigation of shichangpu against AD to enhance the understanding of the mechanisms of action of shichangpu from a holistic perspective.

  9. The Functional Genetics of Handedness and Language Lateralization: Insights from Gene Ontology, Pathway and Disease Association Analyses

    PubMed Central

    Schmitz, Judith; Lor, Stephanie; Klose, Rena; Güntürkün, Onur; Ocklenburg, Sebastian

    2017-01-01

    Handedness and language lateralization are partially determined by genetic influences. It has been estimated that at least 40 (and potentially more) possibly interacting genes may influence the ontogenesis of hemispheric asymmetries. Recently, it has been suggested that analyzing the genetics of hemispheric asymmetries on the level of gene ontology sets, rather than at the level of individual genes, might be more informative for understanding the underlying functional cascades. Here, we performed gene ontology, pathway and disease association analyses on genes that have previously been associated with handedness and language lateralization. Significant gene ontology sets for handedness were anatomical structure development, pattern specification (especially asymmetry formation) and biological regulation. Pathway analysis highlighted the importance of the TGF-beta signaling pathway for handedness ontogenesis. Significant gene ontology sets for language lateralization were responses to different stimuli, nervous system development, transport, signaling, and biological regulation. Despite the fact that some authors assume that handedness and language lateralization share a common ontogenetic basis, gene ontology sets barely overlap between phenotypes. Compared to genes involved in handedness, which mostly contribute to structural development, genes involved in language lateralization rather contribute to activity-dependent cognitive processes. Disease association analysis revealed associations of genes involved in handedness with diseases affecting the whole body, while genes involved in language lateralization were specifically engaged in mental and neurological diseases. These findings further support the idea that handedness and language lateralization are ontogenetically independent, complex phenotypes. PMID:28729848

  10. Development of Infrared Library Search Prefilters for Automotive Clear Coats from Simulated Attenuated Total Reflection (ATR) Spectra.

    PubMed

    Perera, Undugodage Don Nuwan; Nishikida, Koichi; Lavine, Barry K

    2018-06-01

    A previously published study featuring an attenuated total reflection (ATR) simulation algorithm that mitigated distortions in ATR spectra was further investigated to evaluate its efficacy to enhance searching of infrared (IR) transmission libraries. In the present study, search prefilters were developed from transformed ATR spectra to identify the assembly plant of a vehicle from ATR spectra of the clear coat layer. A total of 456 IR transmission spectra from the Paint Data Query (PDQ) database that spanned 22 General Motors assembly plants and served as a training set cohort were transformed into ATR spectra by the simulation algorithm. These search prefilters were formulated using the fingerprint region (1500 cm -1 to 500 cm -1 ). Both the transformed ATR spectra (training set) and the experimental ATR spectra (validation set) were preprocessed for pattern recognition analysis using the discrete wavelet transform, which increased the signal-to-noise of the ATR spectra by concentrating the signal in specific wavelet coefficients. Attenuated total reflection spectra of 14 clear coat samples (validation set) measured with a Nicolet iS50 Fourier transform IR spectrometer were correctly classified as to assembly plant(s) of the automotive vehicle from which the paint sample originated using search prefilters developed from 456 simulated ATR spectra. The ATR simulation (transformation) algorithm successfully facilitated spectral library matching of ATR spectra against IR transmission spectra of automotive clear coats in the PDQ database.

  11. The Tip of the Iceberg: The Quest for Innovation at the Base of the Pyramid

    NASA Astrophysics Data System (ADS)

    Gordon, M. D.; Awad, N. F.

    Much of the world in Asia, Latin America, and Africa is at an early stage of economic development similar to what the United States and other developed countries experienced many decades ago. Yet, much as their needs for hard and soft infrastructure, effective business practices, and an educated workforce parallel similar needs that underlay earlier development in the West, replicating Western development would overlook the hallmarks of the current century: widely available information and communications technology; a set of electronic linkages among the world; and a global business environment, to name just a few. Consequently, it should be possible to allow developing countries to use "leapfrog" technologies that were inconceivable decades ago to support their development. One means of identifying these opportunities is by matching traditional development needs with novel support by connecting previously unrelated literatures.

  12. Phylogenetics and evolution of Trx SET genes in fully sequenced land plants.

    PubMed

    Zhu, Xinyu; Chen, Caoyi; Wang, Baohua

    2012-04-01

    Plant Trx SET proteins are involved in H3K4 methylation and play a key role in plant floral development. Genes encoding Trx SET proteins constitute a multigene family in which the copy number varies among plant species and functional divergence appears to have occurred repeatedly. To investigate the evolutionary history of the Trx SET gene family, we made a comprehensive evolutionary analysis on this gene family from 13 major representatives of green plants. A novel clustering (here named as cpTrx clade), which included the III-1, III-2, and III-4 orthologous groups, previously resolved was identified. Our analysis showed that plant Trx proteins possessed a variety of domain organizations and gene structures among paralogs. Additional domains such as PHD, PWWP, and FYR were early integrated into primordial SET-PostSET domain organization of cpTrx clade. We suggested that the PostSET domain was lost in some members of III-4 orthologous group during the evolution of land plants. At least four classes of gene structures had been formed at the early evolutionary stage of land plants. Three intronless orphan Trx SET genes from the Physcomitrella patens (moss) were identified, and supposedly, their parental genes have been eliminated from the genome. The structural differences among evolutionary groups of plant Trx SET genes with different functions were described, contributing to the design of further experimental studies.

  13. Approximations to complete basis set-extrapolated, highly correlated non-covalent interaction energies.

    PubMed

    Mackie, Iain D; DiLabio, Gino A

    2011-10-07

    The first-principles calculation of non-covalent (particularly dispersion) interactions between molecules is a considerable challenge. In this work we studied the binding energies for ten small non-covalently bonded dimers with several combinations of correlation methods (MP2, coupled-cluster single double, coupled-cluster single double (triple) (CCSD(T))), correlation-consistent basis sets (aug-cc-pVXZ, X = D, T, Q), two-point complete basis set energy extrapolations, and counterpoise corrections. For this work, complete basis set results were estimated from averaged counterpoise and non-counterpoise-corrected CCSD(T) binding energies obtained from extrapolations with aug-cc-pVQZ and aug-cc-pVTZ basis sets. It is demonstrated that, in almost all cases, binding energies converge more rapidly to the basis set limit by averaging the counterpoise and non-counterpoise corrected values than by using either counterpoise or non-counterpoise methods alone. Examination of the effect of basis set size and electron correlation shows that the triples contribution to the CCSD(T) binding energies is fairly constant with the basis set size, with a slight underestimation with CCSD(T)∕aug-cc-pVDZ compared to the value at the (estimated) complete basis set limit, and that contributions to the binding energies obtained by MP2 generally overestimate the analogous CCSD(T) contributions. Taking these factors together, we conclude that the binding energies for non-covalently bonded systems can be accurately determined using a composite method that combines CCSD(T)∕aug-cc-pVDZ with energy corrections obtained using basis set extrapolated MP2 (utilizing aug-cc-pVQZ and aug-cc-pVTZ basis sets), if all of the components are obtained by averaging the counterpoise and non-counterpoise energies. With such an approach, binding energies for the set of ten dimers are predicted with a mean absolute deviation of 0.02 kcal/mol, a maximum absolute deviation of 0.05 kcal/mol, and a mean percent absolute deviation of only 1.7%, relative to the (estimated) complete basis set CCSD(T) results. Use of this composite approach to an additional set of eight dimers gave binding energies to within 1% of previously published high-level data. It is also shown that binding within parallel and parallel-crossed conformations of naphthalene dimer is predicted by the composite approach to be 9% greater than that previously reported in the literature. The ability of some recently developed dispersion-corrected density-functional theory methods to predict the binding energies of the set of ten small dimers was also examined. © 2011 American Institute of Physics

  14. Toward instructional design principles: Inducing Faraday's law with contrasting cases

    NASA Astrophysics Data System (ADS)

    Kuo, Eric; Wieman, Carl E.

    2016-06-01

    Although physics education research (PER) has improved instructional practices, there are not agreed upon principles for designing effective instructional materials. Here, we illustrate how close comparison of instructional materials could support the development of such principles. Specifically, in discussion sections of a large, introductory physics course, a pair of studies compare two instructional strategies for teaching a physics concept: having students (i) explain a set of contrasting cases or (ii) apply and build on previously learned concepts. We compare these strategies for the teaching of Faraday's law, showing that explaining a set of related contrasting cases not only improves student performance on Faraday's law questions over building on a previously learned concept (i.e., Lorentz force), but also prepares students to better learn subsequent topics, such as Lenz's law. These differences persist to the final exam. We argue that early exposure to contrasting cases better focuses student attention on a key feature related to both concepts: change in magnetic flux. Importantly, the benefits of contrasting cases for both learning and enjoyment are enhanced for students who did not first attend a Faraday's law lecture, consistent with previous research suggesting that being told a solution can circumvent the benefits of its discovery. These studies illustrate an experimental approach for understanding how the structure of activities affects learning and performance outcomes, a first step toward design principles for effective instructional materials.

  15. Electronegativity Equalization Method: Parameterization and Validation for Large Sets of Organic, Organohalogene and Organometal Molecule

    PubMed Central

    Vařeková, Radka Svobodová; Jiroušková, Zuzana; Vaněk, Jakub; Suchomel, Šimon; Koča, Jaroslav

    2007-01-01

    The Electronegativity Equalization Method (EEM) is a fast approach for charge calculation. A challenging part of the EEM is the parameterization, which is performed using ab initio charges obtained for a set of molecules. The goal of our work was to perform the EEM parameterization for selected sets of organic, organohalogen and organometal molecules. We have performed the most robust parameterization published so far. The EEM parameterization was based on 12 training sets selected from a database of predicted 3D structures (NCI DIS) and from a database of crystallographic structures (CSD). Each set contained from 2000 to 6000 molecules. We have shown that the number of molecules in the training set is very important for quality of the parameters. We have improved EEM parameters (STO-3G MPA charges) for elements that were already parameterized, specifically: C, O, N, H, S, F and Cl. The new parameters provide more accurate charges than those published previously. We have also developed new parameters for elements that were not parameterized yet, specifically for Br, I, Fe and Zn. We have also performed crossover validation of all obtained parameters using all training sets that included relevant elements and confirmed that calculated parameters provide accurate charges.

  16. Does overgeneral autobiographical memory result from poor memory for task instructions?

    PubMed

    Yanes, Paula K; Roberts, John E; Carlos, Erica L

    2008-10-01

    Considerable previous research has shown that retrieval of overgeneral autobiographical memories (OGM) is elevated among individuals suffering from various emotional disorders and those with a history of trauma. Although previous theories suggest that OGM serves the function of regulating acute negative affect, it is also possible that OGM results from difficulties in keeping the instruction set for the Autobiographical Memory Test (AMT) in working memory, or what has been coined "secondary goal neglect" (Dalgleish, 2004). The present study tested whether OGM is associated with poor memory for the task's instruction set, and whether an instruction set reminder would improve memory specificity over repeated trials. Multilevel modelling data-analytic techniques demonstrated a significant relationship between poor recall of instruction set and probability of retrieving OGMs. Providing an instruction set reminder for the AMT relative to a control task's instruction set improved memory specificity immediately afterward.

  17. Finite Element Analysis of Poroelastic Composites Undergoing Thermal and Gas Diffusion

    NASA Technical Reports Server (NTRS)

    Salamon, N. J. (Principal Investigator); Sullivan, Roy M.; Lee, Sunpyo

    1995-01-01

    A theory for time-dependent thermal and gas diffusion in mechanically time-rate-independent anisotropic poroelastic composites has been developed. This theory advances previous work by the latter two authors by providing for critical transverse shear through a three-dimensional axisymmetric formulation and using it in a new hypothesis for determining the Biot fluid pressure-solid stress coupling factor. The derived governing equations couple material deformation with temperature and internal pore pressure and more strongly couple gas diffusion and heat transfer than the previous theory. Hence the theory accounts for the interactions between conductive heat transfer in the porous body and convective heat carried by the mass flux through the pores. The Bubnov Galerkin finite element method is applied to the governing equations to transform them into a semidiscrete finite element system. A numerical procedure is developed to solve the coupled equations in the space and time domains. The method is used to simulate two high temperature tests involving thermal-chemical decomposition of carbon-phenolic composites. In comparison with measured data, the results are accurate. Moreover unlike previous work, for a single set of poroelastic parameters, they are consistent with two measurements in a restrained thermal growth test.

  18. Improvement of the GERDA Ge Detectors Energy Resolution by an Optimized Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Benato, G.; D'Andrea, V.; Cattadori, C.; Riboldi, S.

    GERDA is a new generation experiment searching for neutrinoless double beta decay of 76Ge, operating at INFN Gran Sasso Laboratories (LNGS) since 2010. Coaxial and Broad Energy Germanium (BEGe) Detectors have been operated in liquid argon (LAr) in GERDA Phase I. In the framework of the second GERDA experimental phase, both the contacting technique, the connection to and the location of the front end readout devices are novel compared to those previously adopted, and several tests have been performed. In this work, starting from considerations on the energy scale stability of the GERDA Phase I calibrations and physics data sets, an optimized pulse filtering method has been developed and applied to the Phase II pilot tests data sets, and to few GERDA Phase I data sets. In this contribution the detector performances in term of energy resolution and time stability are here presented. The improvement of the energy resolution, compared to standard Gaussian shaping adopted for Phase I data analysis, is discussed and related to the optimized noise filtering capability. The result is an energy resolution better than 0.1% at 2.6 MeV for the BEGe detectors operated in the Phase II pilot tests and an improvement of the energy resolution in LAr of about 8% achieved on the GERDA Phase I calibration runs, compared to previous analysis algorithms.

  19. Real-time defect detection on highly reflective curved surfaces

    NASA Astrophysics Data System (ADS)

    Rosati, G.; Boschetti, G.; Biondi, A.; Rossi, A.

    2009-03-01

    This paper presents an automated defect detection system for coated plastic components for the automotive industry. This research activity came up as an evolution of a previous study which employed a non-flat mirror to illuminate and inspect high reflective curved surfaces. According to this method, the rays emitted from a light source are conveyed on the surface under investigation by means of a suitably curved mirror. After the reflection on the surface, the light rays are collected by a CCD camera, in which the coating defects appear as shadows of various shapes and dimensions. In this paper we present an evolution of the above-mentioned method, introducing a simplified mirror set-up in order to reduce the costs and the complexity of the defect detection system. In fact, a set of plane mirrors is employed instead of the curved one. Moreover, the inspection of multiple bend radius parts is investigated. A prototype of the machine vision system has been developed in order to test this simplified method. This device is made up of a light projector, a set of plane mirrors for light rays reflection, a conveyor belt for handling components, a CCD camera and a desktop PC which performs image acquisition and processing. Like in the previous system, the defects are identified as shadows inside a high brightness image. At the end of the paper, first experimental results are presented.

  20. Clinical Audits in Outpatient Clinics for Chronic Obstructive Pulmonary Disease: Methodological Considerations and Workflow.

    PubMed

    López-Campos, Jose Luis; Abad Arranz, María; Calero Acuña, Carmen; Romero Valero, Fernando; Ayerbe García, Ruth; Hidalgo Molina, Antonio; Aguilar Pérez-Grovas, Ricardo Ismael; García Gil, Francisco; Casas Maldonado, Francisco; Caballero Ballesteros, Laura; Sánchez Palop, María; Pérez-Tejero, Dolores; Segado, Alejandro; Calvo Bonachera, Jose; Hernández Sierra, Bárbara; Doménech, Adolfo; Arroyo Varela, Macarena; González Vargas, Francisco; Cruz Rueda, Juan Jose

    2015-01-01

    Previous clinical audits for chronic obstructive pulmonary disease (COPD) have provided valuable information on the clinical care delivered to patients admitted to medical wards because of COPD exacerbations. However, clinical audits of COPD in an outpatient setting are scarce and no methodological guidelines are currently available. Based on our previous experience, herein we describe a clinical audit for COPD patients in specialized outpatient clinics with the overall goal of establishing a potential methodological workflow. A pilot clinical audit of COPD patients referred to respiratory outpatient clinics in the region of Andalusia, Spain (over 8 million inhabitants), was performed. The audit took place between October 2013 and September 2014, and 10 centers (20% of all public hospitals) were invited to participate. Cases with an established diagnosis of COPD based on risk factors, clinical symptoms, and a post-bronchodilator FEV1/FVC ratio of less than 0.70 were deemed eligible. The usefulness of formally scheduled regular follow-up visits was assessed. Two different databases (resources and clinical database) were constructed. Assessments were planned over a year divided by 4 three-month periods, with the goal of determining seasonal-related changes. Exacerbations and survival served as the main endpoints. This paper describes a methodological framework for conducting a clinical audit of COPD patients in an outpatient setting. Results from such audits can guide health information systems development and implementation in real-world settings.

  1. Statistical Models for Predicting Automobile Driving Postures for Men and Women Including Effects of Age.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-03-01

    Previously published statistical models of driving posture have been effective for vehicle design but have not taken into account the effects of age. The present study developed new statistical models for predicting driving posture. Driving postures of 90 U.S. drivers with a wide range of age and body size were measured in laboratory mockup in nine package conditions. Posture-prediction models for female and male drivers were separately developed by employing a stepwise regression technique using age, body dimensions, vehicle package conditions, and two-way interactions, among other variables. Driving posture was significantly associated with age, and the effects of other variables depended on age. A set of posture-prediction models is presented for women and men. The results are compared with a previously developed model. The present study is the first study of driver posture to include a large cohort of older drivers and the first to report a significant effect of age. The posture-prediction models can be used to position computational human models or crash-test dummies for vehicle design and assessment. © 2015, Human Factors and Ergonomics Society.

  2. Towards an Entropy Stable Spectral Element Framework for Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Parsani, Matteo; Fisher, Travis C.; Nielsen, Eric J.

    2016-01-01

    Entropy stable (SS) discontinuous spectral collocation formulations of any order are developed for the compressible Navier-Stokes equations on hexahedral elements. Recent progress on two complementary efforts is presented. The first effort is a generalization of previous SS spectral collocation work to extend the applicable set of points from tensor product, Legendre-Gauss-Lobatto (LGL) to tensor product Legendre-Gauss (LG) points. The LG and LGL point formulations are compared on a series of test problems. Although being more costly to implement, it is shown that the LG operators are significantly more accurate on comparable grids. Both the LGL and LG operators are of comparable efficiency and robustness, as is demonstrated using test problems for which conventional FEM techniques suffer instability. The second effort generalizes previous SS work to include the possibility of p-refinement at non-conforming interfaces. A generalization of existing entropy stability machinery is developed to accommodate the nuances of fully multi-dimensional summation-by-parts (SBP) operators. The entropy stability of the compressible Euler equations on non-conforming interfaces is demonstrated using the newly developed LG operators and multi-dimensional interface interpolation operators.

  3. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs.

    PubMed

    McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F

    2015-01-01

    Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.

  4. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs

    PubMed Central

    Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.

    2015-01-01

    Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079

  5. Characterizing the Grape Transcriptome. Analysis of Expressed Sequence Tags from Multiple Vitis Species and Development of a Compendium of Gene Expression during Berry Development1[w

    PubMed Central

    Silva, Francisco Goes da; Iandolino, Alberto; Al-Kayal, Fadi; Bohlmann, Marlene C.; Cushman, Mary Ann; Lim, Hyunju; Ergul, Ali; Figueroa, Rubi; Kabuloglu, Elif K.; Osborne, Craig; Rowe, Joan; Tattersall, Elizabeth; Leslie, Anna; Xu, Jane; Baek, JongMin; Cramer, Grant R.; Cushman, John C.; Cook, Douglas R.

    2005-01-01

    We report the analysis and annotation of 146,075 expressed sequence tags from Vitis species. The majority of these sequences were derived from different cultivars of Vitis vinifera, comprising an estimated 25,746 unique contig and singleton sequences that survey transcription in various tissues and developmental stages and during biotic and abiotic stress. Putatively homologous proteins were identified for over 17,752 of the transcripts, with 1,962 transcripts further subdivided into one or more Gene Ontology categories. A simple structured vocabulary, with modules for plant genotype, plant development, and stress, was developed to describe the relationship between individual expressed sequence tags and cDNA libraries; the resulting vocabulary provides query terms to facilitate data mining within the context of a relational database. As a measure of the extent to which characterized metabolic pathways were encompassed by the data set, we searched for homologs of the enzymes leading from glycolysis, through the oxidative/nonoxidative pentose phosphate pathway, and into the general phenylpropanoid pathway. Homologs were identified for 65 of these 77 enzymes, with 86% of enzymatic steps represented by paralogous genes. Differentially expressed transcripts were identified by means of a stringent believability index cutoff of ≥98.4%. Correlation analysis and two-dimensional hierarchical clustering grouped these transcripts according to similarity of expression. In the broadest analysis, 665 differentially expressed transcripts were identified across 29 cDNA libraries, representing a range of developmental and stress conditions. The groupings revealed expected associations between plant developmental stages and tissue types, with the notable exception of abiotic stress treatments. A more focused analysis of flower and berry development identified 87 differentially expressed transcripts and provides the basis for a compendium that relates gene expression and annotation to previously characterized aspects of berry development and physiology. Comparison with published results for select genes, as well as correlation analysis between independent data sets, suggests that the inferred in silico patterns of expression are likely to be an accurate representation of transcript abundance for the conditions surveyed. Thus, the combined data set reveals the in silico expression patterns for hundreds of genes in V. vinifera, the majority of which have not been previously studied within this species. PMID:16219919

  6. Automated spot defect characterization in a field portable night vision goggle test set

    NASA Astrophysics Data System (ADS)

    Scopatz, Stephen; Ozten, Metehan; Aubry, Gilles; Arquetoux, Guillaume

    2018-05-01

    This paper discusses a new capability developed for and results from a field portable test set for Gen 2 and Gen 3 Image Intensifier (I2) tube-based Night Vision Goggles (NVG). A previous paper described the test set and the automated and semi-automated tests supported for NVGs including a Knife Edge MTF test to replace the operator's interpretation of the USAF 1951 resolution chart. The major improvement and innovation detailed in this paper is the use of image analysis algorithms to automate the characterization of spot defects of I² tubes with the same test set hardware previously presented. The original and still common Spot Defect Test requires the operator to look through the NVGs at target of concentric rings; compare the size of the defects to a chart and manually enter the results into a table based on the size and location of each defect; this is tedious and subjective. The prior semi-automated improvement captures and displays an image of the defects and the rings; allowing the operator determine the defects with less eyestrain; while electronically storing the image and the resulting table. The advanced Automated Spot Defect Test utilizes machine vision algorithms to determine the size and location of the defects, generates the result table automatically and then records the image and the results in a computer-generated report easily usable for verification. This is inherently a more repeatable process that ensures consistent spot detection independent of the operator. Results of across several NVGs will be presented.

  7. A selection model for accounting for publication bias in a full network meta-analysis.

    PubMed

    Mavridis, Dimitris; Welton, Nicky J; Sutton, Alex; Salanti, Georgia

    2014-12-30

    Copas and Shi suggested a selection model to explore the potential impact of publication bias via sensitivity analysis based on assumptions for the probability of publication of trials conditional on the precision of their results. Chootrakool et al. extended this model to three-arm trials but did not fully account for the implications of the consistency assumption, and their model is difficult to generalize for complex network structures with more than three treatments. Fitting these selection models within a frequentist setting requires maximization of a complex likelihood function, and identification problems are common. We have previously presented a Bayesian implementation of the selection model when multiple treatments are compared with a common reference treatment. We now present a general model suitable for complex, full network meta-analysis that accounts for consistency when adjusting results for publication bias. We developed a design-by-treatment selection model to describe the mechanism by which studies with different designs (sets of treatments compared in a trial) and precision may be selected for publication. We fit the model in a Bayesian setting because it avoids the numerical problems encountered in the frequentist setting, it is generalizable with respect to the number of treatments and study arms, and it provides a flexible framework for sensitivity analysis using external knowledge. Our model accounts for the additional uncertainty arising from publication bias more successfully compared to the standard Copas model or its previous extensions. We illustrate the methodology using a published triangular network for the failure of vascular graft or arterial patency. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Consumer perspectives about weight management services in a community pharmacy setting in NSW, Australia

    PubMed Central

    Um, Irene S.; Armour, Carol; Krass, Ines; Gill, Timothy; Chaar, Betty B.

    2012-01-01

    Abstract Background  Obesity is a public health challenge faced worldwide. Community pharmacists may be well placed to manage Australia’s obesity problem owing to their training, accessibility and trustworthiness. However, determining consumers’ needs is vital to the development of any new services or the evaluation of existing services. Objective  To explore Australian consumers’ perspectives regarding weight management services in the community pharmacy setting, including their past experiences and willingness to pay for a specific pharmacy‐based service. Design  An online cross‐sectional consumer survey was distributed through a marketing research company. The survey instrument comprised open‐ended and closed questions exploring consumers’ experiences of and preferences for weight management services in pharmacy. It also included an attitudinal measure, the Consumer Attitude to Pharmacy Weight Management Services (CAPWMS) scale. Setting and participants  A total of 403 consumers from New South Wales, Australia, completed the survey. Results  The majority of respondents had previously not sought a pharmacist’s advice regarding weight management. Those who had previously consulted a pharmacist were more willing to pay for and support pharmacy‐based services in the future. Most consumers considered pharmacists’ motivations to provide advice related to gaining profit from selling a product and expressed concerns about the perceived conflicts of interest. Participants also perceived pharmacists as lacking expertise and time. Conclusion  Although Australian consumers were willing to seek pharmacists’ advice about weight management, they perceived several barriers to the provision of weight management services in community pharmacy. If barriers are addressed, community pharmacies could be a viable and accessible setting to manage obesity. PMID:22646843

  9. Molecular characterization of the apical organ of the anthozoan Nematostella vectensis

    PubMed Central

    Sinigaglia, Chiara; Busengdal, Henriette; Lerner, Avi; Oliveri, Paola; Rentzsch, Fabian

    2015-01-01

    Apical organs are sensory structures present in many marine invertebrate larvae where they are considered to be involved in their settlement, metamorphosis and locomotion. In bilaterians they are characterised by a tuft of long cilia and receptor cells and they are associated with groups of neurons, but their relatively low morphological complexity and dispersed phylogenetic distribution have left their evolutionary relationship unresolved. Moreover, since apical organs are not present in the standard model organisms, their development and function are not well understood. To provide a foundation for a better understanding of this structure we have characterised the molecular composition of the apical organ of the sea anemone Nematostella vectensis. In a microarray-based comparison of the gene expression profiles of planulae with either a wildtype or an experimentally expanded apical organ, we identified 78 evolutionarily conserved genes, which are predominantly or specifically expressed in the apical organ of Nematostella. This gene set comprises signalling molecules, transcription factors, structural and metabolic genes. The majority of these genes, including several conserved, but previously uncharacterized ones, are potentially involved in different aspects of the development or function of the long cilia of the apical organ. To demonstrate the utility of this gene set for comparative analyses, we further analysed the expression of a subset of previously uncharacterized putative orthologs in sea urchin larvae and detected expression for twelve out of eighteen of them in the apical domain. Our study provides a molecular characterization of the apical organ of Nematostella and represents an informative tool for future studies addressing the development, function and evolutionary history of apical organ cells. PMID:25478911

  10. Twin-Twin Transfusion Syndrome: study protocol for developing, disseminating, and implementing a core outcome set.

    PubMed

    Khalil, Asma; Perry, Helen; Duffy, James; Reed, Keith; Baschat, Ahmet; Deprest, Jan; Hecher, Kurt; Lewi, Liesbeth; Lopriore, Enrico; Oepkes, Dick

    2017-07-14

    Twin-Twin Transfusion Syndrome (TTTS) is associated with an increased risk of perinatal mortality and morbidity. Several treatment interventions have been described for TTTS, including fetoscopic laser surgery, amnioreduction, septostomy, expectant management, and pregnancy termination. Over the last decade, fetoscopic laser surgery has become the primary treatment. The literature to date reports on many different outcomes, making it difficult to compare results or combine data from individual studies, limiting the value of research to guide clinical practice. With the advent and ongoing development of new therapeutic techniques, this is more important than ever. The development and use of a core outcome set has been proposed to address these issues, prioritising outcomes important to the key stakeholders, including patients. We aim to produce, disseminate, and implement a core outcome set for TTTS. An international steering group has been established to oversee the development of this core outcome set. This group includes healthcare professionals, researchers and patients. A systematic review is planned to identify previously reported outcomes following treatment for TTTS. Following completion, the identified outcomes will be evaluated by stakeholders using an international, multi-perspective online modified Delphi method to build consensus on core outcomes. This method encourages the participants towards consensus 'core' outcomes. All key stakeholders will be invited to participate. The steering group will then hold a consensus meeting to discuss results and form a core outcome set to be introduced and measured. Once core outcomes have been agreed, the next step will be to determine how they should be measured, disseminated, and implemented within an international context. The development, dissemination, and implementation of a core outcome set in TTTS will enable its use in future clinical trials, systematic reviews and clinical practice guidelines. This is likely to advance the quality of research studies and their effective use in order to guide clinical practice and improve patient care, maternal, short-term perinatal outcomes and long-term neurodevelopmental outcomes. Core Outcome Measures in Effectiveness Trials (COMET), 921 Registered on July 2016. International Prospective Register of Systematic Reviews (PROSPERO), CRD42016043999 . Registered on 2 August 2016.

  11. Image Algebra Matlab language version 2.3 for image processing and compression research

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric

    2010-08-01

    Image algebra is a rigorous, concise notation that unifies linear and nonlinear mathematics in the image domain. Image algebra was developed under DARPA and US Air Force sponsorship at University of Florida for over 15 years beginning in 1984. Image algebra has been implemented in a variety of programming languages designed specifically to support the development of image processing and computer vision algorithms and software. The University of Florida has been associated with development of the languages FORTRAN, Ada, Lisp, and C++. The latter implementation involved a class library, iac++, that supported image algebra programming in C++. Since image processing and computer vision are generally performed with operands that are array-based, the Matlab™ programming language is ideal for implementing the common subset of image algebra. Objects include sets and set operations, images and operations on images, as well as templates and image-template convolution operations. This implementation, called Image Algebra Matlab (IAM), has been found to be useful for research in data, image, and video compression, as described herein. Due to the widespread acceptance of the Matlab programming language in the computing community, IAM offers exciting possibilities for supporting a large group of users. The control over an object's computational resources provided to the algorithm designer by Matlab means that IAM programs can employ versatile representations for the operands and operations of the algebra, which are supported by the underlying libraries written in Matlab. In a previous publication, we showed how the functionality of IAC++ could be carried forth into a Matlab implementation, and provided practical details of a prototype implementation called IAM Version 1. In this paper, we further elaborate the purpose and structure of image algebra, then present a maturing implementation of Image Algebra Matlab called IAM Version 2.3, which extends the previous implementation of IAM to include polymorphic operations over different point sets, as well as recursive convolution operations and functional composition. We also show how image algebra and IAM can be employed in image processing and compression research, as well as algorithm development and analysis.

  12. Mass detection in digital breast tomosynthesis: Deep convolutional neural network with transfer learning from mammography

    PubMed Central

    Chan, Heang-Ping; Hadjiiski, Lubomir; Helvie, Mark A.; Wei, Jun; Cha, Kenny

    2016-01-01

    Purpose: Develop a computer-aided detection (CAD) system for masses in digital breast tomosynthesis (DBT) volume using a deep convolutional neural network (DCNN) with transfer learning from mammograms. Methods: A data set containing 2282 digitized film and digital mammograms and 324 DBT volumes were collected with IRB approval. The mass of interest on the images was marked by an experienced breast radiologist as reference standard. The data set was partitioned into a training set (2282 mammograms with 2461 masses and 230 DBT views with 228 masses) and an independent test set (94 DBT views with 89 masses). For DCNN training, the region of interest (ROI) containing the mass (true positive) was extracted from each image. False positive (FP) ROIs were identified at prescreening by their previously developed CAD systems. After data augmentation, a total of 45 072 mammographic ROIs and 37 450 DBT ROIs were obtained. Data normalization and reduction of non-uniformity in the ROIs across heterogeneous data was achieved using a background correction method applied to each ROI. A DCNN with four convolutional layers and three fully connected (FC) layers was first trained on the mammography data. Jittering and dropout techniques were used to reduce overfitting. After training with the mammographic ROIs, all weights in the first three convolutional layers were frozen, and only the last convolution layer and the FC layers were randomly initialized again and trained using the DBT training ROIs. The authors compared the performances of two CAD systems for mass detection in DBT: one used the DCNN-based approach and the other used their previously developed feature-based approach for FP reduction. The prescreening stage was identical in both systems, passing the same set of mass candidates to the FP reduction stage. For the feature-based CAD system, 3D clustering and active contour method was used for segmentation; morphological, gray level, and texture features were extracted and merged with a linear discriminant classifier to score the detected masses. For the DCNN-based CAD system, ROIs from five consecutive slices centered at each candidate were passed through the trained DCNN and a mass likelihood score was generated. The performances of the CAD systems were evaluated using free-response ROC curves and the performance difference was analyzed using a non-parametric method. Results: Before transfer learning, the DCNN trained only on mammograms with an AUC of 0.99 classified DBT masses with an AUC of 0.81 in the DBT training set. After transfer learning with DBT, the AUC improved to 0.90. For breast-based CAD detection in the test set, the sensitivity for the feature-based and the DCNN-based CAD systems was 83% and 91%, respectively, at 1 FP/DBT volume. The difference between the performances for the two systems was statistically significant (p-value < 0.05). Conclusions: The image patterns learned from the mammograms were transferred to the mass detection on DBT slices through the DCNN. This study demonstrated that large data sets collected from mammography are useful for developing new CAD systems for DBT, alleviating the problem and effort of collecting entirely new large data sets for the new modality. PMID:27908154

  13. Query-based biclustering of gene expression data using Probabilistic Relational Models.

    PubMed

    Zhao, Hui; Cloots, Lore; Van den Bulcke, Tim; Wu, Yan; De Smet, Riet; Storms, Valerie; Meysman, Pieter; Engelen, Kristof; Marchal, Kathleen

    2011-02-15

    With the availability of large scale expression compendia it is now possible to view own findings in the light of what is already available and retrieve genes with an expression profile similar to a set of genes of interest (i.e., a query or seed set) for a subset of conditions. To that end, a query-based strategy is needed that maximally exploits the coexpression behaviour of the seed genes to guide the biclustering, but that at the same time is robust against the presence of noisy genes in the seed set as seed genes are often assumed, but not guaranteed to be coexpressed in the queried compendium. Therefore, we developed ProBic, a query-based biclustering strategy based on Probabilistic Relational Models (PRMs) that exploits the use of prior distributions to extract the information contained within the seed set. We applied ProBic on a large scale Escherichia coli compendium to extend partially described regulons with potentially novel members. We compared ProBic's performance with previously published query-based biclustering algorithms, namely ISA and QDB, from the perspective of bicluster expression quality, robustness of the outcome against noisy seed sets and biological relevance.This comparison learns that ProBic is able to retrieve biologically relevant, high quality biclusters that retain their seed genes and that it is particularly strong in handling noisy seeds. ProBic is a query-based biclustering algorithm developed in a flexible framework, designed to detect biologically relevant, high quality biclusters that retain relevant seed genes even in the presence of noise or when dealing with low quality seed sets.

  14. Development, refinement, and testing of a short term solar flare prediction algorithm

    NASA Technical Reports Server (NTRS)

    Smith, Jesse B., Jr.

    1993-01-01

    During the period included in this report, the expenditure of time and effort, and progress toward performance of the tasks and accomplishing the goals set forth in the two year research grant proposal, consisted primarily of calibration and analysis of selected data sets. The heliographic limits of 30 degrees from central meridian were continued. As previously reported, all analyses are interactive and are performed by the Principal Investigator. It should also be noted that the analysis time involved by the Principal Investigator during this reporting period was limited, partially due to illness and partially resulting from other uncontrollable factors. The calibration technique (as developed by MSFC solar scientists), incorporates sets of constants which vary according to the wave length of the observation data set. One input constant is then varied interactively to correct for observing conditions, etc., to result in a maximum magnetic field strength (in the calibrated data), based on a separate analysis. There is some insecurity in the methodology and the selection of variables to yield the most self-consistent results for variable maximum field strengths and for variable observing/atmospheric conditions. Several data sets were analyzed using differing constant sets, and separate analyses to differing maximum field strength - toward standardizing methodology and technique for the most self-consistent results for the large number of cases. It may be necessary to recalibrate some of the analyses, but the sc analyses are retained on the optical disks and can still be used with recalibration where necessary. Only the extracted parameters will be changed.

  15. Quantification of print, radio and television exposure among previous blood donors in Kenya: an opportunity for encouraging repeat donation in a resource-limited setting?

    PubMed

    Basavaraju, S V; Mwangi, J; Kellogg, T A; Odawo, L; Marum, L H

    2010-10-01

    Blood services in sub-Saharan Africa experience blood shortages and low retention of voluntary, non-remunerated donors. To boost collections by encouraging repeat donations, the Kenya National Blood Transfusion Service is exploring the likelihood of reaching previous donors through targeted print, radio and television advertising. We analysed data from a national AIDS Indicator Survey to determine whether previous donors have significant exposure to media. Respondents reporting history of blood donation had significantly higher exposure to print, radio and television media than those without history of blood donation. Targeted media campaigns encouraging repeat donation are likely to reach previous donors even in resource-limited settings.

  16. [Prediction of the total Japanese cedar pollen counts based on male flower-setting conditions of standard trees].

    PubMed

    Yuta, Atsushi; Ukai, Kotaro; Sakakura, Yasuo; Tani, Hideshi; Matsuda, Fukiko; Yang, Tian-qun; Majima, Yuichi

    2002-07-01

    We made a prediction of the Japanese cedar (Cryptomeria japonica) pollen counts at Tsu city based on male flower-setting conditions of standard trees. The 69 standard trees from 23 kinds of clones, planted at Mie Prefecture Science and Technology Promotion Center (Hakusan, Mie) in 1964, were selected. Male flower-setting conditions for 276 faces (69 trees x 4 points of the compass) were scored from 0 to 3. The average of scores and total pollen counts from 1988 to 2000 was analyzed. As the results, the average scores from standard trees and total pollen counts except two mass pollen-scattered years in 1995 and 2000 had a positive correlation (r = 0.914) by linear function. On the mass pollen-scattered years, pollen counts were influenced from the previous year. Therefore, the score of the present year minus that of the previous year were used for analysis. The average scores from male flower-setting conditions and pollen counts had a strong positive correlation (r = 0.994) when positive scores by taking account of the previous year were analyzed. We conclude that prediction of pollen counts are possible based on the male flower-setting conditions of standard trees.

  17. Prediction of lipoprotein signal peptides in Gram-negative bacteria.

    PubMed

    Juncker, Agnieszka S; Willenbrock, Hanni; Von Heijne, Gunnar; Brunak, Søren; Nielsen, Henrik; Krogh, Anders

    2003-08-01

    A method to predict lipoprotein signal peptides in Gram-negative Eubacteria, LipoP, has been developed. The hidden Markov model (HMM) was able to distinguish between lipoproteins (SPaseII-cleaved proteins), SPaseI-cleaved proteins, cytoplasmic proteins, and transmembrane proteins. This predictor was able to predict 96.8% of the lipoproteins correctly with only 0.3% false positives in a set of SPaseI-cleaved, cytoplasmic, and transmembrane proteins. The results obtained were significantly better than those of previously developed methods. Even though Gram-positive lipoprotein signal peptides differ from Gram-negatives, the HMM was able to identify 92.9% of the lipoproteins included in a Gram-positive test set. A genome search was carried out for 12 Gram-negative genomes and one Gram-positive genome. The results for Escherichia coli K12 were compared with new experimental data, and the predictions by the HMM agree well with the experimentally verified lipoproteins. A neural network-based predictor was developed for comparison, and it gave very similar results. LipoP is available as a Web server at www.cbs.dtu.dk/services/LipoP/.

  18. Prediction of lipoprotein signal peptides in Gram-negative bacteria

    PubMed Central

    Juncker, Agnieszka S.; Willenbrock, Hanni; von Heijne, Gunnar; Brunak, Søren; Nielsen, Henrik; Krogh, Anders

    2003-01-01

    A method to predict lipoprotein signal peptides in Gram-negative Eubacteria, LipoP, has been developed. The hidden Markov model (HMM) was able to distinguish between lipoproteins (SPaseII-cleaved proteins), SPaseI-cleaved proteins, cytoplasmic proteins, and transmembrane proteins. This predictor was able to predict 96.8% of the lipoproteins correctly with only 0.3% false positives in a set of SPaseI-cleaved, cytoplasmic, and transmembrane proteins. The results obtained were significantly better than those of previously developed methods. Even though Gram-positive lipoprotein signal peptides differ from Gram-negatives, the HMM was able to identify 92.9% of the lipoproteins included in a Gram-positive test set. A genome search was carried out for 12 Gram-negative genomes and one Gram-positive genome. The results for Escherichia coli K12 were compared with new experimental data, and the predictions by the HMM agree well with the experimentally verified lipoproteins. A neural network-based predictor was developed for comparison, and it gave very similar results. LipoP is available as a Web server at www.cbs.dtu.dk/services/LipoP/. PMID:12876315

  19. Jet-Surface Interaction Test: Flow Measurements Results

    NASA Technical Reports Server (NTRS)

    Brown, Cliff; Wernet, Mark

    2014-01-01

    Modern aircraft design often puts the engine exhaust in close proximity to the airframe surfaces. Aircraft noise prediction tools must continue to develop in order to meet the challenges these aircraft present. The Jet-Surface Interaction Tests have been conducted to provide a comprehensive quality set of experimental data suitable for development and validation of these exhaust noise prediction methods. Flow measurements have been acquired using streamwise and cross-stream particle image velocimetry (PIV) and fluctuating surface pressure data acquired using flush mounted pressure transducers near the surface trailing edge. These data combined with previously reported far-field and phased array noise measurements represent the first step toward the experimental data base. These flow data are particularly applicable to development of noise prediction methods which rely on computational fluid dynamics to uncover the flow physics. A representative sample of the large flow data set acquired is presented here to show how a surface near a jet affects the turbulent kinetic energy in the plume, the spatial relationship between the jet plume and surface needed to generate surface trailing-edge noise, and differences between heated and unheated jet flows with respect to surfaces.

  20. Artificial neural network models for prediction of cardiovascular autonomic dysfunction in general Chinese population

    PubMed Central

    2013-01-01

    Background The present study aimed to develop an artificial neural network (ANN) based prediction model for cardiovascular autonomic (CA) dysfunction in the general population. Methods We analyzed a previous dataset based on a population sample consisted of 2,092 individuals aged 30–80 years. The prediction models were derived from an exploratory set using ANN analysis. Performances of these prediction models were evaluated in the validation set. Results Univariate analysis indicated that 14 risk factors showed statistically significant association with CA dysfunction (P < 0.05). The mean area under the receiver-operating curve was 0.762 (95% CI 0.732–0.793) for prediction model developed using ANN analysis. The mean sensitivity, specificity, positive and negative predictive values were similar in the prediction models was 0.751, 0.665, 0.330 and 0.924, respectively. All HL statistics were less than 15.0. Conclusion ANN is an effective tool for developing prediction models with high value for predicting CA dysfunction among the general population. PMID:23902963

  1. Efficient generation of connectivity in neuronal networks from simulator-independent descriptions

    PubMed Central

    Djurfeldt, Mikael; Davison, Andrew P.; Eppler, Jochen M.

    2014-01-01

    Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed. We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeler's needs. We have used the connection generator interface to connect C++ and Python implementations of the previously described connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modeling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface. PMID:24795620

  2. Pistil Starch Reserves at Anthesis Correlate with Final Flower Fate in Avocado (Persea americana)

    PubMed Central

    Alcaraz, María Librada; Hormaza, José Ignacio; Rodrigo, Javier

    2013-01-01

    A common observation in different plant species is a massive abscission of flowers and fruitlets even after adequate pollination, but little is known as to the reason for this drop. Previous research has shown the importance of nutritive reserves accumulated in the flower on fertilization success and initial fruit development but direct evidence has been elusive. Avocado (Persea americana) is an extreme case of a species with a very low fruit to flower ratio. In this work, the implications of starch content in the avocado flower on the subsequent fruit set are explored. Firstly, starch content in individual ovaries was analysed from two populations of flowers with a different fruit set capacity showing that the flowers from the population that resulted in a higher percentage of fruit set contained significantly more starch. Secondly, in a different set of flowers, the style of each flower was excised one day after pollination, once the pollen tubes had reached the base of the style, and individually fixed for starch content analysis under the microscope once the fate of its corresponding ovary (that remained in the tree) was known. A high variability in starch content in the style was found among flowers, with some flowers having starch content up to 1,000 times higher than others, and the flowers that successfully developed into fruits presented significantly higher starch content in the style at anthesis than those that abscised. The relationship between starch content in the ovary and the capacity of set of the flower together with the correlation found between the starch content in the style and the fate of the ovary support the hypothesis that the carbohydrate reserves accumulated in the flower at anthesis are related to subsequent abscission or retention of the developing fruit. PMID:24167627

  3. Pistil starch reserves at anthesis correlate with final flower fate in avocado (Persea americana).

    PubMed

    Alcaraz, María Librada; Hormaza, José Ignacio; Rodrigo, Javier

    2013-01-01

    A common observation in different plant species is a massive abscission of flowers and fruitlets even after adequate pollination, but little is known as to the reason for this drop. Previous research has shown the importance of nutritive reserves accumulated in the flower on fertilization success and initial fruit development but direct evidence has been elusive. Avocado (Persea americana) is an extreme case of a species with a very low fruit to flower ratio. In this work, the implications of starch content in the avocado flower on the subsequent fruit set are explored. Firstly, starch content in individual ovaries was analysed from two populations of flowers with a different fruit set capacity showing that the flowers from the population that resulted in a higher percentage of fruit set contained significantly more starch. Secondly, in a different set of flowers, the style of each flower was excised one day after pollination, once the pollen tubes had reached the base of the style, and individually fixed for starch content analysis under the microscope once the fate of its corresponding ovary (that remained in the tree) was known. A high variability in starch content in the style was found among flowers, with some flowers having starch content up to 1,000 times higher than others, and the flowers that successfully developed into fruits presented significantly higher starch content in the style at anthesis than those that abscised. The relationship between starch content in the ovary and the capacity of set of the flower together with the correlation found between the starch content in the style and the fate of the ovary support the hypothesis that the carbohydrate reserves accumulated in the flower at anthesis are related to subsequent abscission or retention of the developing fruit.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unseren, M.A.

    The report reviews a method for modeling and controlling two serial link manipulators which mutually lift and transport a rigid body object in a three dimensional workspace. A new vector variable is introduced which parameterizes the internal contact force controlled degrees of freedom. A technique for dynamically distributing the payload between the manipulators is suggested which yields a family of solutions for the contact forces and torques the manipulators impart to the object. A set of rigid body kinematic constraints which restricts the values of the joint velocities of both manipulators is derived. A rigid body dynamical model for themore » closed chain system is first developed in the joint space. The model is obtained by generalizing the previous methods for deriving the model. The joint velocity and acceleration variables in the model are expressed in terms of independent pseudovariables. The pseudospace model is transformed to obtain reduced order equations of motion and a separate set of equations governing the internal components of the contact forces and torques. A theoretic control architecture is suggested which explicitly decouples the two sets of equations comprising the model. The controller enables the designer to develop independent, non-interacting control laws for the position control and internal force control of the system.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unseren, M.A.

    The paper reviews a method for modeling and controlling two serial link manipulators which mutually lift and transport a rigid body object in a three dimensional workspace. A new vector variable is introduced which parameterizes the internal contact force controlled degrees of freedom. A technique for dynamically distributing the payload between the manipulators is suggested which yields a family of solutions for the contact forces and torques the manipulators impart to the object. A set of rigid body kinematic constraints which restrict the values of the joint velocities of both manipulators is derived. A rigid body dynamical model for themore » closed chain system is first developed in the joint space. The model is obtained by generalizing the previous methods for deriving the model. The joint velocity and acceleration variables in the model are expressed in terms of independent pseudovariables. The pseudospace model is transformed to obtain reduced order equations of motion and a separate set of equations governing the internal components of the contact forces and torques. A theoretic control architecture is suggested which explicitly decouples the two sets of equations comprising the model. The controller enables the designer to develop independent, non-interacting control laws for the position control and internal force control of the system.« less

  6. The upper bound of abutment scour defined by selected laboratory and field data

    USGS Publications Warehouse

    Benedict, Stephen; Caldwell, Andral W.

    2015-01-01

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted a field investigation of abutment scour in South Carolina and used that data to develop envelope curves defining the upper bound of abutment scour. To expand upon this previous work, an additional cooperative investigation was initiated to combine the South Carolina data with abutment-scour data from other sources and evaluate the upper bound of abutment scour with the larger data set. To facilitate this analysis, a literature review was made to identify potential sources of published abutment-scour data, and selected data, consisting of 446 laboratory and 331 field measurements, were compiled for the analysis. These data encompassed a wide range of laboratory and field conditions and represent field data from 6 states within the United States. The data set was used to evaluate the South Carolina abutment-scour envelope curves. Additionally, the data were used to evaluate a dimensionless abutment-scour envelope curve developed by Melville (1992), highlighting the distinct difference in the upper bound for laboratory and field data. The envelope curves evaluated in this investigation provide simple but useful tools for assessing the potential maximum abutment-scour depth in the field setting.

  7. Developing Electronic Health Record (EHR) Strategies Related to Health Center Patients' Social Determinants of Health.

    PubMed

    Gold, Rachel; Cottrell, Erika; Bunce, Arwen; Middendorf, Mary; Hollombe, Celine; Cowburn, Stuart; Mahr, Peter; Melgar, Gerardo

    2017-01-01

    "Social determinants of heath" (SDHs) are nonclinical factors that profoundly affect health. Helping community health centers (CHCs) document patients' SDH data in electronic health records (EHRs) could yield substantial health benefits, but little has been reported about CHCs' development of EHR-based tools for SDH data collection and presentation. We worked with 27 diverse CHC stakeholders to develop strategies for optimizing SDH data collection and presentation in their EHR, and approaches for integrating SDH data collection and the use of those data (eg, through referrals to community resources) into CHC workflows. We iteratively developed a set of EHR-based SDH data collection, summary, and referral tools for CHCs. We describe considerations that arose while developing the tools and present some preliminary lessons learned. Standardizing SDH data collection and presentation in EHRs could lead to improved patient and population health outcomes in CHCs and other care settings. We know of no previous reports of processes used to develop similar tools. This article provides an example of 1 such process. Lessons from our process may be useful to health care organizations interested in using EHRs to collect and act on SDH data. Research is needed to empirically test the generalizability of these lessons. © Copyright 2017 by the American Board of Family Medicine.

  8. Changing Habits of Practice

    PubMed Central

    Bowen, Judith L; Salerno, Stephen M; Chamberlain, John K; Eckstrom, Elizabeth; Chen, Helen L; Brandenburg, Suzanne

    2005-01-01

    Purpose The majority of health care, both for acute and chronic conditions, is delivered in the ambulatory setting. Despite repeated proposals for change, the majority of internal medicine residency training still occurs in the inpatient setting. Substantial changes in ambulatory education are needed to correct the current imbalance. To assist educators and policy makers in this process, this paper reviews the literature on ambulatory education and makes recommendations for change. Methods The authors searched the Medline, Psychlit, and ERIC databases from 2000 to 2004 for studies that focused specifically on curriculum, teaching, and evaluation of internal medicine residents in the ambulatory setting to update previous reviews. Studies had to contain primary data and were reviewed for methodological rigor and relevance. Results Fifty-five studies met criteria for review. Thirty-five of the studies focused on specific curricular areas and 11 on ambulatory teaching methods. Five involved evaluating performance and 4 focused on structural issues. No study evaluated the overall effectiveness of ambulatory training or investigated the effects of current resident continuity clinic microsystems on education. Conclusion This updated review continues to identify key deficiencies in ambulatory training curriculum and faculty skills. The authors make several recommendations: (1) Make training in the ambulatory setting a priority. (2) Address systems problems in practice environments. (3) Create learning experiences appropriate to the resident's level of development. (4) Teach and evaluate in the examination room. (5) Expand subspecialty-based training to the ambulatory setting. (6) Make faculty development a priority. (7) Create and fund multiinstitutional educational research consortia. PMID:16423112

  9. Discovering relationships between nuclear receptor signaling pathways, genes, and tissues in Transcriptomine.

    PubMed

    Becnel, Lauren B; Ochsner, Scott A; Darlington, Yolanda F; McOwiti, Apollo; Kankanamge, Wasula H; Dehart, Michael; Naumov, Alexey; McKenna, Neil J

    2017-04-25

    We previously developed a web tool, Transcriptomine, to explore expression profiling data sets involving small-molecule or genetic manipulations of nuclear receptor signaling pathways. We describe advances in biocuration, query interface design, and data visualization that enhance the discovery of uncharacterized biology in these pathways using this tool. Transcriptomine currently contains about 45 million data points encompassing more than 2000 experiments in a reference library of nearly 550 data sets retrieved from public archives and systematically curated. To make the underlying data points more accessible to bench biologists, we classified experimental small molecules and gene manipulations into signaling pathways and experimental tissues and cell lines into physiological systems and organs. Incorporation of these mappings into Transcriptomine enables the user to readily evaluate tissue-specific regulation of gene expression by nuclear receptor signaling pathways. Data points from animal and cell model experiments and from clinical data sets elucidate the roles of nuclear receptor pathways in gene expression events accompanying various normal and pathological cellular processes. In addition, data sets targeting non-nuclear receptor signaling pathways highlight transcriptional cross-talk between nuclear receptors and other signaling pathways. We demonstrate with specific examples how data points that exist in isolation in individual data sets validate each other when connected and made accessible to the user in a single interface. In summary, Transcriptomine allows bench biologists to routinely develop research hypotheses, validate experimental data, or model relationships between signaling pathways, genes, and tissues. Copyright © 2017, American Association for the Advancement of Science.

  10. Improving record linkage performance in the presence of missing linkage data.

    PubMed

    Ong, Toan C; Mannino, Michael V; Schilling, Lisa M; Kahn, Michael G

    2014-12-01

    Existing record linkage methods do not handle missing linking field values in an efficient and effective manner. The objective of this study is to investigate three novel methods for improving the accuracy and efficiency of record linkage when record linkage fields have missing values. By extending the Fellegi-Sunter scoring implementations available in the open-source Fine-grained Record Linkage (FRIL) software system we developed three novel methods to solve the missing data problem in record linkage, which we refer to as: Weight Redistribution, Distance Imputation, and Linkage Expansion. Weight Redistribution removes fields with missing data from the set of quasi-identifiers and redistributes the weight from the missing attribute based on relative proportions across the remaining available linkage fields. Distance Imputation imputes the distance between the missing data fields rather than imputing the missing data value. Linkage Expansion adds previously considered non-linkage fields to the linkage field set to compensate for the missing information in a linkage field. We tested the linkage methods using simulated data sets with varying field value corruption rates. The methods developed had sensitivity ranging from .895 to .992 and positive predictive values (PPV) ranging from .865 to 1 in data sets with low corruption rates. Increased corruption rates lead to decreased sensitivity for all methods. These new record linkage algorithms show promise in terms of accuracy and efficiency and may be valuable for combining large data sets at the patient level to support biomedical and clinical research. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Plasmodium falciparum PfSET7: enzymatic characterization and cellular localization of a novel protein methyltransferase in sporozoite, liver and erythrocytic stage parasites

    PubMed Central

    Chen, Patty B.; Ding, Shuai; Zanghì, Gigliola; Soulard, Valérie; DiMaggio, Peter A.; Fuchter, Matthew J.; Mecheri, Salah; Mazier, Dominique; Scherf, Artur; Malmquist, Nicholas A.

    2016-01-01

    Epigenetic control via reversible histone methylation regulates transcriptional activation throughout the malaria parasite genome, controls the repression of multi-copy virulence gene families and determines sexual stage commitment. Plasmodium falciparum encodes ten predicted SET domain-containing protein methyltransferases, six of which have been shown to be refractory to knock-out in blood stage parasites. We have expressed and purified the first recombinant malaria methyltransferase in sufficient quantities to perform a full enzymatic characterization and reveal the ill-defined PfSET7 is an AdoMet-dependent histone H3 lysine methyltransferase with highest activity towards lysines 4 and 9. Steady-state kinetics of the PfSET7 enzyme are similar to previously characterized histone methyltransferase enzymes from other organisms, however, PfSET7 displays specific protein substrate preference towards nucleosomes with pre-existing histone H3 lysine 14 acetylation. Interestingly, PfSET7 localizes to distinct cytoplasmic foci adjacent to the nucleus in erythrocytic and liver stage parasites, and throughout the cytoplasm in salivary gland sporozoites. Characterized recombinant PfSET7 now allows for target based inhibitor discovery. Specific PfSET7 inhibitors can aid in further investigating the biological role of this specific methyltransferase in transmission, hepatic and blood stage parasites, and may ultimately lead to the development of suitable antimalarial drug candidates against this novel class of essential parasite enzymes. PMID:26902486

  12. A model of strategic marketing alliances for hospices: vertical, internal, osmotic alliances and the complete model.

    PubMed

    Starnes, B J; Self, D R

    1999-01-01

    This article develops two previous research efforts. William J. Winston (1994, 1995) has proposed a set of strategies by which health care organizations can benefit from forging strategic alliances. Raadt and Self (1997) have proposed a classification model of alliances including horizontal, vertical, internal, and osmotic. In the second of two articles, this paper presents a model of vertical, internal, and osmotic alliances. Advantages and disadvantages of each are discussed. Finally, the complete alliance system model is presented.

  13. Practical uncertainty reduction and quantification in shock physics measurements

    DOE PAGES

    Akin, M. C.; Nguyen, J. H.

    2015-04-20

    We report the development of a simple error analysis sampling method for identifying intersections and inflection points to reduce total uncertainty in experimental data. This technique was used to reduce uncertainties in sound speed measurements by 80% over conventional methods. Here, we focused on its impact on a previously published set of Mo sound speed data and possible implications for phase transition and geophysical studies. However, this technique's application can be extended to a wide range of experimental data.

  14. Military Construction Naval Reserve Justification Data Submitted to Congress FY 1985.

    DTIC Science & Technology

    1984-02-01

    FOR FOOT TRAFFIC. THE STRUCTURAL PILES ARE REDUCED TO STALACTITE - STALAGMITE CONFIGURATION. THIS STATION HAS NO PIER TO SUPPORT SHIPS. THERE IS NO...structure, object or setting listed in the National Register of Historic Places except as noted on DD Form 1391. Environmental Protection In accordance...ESTABLISHED OR DEVELOPED UNDER THIS CHAPTER WHICH ARE NOT OTHERWISE AUTHORIZED BE LAW. FORM 44(M 1 OEC Kl<l9l MMM>«4M PREVIOUS EDITIONS MAY SE USED

  15. Wood industrial application for quality control using image processing

    NASA Astrophysics Data System (ADS)

    Ferreira, M. J. O.; Neves, J. A. C.

    1994-11-01

    This paper describes an application of image processing for the furniture industry. It uses an input data, images acquired directly from wood planks where defects were previously marked by an operator. A set of image processing algorithms separates and codes each defect and detects a polygonal approach of the line representing them. For such a purpose we developed a pattern classification algorithm and a new technique of segmenting defects by carving the convex hull of the binary shape representing each isolated defect.

  16. Sediment and Hydraulic Measurements with Computed Bed Load on the Missouri River, Sioux City to Hermann, 2014

    DTIC Science & Technology

    2017-05-01

    large sand bed river, with seven sites representing increasingly larger flows along the river length. The data set will be very useful for additional...quantity, quality , and types of data that can be obtained for the study of natural phenomenon. The study of riverine sedimentation is no exception...detail than in previous years. Additionally, new methodologies have been developed that allow the computation of bed-load transport in large sand bed

  17. Human Myoblast Fusion Requires Expression of Functional Inward Rectifier Kir2.1 Channels

    PubMed Central

    Fischer-Lougheed, Jacqueline; Liu, Jian-Hui; Espinos, Estelle; Mordasini, David; Bader, Charles R.; Belin, Dominique; Bernheim, Laurent

    2001-01-01

    Myoblast fusion is essential to skeletal muscle development and repair. We have demonstrated previously that human myoblasts hyperpolarize, before fusion, through the sequential expression of two K+ channels: an ether-à-go-go and an inward rectifier. This hyperpolarization is a prerequisite for fusion, as it sets the resting membrane potential in a range at which Ca2+ can enter myoblasts and thereby trigger fusion via a window current through α1H T channels. PMID:11352930

  18. The integration of a mesh reflector to a 15-foot box truss structure. Task 3: Box truss analysis and technology development

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Thiemet, W. F.; Morosow, G.

    1987-01-01

    To demonstrate the design and integration of a reflective mesh surface to a deployable truss structure, a mesh reflector was installed on a 15 foot box truss cube. The specific features demonstrated include: (1) sewing seams in reflective mesh; (2) mesh stretching to desired preload; (3) installation of surface tie cords; (4) installation of reflective surface on truss; (5) setting of reflective surface; (6) verification of surface shape/accuracy; (7) storage and deployment; (8) repeatability of reflector surface; and (9) comparison of surface with predicted shape using analytical methods developed under a previous task.

  19. Development of year 2020 goals for the National HIV/AIDS Strategy for the United States.

    PubMed

    Holtgrave, David R

    2014-04-01

    In July, 2010, President Barack Obama released the National HIV/AIDS Strategy (NHAS). The NHAS set forth ambitious goals for the year 2015. These goals were potentially achievable had the appropriate level of resources been invested; however, investment at the necessary scale has not been made and the 2015 goals now may well be out of reach. Therefore, we propose that an updated NHAS be developed with goals for the year 2020 clearly articulated. For the purposes of fostering discussion on this important topic, we propose bold yet achievable quantitative 2020 goals based on previously published economic and mathematical modeling analyses.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbott, Robert G.; Forsythe, James C.

    Adaptive Thinking has been defined here as the capacity to recognize when a course of action that may have previously been effective is no longer effective and there is need to adjust strategy. Research was undertaken with human test subjects to identify the factors that contribute to adaptive thinking. It was discovered that those most effective in settings that call for adaptive thinking tend to possess a superior capacity to quickly and effectively generate possible courses of action, as measured using the Category Generation test. Software developed for this research has been applied to develop capabilities enabling analysts to identifymore » crucial factors that are predictive of outcomes in fore-on-force simulation exercises.« less

  1. Squamous cell carcinoma arising in Hailey-Hailey disease of the vulva.

    PubMed

    Cockayne, S E; Rassl, D M; Thomas, S E

    2000-03-01

    A 61-year-old woman, who was known to have Hailey-Hailey disease, presented with increasing vulval soreness. Biopsy showed vulval intraepithelial neoplasia (VIN) 3 and subsequent histology from a vulvectomy specimen showed extensive VIN with early invasive squamous cell carcinoma. This may be another example of chronic inflammation of the vulval area leading to the development of squamous cell carcinoma. However, in this case, chronic human papillomavirus may also have played a part, leading to VIN and reactivation of the Hailey-Hailey disease. We can find no previous reports of squamous cell carcinoma developing in the setting of Hailey-Hailey disease.

  2. Automated discovery of local search heuristics for satisfiability testing.

    PubMed

    Fukunaga, Alex S

    2008-01-01

    The development of successful metaheuristic algorithms such as local search for a difficult problem such as satisfiability testing (SAT) is a challenging task. We investigate an evolutionary approach to automating the discovery of new local search heuristics for SAT. We show that several well-known SAT local search algorithms such as Walksat and Novelty are composite heuristics that are derived from novel combinations of a set of building blocks. Based on this observation, we developed CLASS, a genetic programming system that uses a simple composition operator to automatically discover SAT local search heuristics. New heuristics discovered by CLASS are shown to be competitive with the best Walksat variants, including Novelty+. Evolutionary algorithms have previously been applied to directly evolve a solution for a particular SAT instance. We show that the heuristics discovered by CLASS are also competitive with these previous, direct evolutionary approaches for SAT. We also analyze the local search behavior of the learned heuristics using the depth, mobility, and coverage metrics proposed by Schuurmans and Southey.

  3. Terminal Area Productivity Airport Wind Analysis and Chicago O'Hare Model Description

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Shapiro, Gerald

    1998-01-01

    This paper describes two results from a continuing effort to provide accurate cost-benefit analyses of the NASA Terminal Area Productivity (TAP) program technologies. Previous tasks have developed airport capacity and delay models and completed preliminary cost benefit estimates for TAP technologies at 10 U.S. airports. This task covers two improvements to the capacity and delay models. The first improvement is the completion of a detailed model set for the Chicago O'Hare (ORD) airport. Previous analyses used a more general model to estimate the benefits for ORD. This paper contains a description of the model details with results corresponding to current conditions. The second improvement is the development of specific wind speed and direction criteria for use in the delay models to predict when the Aircraft Vortex Spacing System (AVOSS) will allow use of reduced landing separations. This paper includes a description of the criteria and an estimate of AVOSS utility for 10 airports based on analysis of 35 years of weather data.

  4. Disruption of endosperm development: an inbreeding effect in almond (Prunus dulcis).

    PubMed

    Ortega, Encarnación; Martínez-García, Pedro J; Dicenta, Federico; Egea, José

    2010-06-01

    A homozygous self-compatible almond, originated from self-fertilization of a self-compatible genotype and producing a reasonable yield following open pollination, exhibited a very high fruit drop rate when self-pollinated. To investigate whether fruit dropping in this individual is related to an abnormal development of the embryo sac following self-fertilization, histological sections of ovaries from self and cross-pollinated flowers were observed by light microscopy. Additionally, the presence of pollen tubes in the ovary and fruit set were determined for both types of pollination. Despite pollen tubes reached the ovary after both pollinations, differences in embryo sac and endosperm development after fertilization were found. Thus, while for cross-fertilized ovules a pro-embryo and an endosperm with abundant nuclei were generally observed, most self-fertilized ovules remained in a previous developmental stage in which the embryo sac was not elongated and endosperm nuclei were absent. Although 30 days after pollination fruit set was similar for both pollination types, at 60 days it was significantly reduced for self-pollination. These results provide evidence that the high fruit drop in this genotype is the consequence of a disrupted development of the endosperm, what could be an expression of its high level of inbreeding.

  5. The role of collaborative ontology development in the knowledge negotiation process

    NASA Astrophysics Data System (ADS)

    Rivera, Norma

    Interdisciplinary research (IDR) collaboration can be defined as the process of integrating experts' knowledge, perspectives, and resources to advance scientific discovery. The flourishing of more complex research problems, together with the growth of scientific and technical knowledge has resulted in the need for researchers from diverse fields to provide different expertise and points of view to tackle these problems. These collaborations, however, introduce a new set of "culture" barriers as participating experts are trained to communicate in discipline-specific languages, theories, and research practices. We propose that building a common knowledge base for research using ontology development techniques can provide a starting point for interdisciplinary knowledge exchange, negotiation, and integration. The goal of this work is to extend ontology development techniques to support the knowledge negotiation process in IDR groups. Towards this goal, this work presents a methodology that extends previous work in collaborative ontology development and integrates learning strategies and tools to enhance interdisciplinary research practices. We evaluate the effectiveness of applying such methodology in three different scenarios that cover educational and research settings. The results of this evaluation confirm that integrating learning strategies can, in fact, be advantageous to overall collaborative practices in IDR groups.

  6. Long-Term Abstract Learning of Attentional Set

    ERIC Educational Resources Information Center

    Leber, Andrew B.; Kawahara, Jun-Ichiro; Gabari, Yuji

    2009-01-01

    How does past experience influence visual search strategy (i.e., attentional set)? Recent reports have shown that, when given the option to use 1 of 2 attentional sets, observers persist with the set previously required in a training phase. Here, 2 related questions are addressed. First, does the training effect result only from perseveration with…

  7. Validation of a Multimarker Model for Assessing Risk of Type 2 Diabetes from a Five-Year Prospective Study of 6784 Danish People (Inter99)

    PubMed Central

    Urdea, Mickey; Kolberg, Janice; Wilber, Judith; Gerwien, Robert; Moler, Edward; Rowe, Michael; Jorgensen, Paul; Hansen, Torben; Pedersen, Oluf; Jørgensen, Torben; Borch-Johnsen, Knut

    2009-01-01

    Background Improved identification of subjects at high risk for development of type 2 diabetes would allow preventive interventions to be targeted toward individuals most likely to benefit. In previous research, predictive biomarkers were identified and used to develop multivariate models to assess an individual's risk of developing diabetes. Here we describe the training and validation of the PreDx™ Diabetes Risk Score (DRS) model in a clinical laboratory setting using baseline serum samples from subjects in the Inter99 cohort, a population-based primary prevention study of cardiovascular disease. Methods Among 6784 subjects free of diabetes at baseline, 215 subjects progressed to diabetes (converters) during five years of follow-up. A nested case-control study was performed using serum samples from 202 converters and 597 randomly selected nonconverters. Samples were randomly assigned to equally sized training and validation sets. Seven biomarkers were measured using assays developed for use in a clinical reference laboratory. Results The PreDx DRS model performed better on the training set (area under the curve [AUC] = 0.837) than fasting plasma glucose alone (AUC = 0.779). When applied to the sequestered validation set, the PreDx DRS showed the same performance (AUC = 0.838), thus validating the model. This model had a better AUC than any other single measure from a fasting sample. Moreover, the model provided further risk stratification among high-risk subpopulations with impaired fasting glucose or metabolic syndrome. Conclusions The PreDx DRS provides the absolute risk of diabetes conversion in five years for subjects identified to be “at risk” using the clinical factors. PMID:20144324

  8. Validation of a multimarker model for assessing risk of type 2 diabetes from a five-year prospective study of 6784 Danish people (Inter99).

    PubMed

    Urdea, Mickey; Kolberg, Janice; Wilber, Judith; Gerwien, Robert; Moler, Edward; Rowe, Michael; Jorgensen, Paul; Hansen, Torben; Pedersen, Oluf; Jørgensen, Torben; Borch-Johnsen, Knut

    2009-07-01

    Improved identification of subjects at high risk for development of type 2 diabetes would allow preventive interventions to be targeted toward individuals most likely to benefit. In previous research, predictive biomarkers were identified and used to develop multivariate models to assess an individual's risk of developing diabetes. Here we describe the training and validation of the PreDx Diabetes Risk Score (DRS) model in a clinical laboratory setting using baseline serum samples from subjects in the Inter99 cohort, a population-based primary prevention study of cardiovascular disease. Among 6784 subjects free of diabetes at baseline, 215 subjects progressed to diabetes (converters) during five years of follow-up. A nested case-control study was performed using serum samples from 202 converters and 597 randomly selected nonconverters. Samples were randomly assigned to equally sized training and validation sets. Seven biomarkers were measured using assays developed for use in a clinical reference laboratory. The PreDx DRS model performed better on the training set (area under the curve [AUC] = 0.837) than fasting plasma glucose alone (AUC = 0.779). When applied to the sequestered validation set, the PreDx DRS showed the same performance (AUC = 0.838), thus validating the model. This model had a better AUC than any other single measure from a fasting sample. Moreover, the model provided further risk stratification among high-risk subpopulations with impaired fasting glucose or metabolic syndrome. The PreDx DRS provides the absolute risk of diabetes conversion in five years for subjects identified to be "at risk" using the clinical factors. Copyright 2009 Diabetes Technology Society.

  9. Emulating RRTMG Radiation with Deep Neural Networks for the Accelerated Model for Climate and Energy

    NASA Astrophysics Data System (ADS)

    Pal, A.; Norman, M. R.

    2017-12-01

    The RRTMG radiation scheme in the Accelerated Model for Climate and Energy Multi-scale Model Framework (ACME-MMF), is a bottleneck and consumes approximately 50% of the computational time. To simulate a case using RRTMG radiation scheme in ACME-MMF with high throughput and high resolution will therefore require a speed-up of this calculation while retaining physical fidelity. In this study, RRTMG radiation is emulated with Deep Neural Networks (DNNs). The first step towards this goal is to run a case with ACME-MMF and generate input data sets for the DNNs. A principal component analysis of these input data sets are carried out. Artificial data sets are created using the previous data sets to cover a wider space. These artificial data sets are used in a standalone RRTMG radiation scheme to generate outputs in a cost effective manner. These input-output pairs are used to train multiple architectures DNNs(1). Another DNN(2) is trained using the inputs to predict the error. A reverse emulation is trained to map the output to input. An error controlled code is developed with the two DNNs (1 and 2) and will determine when/if the original parameterization needs to be used.

  10. Combining the role of convenience and consideration set size in explaining fish consumption in Norway.

    PubMed

    Rortveit, Asbjorn Warvik; Olsen, Svein Ottar

    2009-04-01

    The purpose of this study is to explore how convenience orientation, perceived product inconvenience and consideration set size are related to attitudes towards fish and fish consumption. The authors present a structural equation model (SEM) based on the integration of two previous studies. The results of a SEM analysis using Lisrel 8.72 on data from a Norwegian consumer survey (n=1630) suggest that convenience orientation and perceived product inconvenience have a negative effect on both consideration set size and consumption frequency. Attitude towards fish has the greatest impact on consumption frequency. The results also indicate that perceived product inconvenience is a key variable since it has a significant impact on attitude, and on consideration set size and consumption frequency. Further, the analyses confirm earlier findings suggesting that the effect of convenience orientation on consumption is partially mediated through perceived product inconvenience. The study also confirms earlier findings suggesting that the consideration set size affects consumption frequency. Practical implications drawn from this research are that the seafood industry would benefit from developing and positioning products that change beliefs about fish as an inconvenient product. Future research for other food categories should be done to enhance the external validity.

  11. Surflex-Dock: Docking benchmarks and real-world application

    NASA Astrophysics Data System (ADS)

    Spitzer, Russell; Jain, Ajay N.

    2012-06-01

    Benchmarks for molecular docking have historically focused on re-docking the cognate ligand of a well-determined protein-ligand complex to measure geometric pose prediction accuracy, and measurement of virtual screening performance has been focused on increasingly large and diverse sets of target protein structures, cognate ligands, and various types of decoy sets. Here, pose prediction is reported on the Astex Diverse set of 85 protein ligand complexes, and virtual screening performance is reported on the DUD set of 40 protein targets. In both cases, prepared structures of targets and ligands were provided by symposium organizers. The re-prepared data sets yielded results not significantly different than previous reports of Surflex-Dock on the two benchmarks. Minor changes to protein coordinates resulting from complex pre-optimization had large effects on observed performance, highlighting the limitations of cognate ligand re-docking for pose prediction assessment. Docking protocols developed for cross-docking, which address protein flexibility and produce discrete families of predicted poses, produced substantially better performance for pose prediction. Performance on virtual screening performance was shown to benefit by employing and combining multiple screening methods: docking, 2D molecular similarity, and 3D molecular similarity. In addition, use of multiple protein conformations significantly improved screening enrichment.

  12. The development of a primary dental care outreach course.

    PubMed

    Waterhouse, P; Maguire, A; Tabari, D; Hind, V; Lloyd, J

    2008-02-01

    The aim of this work was to develop the first north-east based primary dental care outreach (PDCO) course for clinical dental undergraduate students at Newcastle University. The process of course design will be described and involved review of the existing Bachelor of Dental Surgery (BDS) degree course in relation to previously published learning outcomes. Areas were identified where the existing BDS course did not meet fully these outcomes. This was followed by setting the PDCO course aims and objectives, intended learning outcomes, curriculum and structure. The educational strategy and methods of teaching and learning were subsequently developed together with a strategy for overall quality control of the teaching and learning experience. The newly developed curriculum was aligned with appropriate student assessment methods, including summative, formative and ipsative elements.

  13. Genetic basis of climatic adaptation in scots pine by bayesian quantitative trait locus analysis.

    PubMed Central

    Hurme, P; Sillanpää, M J; Arjas, E; Repo, T; Savolainen, O

    2000-01-01

    We examined the genetic basis of large adaptive differences in timing of bud set and frost hardiness between natural populations of Scots pine. As a mapping population, we considered an "open-pollinated backcross" progeny by collecting seeds of a single F(1) tree (cross between trees from southern and northern Finland) growing in southern Finland. Due to the special features of the design (no marker information available on grandparents or the father), we applied a Bayesian quantitative trait locus (QTL) mapping method developed previously for outcrossed offspring. We found four potential QTL for timing of bud set and seven for frost hardiness. Bayesian analyses detected more QTL than ANOVA for frost hardiness, but the opposite was true for bud set. These QTL included alleles with rather large effects, and additionally smaller QTL were supported. The largest QTL for bud set date accounted for about a fourth of the mean difference between populations. Thus, natural selection during adaptation has resulted in selection of at least some alleles of rather large effect. PMID:11063704

  14. Automatic threshold selection for multi-class open set recognition

    NASA Astrophysics Data System (ADS)

    Scherreik, Matthew; Rigling, Brian

    2017-05-01

    Multi-class open set recognition is the problem of supervised classification with additional unknown classes encountered after a model has been trained. An open set classifer often has two core components. The first component is a base classifier which estimates the most likely class of a given example. The second component consists of open set logic which estimates if the example is truly a member of the candidate class. Such a system is operated in a feed-forward fashion. That is, a candidate label is first estimated by the base classifier, and the true membership of the example to the candidate class is estimated afterward. Previous works have developed an iterative threshold selection algorithm for rejecting examples from classes which were not present at training time. In those studies, a Platt-calibrated SVM was used as the base classifier, and the thresholds were applied to class posterior probabilities for rejection. In this work, we investigate the effectiveness of other base classifiers when paired with the threshold selection algorithm and compare their performance with the original SVM solution.

  15. A low dimensional dynamical system for the wall layer

    NASA Technical Reports Server (NTRS)

    Aubry, N.; Keefe, L. R.

    1987-01-01

    Low dimensional dynamical systems which model a fully developed turbulent wall layer were derived.The model is based on the optimally fast convergent proper orthogonal decomposition, or Karhunen-Loeve expansion. This decomposition provides a set of eigenfunctions which are derived from the autocorrelation tensor at zero time lag. Via Galerkin projection, low dimensional sets of ordinary differential equations in time, for the coefficients of the expansion, were derived from the Navier-Stokes equations. The energy loss to the unresolved modes was modeled by an eddy viscosity representation, analogous to Heisenberg's spectral model. A set of eigenfunctions and eigenvalues were obtained from direct numerical simulation of a plane channel at a Reynolds number of 6600, based on the mean centerline velocity and the channel width flow and compared with previous work done by Herzog. Using the new eigenvalues and eigenfunctions, a new ten dimensional set of ordinary differential equations were derived using five non-zero cross-stream Fourier modes with a periodic length of 377 wall units. The dynamical system was integrated for a range of the eddy viscosity prameter alpha. This work is encouraging.

  16. Thrust imbalance of solid rocket motor pairs on Space Shuttle flights

    NASA Technical Reports Server (NTRS)

    Foster, W. A., Jr.; Shu, P. H.; Sforzini, R. H.

    1986-01-01

    This analysis extends the investigation presented at the 17th Joint Propulsion Conference in 1981 to include fifteen sets of Space Shuttle flight data. The previous report dealt only with static test data and the first flight pair. The objective is to compare the authors' previous theoretical analysis of thrust imbalance with actual Space Shuttle performance. The theoretical prediction method, which involves a Monte Carlo technique, is reviewed briefly as are salient features of the flight instrumentation system and the statistical analysis. A scheme for smoothing flight data is discussed. The effects of changes in design parameters are discussed with special emphasis on the filament wound motor case being developed to replace the steel case. Good agreement between the predictions and the flight data is demonstrated.

  17. Developing Urban Environment Indicators for Neighborhood Sustainability Assessment in Tripoli-Libya

    NASA Astrophysics Data System (ADS)

    Elgadi, Ahmed. A.; Hakim Ismail, Lokman; Abass, Fatma; Ali, Abdelmuniem

    2016-11-01

    Sustainability assessment frameworks are becoming increasingly important to assist in the transition towards a sustainable urban environment. The urban environment is an effective system and requires regular monitoring and evaluation through a set of relevant indicators. The indicator provides information about the state of the environment through the production value of quantity. The indicator creates sustainability assessment requests to be considered on all spatial scales to specify efficient information of urban environment sustainability in Tripoli-Libya. Detailed data is necessary to assess environmental modification in the urban environment on a local scale and ease the transfer of this information to national and global stages. This paper proposes a set of key indicators to monitor urban environmental sustainability developments of Libyan residential neighborhoods. The proposed environmental indicator framework measures the sustainability performance of an urban environment through 13 sub-categories consisting of 21 indicators. This paper also explains the theoretical foundations for the selection of all indicators with reference to previous studies.

  18. Improving the Effectiveness of Electronic Health Record-Based Referral Processes

    PubMed Central

    2012-01-01

    Electronic health records are increasingly being used to facilitate referral communication in the outpatient setting. However, despite support by technology, referral communication between primary care providers and specialists is often unsatisfactory and is unable to eliminate care delays. This may be in part due to lack of attention to how information and communication technology fits within the social environment of health care. Making electronic referral communication effective requires a multifaceted “socio-technical” approach. Using an 8-dimensional socio-technical model for health information technology as a framework, we describe ten recommendations that represent good clinical practices to design, develop, implement, improve, and monitor electronic referral communication in the outpatient setting. These recommendations were developed on the basis of our previous work, current literature, sound clinical practice, and a systems-based approach to understanding and implementing health information technology solutions. Recommendations are relevant to system designers, practicing clinicians, and other stakeholders considering use of electronic health records to support referral communication. PMID:22973874

  19. Everything should be as simple as possible, but no simpler: towards a protocol for accumulating evidence regarding the active content of health behaviour change interventions.

    PubMed

    Peters, Gjalt-Jorn Ygram; de Bruin, Marijn; Crutzen, Rik

    2015-01-01

    There is a need to consolidate the evidence base underlying our toolbox of methods of behaviour change. Recent efforts to this effect have conducted meta-regressions on evaluations of behaviour change interventions, deriving each method's effectiveness from its association to intervention effect size. However, there are a range of issues that raise concern about whether this approach is actually furthering or instead obstructing the advancement of health psychology theories and the quality of health behaviour change interventions. Using examples from theory, the literature and data from previous meta-analyses, these concerns and their implications are explained and illustrated. An iterative protocol for evidence base accumulation is proposed that integrates evidence derived from both experimental and applied behaviour change research, and combines theory development in experimental settings with theory testing in applied real-life settings. As evidence gathered in this manner accumulates, a cumulative science of behaviour change can develop.

  20. Everything should be as simple as possible, but no simpler: towards a protocol for accumulating evidence regarding the active content of health behaviour change interventions

    PubMed Central

    Peters, Gjalt-Jorn Ygram; de Bruin, Marijn; Crutzen, Rik

    2015-01-01

    There is a need to consolidate the evidence base underlying our toolbox of methods of behaviour change. Recent efforts to this effect have conducted meta-regressions on evaluations of behaviour change interventions, deriving each method's effectiveness from its association to intervention effect size. However, there are a range of issues that raise concern about whether this approach is actually furthering or instead obstructing the advancement of health psychology theories and the quality of health behaviour change interventions. Using examples from theory, the literature and data from previous meta-analyses, these concerns and their implications are explained and illustrated. An iterative protocol for evidence base accumulation is proposed that integrates evidence derived from both experimental and applied behaviour change research, and combines theory development in experimental settings with theory testing in applied real-life settings. As evidence gathered in this manner accumulates, a cumulative science of behaviour change can develop. PMID:25793484

  1. A network of epigenetic regulators guides developmental haematopoiesis in vivo.

    PubMed

    Huang, Hsuan-Ting; Kathrein, Katie L; Barton, Abby; Gitlin, Zachary; Huang, Yue-Hua; Ward, Thomas P; Hofmann, Oliver; Dibiase, Anthony; Song, Anhua; Tyekucheva, Svitlana; Hide, Winston; Zhou, Yi; Zon, Leonard I

    2013-12-01

    The initiation of cellular programs is orchestrated by key transcription factors and chromatin regulators that activate or inhibit target gene expression. To generate a compendium of chromatin factors that establish the epigenetic code during developmental haematopoiesis, a large-scale reverse genetic screen was conducted targeting orthologues of 425 human chromatin factors in zebrafish. A set of chromatin regulators was identified that target different stages of primitive and definitive blood formation, including factors not previously implicated in haematopoiesis. We identified 15 factors that regulate development of primitive erythroid progenitors and 29 factors that regulate development of definitive haematopoietic stem and progenitor cells. These chromatin factors are associated with SWI/SNF and ISWI chromatin remodelling, SET1 methyltransferase, CBP-p300-HBO1-NuA4 acetyltransferase, HDAC-NuRD deacetylase, and Polycomb repressive complexes. Our work provides a comprehensive view of how specific chromatin factors and their associated complexes play a major role in the establishment of haematopoietic cells in vivo.

  2. Applications of Deep Learning and Reinforcement Learning to Biological Data.

    PubMed

    Mahmud, Mufti; Kaiser, Mohammed Shamim; Hussain, Amir; Vassanelli, Stefano

    2018-06-01

    Rapid advances in hardware-based technologies during the past decades have opened up new possibilities for life scientists to gather multimodal data in various application domains, such as omics, bioimaging, medical imaging, and (brain/body)-machine interfaces. These have generated novel opportunities for development of dedicated data-intensive machine learning techniques. In particular, recent research in deep learning (DL), reinforcement learning (RL), and their combination (deep RL) promise to revolutionize the future of artificial intelligence. The growth in computational power accompanied by faster and increased data storage, and declining computing costs have already allowed scientists in various fields to apply these techniques on data sets that were previously intractable owing to their size and complexity. This paper provides a comprehensive survey on the application of DL, RL, and deep RL techniques in mining biological data. In addition, we compare the performances of DL techniques when applied to different data sets across various application domains. Finally, we outline open issues in this challenging research area and discuss future development perspectives.

  3. Automatic script identification from images using cluster-based templates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hochberg, J.; Kerns, L.; Kelly, P.

    We have developed a technique for automatically identifying the script used to generate a document that is stored electronically in bit image form. Our approach differs from previous work in that the distinctions among scripts are discovered by an automatic learning procedure, without any handson analysis. We first develop a set of representative symbols (templates) for each script in our database (Cyrillic, Roman, etc.). We do this by identifying all textual symbols in a set of training documents, scaling each symbol to a fixed size, clustering similar symbols, pruning minor clusters, and finding each cluster`s centroid. To identify a newmore » document`s script, we identify and scale a subset of symbols from the document and compare them to the templates for each script. We choose the script whose templates provide the best match. Our current system distinguishes among the Armenian, Burmese, Chinese, Cyrillic, Ethiopic, Greek, Hebrew, Japanese, Korean, Roman, and Thai scripts with over 90% accuracy.« less

  4. Muiti-Sensor Historical Climatology of Satellite-Derived Global Land Surface Moisture

    NASA Technical Reports Server (NTRS)

    Owe, Manfred; deJeu, Richard; Holmes, Thomas

    2007-01-01

    A historical climatology of continuous satellite derived global land surface soil moisture is being developed. The data set consists of surface soil moisture retrievals from observations of both historical and currently active satellite microwave sensors, including Nimbus-7 SMMR, DMSP SSM/I, TRMM TMI, and AQUA AMSR-E. The data sets span the period from November 1978 through the end of 2006. The soil moisture retrievals are made with the Land Parameter Retrieval Model, a physically-based model which was developed jointly by researchers from the above institutions. These data are significant in that they are the longest continuous data record of observational surface soil moisture at a global scale. Furthermore, while previous reports have intimated that higher frequency sensors such as on SSM/I are unable to provide meaningful information on soil moisture, our results indicate that these sensors do provide highly useful soil moisture data over significant parts of the globe, and especially in critical areas located within the Earth's many arid and semi-arid regions.

  5. Development of an electronic seepage chamber for extended use in a river.

    PubMed

    Fritz, Brad G; Mendoza, Donaldo P; Gilmore, Tyler J

    2009-01-01

    Seepage chambers have been used to characterize the flux of water across the water-sediment interface in a variety of settings. In this work, an electronic seepage chamber was developed specifically for long-term use in a large river where hydraulic gradient reversals occur frequently with river-stage variations. A bidirectional electronic flowmeter coupled with a seepage chamber was used to measure temporal changes in the magnitude and direction of water flux across the water-sediment interface over an 8-week period. The specific discharge measured from the seepage chamber compared favorably with measurements of vertical hydraulic gradient and previous specific discharge calculations. This, as well as other supporting data, demonstrates the effectiveness of the electronic seepage chamber to accurately quantify water flux in two directions over a multimonth period in this setting. The ability to conduct multimonth measurements of water flux at a subhourly frequency in a river system is a critical capability for a seepage chamber in a system where hydraulic gradients change on a daily and seasonal basis.

  6. Bengali-English Relevant Cross Lingual Information Access Using Finite Automata

    NASA Astrophysics Data System (ADS)

    Banerjee, Avishek; Bhattacharyya, Swapan; Hazra, Simanta; Mondal, Shatabdi

    2010-10-01

    CLIR techniques searches unrestricted texts and typically extract term and relationships from bilingual electronic dictionaries or bilingual text collections and use them to translate query and/or document representations into a compatible set of representations with a common feature set. In this paper, we focus on dictionary-based approach by using a bilingual data dictionary with a combination to statistics-based methods to avoid the problem of ambiguity also the development of human computer interface aspects of NLP (Natural Language processing) is the approach of this paper. The intelligent web search with regional language like Bengali is depending upon two major aspect that is CLIA (Cross language information access) and NLP. In our previous work with IIT, KGP we already developed content based CLIA where content based searching in trained on Bengali Corpora with the help of Bengali data dictionary. Here we want to introduce intelligent search because to recognize the sense of meaning of a sentence and it has a better real life approach towards human computer interactions.

  7. The stress response system of proteins: Implications for bioreactor scaleup

    NASA Technical Reports Server (NTRS)

    Goochee, Charles F.

    1988-01-01

    Animal cells face a variety of environmental stresses in large scale bioreactors, including periodic variations in shear stress and dissolved oxygen concentration. Diagnostic techniques were developed for identifying the particular sources of environmental stresses for animal cells in a given bioreactor configuration. The mechanisms by which cells cope with such stresses was examined. The individual concentrations and synthesis rates of hundreds of intracellular proteins are affected by the extracellular environment (medium composition, dissolved oxygen concentration, ph, and level of surface shear stress). Techniques are currently being developed for quantifying the synthesis rates and concentrations of the intracellular proteins which are most sensitive to environmental stress. Previous research has demonstrated that a particular set of stress response proteins are synthesized by mammalian cells in response to temperature fluctuations, dissolved oxygen deprivation, and glucose deprivation. Recently, it was demonstrated that exposure of human kidney cells to high shear stress results in expression of a completely distinct set of intracellular proteins.

  8. Language, culture and international exchange of virtual patients.

    PubMed

    Muntean, Valentin; Calinici, Tudor; Tigan, Stefan; Fors, Uno G H

    2013-02-11

    Language and cultural differences could be a limiting factor for the international exchange of Virtual Patients (VPs), especially for small countries and languages of limited circulation. Our research evaluated whether it would be feasible to develop a VP based educational program in our Romanian institution, with cases in English and developed in a non-Romanian setting. The participants in the research comprised 4th year Romanian medical students from the Faculty of Medicine in Cluj-Napoca, Romania, with previous training exclusively in Romanian, good English proficiency and no experience with VPs. The students worked on eight VPs in two identical versions, Romanian and English. The first group (2010) of 136 students worked with four VPs developed in Cluj and the second group (2011) of 144 students with four VPs originally developed at an US University. Every student was randomly assigned two different VPs, one in Romanian and another in English. Student activity throughout the case, the diagnosis, therapeutic plan and diagnosis justification were recorded. We also compared student performance on the two VPs versions, Romanian and English and the student performance on the two sets of cases, originally developed in Romania, respectively USA. We found no significant differences between the students' performance on the Romanian vs. English version of VPs. Regarding the students' performance on the two sets of cases, in those originally developed in Romania, respectively in the USA, we found a number of statistically significant differences in the students' activity through the cases. There were no statistically significant differences in the students' ability to reach the correct diagnosis and therapeutic plan. The development of our program with VPs in English would be feasible, cost-effective and in accordance with the globalization of medical education.

  9. Targeted intervention strategies to optimise diversion of BMW in the Dublin, Ireland region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purcell, M., E-mail: mary.purcell@cit.ie; Centre for Water Resources Research, School of Architecture, Landscape and Civil Engineering, University College Dublin, Newstead, Belfield, Dublin 4; Magette, W.L.

    Highlights: > Previous research indicates that targeted strategies designed for specific areas should lead to improved diversion. > Survey responses and GIS model predictions from previous research were the basis for goal setting. > Then logic modelling and behavioural research were employed to develop site-specific management intervention strategies. > Waste management initiatives can be tailored to specific needs of areas rather than one size fits all means currently used. - Abstract: Urgent transformation is required in Ireland to divert biodegradable municipal waste (BMW) from landfill and prevent increases in overall waste generation. When BMW is optimally managed, it becomes amore » resource with value instead of an unwanted by-product requiring disposal. An analysis of survey responses from commercial and residential sectors for the Dublin region in previous research by the authors proved that attitudes towards and behaviour regarding municipal solid waste is spatially variable. This finding indicates that targeted intervention strategies designed for specific geographic areas should lead to improved diversion rates of BMW from landfill, a requirement of the Landfill Directive 1999/31/EC. In the research described in this paper, survey responses and GIS model predictions from previous research were the basis for goal setting, after which logic modelling and behavioural research were employed to develop site-specific waste management intervention strategies. The main strategies devised include (a) roll out of the Brown Bin (Organics) Collection and Community Workshops in Dun Laoghaire Rathdown, (b) initiation of a Community Composting Project in Dublin City (c) implementation of a Waste Promotion and Motivation Scheme in South Dublin (d) development and distribution of a Waste Booklet to promote waste reduction activities in Fingal (e) region wide distribution of a Waste Booklet to the commercial sector and (f) Greening Irish Pubs Initiative. Each of these strategies was devised after interviews with both the residential and commercial sectors to help make optimal waste management the norm for both sectors. Strategy (b), (e) and (f) are detailed in this paper. By integrating a human element into accepted waste management approaches, these strategies will make optimal waste behaviour easier to achieve. Ultimately this will help divert waste from landfill and improve waste management practice as a whole for the region. This method of devising targeted intervention strategies can be adapted for many other regions.« less

  10. Physical Examination Reporting System

    PubMed Central

    Rowley, B.A.; Cameron, J.M.; Anderson, D.E.; Nicholas, T.A.; Hogue, R.L.; Hutcheson, J.L.; Peralta, V.H.; Johansen, B.; Walston, D.

    1978-01-01

    The following is a description of a Physical Examination Reporting System which was developed in cooperation with physicians from the Department of Family Practice and the Department of Preventive Medicine, Texas Tech University School of Medicine, Lubbock, Texas. This system was designed to evaluate what effect such a report would have on the practice of medicine in underserved areas with regard to health benefit, medical impact, and economic impact. The set of observations was assembled over a three month period by utilizing techniques previously developed in this area as well as the expertise of the TTUSM faculty. This system was developed for impact evaluation purposes. Its actual daily use may require changes in the form, in method of entry, and in format of the report.

  11. The Development of NASA's Low Thrust Trajectory Tool Set

    NASA Technical Reports Server (NTRS)

    Sims, Jon; Artis, Gwen; Kos, Larry

    2006-01-01

    Highly efficient electric propulsion systems can enable interesting classes of missions; unfortunately, they provide only a limited amount of thrust. Low-thrust (LT) trajectories are much more difficult to design than impulsive-type (chemical propulsion) trajectories. Previous low-thrust (LT) trajectory optimization software was often difficult to use, often had difficulties converging, and was somewhat limited in the types of missions it could support. A new state-of-the-art suite (toolbox) of low-thrust (LT) tools along with improved algorithms and methods was developed by NASA's MSFC, JPL, JSC, and GRC to address the needs of our customers to help foster technology development in the areas of advanced LT propulsion systems, and to facilitate generation of similar results by different analysts.

  12. The development of a VBHOM-based outcome model for lower limb amputation performed for critical ischaemia.

    PubMed

    Tang, T Y; Prytherch, D R; Walsh, S R; Athanassoglou, V; Seppi, V; Sadat, U; Lees, T A; Varty, K; Boyle, J R

    2009-01-01

    VBHOM (Vascular Biochemistry and Haematology Outcome Models) adopts the approach of using a minimum data set to model outcome and has been previously shown to be feasible after index arterial operations. This study attempts to model mortality following lower limb amputation for critical limb ischaemia using the VBHOM concept. A binary logistic regression model of risk of mortality was built using National Vascular Database items that contained the complete data required by the model from 269 admissions for lower limb amputation. The subset of NVD data items used were urea, creatinine, sodium, potassium, haemoglobin, white cell count, age on and mode of admission. This model was applied prospectively to a test set of data (n=269), which were not part of the original training set to develop the predictor equation. Outcome following lower limb amputation could be described accurately using the same model. The overall mean predicted risk of mortality was 32%, predicting 86 deaths. Actual number of deaths was 86 (chi(2)=8.05, 8 d.f., p=0.429; no evidence of lack of fit). The model demonstrated adequate discrimination (c-index=0.704). VBHOM provides a single unified model that allows good prediction of surgical mortality in this high risk group of individuals. It uses a small, simple and objective clinical data set that may also simplify comparative audit within vascular surgery.

  13. Developing open source, self-contained disease surveillance software applications for use in resource-limited settings

    PubMed Central

    2012-01-01

    Background Emerging public health threats often originate in resource-limited countries. In recognition of this fact, the World Health Organization issued revised International Health Regulations in 2005, which call for significantly increased reporting and response capabilities for all signatory nations. Electronic biosurveillance systems can improve the timeliness of public health data collection, aid in the early detection of and response to disease outbreaks, and enhance situational awareness. Methods As components of its Suite for Automated Global bioSurveillance (SAGES) program, The Johns Hopkins University Applied Physics Laboratory developed two open-source, electronic biosurveillance systems for use in resource-limited settings. OpenESSENCE provides web-based data entry, analysis, and reporting. ESSENCE Desktop Edition provides similar capabilities for settings without internet access. Both systems may be configured to collect data using locally available cell phone technologies. Results ESSENCE Desktop Edition has been deployed for two years in the Republic of the Philippines. Local health clinics have rapidly adopted the new technology to provide daily reporting, thus eliminating the two-to-three week data lag of the previous paper-based system. Conclusions OpenESSENCE and ESSENCE Desktop Edition are two open-source software products with the capability of significantly improving disease surveillance in a wide range of resource-limited settings. These products, and other emerging surveillance technologies, can assist resource-limited countries compliance with the revised International Health Regulations. PMID:22950686

  14. Informed consent comprehension in African research settings.

    PubMed

    Afolabi, Muhammed O; Okebe, Joseph U; McGrath, Nuala; Larson, Heidi J; Bojang, Kalifa; Chandramohan, Daniel

    2014-06-01

    Previous reviews on participants' comprehension of informed consent information have focused on developed countries. Experience has shown that ethical standards developed on Western values may not be appropriate for African settings where research concepts are unfamiliar. We undertook this review to describe how informed consent comprehension is defined and measured in African research settings. We conducted a comprehensive search involving five electronic databases: Medline, Embase, Global Health, EthxWeb and Bioethics Literature Database (BELIT). We also examined African Index Medicus and Google Scholar for relevant publications on informed consent comprehension in clinical studies conducted in sub-Saharan Africa. 29 studies satisfied the inclusion criteria; meta-analysis was possible in 21 studies. We further conducted a direct comparison of participants' comprehension on domains of informed consent in all eligible studies. Comprehension of key concepts of informed consent varies considerably from country to country and depends on the nature and complexity of the study. Meta-analysis showed that 47% of a total of 1633 participants across four studies demonstrated comprehension about randomisation (95% CI 13.9-80.9%). Similarly, 48% of 3946 participants in six studies had understanding about placebo (95% CI 19.0-77.5%), while only 30% of 753 participants in five studies understood the concept of therapeutic misconception (95% CI 4.6-66.7%). Measurement tools for informed consent comprehension were developed with little or no validation. Assessment of comprehension was carried out at variable times after disclosure of study information. No uniform definition of informed consent comprehension exists to form the basis for development of an appropriate tool to measure comprehension in African participants. Comprehension of key concepts of informed consent is poor among study participants across Africa. There is a vital need to develop a uniform definition for informed consent comprehension in low literacy research settings in Africa. This will be an essential step towards developing appropriate tools that can adequately measure informed consent comprehension. This may consequently suggest adequate measures to improve the informed consent procedure. © 2014 John Wiley & Sons Ltd.

  15. Development of an intensive care unit resource assessment survey for the care of critically ill patients in resource-limited settings.

    PubMed

    Leligdowicz, Aleksandra; Bhagwanjee, Satish; Diaz, Janet V; Xiong, Wei; Marshall, John C; Fowler, Robert A; Adhikari, Neill Kj

    2017-04-01

    Capacity to provide critical care in resource-limited settings is poorly understood because of lack of data about resources available to manage critically ill patients. Our objective was to develop a survey to address this issue. We developed and piloted a cross-sectional self-administered survey in 9 resource-limited countries. The survey consisted of 8 domains; specific items within domains were modified from previously developed survey tools. We distributed the survey by e-mail to a convenience sample of health care providers responsible for providing care to critically ill patients. We assessed clinical sensibility and test-retest reliability. Nine of 15 health care providers responded to the survey on 2 separate occasions, separated by 2 to 4 weeks. Clinical sensibility was high (3.9-4.9/5 on assessment tool). Test-retest reliability for questions related to resource availability was acceptable (intraclass correlation coefficient, 0.94; 95% confidence interval, 0.75-0.99; mean (SD) of weighted κ values = 0.67 [0.19]). The mean (SD) time for survey completion survey was 21 (16) minutes. A reliable cross-sectional survey of available resources to manage critically ill patients can be feasibly administered to health care providers in resource-limited settings. The survey will inform future research focusing on access to critical care where it is poorly described but urgently needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Increased phylogenetic resolution within the ecologically important Rhizopogon subgenus Amylopogon using 10 anonymous nuclear loci.

    PubMed

    Dowie, Nicholas J; Grubisha, Lisa C; Burton, Brent A; Klooster, Matthew R; Miller, Steven L

    2017-01-01

    Rhizopogon species are ecologically significant ectomycorrhizal fungi in conifer ecosystems. The importance of this system merits the development and utilization of a more robust set of molecular markers specifically designed to evaluate their evolutionary ecology. Anonymous nuclear loci (ANL) were developed for R. subgenus Amylopogon. Members of this subgenus occur throughout the United States and are exclusive fungal symbionts associated with Pterospora andromedea, a threatened mycoheterotrophic plant endemic to disjunct eastern and western regions of North America. Candidate ANL were developed from 454 shotgun pyrosequencing and assessed for positive amplification across targeted species, sequencing success, and recovery of phylogenetically informative sites. Ten ANL were successfully developed and were subsequently used to sequence representative taxa, herbaria holotype and paratype specimens in R. subgenus Amylopogon. Phylogenetic reconstructions were performed on individual and concatenated data sets by Bayesian inference and maximum likelihood methods. Phylogenetic analyses of these 10 ANL were compared with a phylogeny traditionally constructed using the universal fungal barcode nuc rDNA ITS1-5.8S-ITS2 region (ITS). The resulting ANL phylogeny was consistent with most of the species designations delineated by ITS. However, the ANL phylogeny provided much greater phylogenetic resolution, yielding new evidence for cryptic species within previously defined species of R. subgenus Amylopogon. Additionally, the rooted ANL phylogeny provided an alternate topology to the ITS phylogeny, which inferred a novel set of evolutionary relationships not identified in prior phylogenetic studies.

  17. Apatinib for metastatic breast cancer in non-clinical trial setting: Satisfying efficacy regardless of previous anti-angiogenic treatment.

    PubMed

    Lin, Ying; Wu, Zheng; Zhang, Jian; Hu, Xichun; Wang, Zhonghua; Wang, Biyun; Cao, Jun; Wang, Leiping

    2017-06-01

    Apatinib is a novel tyrosine kinase inhibitor targeting vascular endothelial growth factor receptor 2. This study aimed to evaluate the efficacy and safety of apatinib in metastatic breast cancer (MBC) under non-clinical trial setting, and to study the impact of previous antiangiogenic treatment to the efficacy of apatinib. 52 MBC patients treated with apatinib under non-clinical trial setting in Fudan University Shanghai Cancer Center between January 1st 2015 and October 1st 2016 were included. All patients were included in time-to-treatment failure (TTF) analysis, while 45 patients were enrolled for progression-free survival (PFS) and overall survival (OS) analysis because 7 of the patients with treatment discontinuation due to intolerable toxicities had too short time for efficacy assessment. Impact of previous exposure to antiangiogenic treatment and other factors to patients' survival were analyzed by Log-rank analysis and Cox multivariate analysis. The median PFS, median OS, and median TTF were 4.90 (95% confidence interval [CI] 3.44 - 6.36), 10.3 (unable to calculate 95% CI), and 3.93 (95% CI 1.96 - 5.90) months, respectively. Previous treatment of bevacizumab did not affect the efficacy of apatinib. Previous exposure to anthracycline, age of 60 years or older and palmar-plantar erythrodysesthesia syndrome were independent predictors for prolonged PFS. Discontinuation of treatment was more common in age group of 60 years or older than that in younger group, although the difference was not significant. Although toxicities were generally managable, a previously unrecorded grade 3~4 adverse event of dyspnea has been observed. This study confirmed the encouraging efficacy and manageable safety of apatinib on pretreated MBC patients in non-clinical trial setting. For the first time to our knowledge, this study found that previous treatment of bevacizumab did not affect the efficacy of apatinib, and reported an undocumented severe adverse effect of dyspnea.

  18. Quality indicators for pharmaceutical care: a comprehensive set with national scores for Dutch community pharmacies.

    PubMed

    Teichert, Martina; Schoenmakers, Tim; Kylstra, Nico; Mosk, Berend; Bouvy, Marcel L; van de Vaart, Frans; De Smet, Peter A G M; Wensing, Michel

    2016-08-01

    Background The quality of pharmaceutical care in community pharmacies in the Netherlands has been assessed annually since 2008. The initial set has been further developed with pharmacists and patient organizations, the healthcare inspectorate, the government and health insurance companies. The set over 2012 was the first set of quality indicators for community pharmacies which was validated and supported by all major stakeholders. The aims of this study were to describe the validated set of quality indicators for community pharmacies and to report their scores over 2012. In subanalyses the score development over 5 years was described for those indicators, that have been surveyed before and remained unchanged. Methods Community pharmacists in the Netherlands were invited in 2013 to provide information for the set of 2012. Quality indicators were mapped by categories relevant for pharmaceutical care and defined for structures, processes and dispensing outcomes. Scores for categorically-measured quality indicators were presented as the percentage of pharmacies reporting the presence of a quality aspect. For numerical quality indicators, the mean of all reported scores was expressed. In subanalyses for those indicators that had been questioned previously, scores were collected from earlier measurements for pharmacies providing their scores in 2012. Multilevel analysis was used to assess the consistency of scores within one pharmacy over time by the intra-class correlation coefficient (ICC). Results For the set in 2012, 1739 Dutch community pharmacies (88 % of the total) provided information for 66 quality indicators in 10 categories. Indicator scores on the presence of quality structures showed relatively high quality levels. Scores for processes and dispensing outcomes were lower. Subanalyses showed that overall indicators scores improved within pharmacies, but this development differed between pharmacies. Conclusions A set of validated quality indicators provided insight into the quality of pharmaceutical care in the Netherlands. The quality of pharmaceutical care improved over time. As of 2012 quality structures were present in at least 80 % of the community pharmacies. Variation in scores on care processes and outcomes between individual pharmacies and over time can initiate future research to better understand and facilitate quality improvement in community pharmacies.

  19. Predicting chemical bioavailability using microarray gene expression data and regression modeling: A tale of three explosive compounds.

    PubMed

    Gong, Ping; Nan, Xiaofei; Barker, Natalie D; Boyd, Robert E; Chen, Yixin; Wilkins, Dawn E; Johnson, David R; Suedel, Burton C; Perkins, Edward J

    2016-03-08

    Chemical bioavailability is an important dose metric in environmental risk assessment. Although many approaches have been used to evaluate bioavailability, not a single approach is free from limitations. Previously, we developed a new genomics-based approach that integrated microarray technology and regression modeling for predicting bioavailability (tissue residue) of explosives compounds in exposed earthworms. In the present study, we further compared 18 different regression models and performed variable selection simultaneously with parameter estimation. This refined approach was applied to both previously collected and newly acquired earthworm microarray gene expression datasets for three explosive compounds. Our results demonstrate that a prediction accuracy of R(2) = 0.71-0.82 was achievable at a relatively low model complexity with as few as 3-10 predictor genes per model. These results are much more encouraging than our previous ones. This study has demonstrated that our approach is promising for bioavailability measurement, which warrants further studies of mixed contamination scenarios in field settings.

  20. General practices as emergent research organizations: a qualitative study into organizational development.

    PubMed

    Macfarlane, Fraser; Shaw, Sara; Greenhalgh, Trisha; Carter, Yvonne H

    2005-06-01

    An increasing proportion of research in primary care is locally undertaken in designated research practices. Capacity building to support high quality research at these grass roots is urgently needed and is a government priority. There is little previously published research on the process by which GP practices develop as research organizations or on their specific support needs at organizational level. Using in-depth qualitative interviews with 28 key informants in 11 research practices across the UK, we explored their historical accounts of the development of research activity. We analysed the data with reference to contemporary theories of organizational development. Participants identified a number of key events and processes, which allowed us to produce a five-phase model of practice development in relation to research activity (creative energy, concrete planning, transformation/differentiation, consolidation and collaboration). Movement between these phases was not linear or continuous, but showed emergent and adaptive properties in which specific triggers and set-backs were often critical. This developmental model challenges previous categorical taxonomies of research practices. It forms a theory-driven framework for providing appropriate support at the grass roots of primary care research, based on the practice's phase of development and the nature of external triggers and potential setbacks. Our findings have important implications for the strategic development of practice-based research in the UK, and could serve as a model for the wider international community.

  1. Ancient Laurentian detrital zircon in the closing Iapetus Ocean, Southern Uplands terrane, Scotland

    NASA Astrophysics Data System (ADS)

    Waldron, John W. F.; Floyd, James D.; Simonetti, Antonio; Heaman, Larry M.

    2008-07-01

    Early Paleozoic sandstones in the Southern Uplands terrane ofScotland were deposited during closure of the Iapetus Oceanbetween Laurentia and Avalonia. Their tectonic setting and sourcesare controversial, and different authors have supported subduction-accretion,extensional continental-margin development, or back-arc basinsettings. We report new U-Pb detrital zircon ages from fiveLate Ordovician sandstones from the Northern Belt of the SouthernUplands and test models of their tectonic setting. The U-Pbzircon age distributions are dominated by peaks characteristicof sources in Laurentia and include grains as old as 3.6 Ga,older than any previously recorded in the British CaledonidesSE of the Laurentian foreland. Discordant grains in one samplesuggest derivation via erosion of metasedimentary rocks incorporatedin the Grampian-Taconian orogen. Rare Neoproterozoic grains,previously interpreted as originating from a peri-Gondwananterrane, may be derived from igneous rocks associated with Iapetanrifting. Only rare zircons are contemporary with the depositionalages. The results are difficult to reconcile with extensionalcontinental-margin and back-arc models, but they support anactive continental-margin subduction-accretion model. Closesimilarities with distributions from the Newfoundland Appalachiansare consistent with sinistral transpression during closing ofthe Iapetus Ocean.

  2. Color analysis and image rendering of woodblock prints with oil-based ink

    NASA Astrophysics Data System (ADS)

    Horiuchi, Takahiko; Tanimoto, Tetsushi; Tominaga, Shoji

    2012-01-01

    This paper proposes a method for analyzing the color characteristics of woodblock prints having oil-based ink and rendering realistic images based on camera data. The analysis results of woodblock prints show some characteristic features in comparison with oil paintings: 1) A woodblock print can be divided into several cluster areas, each with similar surface spectral reflectance; and 2) strong specular reflection from the influence of overlapping paints arises only in specific cluster areas. By considering these properties, we develop an effective rendering algorithm by modifying our previous algorithm for oil paintings. A set of surface spectral reflectances of a woodblock print is represented by using only a small number of average surface spectral reflectances and the registered scaling coefficients, whereas the previous algorithm for oil paintings required surface spectral reflectances of high dimension at all pixels. In the rendering process, in order to reproduce the strong specular reflection in specific cluster areas, we use two sets of parameters in the Torrance-Sparrow model for cluster areas with or without strong specular reflection. An experiment on a woodblock printing with oil-based ink was performed to demonstrate the feasibility of the proposed method.

  3. Modification of the Integrated Sasang Constitutional Diagnostic Model

    PubMed Central

    Nam, Jiho

    2017-01-01

    In 2012, the Korea Institute of Oriental Medicine proposed an objective and comprehensive physical diagnostic model to address quantification problems in the existing Sasang constitutional diagnostic method. However, certain issues have been raised regarding a revision of the proposed diagnostic model. In this paper, we propose various methodological approaches to address the problems of the previous diagnostic model. Firstly, more useful variables are selected in each component. Secondly, the least absolute shrinkage and selection operator is used to reduce multicollinearity without the modification of explanatory variables. Thirdly, proportions of SC types and age are considered to construct individual diagnostic models and classify the training set and the test set for reflecting the characteristics of the entire dataset. Finally, an integrated model is constructed with explanatory variables of individual diagnosis models. The proposed integrated diagnostic model significantly improves the sensitivities for both the male SY type (36.4% → 62.0%) and the female SE type (43.7% → 64.5%), which were areas of limitation of the previous integrated diagnostic model. The ideas of these new algorithms are expected to contribute not only to the scientific development of Sasang constitutional medicine in Korea but also to that of other diagnostic methods for traditional medicine. PMID:29317897

  4. Parenting an overweight or obese teen; issues and advice from parents

    PubMed Central

    Boutelle, Kerri N.; Feldman, Shira; Neumark-Sztainer, Dianne

    2013-01-01

    Objective This qualitative study addresses: 1) What challenges do parents of overweight adolescents face? 2) What advice do parents of overweight adolescents have for other parents? Design One-on-one interviews were conducted with 27 parents of overweight or previously overweight adolescents Setting Medical clinic at the University of Minnesota Participants 27 parents of adolescents (12-19 years) who were either currently or previously overweight recruited from the community Main Outcome Measures. Qualitative interviews related to parenting overweight adolescents Analysis Content analysis was used to identify themes regarding parental experiences. Results Issues most frequently mentioned: 1) uncertainty regarding effective communication with adolescent about weight-related topics, 2) inability to control adolescent’s decisions around healthy eating and activity behaviors, 3) concern for adolescent’s well-being, 4) parental feeling of responsibility/guilt. Parental advice most often provided included: 1) setting up healthy home environment, 2) parental role modeling of healthy behaviors, and 3) providing support/encouragement for positive efforts. Conclusions Topics for potential intervention development include communication and motivation of adolescents regarding weight-related topics, appropriate autonomy, and addressing negative emotions concerning the adolescent’s weight status. Targeting these topics could potentially improve acceptability and outcomes for treatments. PMID:22770833

  5. Perceiving while producing: Modeling the dynamics of phonological planning

    PubMed Central

    Roon, Kevin D.; Gafos, Adamantios I.

    2016-01-01

    We offer a dynamical model of phonological planning that provides a formal instantiation of how the speech production and perception systems interact during online processing. The model is developed on the basis of evidence from an experimental task that requires concurrent use of both systems, the so-called response-distractor task in which speakers hear distractor syllables while they are preparing to produce required responses. The model formalizes how ongoing response planning is affected by perception and accounts for a range of results reported across previous studies. It does so by explicitly addressing the setting of parameter values in representations. The key unit of the model is that of the dynamic field, a distribution of activation over the range of values associated with each representational parameter. The setting of parameter values takes place by the attainment of a stable distribution of activation over the entire field, stable in the sense that it persists even after the response cue in the above experiments has been removed. This and other properties of representations that have been taken as axiomatic in previous work are derived by the dynamics of the proposed model. PMID:27440947

  6. Malaria-Related Anemia in Patients from Unstable Transmission Areas in Colombia

    PubMed Central

    Lopez-Perez, Mary; Álvarez, Álvaro; Gutierrez, Juan B.; Moreno, Alberto; Herrera, Sócrates; Arévalo-Herrera, Myriam

    2015-01-01

    Information about the prevalence of malarial anemia in areas of low-malaria transmission intensity, like Latin America, is scarce. To characterize the malaria-related anemia, we evaluated 929 malaria patients from three sites in Colombia during 2011–2013. Plasmodium vivax was found to be the most prevalent species in Tierralta (92%), whereas P. falciparum was predominant in Tumaco (84%) and Quibdó (70%). Although severe anemia (hemoglobin < 7 g/dL) was almost absent (0.3%), variable degrees of non-severe anemia were observed in 36.9% of patients. In Tierralta, hemoglobin levels were negatively associated with days of illness. Moreover, in Tierralta and Quibdó, the number of previous malaria episodes and hemoglobin levels were positively associated. Both Plasmodium species seem to have similar potential to induce malarial anemia with distinct cofactors at each endemic setting. The target age in these low-transmission settings seems shifting toward adolescents and young adults. In addition, previous malaria experience seems to induce protection against anemia development. Altogether, these data suggest that early diagnosis and prompt treatment are likely preventing more frequent and serious malaria-related anemia in Colombia. PMID:25510719

  7. Primary care-led commissioning: applying lessons from the past to the early development of clinical commissioning groups in England

    PubMed Central

    Checkland, Kath; Coleman, Anna; McDermott, Imelda; Segar, Julia; Miller, Rosalind; Petsoulas, Christina; Wallace, Andrew; Harrison, Stephen; Peckham, Stephen

    2013-01-01

    Background The current reorganisation of the English NHS is one of the most comprehensive ever seen. This study reports early evidence from the development of clinical commissioning groups (CCGs), a key element in the new structures. Aim To explore the development of CCGs in the context of what is known from previous studies of GP involvement in commissioning. Design and setting Case study analysis from sites chosen to provide maximum variety across a number of dimensions, from September 2011 to June 2012. Method A case study analysis was conducted using eight detailed qualitative case studies supplemented by descriptive information from web surveys at two points in time. Data collection involved observation of a variety of meetings, and interviews with key participants. Results Previous research shows that clinical involvement in commissioning is most effective when GPs feel able to act autonomously. Complicated internal structures, alongside developing external accountability relationships mean that CCGs’ freedom to act may be subject to considerable constraint. Effective GP engagement is also important in determining outcomes of clinical commissioning, and there are a number of outstanding issues for CCGs, including: who feels ‘ownership’ of the CCG; how internal communication is conceptualised and realised; and the role and remit of locality groups. Previous incarnations of GP-led commissioning have tended to focus on local and primary care services. CCGs are keen to act to improve quality in their constituent practices, using approaches that many developed under practice-based commissioning. Constrained managerial support and the need to maintain GP engagement may have an impact. Conclusion CCGs are new organisations, faced with significant new responsibilities. This study provides early evidence of issues that CCGs and those responsible for CCG development may wish to address. PMID:23998841

  8. Employing the International Classification of Functioning, Disability and Health framework to capture user feedback in the design and testing stage of development of home-based arm rehabilitation technology.

    PubMed

    Sivan, Manoj; Gallagher, Justin; Holt, Ray; Weightman, Andrew; O'Connor, Rory; Levesley, Martin

    2016-01-01

    The purpose of this study was to evaluate the International Classification of Functioning, Disability and Health (ICF) as a framework to ensure that key aspects of user feedback are identified in the design and testing stages of development of a home-based upper limb rehabilitation system. Seventeen stroke survivors with residual upper limb weakness, and seven healthcare professionals with expertise in stroke rehabilitation, were enrolled in the user-centered design process. Through semi-structured interviews, they provided feedback on the hardware, software and impact of a home-based rehabilitation device to facilitate self-managed arm exercise. Members of the multidisciplinary clinical and engineering research team, based on previous experience and existing literature in user-centred design, developed the topic list for the interviews. Meaningful concepts were extracted from participants' interviews based on existing ICF linking rules and matched to categories within the ICF Comprehensive Core Set for stroke. Most of the interview concepts (except personal factors) matched the existing ICF Comprehensive Core Set categories. Personal factors that emerged from interviews e.g. gender, age, interest, compliance, motivation, choice and convenience that might determine device usability are yet to be categorised within the ICF framework and hence could not be matched to a specific Core Set category.

  9. Electronic health record training in undergraduate medical education: bridging theory to practice with curricula for empowering patient- and relationship-centered care in the computerized setting.

    PubMed

    Wald, Hedy S; George, Paul; Reis, Shmuel P; Taylor, Julie Scott

    2014-03-01

    While electronic health record (EHR) use is becoming state-of-the-art, deliberate teaching of health care information technology (HCIT) competencies is not keeping pace with burgeoning use. Medical students require training to become skilled users of HCIT, but formal pedagogy within undergraduate medical education (UME) is sparse. How can medical educators best meet the needs of learners while integrating EHRs into medical education and practice? How can they help learners preserve and foster effective communication skills within the computerized setting? In general, how can UME curricula be devised for skilled use of EHRs to enhance rather than hinder provision of effective, humanistic health care?Within this Perspective, the authors build on recent publications that "set the stage" for next steps: EHR curricula innovation and implementation as concrete embodiments of theoretical underpinnings. They elaborate on previous calls for maximizing benefits and minimizing risks of EHR use with sufficient focus on physician-patient communication skills and for developing core competencies within medical education. The authors describe bridging theory into practice with systematic longitudinal curriculum development for EHR training in UME at their institution, informed by Kern and colleagues' curriculum development framework, narrative medicine, and reflective practice. They consider this innovation within a broader perspective-the overarching goal of empowering undergraduate medical students' patient- and relationship-centered skills while effectively demonstrating HCIT-related skills.

  10. Neural Substrates of View-Invariant Object Recognition Developed without Experiencing Rotations of the Objects

    PubMed Central

    Okamura, Jun-ya; Yamaguchi, Reona; Honda, Kazunari; Tanaka, Keiji

    2014-01-01

    One fails to recognize an unfamiliar object across changes in viewing angle when it must be discriminated from similar distractor objects. View-invariant recognition gradually develops as the viewer repeatedly sees the objects in rotation. It is assumed that different views of each object are associated with one another while their successive appearance is experienced in rotation. However, natural experience of objects also contains ample opportunities to discriminate among objects at each of the multiple viewing angles. Our previous behavioral experiments showed that after experiencing a new set of object stimuli during a task that required only discrimination at each of four viewing angles at 30° intervals, monkeys could recognize the objects across changes in viewing angle up to 60°. By recording activities of neurons from the inferotemporal cortex after various types of preparatory experience, we here found a possible neural substrate for the monkeys' performance. For object sets that the monkeys had experienced during the task that required only discrimination at each of four viewing angles, many inferotemporal neurons showed object selectivity covering multiple views. The degree of view generalization found for these object sets was similar to that found for stimulus sets with which the monkeys had been trained to conduct view-invariant recognition. These results suggest that the experience of discriminating new objects in each of several viewing angles develops the partially view-generalized object selectivity distributed over many neurons in the inferotemporal cortex, which in turn bases the monkeys' emergent capability to discriminate the objects across changes in viewing angle. PMID:25378169

  11. Allocating limited resources in a time of fiscal constraints: a priority setting case study from Dalhousie University Faculty of Medicine.

    PubMed

    Mitton, Craig; Levy, Adrian; Gorsky, Diane; MacNeil, Christina; Dionne, Francois; Marrie, Tom

    2013-07-01

    Facing a projected $1.4M deficit on a $35M operating budget for fiscal year 2011/2012, members of the Dalhousie University Faculty of Medicine developed and implemented an explicit, transparent, criteria-based priority setting process for resource reallocation. A task group that included representatives from across the Faculty of Medicine used a program budgeting and marginal analysis (PBMA) framework, which provided an alternative to the typical public-sector approaches to addressing a budget deficit of across-the-board spending cuts and political negotiation. Key steps to the PBMA process included training staff members and department heads on priority setting and resource reallocation, establishing process guidelines to meet immediate and longer-term fiscal needs, developing a reporting structure and forming key working groups, creating assessment criteria to guide resource reallocation decisions, assessing disinvestment proposals from all departments, and providing proposal implementation recommendations to the dean. All departments were required to submit proposals for consideration. The task group approved 27 service reduction proposals and 28 efficiency gains proposals, totaling approximately $2.7M in savings across two years. During this process, the task group faced a number of challenges, including a tight timeline for development and implementation (January to April 2011), a culture that historically supported decentralized planning, at times competing interests (e.g., research versus teaching objectives), and reductions in overall health care and postsecondary education government funding. Overall, faculty and staff preferred the PBMA approach to previous practices. Other institutions should use this example to set priorities in times of fiscal constraints.

  12. Wavelet-based identification of DNA focal genomic aberrations from single nucleotide polymorphism arrays

    PubMed Central

    2011-01-01

    Background Copy number aberrations (CNAs) are an important molecular signature in cancer initiation, development, and progression. However, these aberrations span a wide range of chromosomes, making it hard to distinguish cancer related genes from other genes that are not closely related to cancer but are located in broadly aberrant regions. With the current availability of high-resolution data sets such as single nucleotide polymorphism (SNP) microarrays, it has become an important issue to develop a computational method to detect driving genes related to cancer development located in the focal regions of CNAs. Results In this study, we introduce a novel method referred to as the wavelet-based identification of focal genomic aberrations (WIFA). The use of the wavelet analysis, because it is a multi-resolution approach, makes it possible to effectively identify focal genomic aberrations in broadly aberrant regions. The proposed method integrates multiple cancer samples so that it enables the detection of the consistent aberrations across multiple samples. We then apply this method to glioblastoma multiforme and lung cancer data sets from the SNP microarray platform. Through this process, we confirm the ability to detect previously known cancer related genes from both cancer types with high accuracy. Also, the application of this approach to a lung cancer data set identifies focal amplification regions that contain known oncogenes, though these regions are not reported using a recent CNAs detecting algorithm GISTIC: SMAD7 (chr18q21.1) and FGF10 (chr5p12). Conclusions Our results suggest that WIFA can be used to reveal cancer related genes in various cancer data sets. PMID:21569311

  13. Human rotavirus vaccine (Rotarix): focus on effectiveness and impact 6 years after first introduction in Africa.

    PubMed

    O'Ryan, Miguel; Giaquinto, Carlo; Benninghoff, Bernd

    2015-01-01

    A decade after licensure of the human rotavirus vaccine (HRV), a wealth of evidence supports a reduction of rotavirus (RV) gastroenteritis-associated mortality and hospitalizations following HRV inclusion in national immunization programs. Nevertheless, the majority of real-world data has been generated in high- or middle-income settings. Clinical efficacy trials previously indicated RV vaccine performance may be lower in less-developed countries compared with wealthier counterparts. Using recently published data from Africa, we examine the effectiveness and impact of HRV in resource-deprived areas, exploring whether vaccine performance differs by socioeconomic setting and the potential underlying factors. HRV vaccine effectiveness in early adopting African countries has proven to be similar or even superior to the efficacy results observed in pre-licensure studies.

  14. Remanent magnetization and three-dimensional density model of the Kentucky anomaly region

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Existing software was modified to handle 3-D density and magnetization models of the Kentucky body and is being tested. Gravity and magnetic anomaly data sets are ready for use. A preliminary block model is under construction using the 1:1,000,000 maps. An x-y grid to overlay the 1:2,500,000 Albers maps and keyed to the 1:1,000,000 scale block models was created. Software was developed to generate a smoothed MAGSAT data set over this grid; this is to be input to an inversion program for generating the regional magnetization map. The regional scale 1:2,500,000 map mosaic is being digitized using previous magnetization models, the U.S. magnetic anomaly map, and regional tectonic maps as a guide.

  15. Building an ACT-R Reader for Eye-Tracking Corpus Data.

    PubMed

    Dotlačil, Jakub

    2018-01-01

    Cognitive architectures have often been applied to data from individual experiments. In this paper, I develop an ACT-R reader that can model a much larger set of data, eye-tracking corpus data. It is shown that the resulting model has a good fit to the data for the considered low-level processes. Unlike previous related works (most prominently, Engelmann, Vasishth, Engbert & Kliegl, ), the model achieves the fit by estimating free parameters of ACT-R using Bayesian estimation and Markov-Chain Monte Carlo (MCMC) techniques, rather than by relying on the mix of manual selection + default values. The method used in the paper is generalizable beyond this particular model and data set and could be used on other ACT-R models. Copyright © 2017 Cognitive Science Society, Inc.

  16. Treatment and outcome of patients with chest wall recurrence after mastectomy and breast reconstruction.

    PubMed

    Chagpar, Anees; Langstein, Howard N; Kronowitz, Steven J; Singletary, S Eva; Ross, Merrick I; Buchholz, Thomas A; Hunt, Kelly K; Kuerer, Henry M

    2004-02-01

    Chest wall recurrence (CWR) in the setting of previous mastectomy and breast reconstruction can pose complex management dilemmas for clinicians. We examined the impact of breast reconstruction on the treatment and outcomes of patients who subsequently developed a CWR. Between 1988 and 1998, 155 breast cancer patients with CWR after mastectomy were evaluated at our center. Of these patients, 27 had previously undergone breast reconstruction (immediate in 20; delayed in 7). Clinicopathologic features, treatment decisions, and outcomes were compared between the patients with and without previous breast reconstruction. Nonparametric statistics were used to analyse the data. There were no significant differences between the reconstruction and no-reconstruction groups in time to CWR, size of the CWR, number of nodules, ulceration, erythema, and association of CWR with nodal metastases. In patients with previous breast reconstruction, surgical resection of the CWR and repair of the resulting defect tended to be more complex and was more likely to require chest wall reconstruction by the plastic surgery team rather than simple excision or resection with primary closure (26% [7 of 27] versus 8% [10 of 128], P = 0.013). Risk of a second CWR, risk of distant metastases, median overall survival after CWR, and distant-metastasis-free survival after CWR did not differ significantly between patients with and without previous breast reconstruction. Breast reconstruction after mastectomy does not influence the clinical presentation or prognosis of women who subsequently develop a CWR. Collaboration with a plastic surgery team may be beneficial in the surgical management of these patients.

  17. Discovery of Transcription Factors Novel to Mouse Cerebellar Granule Cell Development Through Laser-Capture Microdissection.

    PubMed

    Zhang, Peter G Y; Yeung, Joanna; Gupta, Ishita; Ramirez, Miguel; Ha, Thomas; Swanson, Douglas J; Nagao-Sato, Sayaka; Itoh, Masayoshi; Kawaji, Hideya; Lassmann, Timo; Daub, Carsten O; Arner, Erik; de Hoon, Michiel; Carninci, Piero; Forrest, Alistair R R; Hayashizaki, Yoshihide; Goldowitz, Dan

    2018-06-01

    Laser-capture microdissection was used to isolate external germinal layer tissue from three developmental periods of mouse cerebellar development: embryonic days 13, 15, and 18. The cerebellar granule cell-enriched mRNA library was generated with next-generation sequencing using the Helicos technology. Our objective was to discover transcriptional regulators that could be important for the development of cerebellar granule cells-the most numerous neuron in the central nervous system. Through differential expression analysis, we have identified 82 differentially expressed transcription factors (TFs) from a total of 1311 differentially expressed genes. In addition, with TF-binding sequence analysis, we have identified 46 TF candidates that could be key regulators responsible for the variation in the granule cell transcriptome between developmental stages. Altogether, we identified 125 potential TFs (82 from differential expression analysis, 46 from motif analysis with 3 overlaps in the two sets). From this gene set, 37 TFs are considered novel due to the lack of previous knowledge about their roles in cerebellar development. The results from transcriptome-wide analyses were validated with existing online databases, qRT-PCR, and in situ hybridization. This study provides an initial insight into the TFs of cerebellar granule cells that might be important for development and provide valuable information for further functional studies on these transcriptional regulators.

  18. Transcriptional responses in thyroid tissues from rats treated with a tumorigenic and a non-tumorigenic triazole conazole fungicide.

    PubMed

    Hester, Susan D; Nesnow, Stephen

    2008-03-15

    Conazoles are azole-containing fungicides that are used in agriculture and medicine. Conazoles can induce follicular cell adenomas of the thyroid in rats after chronic bioassay. The goal of this study was to identify pathways and networks of genes that were associated with thyroid tumorigenesis through transcriptional analyses. To this end, we compared transcriptional profiles from tissues of rats treated with a tumorigenic and a non-tumorigenic conazole. Triadimefon, a rat thyroid tumorigen, and myclobutanil, which was not tumorigenic in rats after a 2-year bioassay, were administered in the feed to male Wistar/Han rats for 30 or 90 days similar to the treatment conditions previously used in their chronic bioassays. Thyroid gene expression was determined using high density Affymetrix GeneChips (Rat 230_2). Gene expression was analyzed by the Gene Set Expression Analyses method which clearly separated the tumorigenic treatments (tumorigenic response group (TRG)) from the non-tumorigenic treatments (non-tumorigenic response group (NRG)). Core genes from these gene sets were mapped to canonical, metabolic, and GeneGo processes and these processes compared across group and treatment time. Extensive analyses were performed on the 30-day gene sets as they represented the major perturbations. Gene sets in the 30-day TRG group had over representation of fatty acid metabolism, oxidation, and degradation processes (including PPARgamma and CYP involvement), and of cell proliferation responses. Core genes from these gene sets were combined into networks and found to possess signaling interactions. In addition, the core genes in each gene set were compared with genes known to be associated with human thyroid cancer. Among the genes that appeared in both rat and human data sets were: Acaca, Asns, Cebpg, Crem, Ddit3, Gja1, Grn, Jun, Junb, and Vegf. These genes were major contributors in the previously developed network from triadimefon-treated rat thyroids. It is postulated that triadimefon induces oxidative response genes and activates the nuclear receptor, Ppargamma, initiating transcription of gene products and signaling to a series of genes involved in cell proliferation.

  19. A reexamination of age-related variation in body weight and morphometry of Maryland nutria

    USGS Publications Warehouse

    Sherfy, M.H.; Mollett, T.A.; McGowan, K.R.; Daugherty, S.L.

    2006-01-01

    Age-related variation in morphometry has been documented for many species. Knowledge of growth patterns can be useful for modeling energetics, detecting physiological influences on populations, and predicting age. These benefits have shown value in understanding population dynamics of invasive species, particularly in developing efficient control and eradication programs. However, development and evaluation of descriptive and predictive models is a critical initial step in this process. Accordingly, we used data from necropsies of 1,544 nutria (Myocastor coypus) collected in Maryland, USA, to evaluate the accuracy of previously published models for prediction of nutria age from body weight. Published models underestimated body weights of our animals, especially for ages <3. We used cross-validation procedures to develop and evaluate models for describing nutria growth patterns and for predicting nutria age. We derived models from a randomly selected model-building data set (n = 192-193 M, 217-222 F) and evaluated them with the remaining animals (n = 487-488 M, 642-647 F). We used nonlinear regression to develop Gompertz growth-curve models relating morphometric variables to age. Predicted values of morphometric variables fell within the 95% confidence limits of their true values for most age classes. We also developed predictive models for estimating nutria age from morphometry, using linear regression of log-transformed age on morphometric variables. The evaluation data set corresponded with 95% prediction intervals from the new models. Predictive models for body weight and length provided greater accuracy and less bias than models for foot length and axillary girth. Our growth models accurately described age-related variation in nutria morphometry, and our predictive models provided accurate estimates of ages from morphometry that will be useful for live-captured individuals. Our models offer better accuracy and precision than previously published models, providing a capacity for modeling energetics and growth patterns of Maryland nutria as well as an empirical basis for determining population age structure from live-captured animals.

  20. A Bayesian compound stochastic process for modeling nonstationary and nonhomogeneous sequence evolution.

    PubMed

    Blanquart, Samuel; Lartillot, Nicolas

    2006-11-01

    Variations of nucleotidic composition affect phylogenetic inference conducted under stationary models of evolution. In particular, they may cause unrelated taxa sharing similar base composition to be grouped together in the resulting phylogeny. To address this problem, we developed a nonstationary and nonhomogeneous model accounting for compositional biases. Unlike previous nonstationary models, which are branchwise, that is, assume that base composition only changes at the nodes of the tree, in our model, the process of compositional drift is totally uncoupled from the speciation events. In addition, the total number of events of compositional drift distributed across the tree is directly inferred from the data. We implemented the method in a Bayesian framework, relying on Markov Chain Monte Carlo algorithms, and applied it to several nucleotidic data sets. In most cases, the stationarity assumption was rejected in favor of our nonstationary model. In addition, we show that our method is able to resolve a well-known artifact. By Bayes factor evaluation, we compared our model with 2 previously developed nonstationary models. We show that the coupling between speciations and compositional shifts inherent to branchwise models may lead to an overparameterization, resulting in a lesser fit. In some cases, this leads to incorrect conclusions, concerning the nature of the compositional biases. In contrast, our compound model more flexibly adapts its effective number of parameters to the data sets under investigation. Altogether, our results show that accounting for nonstationary sequence evolution may require more elaborate and more flexible models than those currently used.

  1. Signal processing for the detection of explosive residues on varying substrates using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie

    2011-05-01

    Laser induced breakdown spectroscopy (LIBS) can provide rapid, minimally destructive, chemical analysis of substances with the benefit of little to no sample preparation. Therefore, LIBS is a viable technology for the detection of substances of interest in near real-time fielded remote sensing scenarios. Of particular interest to military and security operations is the detection of explosive residues on various surfaces. It has been demonstrated that LIBS is capable of detecting such residues, however, the surface or substrate on which the residue is present can alter the observed spectra. Standard chemometric techniques such as principal components analysis and partial least squares discriminant analysis have previously been applied to explosive residue detection, however, the classification techniques developed on such data perform best against residue/substrate pairs that were included in model training but do not perform well when the residue/substrate pairs are not in the training set. Specifically residues in the training set may not be correctly detected if they are presented on a previously unseen substrate. In this work, we explicitly model LIBS spectra resulting from the residue and substrate to attempt to separate the response from each of the two components. This separation process is performed jointly with classifier design to ensure that the classifier that is developed is able to detect residues of interest without being confused by variations in the substrates. We demonstrate that the proposed classification algorithm provides improved robustness to variations in substrate compared to standard chemometric techniques for residue detection.

  2. MASCC/ISOO clinical practice guidelines for the management of mucositis secondary to cancer therapy.

    PubMed

    Lalla, Rajesh V; Bowen, Joanne; Barasch, Andrei; Elting, Linda; Epstein, Joel; Keefe, Dorothy M; McGuire, Deborah B; Migliorati, Cesar; Nicolatou-Galitis, Ourania; Peterson, Douglas E; Raber-Durlacher, Judith E; Sonis, Stephen T; Elad, Sharon

    2014-05-15

    Mucositis is a highly significant, and sometimes dose-limiting, toxicity of cancer therapy. The goal of this systematic review was to update the Multinational Association of Supportive Care in Cancer and International Society of Oral Oncology (MASCC/ISOO) Clinical Practice Guidelines for mucositis. A literature search was conducted to identify eligible published articles, based on predefined inclusion/exclusion criteria. Each article was independently reviewed by 2 reviewers. Studies were rated according to the presence of major and minor flaws as per previously published criteria. The body of evidence for each intervention, in each treatment setting, was assigned a level of evidence, based on previously published criteria. Guidelines were developed based on the level of evidence, with 3 possible guideline determinations: recommendation, suggestion, or no guideline possible. The literature search identified 8279 papers, 1032 of which were retrieved for detailed evaluation based on titles and abstracts. Of these, 570 qualified for final inclusion in the systematic reviews. Sixteen new guidelines were developed for or against the use of various interventions in specific treatment settings. In total, the MASCC/ISOO Mucositis Guidelines now include 32 guidelines: 22 for oral mucositis and 10 for gastrointestinal mucositis. This article describes these updated guidelines. The updated MASCC/ISOO Clinical Practice Guidelines for mucositis will help clinicians provide evidence-based management of mucositis secondary to cancer therapy. © 2014 The Authors. Cancer published by Wiley Periodicals, Inc. on behalf of American Cancer Society.

  3. Goal setting, using goal attainment scaling, as a method to identify patient selected items for measuring arm function.

    PubMed

    Ashford, Stephen; Jackson, Diana; Turner-Stokes, Lynne

    2015-03-01

    Following stroke or brain injury, goals for rehabilitation of the hemiparetic upper limb include restoring active function if there is return of motor control or, if none is possible, improving passive function, and facilitating care for the limb. To inform development of a new patient reported outcome measure (PROM) of active and passive function in the hemiparetic upper limb, the Arm Activity measure, we examined functional goals for the upper limb, identified during goal setting for spasticity intervention (physical therapy and concomitant botulinum toxin A interventions). Using secondary analysis of a prospective observational cohort study, functional goals determined between patients, their carers and the clinical team were assigned into categories by two raters. Goal category identification, followed by assignment of goals to a category, was undertaken and then confirmed by a second reviewer. Participants comprised nine males and seven females of mean (SD) age 54.5 (15.7) years and their carers. Fifteen had sustained a stroke and one a traumatic brain injury. Goals were used to identify five categories: passive function, active function, symptoms, cosmesis and impairment. Two passive function items not previously identified by a previous systematic review were identified. Analysis of goals important to patients and carers revealed items for inclusion in a new measure of arm function and provide a useful alternative method to involve patients and carers in standardised measure development. Copyright © 2014 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  4. Respiratory rate estimation during triage of children in hospitals.

    PubMed

    Shah, Syed Ahmar; Fleming, Susannah; Thompson, Matthew; Tarassenko, Lionel

    2015-01-01

    Accurate assessment of a child's health is critical for appropriate allocation of medical resources and timely delivery of healthcare in Emergency Departments. The accurate measurement of vital signs is a key step in the determination of the severity of illness and respiratory rate is currently the most difficult vital sign to measure accurately. Several previous studies have attempted to extract respiratory rate from photoplethysmogram (PPG) recordings. However, the majority have been conducted in controlled settings using PPG recordings from healthy subjects. In many studies, manual selection of clean sections of PPG recordings was undertaken before assessing the accuracy of the signal processing algorithms developed. Such selection procedures are not appropriate in clinical settings. A major limitation of AR modelling, previously applied to respiratory rate estimation, is an appropriate selection of model order. This study developed a novel algorithm that automatically estimates respiratory rate from a median spectrum constructed applying multiple AR models to processed PPG segments acquired with pulse oximetry using a finger probe. Good-quality sections were identified using a dynamic template-matching technique to assess PPG signal quality. The algorithm was validated on 205 children presenting to the Emergency Department at the John Radcliffe Hospital, Oxford, UK, with reference respiratory rates up to 50 breaths per minute estimated by paediatric nurses. At the time of writing, the authors are not aware of any other study that has validated respiratory rate estimation using data collected from over 200 children in hospitals during routine triage.

  5. Gas Migration Project: Risk Assessment Tool and Computational Analyses to Investigate Wellbore/Mine Interactions, Secretary's Potash Area, Southeastern New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sobolik, Steven R.; Hadgu, Teklu; Rechard, Robert P.

    The Bureau of Land Management (BLM), US Department of the Interior has asked Sandia National Laboratories (SNL) to perform scientific studies relevant to technical issues that arise in the development of co-located resources of potash and petroleum in southeastern New Mexico in the Secretary’s Potash Area. The BLM manages resource development, issues permits and interacts with the State of New Mexico in the process of developing regulations, in an environment where many issues are disputed by industry stakeholders. The present report is a deliverable of the study of the potential for gas migration from a wellbore to a mine openingmore » in the event of wellbore leakage, a risk scenario about which there is disagreement among stakeholders and little previous site specific analysis. One goal of this study was to develop a framework that required collaboratively developed inputs and analytical approaches in order to encourage stakeholder participation and to employ ranges of data values and scenarios. SNL presents here a description of a basic risk assessment (RA) framework that will fulfill the initial steps of meeting that goal. SNL used the gas migration problem to set up example conceptual models, parameter sets and computer models and as a foundation for future development of RA to support BLM resource development.« less

  6. Prediction of three social cognitive-motivational structure types.

    PubMed

    Malerstein, A J; Ahern, M M; Pulos, S

    2001-10-01

    Previously, using interviews from Baumrind's longitudinal study, three cognitive-motivational structures (CMSs) were predicted in 68 adolescents from caregiving settings and from the CMS types of their mothers, based on the mothers' interviews elicited six years earlier. CMS theory proposes that during Piaget's Concrete Operational Period care-receiving influences the child's adoption of a social cognitive style, which corresponds to one of Piaget's stages of cognitive development. One who is classified as an Operational experiences the caregiving setting as tuned to the child's long-term interests, becomes focused on function and control of function and grasps the distinctions between and gradations of social attributes. One classified as future Intuitive experiences the caregiving as insufficient or unreliable and becomes focused on getting and having, and assesses social situations based on current striking dimensions. A person classified as being future Symbolic experiences the caregiving as out of tune with the self or the world, becomes focused on identity and emotional closeness, and may define self or object by a single attribute. This previous study did not distinguish between the influence of caregiving (including mothers' CMS) on the formation of adolescent CMS type and the possible constancy of CMS type from ages 9 to 15 years. The current study was designed to distinguish between these two possibilities, using data from 67 of the same mothers. Mothers' interviews were purged of descriptions of her child's behavior. Another interview was composed of the purged descriptions of child behavior. This was also done for interviews held when the child was 4 and 15 as well as at 9. From interviews with descriptions of child behavior purged, mother's CMS type at the child's age of 4 and 9 yr. agreed with her adolescent's previously assigned CMS type (p<.05), and caregiving setting at 9 years predicted the adolescent's CMS type (p<.05). From interviews composed of descriptions of only the child's behavior, adolescent CMS type agreed with previously assigned adolescent CMS type (p<.01). Findings were consonant with the idea that CMS type formation is influenced at about Age 9 and sufficiently established to be recognized at Age 15.

  7. Ensemble representations: effects of set size and item heterogeneity on average size perception.

    PubMed

    Marchant, Alexander P; Simons, Daniel J; de Fockert, Jan W

    2013-02-01

    Observers can accurately perceive and evaluate the statistical properties of a set of objects, forming what is now known as an ensemble representation. The accuracy and speed with which people can judge the mean size of a set of objects have led to the proposal that ensemble representations of average size can be computed in parallel when attention is distributed across the display. Consistent with this idea, judgments of mean size show little or no decrement in accuracy when the number of objects in the set increases. However, the lack of a set size effect might result from the regularity of the item sizes used in previous studies. Here, we replicate these previous findings, but show that judgments of mean set size become less accurate when set size increases and the heterogeneity of the item sizes increases. This pattern can be explained by assuming that average size judgments are computed using a limited capacity sampling strategy, and it does not necessitate an ensemble representation computed in parallel across all items in a display. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. ECHO: A reference-free short-read error correction algorithm

    PubMed Central

    Kao, Wei-Chun; Chan, Andrew H.; Song, Yun S.

    2011-01-01

    Developing accurate, scalable algorithms to improve data quality is an important computational challenge associated with recent advances in high-throughput sequencing technology. In this study, a novel error-correction algorithm, called ECHO, is introduced for correcting base-call errors in short-reads, without the need of a reference genome. Unlike most previous methods, ECHO does not require the user to specify parameters of which optimal values are typically unknown a priori. ECHO automatically sets the parameters in the assumed model and estimates error characteristics specific to each sequencing run, while maintaining a running time that is within the range of practical use. ECHO is based on a probabilistic model and is able to assign a quality score to each corrected base. Furthermore, it explicitly models heterozygosity in diploid genomes and provides a reference-free method for detecting bases that originated from heterozygous sites. On both real and simulated data, ECHO is able to improve the accuracy of previous error-correction methods by several folds to an order of magnitude, depending on the sequence coverage depth and the position in the read. The improvement is most pronounced toward the end of the read, where previous methods become noticeably less effective. Using a whole-genome yeast data set, it is demonstrated here that ECHO is capable of coping with nonuniform coverage. Also, it is shown that using ECHO to perform error correction as a preprocessing step considerably facilitates de novo assembly, particularly in the case of low-to-moderate sequence coverage depth. PMID:21482625

  9. Inhalability for aerosols at ultra-low windspeeds

    NASA Astrophysics Data System (ADS)

    Sleeth, Darrah K.; Vincent, James H.

    2009-02-01

    Most previous experimental studies of aerosol inhalability were conducted in wind tunnels for windspeeds greater than 0.5 ms-1. While that body of work was used to establish a convention for the inhalable fraction, results from studies in calm air chambers (for essentially zero windspeed) are being discussed as the basis of a modified criterion. However, information is lacking for windspeeds in the intermediate range, which - it so happens - pertain to most actual workplaces. With this in mind, we have developed a new experimental system to assess inhalability - and, ultimately, personal sampler performance - for aerosols with particle aerodynamic diameter within the range from about 9 to 90 μm for ultra-low windspeed environments from about 0.1 to 0.5 ms1. This new system contains an aerosol test facility, fully described elsewhere, that combines the physical attributes and performance characteristics of moving air wind tunnels and calm air chambers, both of which have featured individually in previous research. It also contains a specially-designed breathing, heated, life-sized mannequin that allows for accurate recovery of test particulate material that has been inhaled. Procedures have been developed that employ test aerosols of well-defined particle size distribution generated mechanically from narrowly-graded powders of fused alumina. Using this new system, we have conducted an extensive set of new experiments to measure the inhalability of a human subject (as represented by the mannequin), aimed at filling the current knowledge gap for conditions that are more realistic than those embodied in most previous research. These data reveal that inhalability throughout the range of interest is significantly different based on windspeed, indicating a rise in aspiration efficiency as windspeed decreases. Breathing flowrate and mode of breathing (i.e. nose versus mouth breathing) did not show significant differences for the inhalability of aerosols. On the whole however, the data obtained here are within the range of inhalability data that exist from the large body of the previous experimental work performed at the higher windspeeds. These latest findings are an important contribution to the ongoing discussion in international standards-setting bodies about the possible adjustment of the quantitative definition of what constitutes the inhalable fraction.

  10. Differential Binding between Volatile Ligands and Major Urinary Proteins Due to Genetic Variation in Mice

    DTIC Science & Technology

    2012-06-20

    a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...previous studies have examined only one of the classes at a time. No study has analyzed these two sets simultaneously, and consequently binding...previous studies have examined only one of the classes at a time. No study has analyzed these two sets simultaneously, and consequently binding

  11. Migration of Dust Particles and Their Collisions with the Terrestrial Planets

    NASA Technical Reports Server (NTRS)

    Ipatov, S. I.; Mather, J. C.

    2004-01-01

    Our review of previously published papers on dust migration can be found in [1], where we also present different distributions of migrating dust particles. We considered a different set of initial orbits for the dust particles than those in the previous papers. Below we pay the main attention to the collisional probabilities of migrating dust particles with the planets based on a set of orbital elements during their evolution. Such probabilities were not calculated earlier.

  12. Bilinguals Use Language-Specific Articulatory Settings

    ERIC Educational Resources Information Center

    Wilson, Ian; Gick, Bryan

    2014-01-01

    Purpose: Previous work has shown that monolingual French and English speakers use distinct articulatory settings, the underlying articulatory posture of a language. In the present article, the authors report on an experiment in which they investigated articulatory settings in bilingual speakers. The authors first tested the hypothesis that in…

  13. Evaluating a scalable model for implementing electronic health records in resource-limited settings.

    PubMed

    Were, Martin C; Emenyonu, Nneka; Achieng, Marion; Shen, Changyu; Ssali, John; Masaba, John P M; Tierney, William M

    2010-01-01

    Current models for implementing electronic health records (EHRs) in resource-limited settings may not be scalable because they fail to address human-resource and cost constraints. This paper describes an implementation model which relies on shared responsibility between local sites and an external three-pronged support infrastructure consisting of: (1) a national technical expertise center, (2) an implementer's community, and (3) a developer's community. This model was used to implement an open-source EHR in three Ugandan HIV-clinics. Pre-post time-motion study at one site revealed that Primary Care Providers spent a third less time in direct and indirect care of patients (p<0.001) and 40% more time on personal activities (p=0.09) after EHRs implementation. Time spent by previously enrolled patients with non-clinician staff fell by half (p=0.004) and with pharmacy by 63% (p<0.001). Surveyed providers were highly satisfied with the EHRs and its support infrastructure. This model offers a viable approach for broadly implementing EHRs in resource-limited settings.

  14. A fully automatic three-step liver segmentation method on LDA-based probability maps for multiple contrast MR images.

    PubMed

    Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf

    2010-07-01

    Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Machine learning for epigenetics and future medical applications.

    PubMed

    Holder, Lawrence B; Haque, M Muksitul; Skinner, Michael K

    2017-07-03

    Understanding epigenetic processes holds immense promise for medical applications. Advances in Machine Learning (ML) are critical to realize this promise. Previous studies used epigenetic data sets associated with the germline transmission of epigenetic transgenerational inheritance of disease and novel ML approaches to predict genome-wide locations of critical epimutations. A combination of Active Learning (ACL) and Imbalanced Class Learning (ICL) was used to address past problems with ML to develop a more efficient feature selection process and address the imbalance problem in all genomic data sets. The power of this novel ML approach and our ability to predict epigenetic phenomena and associated disease is suggested. The current approach requires extensive computation of features over the genome. A promising new approach is to introduce Deep Learning (DL) for the generation and simultaneous computation of novel genomic features tuned to the classification task. This approach can be used with any genomic or biological data set applied to medicine. The application of molecular epigenetic data in advanced machine learning analysis to medicine is the focus of this review.

  16. Veterinary software application for comparison of thermograms for pathology evaluation

    NASA Astrophysics Data System (ADS)

    Pant, Gita; Umbaugh, Scott E.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-09-01

    The bilateral symmetry property in mammals allows for the detection of pathology by comparison of opposing sides. For any pathological disorder, thermal patterns differ compared to the normal body part. A software application for veterinary clinics has been under development to input two thermograms of body parts on both sides, one normal and the other unknown, and the application compares them based on extracted features and appropriate similarity and difference measures and outputs the likelihood of pathology. Here thermographic image data from 19° C to 40° C was linearly remapped to create images with 256 gray level values. Features were extracted from these images, including histogram, texture and spectral features. The comparison metrics used are the vector inner product, Tanimoto, Euclidean, city block, Minkowski and maximum value metric. Previous research with the anterior cruciate ligament (ACL) pathology in dogs suggested any thermogram variation below a threshold of 40% of Euclidean distance is normal and above 40% is abnormal. Here the 40% threshold was applied to a new ACL image set and achieved a sensitivity of 75%, an improvement from the 55% sensitivity of the previous work. With the new data set it was determined that using a threshold of 20% provided a much improved 92% sensitivity metric. However, this will require further research to determine the corresponding specificity success rate. Additionally, it was found that the anterior view provided better results than the lateral view. It was also determined that better results were obtained with all three feature sets than with just the histogram and texture sets. Further experiments are ongoing with larger image datasets, and pathologies, new features and comparison metric evaluation for determination of more accurate threshold values to separate normal and abnormal images.

  17. Optic disc segmentation for glaucoma screening system using fundus images.

    PubMed

    Almazroa, Ahmed; Sun, Weiwei; Alodhayb, Sami; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan

    2017-01-01

    Segmenting the optic disc (OD) is an important and essential step in creating a frame of reference for diagnosing optic nerve head pathologies such as glaucoma. Therefore, a reliable OD segmentation technique is necessary for automatic screening of optic nerve head abnormalities. The main contribution of this paper is in presenting a novel OD segmentation algorithm based on applying a level set method on a localized OD image. To prevent the blood vessels from interfering with the level set process, an inpainting technique was applied. As well an important contribution was to involve the variations in opinions among the ophthalmologists in detecting the disc boundaries and diagnosing the glaucoma. Most of the previous studies were trained and tested based on only one opinion, which can be assumed to be biased for the ophthalmologist. In addition, the accuracy was calculated based on the number of images that coincided with the ophthalmologists' agreed-upon images, and not only on the overlapping images as in previous studies. The ultimate goal of this project is to develop an automated image processing system for glaucoma screening. The disc algorithm is evaluated using a new retinal fundus image dataset called RIGA (retinal images for glaucoma analysis). In the case of low-quality images, a double level set was applied, in which the first level set was considered to be localization for the OD. Five hundred and fifty images are used to test the algorithm accuracy as well as the agreement among the manual markings of six ophthalmologists. The accuracy of the algorithm in marking the optic disc area and centroid was 83.9%, and the best agreement was observed between the results of the algorithm and manual markings in 379 images.

  18. A point-charge force field for molecular mechanics simulations of proteins based on condensed-phase quantum mechanical calculations.

    PubMed

    Duan, Yong; Wu, Chun; Chowdhury, Shibasish; Lee, Mathew C; Xiong, Guoming; Zhang, Wei; Yang, Rong; Cieplak, Piotr; Luo, Ray; Lee, Taisung; Caldwell, James; Wang, Junmei; Kollman, Peter

    2003-12-01

    Molecular mechanics models have been applied extensively to study the dynamics of proteins and nucleic acids. Here we report the development of a third-generation point-charge all-atom force field for proteins. Following the earlier approach of Cornell et al., the charge set was obtained by fitting to the electrostatic potentials of dipeptides calculated using B3LYP/cc-pVTZ//HF/6-31G** quantum mechanical methods. The main-chain torsion parameters were obtained by fitting to the energy profiles of Ace-Ala-Nme and Ace-Gly-Nme di-peptides calculated using MP2/cc-pVTZ//HF/6-31G** quantum mechanical methods. All other parameters were taken from the existing AMBER data base. The major departure from previous force fields is that all quantum mechanical calculations were done in the condensed phase with continuum solvent models and an effective dielectric constant of epsilon = 4. We anticipate that this force field parameter set will address certain critical short comings of previous force fields in condensed-phase simulations of proteins. Initial tests on peptides demonstrated a high-degree of similarity between the calculated and the statistically measured Ramanchandran maps for both Ace-Gly-Nme and Ace-Ala-Nme di-peptides. Some highlights of our results include (1) well-preserved balance between the extended and helical region distributions, and (2) favorable type-II poly-proline helical region in agreement with recent experiments. Backward compatibility between the new and Cornell et al. charge sets, as judged by overall agreement between dipole moments, allows a smooth transition to the new force field in the area of ligand-binding calculations. Test simulations on a large set of proteins are also discussed. Copyright 2003 Wiley Periodicals, Inc. J Comput Chem 24: 1999-2012, 2003

  19. ISAAC - InterSpecies Analysing Application using Containers.

    PubMed

    Baier, Herbert; Schultz, Jörg

    2014-01-15

    Information about genes, transcripts and proteins is spread over a wide variety of databases. Different tools have been developed using these databases to identify biological signals in gene lists from large scale analysis. Mostly, they search for enrichments of specific features. But, these tools do not allow an explorative walk through different views and to change the gene lists according to newly upcoming stories. To fill this niche, we have developed ISAAC, the InterSpecies Analysing Application using Containers. The central idea of this web based tool is to enable the analysis of sets of genes, transcripts and proteins under different biological viewpoints and to interactively modify these sets at any point of the analysis. Detailed history and snapshot information allows tracing each action. Furthermore, one can easily switch back to previous states and perform new analyses. Currently, sets can be viewed in the context of genomes, protein functions, protein interactions, pathways, regulation, diseases and drugs. Additionally, users can switch between species with an automatic, orthology based translation of existing gene sets. As todays research usually is performed in larger teams and consortia, ISAAC provides group based functionalities. Here, sets as well as results of analyses can be exchanged between members of groups. ISAAC fills the gap between primary databases and tools for the analysis of large gene lists. With its highly modular, JavaEE based design, the implementation of new modules is straight forward. Furthermore, ISAAC comes with an extensive web-based administration interface including tools for the integration of third party data. Thus, a local installation is easily feasible. In summary, ISAAC is tailor made for highly explorative interactive analyses of gene, transcript and protein sets in a collaborative environment.

  20. International Life Science Institute North America Cronobacter (Formerly Enterobacter sakazakii) isolate set.

    PubMed

    Ivy, Reid A; Farber, Jeffrey M; Pagotto, Franco; Wiedmann, Martin

    2013-01-01

    Foodborne pathogen isolate collections are important for the development of detection methods, for validation of intervention strategies, and to develop an understanding of pathogenesis and virulence. We have assembled a publicly available Cronobacter (formerly Enterobacter sakazakii) isolate set that consists of (i) 25 Cronobacter sakazakii isolates, (ii) two Cronobacter malonaticus isolates, (iii) one Cronobacter muytjensii isolate, which displays some atypical phenotypic characteristics, biochemical profiles, and colony color on selected differential media, and (iv) two nonclinical Enterobacter asburiae isolates, which show some phenotypic characteristics similar to those of Cronobacter spp. The set consists of human (n = 10), food (n = 11), and environmental (n = 9) isolates. Analysis of partial 16S rDNA sequence and seven-gene multilocus sequence typing data allowed for reliable identification of these isolates to species and identification of 14 isolates as sequence type 4, which had previously been shown to be the most common C. sakazakii sequence type associated with neonatal meningitis. Phenotypic characterization was carried out with API 20E and API 32E test strips and streaking on two selective chromogenic agars; isolates were also assessed for sorbitol fermentation and growth at 45°C. Although these strategies typically produced the same classification as sequence-based strategies, based on a panel of four biochemical tests, one C. sakazakii isolate yielded inconclusive data and one was classified as C. malonaticus. EcoRI automated ribotyping and pulsed-field gel electrophoresis (PFGE) with XbaI separated the set into 23 unique ribotypes and 30 unique PFGE types, respectively, indicating subtype diversity within the set. Subtype and source data for the collection are publicly available in the PathogenTracker database (www. pathogentracker. net), which allows for continuous updating of information on the set, including links to publications that include information on isolates from this collection.

  1. Advances in iterative non-uniformity correction techniques for infrared scene projection

    NASA Astrophysics Data System (ADS)

    Danielson, Tom; Franks, Greg; LaVeigne, Joe; Prewarski, Marcus; Nehring, Brian

    2015-05-01

    Santa Barbara Infrared (SBIR) is continually developing improved methods for non-uniformity correction (NUC) of its Infrared Scene Projectors (IRSPs) as part of its comprehensive efforts to achieve the best possible projector performance. The most recent step forward, Advanced Iterative NUC (AI-NUC), improves upon previous NUC approaches in several ways. The key to NUC performance is achieving the most accurate possible input drive-to-radiance output mapping for each emitter pixel. This requires many highly-accurate radiance measurements of emitter output, as well as sophisticated manipulation of the resulting data set. AI-NUC expands the available radiance data set to include all measurements made of emitter output at any point. In addition, it allows the user to efficiently manage that data for use in the construction of a new NUC table that is generated from an improved fit of the emitter response curve. Not only does this improve the overall NUC by offering more statistics for interpolation than previous approaches, it also simplifies the removal of erroneous data from the set so that it does not propagate into the correction tables. AI-NUC is implemented by SBIR's IRWindows4 automated test software as part its advanced turnkey IRSP product (the Calibration Radiometry System or CRS), which incorporates all necessary measurement, calibration and NUC table generation capabilities. By employing AI-NUC on the CRS, SBIR has demonstrated the best uniformity results on resistive emitter arrays to date.

  2. Investigation of Lithium Metal Hydride Materials for Mitigation of Deep Space Radiation

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Atwell, William

    2016-01-01

    Radiation exposure to crew, electronics, and non-metallic materials is one of many concerns with long-term, deep space travel. Mitigating this exposure is approached via a multi-faceted methodology focusing on multi-functional materials, vehicle configuration, and operational or mission constraints. In this set of research, we are focusing on new multi-functional materials that may have advantages over traditional shielding materials, such as polyethylene. Metal hydride materials are of particular interest for deep space radiation shielding due to their ability to store hydrogen, a low-Z material known to be an excellent radiation mitigator and a potential fuel source. We have previously investigated 41 different metal hydrides for their radiation mitigation potential. Of these metal hydrides, we found a set of lithium hydrides to be of particular interest due to their excellent shielding of galactic cosmic radiation. Given these results, we will continue our investigation of lithium hydrides by expanding our data set to include dose equivalent and to further understand why these materials outperformed polyethylene in a heavy ion environment. For this study, we used HZETRN 2010, a one-dimensional transport code developed by NASA Langley Research Center, to simulate radiation transport through the lithium hydrides. We focused on the 1977 solar minimum Galactic Cosmic Radiation environment and thicknesses of 1, 5, 10, 20, 30, 50, and 100 g/cm2 to stay consistent with our previous studies. The details of this work and the subsequent results will be discussed in this paper.

  3. 17 to 23: A novel complementary mini Y-STR panel to extend the Y-STR databases from 17 to 23 markers for forensic purposes.

    PubMed

    Núñez, Carolina; Baeta, Miriam; Ibarbia, Nerea; Ortueta, Urko; Jiménez-Moreno, Susana; Blazquez-Caeiro, José Luis; Builes, Juan José; Herrera, Rene J; Martínez-Jarreta, Begoña; de Pancorbo, Marian M

    2017-04-01

    A Y-STR multiplex system has been developed with the purpose of complementing the widely used 17 Y-STR haplotyping (AmpFlSTR Y Filer® PCR Amplification kit) routinely employed in forensic and population genetic studies. This new multiplex system includes six additional STR loci (DYS576, DYS481, DYS549, DYS533, DYS570, and DYS643) to reach the 23 Y-STR of the PowerPlex® Y23 System. In addition, this kit includes the DYS456 and DYS385 loci for traceability purposes. Male samples from 625 individuals from ten worldwide populations were genotyped, including three sample sets from populations previously published with the 17 Y-STR system to expand their current data. Validation studies demonstrated good performance of the panel set in terms of concordance, sensitivity, and stability in the presence of inhibitors and artificially degraded DNA. The results obtained for haplotype diversity and discrimination capacity with this multiplex system were considerably high, providing further evidences of the suitability of this novel Y-STR system for forensic purposes. Thus, the use of this multiplex for samples previously genotyped with 17 Y-STRs will be an efficient and low-cost alternative to complete the set of 23 Y-STRs and improve allele databases for population and forensic purposes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Youth–adult partnership: exploring contributions to empowerment, agency and community connections in Malaysian youth programs.

    PubMed

    Krauss, Steven Eric; Collura, Jessica; Zeldin, Shepherd; Ortega, Adriana; Abdullah, Haslinda; Sulaiman, Abdul Hadi

    2014-09-01

    Youth–adult partnership (Y–AP) has emerged as a key practice for enacting two features of effective developmental settings: supportive adult relationships and support for efficacy and mattering. Previous studies have shown that when youth, supported by adults, actively participate in organizational and community decision making they are likely to show greater confidence and agency, empowerment and critical consciousness, and community connections. Most of the extant research on Y–AP is limited to qualitative studies and the identification of organizational best practices. Almost all research focuses on Western sociocultural settings. To address these gaps, 299 youth, age 15 to 24, were sampled from established afterschool and community programs in Malaysia to explore the contribution of Y–AP (operationalized as having two components: youth voice in decision-making and supportive adult relationships) to empowerment, agency and community connections. As hypothesized, hierarchical regressions indicated that program quality (Y–AP, safe environment and program engagement) contributed to agency, empowerment and community connections beyond the contribution of family, school and religion. Additionally, the Y–AP measures contributed substantially more variance than the other measures of program quality on each outcome. Interaction effects indicated differences by age for empowerment and agency but not for community connections. The primary findings in this inquiry replicate those found in previous interview and observational-oriented studies. The data suggests fertile ground for future research while demonstrating that Y–AP may be an effective practice for positive youth development outside of Western settings.

  5. Summary of the Third AIAA CFD Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Vassberg, John C.; Tinoco, Edward N.; Mani, Mori; Brodersen, Olaf P.; Eisfeld, Bernhard; Wahls, Richard A.; Morrison, Joseph H.; Zickuhr, Tom; Laflin, Kelly R.; Mavriplis, DImitri J.

    2007-01-01

    The workshop focused on the prediction of both absolute and differential drag levels for wing-body and wing-al;one configurations of that are representative of transonic transport aircraft. The baseline DLR-F6 wing-body geometry, previously utilized in DPW-II, is also augmented with a side-body fairing to help reduce the complexity of the flow physics in the wing-body juncture region. In addition, two new wing-alone geometries have been developed for the DPW-II. Numerical calculations are performed using industry-relevant test cases that include lift-specific and fixed-alpha flight conditions, as well as full drag polars. Drag, lift, and pitching moment predictions from previous Reynolds-Averaged Navier-Stokes computational fluid Dynamics Methods are presented, focused on fully-turbulent flows. Solutions are performed on structured, unstructured, and hybrid grid systems. The structured grid sets include point-matched multi-block meshes and over-set grid systems. The unstructured and hybrid grid sets are comprised of tetrahedral, pyramid, and prismatic elements. Effort was made to provide a high-quality and parametrically consistent family of grids for each grid type about each configuration under study. The wing-body families are comprised of a coarse, medium, and fine grid, while the wing-alone families also include an extra-fine mesh. These mesh sequences are utilized to help determine how the provided flow solutions fair with respect to asymptotic grid convergence, and are used to estimate an absolute drag of each configuration.

  6. User interfaces in space science instrumentation

    NASA Astrophysics Data System (ADS)

    McCalden, Alec John

    This thesis examines user interaction with instrumentation in the specific context of space science. It gathers together existing practice in machine interfaces with a look at potential future usage and recommends a new approach to space science projects with the intention of maximising their science return. It first takes a historical perspective on user interfaces and ways of defining and measuring the science return of a space instrument. Choices of research methodology are considered. Implementation details such as the concepts of usability, mental models, affordance and presentation of information are described, and examples of existing interfaces in space science are given. A set of parameters for use in analysing and synthesizing a user interface is derived by using a set of case studies of diverse failures and from previous work. A general space science user analysis is made by looking at typical practice, and an interview plus persona technique is used to group users with interface designs. An examination is made of designs in the field of astronomical instrumentation interfaces, showing the evolution of current concepts and including ideas capable of sustaining progress in the future. The parameters developed earlier are then tested against several established interfaces in the space science context to give a degree of confidence in their use. The concept of a simulator that is used to guide the development of an instrument over the whole lifecycle is described, and the idea is proposed that better instrumentation would result from more efficient use of the resources available. The previous ideas in this thesis are then brought together to describe a proposed new approach to a typical development programme, with an emphasis on user interaction. The conclusion shows that there is significant room for improvement in the science return from space instrumentation by attention to the user interface.

  7. Development of key indicators of hospital resilience: a modified Delphi study.

    PubMed

    Zhong, Shuang; Clark, Michele; Hou, Xiang-Yu; Zang, Yuli; FitzGerald, Gerard

    2015-04-01

    Hospital resilience is an emerging concept, which can be defined as 'a hospital's ability to resist, absorb, and respond to the shock of disasters while maintaining its critical health care functions, and then recover to its original state or adapt to a new one'. Our aim was to develop a comprehensive framework of key indicators of hospital resilience. A panel of 33 Chinese experts was invited to participate in a three-round, modified Delphi study to develop a set of potential measures previously derived from a literature review. In the first round, these potential measures were modified to cover the comprehensive domains of hospital resilience. The importance of proposed measures was scored by experts on a five-point Likert scale. Subsequently, the experts reconsidered their voting in light of the previous aggregated results. Agreement on measures was defined as at least 70% of the responders agreeing or strongly agreeing to the inclusion of a measure. A large proportion of preliminary measures (89.5%) were identified as having good potential for assessing hospital resilience. These measures were categorized into eight domains, 17 subdomains, and 43 indicators. The highest rated indicators (mean score) were: equipment for on-site rescue (4.7), plan initiation (4.6), equipment for referral of patients with complex care needs (4.5), the plan execution (4.4), medication management strategies (4.4), emergency medical treatment conditions (4.4), disaster committee (4.4), stock types and quantities for essential medications (4.4), surge capacity of emergency beds (4.4), and mass-casualty triage protocols (4.4). This framework identifies a comprehensive set of indicators of hospital resilience. It can be used for hospital assessment, as well as informing priority practices to address future disasters better. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. Relating coccidioidomycosis (valley fever) incidence to soil moisture conditions.

    PubMed

    Coopersmith, E J; Bell, J E; Benedict, K; Shriber, J; McCotter, O; Cosh, M H

    2017-04-17

    Coccidioidomycosis (also called Valley fever) is caused by a soilborne fungus, Coccidioides spp. , in arid regions of the southwestern United States. Though some who develop infections from this fungus remain asymptomatic, others develop respiratory disease as a consequence. Less commonly, severe illness and death can occur when the infection spreads to other regions of the body. Previous analyses have attempted to connect the incidence of coccidioidomycosis to broadly available climatic measurements, such as precipitation or temperature. However, with the limited availability of long-term, in situ soil moisture data sets, it has not been feasible to perform a direct analysis of the relationships between soil moisture levels and coccidioidomycosis incidence on a larger temporal and spatial scale. Utilizing in situ soil moisture gauges throughout the southwest from the U.S. Climate Reference Network and a model with which to extend those estimates, this work connects periods of higher and lower soil moisture in Arizona and California between 2002 and 2014 to the reported incidence of coccidioidomycosis. The results indicate that in both states, coccidioidomycosis incidence is related to soil moisture levels from previous summers and falls. Stated differently, a higher number of coccidioidomycosis cases are likely to be reported if previous bands of months have been atypically wet or dry, depending on the location.

  9. Myxococcus xanthus Developmental Cell Fate Production: Heterogeneous Accumulation of Developmental Regulatory Proteins and Reexamination of the Role of MazF in Developmental Lysis

    PubMed Central

    Lee, Bongsoo; Holkenbrink, Carina; Treuner-Lange, Anke

    2012-01-01

    Myxococcus xanthus undergoes a starvation-induced multicellular developmental program during which cells partition into three known fates: (i) aggregation into fruiting bodies followed by differentiation into spores, (ii) lysis, or (iii) differentiation into nonaggregating persister-like cells, termed peripheral rods. As a first step to characterize cell fate segregation, we enumerated total, aggregating, and nonaggregating cells throughout the developmental program. We demonstrate that both cell lysis and cell aggregation begin with similar timing at approximately 24 h after induction of development. Examination of several known regulatory proteins in the separated aggregated and nonaggregated cell fractions revealed previously unknown heterogeneity in the accumulation patterns of proteins involved in type IV pilus (T4P)-mediated motility (PilC and PilA) and regulation of development (MrpC, FruA, and C-signal). As part of our characterization of the cell lysis fate, we set out to investigate the unorthodox MazF-MrpC toxin-antitoxin system which was previously proposed to induce programmed cell death (PCD). We demonstrate that deletion of mazF in two different wild-type M. xanthus laboratory strains does not significantly reduce developmental cell lysis, suggesting that MazF's role in promoting PCD is an adaption to the mutant background strain used previously. PMID:22493014

  10. Relating coccidioidomycosis (valley fever) incidence to soil moisture conditions

    PubMed Central

    Coopersmith, E. J.; Bell, J. E.; Benedict, K.; Shriber, J.; McCotter, O.; Cosh, M. H.

    2017-01-01

    Coccidioidomycosis (also called Valley fever) is caused by a soilborne fungus, Coccidioides spp., in arid regions of the southwestern United States. Though some who develop infections from this fungus remain asymptomatic, others develop respiratory disease as a consequence. Less commonly, severe illness and death can occur when the infection spreads to other regions of the body. Previous analyses have attempted to connect the incidence of coccidioidomycosis to broadly available climatic measurements, such as precipitation or temperature. However, with the limited availability of long-term, in situ soil moisture data sets, it has not been feasible to perform a direct analysis of the relationships between soil moisture levels and coccidioidomycosis incidence on a larger temporal and spatial scale. Utilizing in situ soil moisture gauges throughout the southwest from the U.S. Climate Reference Network and a model with which to extend those estimates, this work connects periods of higher and lower soil moisture in Arizona and California between 2002 and 2014 to the reported incidence of coccidioidomycosis. The results indicate that in both states, coccidioidomycosis incidence is related to soil moisture levels from previous summers and falls. Stated differently, a higher number of coccidioidomycosis cases are likely to be reported if previous bands of months have been atypically wet or dry, depending on the location. PMID:29124249

  11. OMIT: dynamic, semi-automated ontology development for the microRNA domain.

    PubMed

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A; Natale, Darren A; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology.

  12. OMIT: Dynamic, Semi-Automated Ontology Development for the microRNA Domain

    PubMed Central

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M.; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A.; Natale, Darren A.; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology. PMID:25025130

  13. Quality indicators for the detection and management of chronic kidney disease in primary care in Canada derived from a modified Delphi panel approach.

    PubMed

    Tu, Karen; Bevan, Lindsay; Hunter, Katie; Rogers, Jess; Young, Jacqueline; Nesrallah, Gihad

    2017-01-01

    The detection and management of chronic kidney disease lies within primary care; however, performance measures applicable in the Canadian context are lacking. We sought to develop a set of primary care quality indicators for chronic kidney disease in the Canadian setting and to assess the current state of the disease's detection and management in primary care. We used a modified Delphi panel approach, involving 20 panel members from across Canada (10 family physicians, 7 nephrologists, 1 patient, 1 primary care nurse and 1 pharmacist). Indicators identified from peer-reviewed and grey literature sources were subjected to 3 rounds of voting to develop a set of quality indicators for the detection and management of chronic kidney disease in the primary care setting. The final indicators were applied to primary care electronic medical records in the Electronic Medical Record Administrative data Linked Database (EMRALD) to assess the current state of primary care detection and management of chronic kidney disease in Ontario. Seventeen indicators made up the final list, with 1 under the category Prevalence, Incidence and Mortality; 4 under Screening, Diagnosis and Risk Factors; 11 under Management; and 1 under Referral to a Specialist. In a sample of 139 993 adult patients not on dialysis, 6848 (4.9%) had stage 3 or higher chronic kidney disease, with the average age of patients being 76.1 years (standard deviation [SD] 11.0); 62.9% of patients were female. Diagnosis and screening for chronic kidney disease were poorly performed. Only 27.1% of patients with stage 3 or higher disease had their diagnosis documented in their cumulative patient profile. Albumin-creatinine ratio testing was only performed for 16.3% of patients with a low estimated glomerular filtration rate (eGFR) and for 28.5% of patients with risk factors for chronic kidney disease. Family physicians performed relatively better with the management of chronic kidney disease, with 90.4% of patients with stage 3 or higher disease having an eGFR performed in the previous 18 months and 83.1% having a blood pressure recorded in the previous 9 months. We propose a set of measurable indicators to evaluate the quality of the management of chronic kidney disease in primary care. These indicators may be used to identify opportunities to improve current practice in Canada.

  14. Developmental toxicity of dextromethorphan in zebrafish embryos/larvae.

    PubMed

    Xu, Zheng; Williams, Frederick E; Liu, Ming-Cheh

    2011-03-01

    Dextromethorphan is widely used in over-the-counter cough and cold medications. Its efficacy and safety for infants and young children remains to be clarified. The present study was designed to use zebrafish as a model to investigate the potential toxicity of dextromethorphan during embryonic and larval development. Three sets of zebrafish embryos/larvae were exposed to dextromethorphan at 24, 48 and 72 h post fertilization (hpf), respectively, during the embryonic/larval development. Compared with the 48 and 72 hpf exposure sets, the embryos/larvae in the 24 hpf exposure set showed much higher mortality rates which increased in a dose-dependent manner. Bradycardia and reduced blood flow were observed for the embryos/larvae treated with increasing concentrations of dextromethorphan. Morphological effects of dextromethorphan exposure, including yolk sac and cardiac edema, craniofacial malformation, lordosis, non-inflated swim bladder and missing gill, were also more frequent and severe among zebrafish embryos/larvae exposed to dextromethorphan at 24 hpf. Whether the more frequent and severe developmental toxicity of dextromethorphan observed among the embryos/larvae in the 24 hpf exposure set, as compared with the 48 and 72 hpf exposure sets, is due to the developmental expression of the phase I and phase II enzymes involved in the metabolism of dextromethorphan remains to be clarified. A reverse transcription-polymerase chain reaction analysis, nevertheless, revealed developmental stage-dependent expression of mRNAs encoding SULT3 ST1 and SULT3 ST3, two enzymes previously shown to be capable of sulfating dextrorphan, an active metabolite of dextromethorphan. Copyright © 2010 John Wiley & Sons, Ltd.

  15. Developmental Toxicity of Dextromethorphan in Zebrafish Embryos/Larvae

    PubMed Central

    Xu, Zheng; Williams, Frederick E.; Liu, Ming-Cheh

    2012-01-01

    Dextromethorphan is widely used in over-the-counter cough and cold medications. Its efficacy and safety for infants and young children remains to be clarified. The present study was designed to use the zebrafish as a model to investigate the potential toxicity of dextromethorphan during the embryonic and larval development. Three sets of zebrafish embryos/larvae were exposed to dextromethorphan at 24 hours post fertilization (hpf), 48 hpf, and 72 hpf, respectively, during the embryonic/larval development. Compared with the 48 and 72 hpf exposure sets, the embryos/larvae in the 24 hpf exposure set showed much higher mortality rates which increased in a dose-dependent manner. Bradycardia and reduced blood flow were observed for the embryos/larvae treated with increasing concentrations of dextromethorphan. Morphological effects of dextromethorphan exposure, including yolk sac and cardiac edema, craniofacial malformation, lordosis, non-inflated swim bladder, and missing gill, were also more frequent and severe among zebrafish embryos/larvae exposed to dextromethorphan at 24 hpf. Whether the more frequent and severe developmental toxicity of dextromethorphan observed among the embryos/larvae in the 24 hpf exposure set, as compared with the 48 and 72 hpf exposure sets, is due to the developmental expression of the Phase I and Phase II enzymes involved in the metabolism of dextromethorphan remains to be clarified. A reverse transcription-polymerase chain reaction (RT-PCR) analysis, nevertheless, revealed developmental stage-dependent expression of mRNAs encoding SULT3 ST1 and SULT3 ST3, two enzymes previously shown to be capable of sulfating dextrorphan, an active metabolite of dextromethorphan. PMID:20737414

  16. Predictions of CD4 lymphocytes’ count in HIV patients from complete blood count

    PubMed Central

    2013-01-01

    Background HIV diagnosis, prognostic and treatment requires T CD4 lymphocytes’ number from flow cytometry, an expensive technique often not available to people in developing countries. The aim of this work is to apply a previous developed methodology that predicts T CD4 lymphocytes’ value based on total white blood cell (WBC) count and lymphocytes count applying sets theory, from information taken from the Complete Blood Count (CBC). Methods Sets theory was used to classify into groups named A, B, C and D the number of leucocytes/mm3, lymphocytes/mm3, and CD4/μL3 subpopulation per flow cytometry of 800 HIV diagnosed patients. Union between sets A and C, and B and D were assessed, and intersection between both unions was described in order to establish the belonging percentage to these sets. Results were classified into eight ranges taken by 1000 leucocytes/mm3, calculating the belonging percentage of each range with respect to the whole sample. Results Intersection (A ∪ C) ∩ (B ∪ D) showed an effectiveness in the prediction of 81.44% for the range between 4000 and 4999 leukocytes, 91.89% for the range between 3000 and 3999, and 100% for the range below 3000. Conclusions Usefulness and clinical applicability of a methodology based on sets theory were confirmed to predict the T CD4 lymphocytes’ value, beginning with WBC and lymphocytes’ count from CBC. This methodology is new, objective, and has lower costs than the flow cytometry which is currently considered as Gold Standard. PMID:24034560

  17. Greedy Algorithms for Nonnegativity-Constrained Simultaneous Sparse Recovery

    PubMed Central

    Kim, Daeun; Haldar, Justin P.

    2016-01-01

    This work proposes a family of greedy algorithms to jointly reconstruct a set of vectors that are (i) nonnegative and (ii) simultaneously sparse with a shared support set. The proposed algorithms generalize previous approaches that were designed to impose these constraints individually. Similar to previous greedy algorithms for sparse recovery, the proposed algorithms iteratively identify promising support indices. In contrast to previous approaches, the support index selection procedure has been adapted to prioritize indices that are consistent with both the nonnegativity and shared support constraints. Empirical results demonstrate for the first time that the combined use of simultaneous sparsity and nonnegativity constraints can substantially improve recovery performance relative to existing greedy algorithms that impose less signal structure. PMID:26973368

  18. Gasoline sniffing multifocal neuropathy.

    PubMed

    Burns, T M; Shneker, B F; Juel, V C

    2001-11-01

    The polyneuropathy caused by chronic gasoline inhalation is reported to be a gradually progressive, symmetric, sensorimotor polyneuropathy. We report unleaded gasoline sniffing by a female 14 years of age that precipitated peripheral neuropathy. In contrast with the previously reported presentation of peripheral neuropathy in gasoline inhalation, our patient developed multiple mononeuropathies superimposed on a background of sensorimotor polyneuropathy. The patient illustrates that gasoline sniffing neuropathy may present with acute multiple mononeuropathies resembling mononeuritis multiplex, possibly related to increased peripheral nerve susceptibility to pressure in the setting of neurotoxic components of gasoline. The presence of tetraethyl lead, which is no longer present in modern gasoline mixtures, is apparently not a necessary factor in the development of gasoline sniffer's neuropathy.

  19. Applied Meteorology Unit (AMU) Quarterly Report Fourth Quarter FY-14

    NASA Technical Reports Server (NTRS)

    Bauman, William H.; Crawford, Winifred C.; Watson, Leela R.; Shafer, Jaclyn

    2014-01-01

    Ms. Crawford completed the final report for the dual-Doppler wind field task. Dr. Bauman completed transitioning the 915-MHz and 50-MHz Doppler Radar Wind Profiler (DRWP) splicing algorithm developed at Marshall Space Flight Center (MSFC) into the AMU Upper Winds Tool. Dr. Watson completed work to assimilate data into model configurations for Wallops Flight Facility (WFF) and Kennedy Space Center/Cape Canaveral Air Force Station (KSC/CCAFS). Ms. Shafer began evaluating the a local high-resolution model she had set up previously for its ability to forecast weather elements that affect launches at KSC/CCAFS. Dr. Watson began a task to optimize the data-assimilated model she just developed to run in real time.

  20. Development of Translational Methods in Spectral Analysis of Human Infant Crying and Rat Pup Ultrasonic Vocalizations for Early Neurobehavioral Assessment

    PubMed Central

    Zeskind, Philip Sanford; McMurray, Matthew S.; Garber, Kristin A.; Neuspiel, Juliana M.; Cox, Elizabeth T.; Grewen, Karen M.; Mayes, Linda C.; Johns, Josephine M.

    2011-01-01

    The purpose of this article is to describe the development of translational methods by which spectrum analysis of human infant crying and rat pup ultrasonic vocalizations (USVs) can be used to assess potentially adverse effects of various prenatal conditions on early neurobehavioral development. The study of human infant crying has resulted in a rich set of measures that has long been used to assess early neurobehavioral insult due to non-optimal prenatal environments, even among seemingly healthy newborn and young infants. In another domain of study, the analysis of rat put USVs has been conducted via paradigms that allow for better experimental control over correlated prenatal conditions that may confound findings and conclusions regarding the effects of specific prenatal experiences. The development of translational methods by which cry vocalizations of both species can be analyzed may provide the opportunity for findings from the two approaches of inquiry to inform one another through their respective strengths. To this end, we present an enhanced taxonomy of a novel set of common measures of cry vocalizations of both human infants and rat pups based on a conceptual framework that emphasizes infant crying as a graded and dynamic acoustic signal. This set includes latency to vocalization onset, duration and repetition rate of expiratory components, duration of inter-vocalization-intervals and spectral features of the sound, including the frequency and amplitude of the fundamental and dominant frequencies. We also present a new set of classifications of rat pup USV waveforms that include qualitative shifts in fundamental frequency, similar to the presence of qualitative shifts in fundamental frequency that have previously been related to insults to neurobehavioral integrity in human infants. Challenges to the development of translational analyses, including the use of different terminologies, methods of recording, and spectral analyses are discussed, as well as descriptions of automated processes, software solutions, and pitfalls. PMID:22028695

  1. Using social media to enhance career development opportunities for health promotion professionals.

    PubMed

    Roman, Leah A

    2014-07-01

    For health promotion professionals, social media offers many ways to engage with a broader range of colleagues; participate in professional development events; promote expertise, products, or services; and learn about career-enhancing opportunities such as funding and fellowships. Previous work has recommended "building networking into what you are already doing." This article provides updated and new social media resources, as well as practical examples and strategies to promote effective use of social media. Social media offers health promotion professionals cost-effective opportunities to enhance their career by building communities of practice, participating in professional development events, and enriching classroom learning. Developing the skills necessary to use social media for networking is important in the public health workforce, especially as social media is increasingly used in academic and practice settings. © 2014 Society for Public Health Education.

  2. Feasibility of using the International Classification of Functioning, Disability and Health Core Set for evaluation of fall-related risk factors in acute rehabilitation settings.

    PubMed

    Huang, Shih W; Lin, Li F; Chou, Lin C; Wu, Mei J; Liao, Chun D; Liou, Tsan H

    2016-04-01

    Previously, we reported the use of an International Classification of Functioning (ICF) core set that can provide a holistic framework for evaluating the risk factors of falls; however, data on the feasibility of applying this core set are lacking. To investigate the feasibility of applying the fall-related ICF risk-factor core set in the case of patients in an acute-rehabilitation setting. A cross-sectional and descriptive correlational design. Acute-rehabilitation ward. A total of 273 patients who experienced fall at acute-rehabilitation ward. The data on falls were collected from the hospital's Nursing Information System (NIS) and the fall-reporting system (Adverse Event Reporting System, AERS) between 2010 and 2013. The relationship of both systems to the fall-related ICF core set was analyzed to assess the feasibility of their clinical application. We evaluated the feasibility of using the fall-related ICF risk-factor core set by using the frequency and the percentage of the fall patients in of the listed categories. The fall-related ICF risk-factor core set category b735 (muscle tone functions) exhibited a high feasibility (85.95%) for clinical application, and the category b730 (muscle power functions) covered 77.11% of the patients. The feasibility of application of the category d410 (change basic body position) was also high in the case of all fall patients (81.69%). In the acute-rehabilitation setting, the feasibility of application of the fall-related ICF risk-factor core set is high. The fall-related ICF risk-factor core set can help multidisciplinary teams develop fall-prevention strategies in acute rehabilitation wards.

  3. Alcohol Pharmacology Education Partnership: Using Chemistry and Biology Concepts To Educate High School Students about Alcohol

    PubMed Central

    2015-01-01

    We developed the Alcohol Pharmacology Education Partnership (APEP), a set of modules designed to integrate a topic of interest (alcohol) with concepts in chemistry and biology for high school students. Chemistry and biology teachers (n = 156) were recruited nationally to field-test APEP in a controlled study. Teachers obtained professional development either at a conference-based workshop (NSTA or NCSTA) or via distance learning to learn how to incorporate the APEP modules into their teaching. They field-tested the modules in their classes during the following year. Teacher knowledge of chemistry and biology concepts increased significantly following professional development, and was maintained for at least a year. Their students (n = 14 014) demonstrated significantly higher scores when assessed for knowledge of both basic and advanced chemistry and biology concepts compared to students not using APEP modules in their classes the previous year. Higher scores were achieved as the number of modules used increased. These findings are consistent with our previous studies, demonstrating higher scores in chemistry and biology after students use modules that integrate topics interesting to them, such as drugs (the Pharmacology Education Partnership). PMID:24803686

  4. De novo interstitial tandem duplication of chromosome 4(q21-q28)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Navarro, E.G.; Ramon, F.J.H.; Jimenez, R.D.

    1996-03-29

    We describe a girl with a previously unreported de novo duplication of chromosome 4q involving segment q21-q28. Clinical manifestations included growth and psychomotor retardation, facial asymmetry, hypotelorism, epicanthic folds, mongoloid slant of palpebral fissures, apparently low-set auricles, high nasal bridge, long philtrum, small mouth, short neck, low-set thumbs, and bilateral club foot. This phenotype is compared with that of previously reported cases of duplication 4q. 12 refs., 3 figs., 1 tab.

  5. Robust Statistics and Regularization for Feature Extraction and UXO Discrimination

    DTIC Science & Technology

    2011-07-01

    July 11, 2011 real data we find that this technique has an improved probability of finding all ordnance in a test data set, relative to previously...many sites. Tests on larger data sets should still be carried out. In previous work we considered a bootstrapping approach to selecting the operating...Marginalizing over x we obtain the probability that the ith order statistic in the test data belongs to the T class (55) P (T |x(i)) = ∞∫ −∞ P (T |x)p(x

  6. Component-Based Visualization System

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco

    2005-01-01

    A software system has been developed that gives engineers and operations personnel with no "formal" programming expertise, but who are familiar with the Microsoft Windows operating system, the ability to create visualization displays to monitor the health and performance of aircraft/spacecraft. This software system is currently supporting the X38 V201 spacecraft component/system testing and is intended to give users the ability to create, test, deploy, and certify their subsystem displays in a fraction of the time that it would take to do so using previous software and programming methods. Within the visualization system there are three major components: the developer, the deployer, and the widget set. The developer is a blank canvas with widget menu items that give users the ability to easily create displays. The deployer is an application that allows for the deployment of the displays created using the developer application. The deployer has additional functionality that the developer does not have, such as printing of displays, screen captures to files, windowing of displays, and also serves as the interface into the documentation archive and help system. The third major component is the widget set. The widgets are the visual representation of the items that will make up the display (i.e., meters, dials, buttons, numerical indicators, string indicators, and the like). This software was developed using Visual C++ and uses COTS (commercial off-the-shelf) software where possible.

  7. Faculty development to enhance humanistic teaching and role modeling: a collaborative study at eight institutions.

    PubMed

    Branch, William T; Chou, Calvin L; Farber, Neil J; Hatem, David; Keenan, Craig; Makoul, Gregory; Quinn, Mariah; Salazar, William; Sillman, Jane; Stuber, Margaret; Wilkerson, LuAnn; Mathew, George; Fost, Michael

    2014-09-01

    There is increased emphasis on practicing humanism in medicine but explicit methods for faculty development in humanism are rare. We sought to demonstrate improved faculty teaching and role modeling of humanistic and professional values by participants in a multi-institutional faculty development program as rated by their learners in clinical settings compared to contemporaneous controls. Blinded learners in clinical settings rated their clinical teachers, either participants or controls, on the previously validated 10-item Humanistic Teaching Practices Effectiveness (HTPE) questionnaire. Groups of 7-9 participants at 8 academic medical centers completed an 18-month faculty development program. Participating faculty were chosen by program facilitators at each institution on the basis of being promising teachers, willing to participate in the longitudinal faculty development program. Our 18-month curriculum combined experiential learning of teaching skills with critical reflection using appreciative inquiry narratives about their experiences as teachers and other reflective discussions. The main outcome was the aggregate score of the ten items on the questionnaire at all institutions. The aggregate score favored participants over controls (P = 0.019) independently of gender, experience on faculty, specialty area, and/or overall teaching skills. Longitudinal, intensive faculty development that employs experiential learning and critical reflection likely enhances humanistic teaching and role modeling. Almost all participants completed the program. Results are generalizable to other schools.

  8. Parallel approaches to composite production: interfaces that behave contrary to expectation.

    PubMed

    Frowd, Charlie D; Bruce, Vicki; Ness, Hayley; Bowie, Leslie; Paterson, Jenny; Thomson-Bogner, Claire; McIntyre, Alexander; Hancock, Peter J B

    2007-04-01

    This paper examines two facial composite systems that present multiple faces during construction to more closely resemble natural face processing. A 'parallel' version of PRO-fit was evaluated, which presents facial features in sets of six or twelve, and EvoFIT, a system in development, which contains a holistic face model and an evolutionary interface. The PRO-fit parallel interface turned out not to be quite as good as the 'serial' version as it appeared to interfere with holistic face processing. Composites from EvoFIT were named almost three times better than PRO-fit, but a benefit emerged under feature encoding, suggesting that recall has a greater role for EvoFIT than was previously thought. In general, an advantage was found for feature encoding, replicating a previous finding in this area, and also for a novel 'holistic' interview.

  9. A gyrokinetic one-dimensional scrape-off layer model of an edge-localized mode heat pulse

    DOE PAGES

    Shi, E. L.; Hakim, A. H.; Hammett, G. W.

    2015-02-03

    An electrostatic gyrokinetic-based model is applied to simulate parallel plasma transport in the scrape-off layer to a divertor plate. We focus on a test problem that has been studied previously, using parameters chosen to model a heat pulse driven by an edge-localized mode in JET. Previous work has used direct particle-in-cellequations with full dynamics, or Vlasov or fluid equations with only parallel dynamics. With the use of the gyrokinetic quasineutrality equation and logical sheathboundary conditions, spatial and temporal resolution requirements are no longer set by the electron Debye length and plasma frequency, respectively. Finally, this test problem also helps illustratemore » some of the physics contained in the Hamiltonian form of the gyrokineticequations and some of the numerical challenges in developing an edge gyrokinetic code.« less

  10. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids by Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  11. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  12. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer

    PubMed Central

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs. PMID:26819909

  13. Miniature MMIC Low Mass/Power Radiometer Modules for the 180 GHz GeoSTAR Array

    NASA Technical Reports Server (NTRS)

    Kangaslahti, Pekka; Tanner, Alan; Pukala, David; Lambrigtsen, Bjorn; Lim, Boon; Mei, Xiaobing; Lai, Richard

    2010-01-01

    We have developed and demonstrated miniature 180 GHz Monolithic Microwave Integrated Circuit (MMIC) radiometer modules that have low noise temperature, low mass and low power consumption. These modules will enable the Geostationary Synthetic Thinned Aperture Radiometer (GeoSTAR) of the Precipitation and All-weather Temperature and Humidity (PATH) Mission for atmospheric temperature and humidity profiling. The GeoSTAR instrument has an array of hundreds of receivers. Technology that was developed included Indium Phosphide (InP) MMIC Low Noise Amplifiers (LNAs) and second harmonic MMIC mixers and I-Q mixers, surface mount Multi-Chip Module (MCM) packages at 180 GHz, and interferometric array at 180 GHz. A complete MMIC chip set for the 180 GHz receiver modules (LNAs and I-Q Second harmonic mixer) was developed. The MMIC LNAs had more than 50% lower noise temperature (NT=300K) than previous state-of-art and MMIC I-Q mixers demonstrated low LO power (3 dBm). Two lots of MMIC wafers were processed with very high DC transconductance of up to 2800 mS/mm for the 35 nm gate length devices. Based on these MMICs a 180 GHz Multichip Module was developed that had a factor of 100 lower mass/volume (16x18x4.5 mm3, 3g) than previous generation 180 GHz receivers.

  14. Many-Body Theory for Positronium-Atom Interactions

    NASA Astrophysics Data System (ADS)

    Green, D. G.; Swann, A. R.; Gribakin, G. F.

    2018-05-01

    A many-body-theory approach has been developed to study positronium-atom interactions. As first applications, we calculate the elastic scattering and momentum-transfer cross sections and the pickoff annihilation rate 1Zeff for Ps collisions with He and Ne. For He the cross section is in agreement with previous coupled-state calculations, while comparison with experiment for both atoms highlights discrepancies between various sets of measured data. In contrast, the calculated 1Zeff (0.13 and 0.26 for He and Ne, respectively) are in excellent agreement with the measured values.

  15. On singlet s-wave electron-hydrogen scattering.

    NASA Technical Reports Server (NTRS)

    Madan, R. N.

    1973-01-01

    Discussion of various zeroth-order approximations to s-wave scattering of electrons by hydrogen atoms below the first excitation threshold. The formalism previously developed by the author (1967, 1968) is applied to Feshbach operators to derive integro-differential equations, with the optical-potential set equal to zero, for the singlet and triplet cases. Phase shifts of s-wave scattering are computed in the zeroth-order approximation of the Feshbach operator method and in the static-exchange approximation. It is found that the convergence of numerical computations is faster in the former approximation than in the latter.

  16. Chronic calcific constrictive pericarditis complicating Churg-Strauss syndrome: first reported case.

    PubMed

    Aboukhoudir, Falah; Pansieri, Michel; Rekik, Sofiene

    2014-10-01

    Churg-Strauss syndrome is a necrotizing systemic vasculitis characterized by extravascular granulomas and eosinophilic infiltrates of small vessels. Although cardiac complications are considered to be relatively common, no case of constrictive calcified pericarditis has ever been previously described in this setting. In this report, we present the case of a 46-year-old man with Churg-Strauss syndrome, in whom we were able to document the development of symptomatic calcific constrictive pericarditis during a 10-year period despite long-term corticosteroid therapy. Georg Thieme Verlag KG Stuttgart · New York.

  17. New global hydrography derived from spaceborne elevation data

    USGS Publications Warehouse

    Lehner, B.; Verdin, K.; Jarvis, A.

    2008-01-01

    In response to these limitations, a team of scientists has developed data and created maps of the world's rivers that provide the research community with more reliable information about where streams and watersheds occur on the Earth's surface and how water drains the landscape. The new product, known as HydroSHEDS (Hydrological Data and Maps Based on Shuttle Elevation Derivatives at Multiple Scales), provides this information at a resolution and quality unachieved by previous global data sets, such as HYDRO1k [U.S. Geological Survey (USGS), 2000].

  18. Scenario Decomposition for 0-1 Stochastic Programs: Improvements and Asynchronous Implementation

    DOE PAGES

    Ryan, Kevin; Rajan, Deepak; Ahmed, Shabbir

    2016-05-01

    We recently proposed scenario decomposition algorithm for stochastic 0-1 programs finds an optimal solution by evaluating and removing individual solutions that are discovered by solving scenario subproblems. In our work, we develop an asynchronous, distributed implementation of the algorithm which has computational advantages over existing synchronous implementations of the algorithm. Improvements to both the synchronous and asynchronous algorithm are proposed. We also test the results on well known stochastic 0-1 programs from the SIPLIB test library and is able to solve one previously unsolved instance from the test set.

  19. The method of lines in three dimensional fracture mechanics

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, J.; Berke, L.

    1980-01-01

    A review of recent developments in the calculation of design parameters for fracture mechanics by the method of lines (MOL) is presented. Three dimensional elastic and elasto-plastic formulations are examined and results from previous and current research activities are reported. The application of MOL to the appropriate partial differential equations of equilibrium leads to coupled sets of simultaneous ordinary differential equations. Solutions of these equations are obtained by the Peano-Baker and by the recurrance relations methods. The advantages and limitations of both solution methods from the computational standpoint are summarized.

  20. Determination of human body burden baseline date of platinum through autopsy tissue analysis.

    PubMed Central

    Vandiver, F; Duffield, F V; Yoakum, A; Bumgarner, J; Moran, J

    1976-01-01

    Results of analysis for platinum in 97 autopsy sets are presented. Analysis was performed by a specially developed emission spectrochemical method. Almost half of the individuals studied were found to have detectable platinum in one or more tissue samples. Platinum was found to be deposited in 13 of 21 tissue types investigated. Surprisingly high values were observed in subcutaneous fat, previously not considered to be a target site for platinum deposition. These data will serve as a human tissue platinum burden baseline in EPA's Catalyst Research Program. PMID:1001291

Top