Sample records for ale meta-analysis workflows

  1. Behavior, sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation.

    PubMed

    Eickhoff, Simon B; Nichols, Thomas E; Laird, Angela R; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T; Bzdok, Danilo; Eickhoff, Claudia R

    2016-08-15

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Behavior, Sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation

    PubMed Central

    Eickhoff, Simon B.; Nichols, Thomas E.; Laird, Angela R.; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T.

    2016-01-01

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. PMID:27179606

  3. The neuronal correlates of intranasal trigeminal function – An ALE meta-analysis of human functional brain imaging data

    PubMed Central

    Albrecht, Jessica; Kopietz, Rainer; Frasnelli, Johannes; Wiesmann, Martin; Hummel, Thomas; Lundström, Johan N.

    2009-01-01

    Almost every odor we encounter in daily life has the capacity to produce a trigeminal sensation. Surprisingly, few functional imaging studies exploring human neuronal correlates of intranasal trigeminal function exist, and results are to some degree inconsistent. We utilized activation likelihood estimation (ALE), a quantitative voxel-based meta-analysis tool, to analyze functional imaging data (fMRI/PET) following intranasal trigeminal stimulation with carbon dioxide (CO2), a stimulus known to exclusively activate the trigeminal system. Meta-analysis tools are able to identify activations common across studies, thereby enabling activation mapping with higher certainty. Activation foci of nine studies utilizing trigeminal stimulation were included in the meta-analysis. We found significant ALE scores, thus indicating consistent activation across studies, in the brainstem, ventrolateral posterior thalamic nucleus, anterior cingulate cortex, insula, precentral gyrus, as well as in primary and secondary somatosensory cortices – a network known for the processing of intranasal nociceptive stimuli. Significant ALE values were also observed in the piriform cortex, insula, and the orbitofrontal cortex, areas known to process chemosensory stimuli, and in association cortices. Additionally, the trigeminal ALE statistics were directly compared with ALE statistics originating from olfactory stimulation, demonstrating considerable overlap in activation. In conclusion, the results of this meta-analysis map the human neuronal correlates of intranasal trigeminal stimulation with high statistical certainty and demonstrate that the cortical areas recruited during the processing of intranasal CO2 stimuli include those outside traditional trigeminal areas. Moreover, through illustrations of the considerable overlap between brain areas that process trigeminal and olfactory information; these results demonstrate the interconnectivity of flavor processing. PMID:19913573

  4. The Neural Bases of Difficult Speech Comprehension and Speech Production: Two Activation Likelihood Estimation (ALE) Meta-Analyses

    ERIC Educational Resources Information Center

    Adank, Patti

    2012-01-01

    The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…

  5. Detecting social-cognitive deficits after traumatic brain injury: An ALE meta-analysis of fMRI studies.

    PubMed

    Xiao, Hui; Jacobsen, Andre; Chen, Ziqian; Wang, Yang

    2017-01-01

    Traumatic brain injury (TBI) can result in significant social dysfunction, which is represented by impairment to social-cognitive abilities (i.e. social cognition, social attention/executive function and communication). This study is aimed to explore brain networks mediating the social dysfunction after TBI and its underlying mechanisms. We performed a quantitative meta-analysis using the activation likelihood estimation (ALE) approach on functional magnetic resonance imaging (fMRI) studies of social-cognitive abilities following TBI. Sixteen studies fulfilled the inclusion criteria resulting in a total of 190 patients with TBI and 206 controls enrolled in the ALE meta-analysis. The temporoparietal junction (TPJ) and the medial prefrontal cortex (mPFC) were the specific regions that social cognition predominantly engaged. The cingulate gyrus, frontal gyrus and inferior parietal lobule were the main regions related to social attention/executive functions. Communication dysfunction, especially related to language deficits, was found to show greater activation of the temporal gyrus and fusiform gyrus in TBI. The current ALE meta-analytic findings provide evidence that patients have significant social-cognitive disabilities following TBI. The relatively limited pool of literature and the varied fMRI results from published studies indicate that social-cognitive abilities following TBI is an area that would greatly benefit from further investigation.

  6. Voxel-Based Morphometry ALE meta-analysis of Bipolar Disorder

    NASA Astrophysics Data System (ADS)

    Magana, Omar; Laird, Robert

    2012-03-01

    A meta-analysis was performed independently to view the changes in gray matter (GM) on patients with Bipolar disorder (BP). The meta-analysis was conducted on a Talairach Space using GingerALE to determine the voxels and their permutation. In order to achieve the data acquisition, published experiments and similar research studies were uploaded onto the online Voxel-Based Morphometry database (VBM). By doing so, coordinates of activation locations were extracted from Bipolar disorder related journals utilizing Sleuth. Once the coordinates of given experiments were selected and imported to GingerALE, a Gaussian was performed on all foci points to create the concentration points of GM on BP patients. The results included volume reductions and variations of GM between Normal Healthy controls and Patients with Bipolar disorder. A significant amount of GM clusters were obtained in Normal Healthy controls over BP patients on the right precentral gyrus, right anterior cingulate, and the left inferior frontal gyrus. In future research, more published journals could be uploaded onto the database and another VBM meta-analysis could be performed including more activation coordinates or a variation of age groups.

  7. Grey Matter Alterations Co-Localize with Functional Abnormalities in Developmental Dyslexia: An ALE Meta-Analysis

    PubMed Central

    Linkersdörfer, Janosch; Lonnemann, Jan; Lindberg, Sven; Hasselhorn, Marcus; Fiebach, Christian J.

    2012-01-01

    The neural correlates of developmental dyslexia have been investigated intensively over the last two decades and reliable evidence for a dysfunction of left-hemispheric reading systems in dyslexic readers has been found in functional neuroimaging studies. In addition, structural imaging studies using voxel-based morphometry (VBM) demonstrated grey matter reductions in dyslexics in several brain regions. To objectively assess the consistency of these findings, we performed activation likelihood estimation (ALE) meta-analysis on nine published VBM studies reporting 62 foci of grey matter reduction in dyslexic readers. We found six significant clusters of convergence in bilateral temporo-parietal and left occipito-temporal cortical regions and in the cerebellum bilaterally. To identify possible overlaps between structural and functional deviations in dyslexic readers, we conducted additional ALE meta-analyses of imaging studies reporting functional underactivations (125 foci from 24 studies) or overactivations (95 foci from 11 studies ) in dyslexics. Subsequent conjunction analyses revealed overlaps between the results of the VBM meta-analysis and the meta-analysis of functional underactivations in the fusiform and supramarginal gyri of the left hemisphere. An overlap between VBM results and the meta-analysis of functional overactivations was found in the left cerebellum. The results of our study provide evidence for consistent grey matter variations bilaterally in the dyslexic brain and substantial overlap of these structural variations with functional abnormalities in left hemispheric regions. PMID:22916214

  8. Different patterns and development characteristics of processing written logographic characters and alphabetic words: an ALE meta-analysis.

    PubMed

    Zhu, Linlin; Nie, Yaoxin; Chang, Chunqi; Gao, Jia-Hong; Niu, Zhendong

    2014-06-01

    The neural systems for phonological processing of written language have been well identified now, while models based on these neural systems are different for different language systems or age groups. Although each of such models is mostly concordant across different experiments, the results are sensitive to the experiment design and intersubject variability. Activation likelihood estimation (ALE) meta-analysis can quantitatively synthesize the data from multiple studies and minimize the interstudy or intersubject differences. In this study, we performed two ALE meta-analysis experiments: one was to examine the neural activation patterns of the phonological processing of two different types of written languages and the other was to examine the development characteristics of such neural activation patterns based on both alphabetic language and logographic language data. The results of our first meta-analysis experiment were consistent with the meta-analysis which was based on the studies published before 2005. And there were new findings in our second meta-analysis experiment, where both adults and children groups showed great activation in the left frontal lobe, the left superior/middle temporal gyrus, and the bilateral middle/superior occipital gyrus. However, the activation of the left middle/inferior frontal gyrus was found increase with the development, and the activation was found decrease in the following areas: the right claustrum and inferior frontal gyrus, the left inferior/medial frontal gyrus, the left middle/superior temporal gyrus, the right cerebellum, and the bilateral fusiform gyrus. It seems that adults involve more phonological areas, whereas children involve more orthographic areas and semantic areas. Copyright © 2013 Wiley Periodicals, Inc.

  9. ALE Meta-Analysis of Schizophrenics Performing the N-Back Task

    NASA Astrophysics Data System (ADS)

    Harrell, Zachary

    2010-10-01

    MRI/fMRI has already proven itself as a valuable tool in the diagnosis and treatment of many illnesses of the brain, including cognitive problems. By exploiting the differences in magnetic susceptibility between oxygenated and deoxygenated hemoglobin, fMRI can measure blood flow in various regions of interest within the brain. This can determine the level of brain activity in relation to motor or cognitive functions and provide a metric for tissue damage or illness symptoms. Structural imaging techniques have shown lesions or deficiencies in tissue volumes in schizophrenics corresponding to areas primarily in the frontal and temporal lobes. These areas are currently known to be involved in working memory and attention, which many schizophrenics have trouble with. The ALE (Activation Likelihood Estimation) Meta-Analysis is able to statistically determine the significance of brain area activations based on the post-hoc combination of multiple studies. This process is useful for giving a general model of brain function in relation to a particular task designed to engage the affected areas (such as working memory for the n-back task). The advantages of the ALE Meta-Analysis include elimination of single subject anomalies, elimination of false/extremely weak activations, and verification of function/location hypotheses.

  10. Hypnosis and pain perception: An Activation Likelihood Estimation (ALE) meta-analysis of functional neuroimaging studies.

    PubMed

    Del Casale, Antonio; Ferracuti, Stefano; Rapinesi, Chiara; De Rossi, Pietro; Angeletti, Gloria; Sani, Gabriele; Kotzalidis, Georgios D; Girardi, Paolo

    2015-12-01

    Several studies reported that hypnosis can modulate pain perception and tolerance by affecting cortical and subcortical activity in brain regions involved in these processes. We conducted an Activation Likelihood Estimation (ALE) meta-analysis on functional neuroimaging studies of pain perception under hypnosis to identify brain activation-deactivation patterns occurring during hypnotic suggestions aiming at pain reduction, including hypnotic analgesic, pleasant, or depersonalization suggestions (HASs). We searched the PubMed, Embase and PsycInfo databases; we included papers published in peer-reviewed journals dealing with functional neuroimaging and hypnosis-modulated pain perception. The ALE meta-analysis encompassed data from 75 healthy volunteers reported in 8 functional neuroimaging studies. HASs during experimentally-induced pain compared to control conditions correlated with significant activations of the right anterior cingulate cortex (Brodmann's Area [BA] 32), left superior frontal gyrus (BA 6), and right insula, and deactivation of right midline nuclei of the thalamus. HASs during experimental pain impact both cortical and subcortical brain activity. The anterior cingulate, left superior frontal, and right insular cortices activation increases could induce a thalamic deactivation (top-down inhibition), which may correlate with reductions in pain intensity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones.

    PubMed

    Kwok, Veronica P Y; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T; Li, Ping; Tan, Li-Hai

    2017-01-01

    The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well.

  12. A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones

    PubMed Central

    Kwok, Veronica P. Y.; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T.; Li, Ping; Tan, Li-Hai

    2017-01-01

    The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well. PMID:28798670

  13. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    PubMed Central

    Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs

    2018-01-01

    Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344

  14. Meditation-related activations are modulated by the practices needed to obtain it and by the expertise: an ALE meta-analysis study

    PubMed Central

    Tomasino, Barbara; Fregona, Sara; Skrap, Miran; Fabbro, Franco

    2013-01-01

    The brain network governing meditation has been studied using a variety of meditation practices and techniques practices eliciting different cognitive processes (e.g., silence, attention to own body, sense of joy, mantras, etc.). It is very possible that different practices of meditation are subserved by largely, if not entirely, disparate brain networks. This assumption was tested by conducting an activation likelihood estimation (ALE) meta-analysis of meditation neuroimaging studies, which assessed 150 activation foci from 24 experiments. Different ALE meta-analyses were carried out. One involved the subsets of studies involving meditation induced through exercising focused attention (FA). The network included clusters bilaterally in the medial gyrus, the left superior parietal lobe, the left insula and the right supramarginal gyrus (SMG). A second analysis addressed the studies involving meditation states induced by chanting or by repetition of words or phrases, known as “mantra.” This type of practice elicited a cluster of activity in the right SMG, the SMA bilaterally and the left postcentral gyrus. Furthermore, the last analyses addressed the effect of meditation experience (i.e., short- vs. long-term meditators). We found that frontal activation was present for short-term, as compared with long-term experience meditators, confirming that experts are better enabled to sustain attentional focus, rather recruiting the right SMG and concentrating on aspects involving disembodiment. PMID:23316154

  15. Imitation and speech: commonalities within Broca's area.

    PubMed

    Kühn, Simone; Brass, Marcel; Gallinat, Jürgen

    2013-11-01

    The so-called embodiment of communication has attracted considerable interest. Recently a growing number of studies have proposed a link between Broca's area's involvement in action processing and its involvement in speech. The present quantitative meta-analysis set out to test whether neuroimaging studies on imitation and overt speech show overlap within inferior frontal gyrus. By means of activation likelihood estimation (ALE), we investigated concurrence of brain regions activated by object-free hand imitation studies as well as overt speech studies including simple syllable and more complex word production. We found direct overlap between imitation and speech in bilateral pars opercularis (BA 44) within Broca's area. Subtraction analyses revealed no unique localization neither for speech nor for imitation. To verify the potential of ALE subtraction analysis to detect unique involvement within Broca's area, we contrasted the results of a meta-analysis on motor inhibition and imitation and found separable regions involved for imitation. This is the first meta-analysis to compare the neural correlates of imitation and overt speech. The results are in line with the proposed evolutionary roots of speech in imitation.

  16. The role of the right temporoparietal junction in attention and social interaction as revealed by ALE meta-analysis

    PubMed Central

    Rottschy, C.; Oberwelland, E.; Bzdok, D.; Fox, P. T.; Eickhoff, S. B.; Fink, G. R.; Konrad, K.

    2016-01-01

    The right temporoparietal junction (rTPJ) is frequently associated with different capacities that to shift attention to unexpected stimuli (reorienting of attention) and to understand others’ (false) mental state [theory of mind (ToM), typically represented by false belief tasks]. Competing hypotheses either suggest the rTPJ representing a unitary region involved in separate cognitive functions or consisting of subregions subserving distinct processes. We conducted activation likelihood estimation (ALE) meta-analyses to test these hypotheses. A conjunction analysis across ALE meta-analyses delineating regions consistently recruited by reorienting of attention and false belief studies revealed the anterior rTPJ, suggesting an overarching role of this specific region. Moreover, the anatomical difference analysis unravelled the posterior rTPJ as higher converging in false belief compared with reorienting of attention tasks. This supports the concept of an exclusive role of the posterior rTPJ in the social domain. These results were complemented by meta-analytic connectivity mapping (MACM) and resting-state functional connectivity (RSFC) analysis to investigate whole-brain connectivity patterns in task-constrained and task-free brain states. This allowed for detailing the functional separation of the anterior and posterior rTPJ. The combination of MACM and RSFC mapping showed that the posterior rTPJ has connectivity patterns with typical ToM regions, whereas the anterior part of rTPJ co-activates with the attentional network. Taken together, our data suggest that rTPJ contains two functionally fractionated subregions: while posterior rTPJ seems exclusively involved in the social domain, anterior rTPJ is involved in both, attention and ToM, conceivably indicating an attentional shifting role of this region. PMID:24915964

  17. Phenotypic regional fMRI activation patterns during memory encoding in MCI and AD

    PubMed Central

    Browndyke, Jeffrey N.; Giovanello, Kelly; Petrella, Jeffrey; Hayden, Kathleen; Chiba-Falek, Ornit; Tucker, Karen A.; Burke, James R.; Welsh-Bohmer, Kathleen A.

    2014-01-01

    Background Reliable blood-oxygen-level-dependent (BOLD) fMRI phenotypic biomarkers of Alzheimer's disease (AD) or mild cognitive impairment (MCI) are likely to emerge only from a systematic, quantitative, and aggregate examination of the functional neuroimaging research literature. Methods A series of random-effects, activation likelihood estimation (ALE) meta-analyses were conducted on studies of episodic memory encoding operations in AD and MCI samples relative to normal controls. ALE analyses were based upon a thorough literature search for all task-based functional neuroimaging studies in AD and MCI published up to January 2010. Analyses covered 16 fMRI studies, which yielded 144 distinct foci for ALE meta-analysis. Results ALE results indicated several regional task-based BOLD consistencies in MCI and AD patients relative to normal controls across the aggregate BOLD functional neuroimaging research literature. Patients with AD and those at significant risk (MCI) showed statistically significant consistent activation differences during episodic memory encoding in the medial temporal lobe (MTL), specifically parahippocampal gyrus, as well superior frontal gyrus, precuneus, and cuneus, relative to normal controls. Conclusions ALE consistencies broadly support the presence of frontal compensatory activity, MTL activity alteration, and posterior midline “default mode” hyperactivation during episodic memory encoding attempts in the diseased or prospective pre-disease condition. Taken together these robust commonalities may form the foundation for a task-based fMRI phenotype of memory encoding in AD. PMID:22841497

  18. A meta-analysis of neurofunctional imaging studies of emotion and cognition in major depression.

    PubMed

    Diener, Carsten; Kuehner, Christine; Brusniak, Wencke; Ubl, Bettina; Wessa, Michèle; Flor, Herta

    2012-07-02

    Major depressive disorder (MDD) is characterized by altered emotional and cognitive functioning. We performed a voxel-based whole-brain meta-analysis of functional neuroimaging data on altered emotion and cognition in MDD. Forty peer-reviewed studies in English-language published between 1998 and 2010 were included, which used functional neuroimaging during cognitive-emotional challenge in adult individuals with MDD and healthy controls. All studies reported between-groups differences for whole-brain analyses in standardized neuroanatomical space and were subjected to Activation Likelihood Estimation (ALE) of brain cluster showing altered responsivity in MDD. ALE resulted in thresholded and false discovery rate corrected hypo- and hyperactive brain regions. Against the background of a complex neural activation pattern, studies converged in predominantly hypoactive cluster in the anterior insular and rostral anterior cingulate cortex linked to affectively biased information processing and poor cognitive control. Frontal areas showed not only similar under- but also over-activation during cognitive-emotional challenge. On the subcortical level, we identified activation alterations in the thalamus and striatum which were involved in biased valence processing of emotional stimuli in MDD. These results for active conditions extend findings from ALE meta-analyses of resting state and antidepressant treatment studies and emphasize the key role of the anterior insular and rostral anterior cingulate cortex for altered emotion and cognition in MDD. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. MetaNET--a web-accessible interactive platform for biological metabolic network analysis.

    PubMed

    Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael

    2014-01-01

    Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.

  20. Where do bright ideas occur in our brain? Meta-analytic evidence from neuroimaging studies of domain-specific creativity

    PubMed Central

    Boccia, Maddalena; Piccardi, Laura; Palermo, Liana; Nori, Raffaella; Palmiero, Massimiliano

    2015-01-01

    Many studies have assessed the neural underpinnings of creativity, failing to find a clear anatomical localization. We aimed to provide evidence for a multi-componential neural system for creativity. We applied a general activation likelihood estimation (ALE) meta-analysis to 45 fMRI studies. Three individual ALE analyses were performed to assess creativity in different cognitive domains (Musical, Verbal, and Visuo-spatial). The general ALE revealed that creativity relies on clusters of activations in the bilateral occipital, parietal, frontal, and temporal lobes. The individual ALE revealed different maximal activation in different domains. Musical creativity yields activations in the bilateral medial frontal gyrus, in the left cingulate gyrus, middle frontal gyrus, and inferior parietal lobule and in the right postcentral and fusiform gyri. Verbal creativity yields activations mainly located in the left hemisphere, in the prefrontal cortex, middle and superior temporal gyri, inferior parietal lobule, postcentral and supramarginal gyri, middle occipital gyrus, and insula. The right inferior frontal gyrus and the lingual gyrus were also activated. Visuo-spatial creativity activates the right middle and inferior frontal gyri, the bilateral thalamus and the left precentral gyrus. This evidence suggests that creativity relies on multi-componential neural networks and that different creativity domains depend on different brain regions. PMID:26322002

  1. A meta-analysis of neuroimaging studies on divergent thinking using activation likelihood estimation.

    PubMed

    Wu, Xin; Yang, Wenjing; Tong, Dandan; Sun, Jiangzhou; Chen, Qunlin; Wei, Dongtao; Zhang, Qinglin; Zhang, Meng; Qiu, Jiang

    2015-07-01

    In this study, an activation likelihood estimation (ALE) meta-analysis was used to conduct a quantitative investigation of neuroimaging studies on divergent thinking. Based on the ALE results, the functional magnetic resonance imaging (fMRI) studies showed that distributed brain regions were more active under divergent thinking tasks (DTTs) than those under control tasks, but a large portion of the brain regions were deactivated. The ALE results indicated that the brain networks of the creative idea generation in DTTs may be composed of the lateral prefrontal cortex, posterior parietal cortex [such as the inferior parietal lobule (BA 40) and precuneus (BA 7)], anterior cingulate cortex (ACC) (BA 32), and several regions in the temporal cortex [such as the left middle temporal gyrus (BA 39), and left fusiform gyrus (BA 37)]. The left dorsolateral prefrontal cortex (BA 46) was related to selecting the loosely and remotely associated concepts and organizing them into creative ideas, whereas the ACC (BA 32) was related to observing and forming distant semantic associations in performing DTTs. The posterior parietal cortex may be involved in the semantic information related to the retrieval and buffering of the formed creative ideas, and several regions in the temporal cortex may be related to the stored long-term memory. In addition, the ALE results of the structural studies showed that divergent thinking was related to the dopaminergic system (e.g., left caudate and claustrum). Based on the ALE results, both fMRI and structural MRI studies could uncover the neural basis of divergent thinking from different aspects (e.g., specific cognitive processing and stable individual difference of cognitive capability). © 2015 Wiley Periodicals, Inc.

  2. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    PubMed

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  3. How Acute Total Sleep Loss Affects the Attending Brain: A Meta-Analysis of Neuroimaging Studies

    PubMed Central

    Ma, Ning; Dinges, David F.; Basner, Mathias; Rao, Hengyi

    2015-01-01

    Study Objectives: Attention is a cognitive domain that can be severely affected by sleep deprivation. Previous neuroimaging studies have used different attention paradigms and reported both increased and reduced brain activation after sleep deprivation. However, due to large variability in sleep deprivation protocols, task paradigms, experimental designs, characteristics of subject populations, and imaging techniques, there is no consensus regarding the effects of sleep loss on the attending brain. The aim of this meta-analysis was to identify brain activations that are commonly altered by acute total sleep deprivation across different attention tasks. Design: Coordinate-based meta-analysis of neuroimaging studies of performance on attention tasks during experimental sleep deprivation. Methods: The current version of the activation likelihood estimation (ALE) approach was used for meta-analysis. The authors searched published articles and identified 11 sleep deprivation neuroimaging studies using different attention tasks with a total of 185 participants, equaling 81 foci for ALE analysis. Results: The meta-analysis revealed significantly reduced brain activation in multiple regions following sleep deprivation compared to rested wakefulness, including bilateral intraparietal sulcus, bilateral insula, right prefrontal cortex, medial frontal cortex, and right parahippocampal gyrus. Increased activation was found only in bilateral thalamus after sleep deprivation compared to rested wakefulness. Conclusion: Acute total sleep deprivation decreases brain activation in the fronto-parietal attention network (prefrontal cortex and intraparietal sulcus) and in the salience network (insula and medial frontal cortex). Increased thalamic activation after sleep deprivation may reflect a complex interaction between the de-arousing effects of sleep loss and the arousing effects of task performance on thalamic activity. Citation: Ma N, Dinges DF, Basner M, Rao H. How acute total sleep loss affects the attending brain: a meta-analysis of neuroimaging studies. SLEEP 2015;38(2):233–240. PMID:25409102

  4. The neural basis of audiomotor entrainment: an ALE meta-analysis

    PubMed Central

    Chauvigné, Léa A. S.; Gitau, Kevin M.; Brown, Steven

    2014-01-01

    Synchronization of body movement to an acoustic rhythm is a major form of entrainment, such as occurs in dance. This is exemplified in experimental studies of finger tapping. Entrainment to a beat is contrasted with movement that is internally driven and is therefore self-paced. In order to examine brain areas important for entrainment to an acoustic beat, we meta-analyzed the functional neuroimaging literature on finger tapping (43 studies) using activation likelihood estimation (ALE) meta-analysis with a focus on the contrast between externally-paced and self-paced tapping. The results demonstrated a dissociation between two subcortical systems involved in timing, namely the cerebellum and the basal ganglia. Externally-paced tapping highlighted the importance of the spinocerebellum, most especially the vermis, which was not activated at all by self-paced tapping. In contrast, the basal ganglia, including the putamen and globus pallidus, were active during both types of tapping, but preferentially during self-paced tapping. These results suggest a central role for the spinocerebellum in audiomotor entrainment. We conclude with a theoretical discussion about the various forms of entrainment in humans and other animals. PMID:25324765

  5. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.

    PubMed

    Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A

    2005-04-07

    Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.

  6. How acute total sleep loss affects the attending brain: a meta-analysis of neuroimaging studies.

    PubMed

    Ma, Ning; Dinges, David F; Basner, Mathias; Rao, Hengyi

    2015-02-01

    Attention is a cognitive domain that can be severely affected by sleep deprivation. Previous neuroimaging studies have used different attention paradigms and reported both increased and reduced brain activation after sleep deprivation. However, due to large variability in sleep deprivation protocols, task paradigms, experimental designs, characteristics of subject populations, and imaging techniques, there is no consensus regarding the effects of sleep loss on the attending brain. The aim of this meta-analysis was to identify brain activations that are commonly altered by acute total sleep deprivation across different attention tasks. Coordinate-based meta-analysis of neuroimaging studies of performance on attention tasks during experimental sleep deprivation. The current version of the activation likelihood estimation (ALE) approach was used for meta-analysis. The authors searched published articles and identified 11 sleep deprivation neuroimaging studies using different attention tasks with a total of 185 participants, equaling 81 foci for ALE analysis. The meta-analysis revealed significantly reduced brain activation in multiple regions following sleep deprivation compared to rested wakefulness, including bilateral intraparietal sulcus, bilateral insula, right prefrontal cortex, medial frontal cortex, and right parahippocampal gyrus. Increased activation was found only in bilateral thalamus after sleep deprivation compared to rested wakefulness. Acute total sleep deprivation decreases brain activation in the fronto-parietal attention network (prefrontal cortex and intraparietal sulcus) and in the salience network (insula and medial frontal cortex). Increased thalamic activation after sleep deprivation may reflect a complex interaction between the de-arousing effects of sleep loss and the arousing effects of task performance on thalamic activity. © 2015 Associated Professional Sleep Societies, LLC.

  7. Functional magnetic resonance imaging during emotion recognition in social anxiety disorder: an activation likelihood meta-analysis

    PubMed Central

    Hattingh, Coenraad J.; Ipser, J.; Tromp, S. A.; Syal, S.; Lochner, C.; Brooks, S. J.; Stein, D. J.

    2012-01-01

    Background: Social anxiety disorder (SAD) is characterized by abnormal fear and anxiety in social situations. Functional magnetic resonance imaging (fMRI) is a brain imaging technique that can be used to demonstrate neural activation to emotionally salient stimuli. However, no attempt has yet been made to statistically collate fMRI studies of brain activation, using the activation likelihood-estimate (ALE) technique, in response to emotion recognition tasks in individuals with SAD. Methods: A systematic search of fMRI studies of neural responses to socially emotive cues in SAD was undertaken. ALE meta-analysis, a voxel-based meta-analytic technique, was used to estimate the most significant activations during emotional recognition. Results: Seven studies were eligible for inclusion in the meta-analysis, constituting a total of 91 subjects with SAD, and 93 healthy controls. The most significant areas of activation during emotional vs. neutral stimuli in individuals with SAD compared to controls were: bilateral amygdala, left medial temporal lobe encompassing the entorhinal cortex, left medial aspect of the inferior temporal lobe encompassing perirhinal cortex and parahippocampus, right anterior cingulate, right globus pallidus, and distal tip of right postcentral gyrus. Conclusion: The results are consistent with neuroanatomic models of the role of the amygdala in fear conditioning, and the importance of the limbic circuitry in mediating anxiety symptoms. PMID:23335892

  8. Implementation errors in the GingerALE Software: Description and recommendations.

    PubMed

    Eickhoff, Simon B; Laird, Angela R; Fox, P Mickle; Lancaster, Jack L; Fox, Peter T

    2017-01-01

    Neuroscience imaging is a burgeoning, highly sophisticated field the growth of which has been fostered by grant-funded, freely distributed software libraries that perform voxel-wise analyses in anatomically standardized three-dimensional space on multi-subject, whole-brain, primary datasets. Despite the ongoing advances made using these non-commercial computational tools, the replicability of individual studies is an acknowledged limitation. Coordinate-based meta-analysis offers a practical solution to this limitation and, consequently, plays an important role in filtering and consolidating the enormous corpus of functional and structural neuroimaging results reported in the peer-reviewed literature. In both primary data and meta-analytic neuroimaging analyses, correction for multiple comparisons is a complex but critical step for ensuring statistical rigor. Reports of errors in multiple-comparison corrections in primary-data analyses have recently appeared. Here, we report two such errors in GingerALE, a widely used, US National Institutes of Health (NIH)-funded, freely distributed software package for coordinate-based meta-analysis. These errors have given rise to published reports with more liberal statistical inferences than were specified by the authors. The intent of this technical report is threefold. First, we inform authors who used GingerALE of these errors so that they can take appropriate actions including re-analyses and corrective publications. Second, we seek to exemplify and promote an open approach to error management. Third, we discuss the implications of these and similar errors in a scientific environment dependent on third-party software. Hum Brain Mapp 38:7-11, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. White matter and schizophrenia: A meta-analysis of voxel-based morphometry and diffusion tensor imaging studies.

    PubMed

    Vitolo, Enrico; Tatu, Mona Karina; Pignolo, Claudia; Cauda, Franco; Costa, Tommaso; Ando', Agata; Zennaro, Alessandro

    2017-12-30

    Voxel-based morphometry (VBM) and diffusion tensor imaging (DTI) are the most implemented methodologies to detect alterations of both gray and white matter (WM). However, the role of WM in mental disorders is still not well defined. We aimed at clarifying the role of WM disruption in schizophrenia and at identifying the most frequently involved brain networks. A systematic literature search was conducted to identify VBM and DTI studies focusing on WM alterations in patients with schizophrenia compared to control subjects. We selected studies reporting the coordinates of WM reductions and we performed the anatomical likelihood estimation (ALE). Moreover, we labeled the WM bundles with an anatomical atlas and compared VBM and DTI ALE-scores of each significant WM tract. A total of 59 studies were eligible for the meta-analysis. WM alterations were reported in 31 and 34 foci with VBM and DTI methods, respectively. The most occurred WM bundles in both VBM and DTI studies and largely involved in schizophrenia were long projection fibers, callosal and commissural fibers, part of motor descending fibers, and fronto-temporal-limbic pathways. The meta-analysis showed a widespread WM disruption in schizophrenia involving specific cerebral circuits instead of well-defined regions. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. An Activation Likelihood Estimation Meta-Analysis Study of Simple Motor Movements in Older and Young Adults

    PubMed Central

    Turesky, Ted K.; Turkeltaub, Peter E.; Eden, Guinevere F.

    2016-01-01

    The functional neuroanatomy of finger movements has been characterized with neuroimaging in young adults. However, less is known about the aging motor system. Several studies have contrasted movement-related activity in older versus young adults, but there is inconsistency among their findings. To address this, we conducted an activation likelihood estimation (ALE) meta-analysis on within-group data from older adults and young adults performing regularly paced right-hand finger movement tasks in response to external stimuli. We hypothesized that older adults would show a greater likelihood of activation in right cortical motor areas (i.e., ipsilateral to the side of movement) compared to young adults. ALE maps were examined for conjunction and between-group differences. Older adults showed overlapping likelihoods of activation with young adults in left primary sensorimotor cortex (SM1), bilateral supplementary motor area, bilateral insula, left thalamus, and right anterior cerebellum. Their ALE map differed from that of the young adults in right SM1 (extending into dorsal premotor cortex), right supramarginal gyrus, medial premotor cortex, and right posterior cerebellum. The finding that older adults uniquely use ipsilateral regions for right-hand finger movements and show age-dependent modulations in regions recruited by both age groups provides a foundation by which to understand age-related motor decline and motor disorders. PMID:27799910

  11. Neuroanatomical and neurofunctional markers of social cognition in autism spectrum disorder.

    PubMed

    Patriquin, Michelle A; DeRamus, Thomas; Libero, Lauren E; Laird, Angela; Kana, Rajesh K

    2016-11-01

    Social impairments in autism spectrum disorder (ASD), a hallmark feature of its diagnosis, may underlie specific neural signatures that can aid in differentiating between those with and without ASD. To assess common and consistent patterns of differences in brain responses underlying social cognition in ASD, this study applied an activation likelihood estimation (ALE) meta-analysis to results from 50 neuroimaging studies of social cognition in children and adults with ASD. In addition, the group ALE clusters of activation obtained from this was used as a social brain mask to perform surface-based cortical morphometry (SBM) in an empirical structural MRI dataset collected from 55 ASD and 60 typically developing (TD) control participants. Overall, the ALE meta-analysis revealed consistent differences in activation in the posterior superior temporal sulcus at the temporoparietal junction, middle frontal gyrus, fusiform face area (FFA), inferior frontal gyrus (IFG), amygdala, insula, and cingulate cortex between ASD and TD individuals. SBM analysis showed alterations in the thickness, volume, and surface area in individuals with ASD in STS, insula, and FFA. Increased cortical thickness was found in individuals with ASD, the IFG. The results of this study provide functional and anatomical bases of social cognition abnormalities in ASD by identifying common signatures from a large pool of neuroimaging studies. These findings provide new insights into the quest for a neuroimaging-based marker for ASD. Hum Brain Mapp 37:3957-3978, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. Identifying environmental sounds: a multimodal mapping study

    PubMed Central

    Tomasino, Barbara; Canderan, Cinzia; Marin, Dario; Maieron, Marta; Gremese, Michele; D'Agostini, Serena; Fabbro, Franco; Skrap, Miran

    2015-01-01

    Our environment is full of auditory events such as warnings or hazards, and their correct recognition is essential. We explored environmental sounds (ES) recognition in a series of studies. In study 1 we performed an Activation Likelihood Estimation (ALE) meta-analysis of neuroimaging experiments addressing ES processing to delineate the network of areas consistently involved in ES processing. Areas consistently activated in the ALE meta-analysis were the STG/MTG, insula/rolandic operculum, parahippocampal gyrus and inferior frontal gyrus bilaterally. Some of these areas truly reflect ES processing, whereas others are related to design choices, e.g., type of task, type of control condition, type of stimulus. In study 2 we report on 7 neurosurgical patients with lesions involving the areas which were found to be activated by the ALE meta-analysis. We tested their ES recognition abilities and found an impairment of ES recognition. These results indicate that deficits of ES recognition do not exclusively reflect lesions to the right or to the left hemisphere but both hemispheres are involved. The most frequently lesioned area is the hippocampus/insula/STG. We made sure that any impairment in ES recognition would not be related to language problems, but reflect impaired ES processing. In study 3 we carried out an fMRI study on patients (vs. healthy controls) to investigate how the areas involved in ES might be functionally deregulated because of a lesion. The fMRI evidenced that controls activated the right IFG, the STG bilaterally and the left insula. We applied a multimodal mapping approach and found that, although the meta-analysis showed that part of the left and right STG/MTG activation during ES processing might in part be related to design choices, this area was one of the most frequently lesioned areas in our patients, thus highlighting its causal role in ES processing. We found that the ROIs we drew on the two clusters of activation found in the left and in the right STG overlapped with the lesions of at least 4 out of the 7 patients' lesions, indicating that the lack of STG activation found for patients is related to brain damage and is crucial for explaining the ES deficit. PMID:26539096

  13. Factors Affecting Medial Temporal Lobe Engagement for Past and Future Episodic Events: An ALE Meta-Analysis of Neuroimaging Studies

    ERIC Educational Resources Information Center

    Viard, Armelle; Desgranges, Beatrice; Eustache, Francis; Piolino, Pascale

    2012-01-01

    Remembering the past and envisioning the future are at the core of one's sense of identity. Neuroimaging studies investigating the neural substrates underlying past and future episodic events have been growing in number. However, the experimental paradigms used to select and elicit episodic events vary greatly, leading to disparate results,…

  14. The neuroplastic effect of working memory training in healthy volunteers and patients with schizophrenia: Implications for cognitive rehabilitation.

    PubMed

    Li, Xu; Xiao, Ya-hui; Zhao, Qing; Leung, Ada W W; Cheung, Eric F C; Chan, Raymond C K

    2015-08-01

    We conducted an activation likelihood estimation (ALE) meta-analysis to quantitatively review the existing working memory (WM) training studies that investigated neural activation changes both in healthy individuals and patients with schizophrenia. ALE analysis of studies in healthy individuals indicates a widespread distribution of activation changes with WM training in the frontal and parietal regions, especially the dorsolateral prefrontal cortex, the medial frontal cortex and the precuneus, as well as subcortical regions such as the insula and the striatum. WM training is also accompanied by activation changes in patients with schizophrenia, mainly in the dorsolateral prefrontal cortex, the precuneus and the fusiform gyrus. Our results demonstrate that WM training is accompanied by changes in neural activation patterns in healthy individuals, which may provide the basis for understanding neuroplastic changes in patients with schizophrenia. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  16. A coordinate-based ALE functional MRI meta-analysis of brain activation during verbal fluency tasks in healthy control subjects

    PubMed Central

    2014-01-01

    Background The processing of verbal fluency tasks relies on the coordinated activity of a number of brain areas, particularly in the frontal and temporal lobes of the left hemisphere. Recent studies using functional magnetic resonance imaging (fMRI) to study the neural networks subserving verbal fluency functions have yielded divergent results especially with respect to a parcellation of the inferior frontal gyrus for phonemic and semantic verbal fluency. We conducted a coordinate-based activation likelihood estimation (ALE) meta-analysis on brain activation during the processing of phonemic and semantic verbal fluency tasks involving 28 individual studies with 490 healthy volunteers. Results For phonemic as well as for semantic verbal fluency, the most prominent clusters of brain activation were found in the left inferior/middle frontal gyrus (LIFG/MIFG) and the anterior cingulate gyrus. BA 44 was only involved in the processing of phonemic verbal fluency tasks, BA 45 and 47 in the processing of phonemic and semantic fluency tasks. Conclusions Our comparison of brain activation during the execution of either phonemic or semantic verbal fluency tasks revealed evidence for spatially different activation in BA 44, but not other regions of the LIFG/LMFG (BA 9, 45, 47) during phonemic and semantic verbal fluency processing. PMID:24456150

  17. Neurological soft signs are not "soft" in brain structure and functional networks: evidence from ALE meta-analysis.

    PubMed

    Zhao, Qing; Li, Zhi; Huang, Jia; Yan, Chao; Dazzan, Paola; Pantelis, Christos; Cheung, Eric F C; Lui, Simon S Y; Chan, Raymond C K

    2014-05-01

    Neurological soft signs (NSS) are associated with schizophrenia and related psychotic disorders. NSS have been conventionally considered as clinical neurological signs without localized brain regions. However, recent brain imaging studies suggest that NSS are partly localizable and may be associated with deficits in specific brain areas. We conducted an activation likelihood estimation meta-analysis to quantitatively review structural and functional imaging studies that evaluated the brain correlates of NSS in patients with schizophrenia and other psychotic disorders. Six structural magnetic resonance imaging (sMRI) and 15 functional magnetic resonance imaging (fMRI) studies were included. The results from meta-analysis of the sMRI studies indicated that NSS were associated with atrophy of the precentral gyrus, the cerebellum, the inferior frontal gyrus, and the thalamus. The results from meta-analysis of the fMRI studies demonstrated that the NSS-related task was significantly associated with altered brain activation in the inferior frontal gyrus, bilateral putamen, the cerebellum, and the superior temporal gyrus. Our findings from both sMRI and fMRI meta-analyses further support the conceptualization of NSS as a manifestation of the "cerebello-thalamo-prefrontal" brain network model of schizophrenia and related psychotic disorders.

  18. Drawing and writing: An ALE meta-analysis of sensorimotor activations.

    PubMed

    Yuan, Ye; Brown, Steven

    2015-08-01

    Drawing and writing are the two major means of creating what are referred to as "images", namely visual patterns on flat surfaces. They share many sensorimotor processes related to visual guidance of hand movement, resulting in the formation of visual shapes associated with pictures and words. However, while the human capacity to draw is tens of thousands of years old, the capacity for writing is only a few thousand years old, and widespread literacy is quite recent. In order to compare the neural activations for drawing and writing, we conducted two activation likelihood estimation (ALE) meta-analyses for these two bodies of neuroimaging literature. The results showed strong overlap in the activation profiles, especially in motor areas (motor cortex, frontal eye fields, supplementary motor area, cerebellum, putamen) and several parts of the posterior parietal cortex. A distinction was found in the left posterior parietal cortex, with drawing showing a preference for a ventral region and writing a dorsal region. These results demonstrate that drawing and writing employ the same basic sensorimotor networks but that some differences exist in parietal areas involved in spatial processing. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Implicit timing activates the left inferior parietal cortex.

    PubMed

    Wiener, Martin; Turkeltaub, Peter E; Coslett, H Branch

    2010-11-01

    Coull and Nobre (2008) suggested that tasks that employ temporal cues might be divided on the basis of whether these cues are explicitly or implicitly processed. Furthermore, they suggested that implicit timing preferentially engages the left cerebral hemisphere. We tested this hypothesis by conducting a quantitative meta-analysis of eleven neuroimaging studies of implicit timing using the activation-likelihood estimation (ALE) algorithm (Turkeltaub, Eden, Jones, & Zeffiro, 2002). Our analysis revealed a single but robust cluster of activation-likelihood in the left inferior parietal cortex (supramarginal gyrus). This result is in accord with the hypothesis that the left hemisphere subserves implicit timing mechanisms. Furthermore, in conjunction with a previously reported meta-analysis of explicit timing tasks, our data support the claim that implicit and explicit timing are supported by at least partially distinct neural structures. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Specifying the core network supporting episodic simulation and episodic memory by activation likelihood estimation

    PubMed Central

    Benoit, Roland G.; Schacter, Daniel L.

    2015-01-01

    It has been suggested that the simulation of hypothetical episodes and the recollection of past episodes are supported by fundamentally the same set of brain regions. The present article specifies this core network via Activation Likelihood Estimation (ALE). Specifically, a first meta-analysis revealed joint engagement of core network regions during episodic memory and episodic simulation. These include parts of the medial surface, the hippocampus and parahippocampal cortex within the medial temporal lobes, and the lateral temporal and inferior posterior parietal cortices on the lateral surface. Both capacities also jointly recruited additional regions such as parts of the bilateral dorsolateral prefrontal cortex. All of these core regions overlapped with the default network. Moreover, it has further been suggested that episodic simulation may require a stronger engagement of some of the core network’s nodes as wells as the recruitment of additional brain regions supporting control functions. A second ALE meta-analysis indeed identified such regions that were consistently more strongly engaged during episodic simulation than episodic memory. These comprised the core-network clusters located in the left dorsolateral prefrontal cortex and posterior inferior parietal lobe and other structures distributed broadly across the default and fronto-parietal control networks. Together, the analyses determine the set of brain regions that allow us to experience past and hypothetical episodes, thus providing an important foundation for studying the regions’ specialized contributions and interactions. PMID:26142352

  1. Modeling motor connectivity using TMS/PET and structural equation modeling

    PubMed Central

    Laird, Angela R.; Robbins, Jacob M.; Li, Karl; Price, Larry R.; Cykowski, Matthew D.; Narayana, Shalini; Laird, Robert W.; Franklin, Crystal; Fox, Peter T.

    2010-01-01

    Structural equation modeling (SEM) was applied to positron emission tomographic (PET) images acquired during transcranial magnetic stimulation (TMS) of the primary motor cortex (M1hand). TMS was applied across a range of intensities, and responses both at the stimulation site and remotely connected brain regions covaried with stimulus intensity. Regions of interest (ROIs) were identified through an activation likelihood estimation (ALE) meta-analysis of TMS studies. That these ROIs represented the network engaged by motor planning and execution was confirmed by an ALE meta-analysis of finger movement studies. Rather than postulate connections in the form of an a priori model (confirmatory approach), effective connectivity models were developed using a model-generating strategy based on improving tentatively specified models. This strategy exploited the experimentally-imposed causal relations: (1) that response variations were caused by stimulation variations, (2) that stimulation was unidirectionally applied to the M1hand region, and (3) that remote effects must be caused, either directly or indirectly, by the M1hand excitation. The path model thus derived exhibited an exceptional level of goodness (χ2=22.150, df = 38, P = 0.981, TLI=1.0). The regions and connections derived were in good agreement with the known anatomy of the human and primate motor system. The model-generating SEM strategy thus proved highly effective and successfully identified a complex set of causal relationships of motor connectivity. PMID:18387823

  2. Grey matter alterations in migraine: A systematic review and meta-analysis.

    PubMed

    Jia, Zhihua; Yu, Shengyuan

    2017-01-01

    To summarize and meta-analyze studies on changes in grey matter (GM) in patients with migraine. We aimed to determine whether there are concordant structural changes in the foci, whether structural changes are concordant with functional changes, and provide further understanding of the anatomy and biology of migraine. We searched PubMed and Embase for relevant articles published between January 1985 and November 2015, and examined the references within relevant primary articles. Following exclusion of unsuitable studies, meta-analysis were performed using activation likelihood estimation (ALE). Eight clinical studies were analyzed for structural changes, containing a total of 390 subjects (191 patients and 199 controls). Five functional studies were enrolled, containing 93 patients and 96 controls. ALE showed that the migraineurs had concordant decreases in the GM volume (GMV) in the bilateral inferior frontal gyri, the right precentral gyrus, the left middle frontal gyrus and the left cingulate gyrus. GMV decreases in right claustrum, left cingulated gyrus, right anterior cingulate, amygdala and left parahippocampal gyrus are related to estimated frequency of headache attack . Activation was found in the somatosensory, cingulate, limbic lobe, basal ganglia and midbrain in migraine patients. GM changes in migraineurs may indicate the mechanism of pain processing and associated symptoms. Changes in the frontal gyrus may predispose a person to pain conditions. The limbic regions may be accumulated damage due to the repetitive occurrence of pain-related processes. Increased activation in precentral gyrus and cingulate opposed to GMV decrease might suggest increased effort duo to disorganization of these areas and/or the use of compensatory strategies involving pain processing in migraine. Knowledge of these structural and functional changes may be useful for monitoring disease progression as well as for therapeutic interventions.

  3. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  4. Common and distinct neural correlates of personal and vicarious reward: A quantitative meta-analysis

    PubMed Central

    Morelli, Sylvia A.; Sacchet, Matthew D.; Zaki, Jamil

    2015-01-01

    Individuals experience reward not only when directly receiving positive outcomes (e.g., food or money), but also when observing others receive such outcomes. This latter phenomenon, known as vicarious reward, is a perennial topic of interest among psychologists and economists. More recently, neuroscientists have begun exploring the neuroanatomy underlying vicarious reward. Here we present a quantitative whole-brain meta-analysis of this emerging literature. We identified 25 functional neuroimaging studies that included contrasts between vicarious reward and a neutral control, and subjected these contrasts to an activation likelihood estimate (ALE) meta-analysis. This analysis revealed a consistent pattern of activation across studies, spanning structures typically associated with the computation of value (especially ventromedial prefrontal cortex) and mentalizing (including dorsomedial prefrontal cortex and superior temporal sulcus). We further quantitatively compared this activation pattern to activation foci from a previous meta-analysis of personal reward. Conjunction analyses yielded overlapping VMPFC activity in response to personal and vicarious reward. Contrast analyses identified preferential engagement of the nucleus accumbens in response to personal as compared to vicarious reward, and in mentalizing-related structures in response to vicarious as compared to personal reward. These data shed light on the common and unique components of the reward that individuals experience directly and through their social connections. PMID:25554428

  5. Specifying the core network supporting episodic simulation and episodic memory by activation likelihood estimation.

    PubMed

    Benoit, Roland G; Schacter, Daniel L

    2015-08-01

    It has been suggested that the simulation of hypothetical episodes and the recollection of past episodes are supported by fundamentally the same set of brain regions. The present article specifies this core network via Activation Likelihood Estimation (ALE). Specifically, a first meta-analysis revealed joint engagement of expected core-network regions during episodic memory and episodic simulation. These include parts of the medial surface, the hippocampus and parahippocampal cortex within the medial temporal lobes, and the temporal and inferior posterior parietal cortices on the lateral surface. Both capacities also jointly recruited additional regions such as parts of the bilateral dorsolateral prefrontal cortex. All of these core regions overlapped with the default network. Moreover, it has further been suggested that episodic simulation may require a stronger engagement of some of the core network's nodes as well as the recruitment of additional brain regions supporting control functions. A second ALE meta-analysis indeed identified such regions that were consistently more strongly engaged during episodic simulation than episodic memory. These comprised the core-network clusters located in the left dorsolateral prefrontal cortex and posterior inferior parietal lobe and other structures distributed broadly across the default and fronto-parietal control networks. Together, the analyses determine the set of brain regions that allow us to experience past and hypothetical episodes, thus providing an important foundation for studying the regions' specialized contributions and interactions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Task modulated brain connectivity of the amygdala: a meta-analysis of psychophysiological interactions.

    PubMed

    Di, Xin; Huang, Jia; Biswal, Bharat B

    2017-01-01

    Understanding functional connectivity of the amygdala with other brain regions, especially task modulated connectivity, is a critical step toward understanding the role of the amygdala in emotional processes and the interactions between emotion and cognition. The present study performed coordinate-based meta-analysis on studies of task modulated connectivity of the amygdala which used psychophysiological interaction (PPI) analysis. We first analyzed 49 PPI studies on different types of tasks using activation likelihood estimation (ALE) meta-analysis. Widespread cortical and subcortical regions showed consistent task modulated connectivity with the amygdala, including the medial frontal cortex, bilateral insula, anterior cingulate, fusiform gyrus, parahippocampal gyrus, thalamus, and basal ganglia. These regions were in general overlapped with those showed coactivations with the amygdala, suggesting that these regions and amygdala are not only activated together, but also show different levels of interactions during tasks. Further analyses with subsets of PPI studies revealed task specific functional connectivities with the amygdala that were modulated by fear processing, face processing, and emotion regulation. These results suggest a dynamic modulation of connectivity upon task demands, and provide new insights on the functions of the amygdala in different affective and cognitive processes. The meta-analytic approach on PPI studies may offer a framework toward systematical examinations of task modulated connectivity.

  7. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  8. Meta-Analysis of Functional Neuroimaging and Cognitive Control Studies in Schizophrenia: Preliminary Elucidation of a Core Dysfunctional Timing Network

    PubMed Central

    Alústiza, Irene; Radua, Joaquim; Albajes-Eizagirre, Anton; Domínguez, Manuel; Aubá, Enrique; Ortuño, Felipe

    2016-01-01

    Timing and other cognitive processes demanding cognitive control become interlinked when there is an increase in the level of difficulty or effort required. Both functions are interrelated and share neuroanatomical bases. A previous meta-analysis of neuroimaging studies found that people with schizophrenia had significantly lower activation, relative to normal controls, of most right hemisphere regions of the time circuit. This finding suggests that a pattern of disconnectivity of this circuit, particularly in the supplementary motor area, is a trait of this mental disease. We hypothesize that a dysfunctional temporal/cognitive control network underlies both cognitive and psychiatric symptoms of schizophrenia and that timing dysfunction is at the root of the cognitive deficits observed. The goal of our study was to look, in schizophrenia patients, for brain structures activated both by execution of cognitive tasks requiring increased effort and by performance of time perception tasks. We conducted a signed differential mapping (SDM) meta-analysis of functional neuroimaging studies in schizophrenia patients assessing the brain response to increasing levels of cognitive difficulty. Then, we performed a multimodal meta-analysis to identify common brain regions in the findings of that SDM meta-analysis and our previously-published activation likelihood estimate (ALE) meta-analysis of neuroimaging of time perception in schizophrenia patients. The current study supports the hypothesis that there exists an overlap between neural structures engaged by both timing tasks and non-temporal cognitive tasks of escalating difficulty in schizophrenia. The implication is that a deficit in timing can be considered as a trait marker of the schizophrenia cognitive profile. PMID:26925013

  9. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  10. MPA Portable: A Stand-Alone Software Package for Analyzing Metaproteome Samples on the Go.

    PubMed

    Muth, Thilo; Kohrs, Fabian; Heyer, Robert; Benndorf, Dirk; Rapp, Erdmann; Reichl, Udo; Martens, Lennart; Renard, Bernhard Y

    2018-01-02

    Metaproteomics, the mass spectrometry-based analysis of proteins from multispecies samples faces severe challenges concerning data analysis and results interpretation. To overcome these shortcomings, we here introduce the MetaProteomeAnalyzer (MPA) Portable software. In contrast to the original server-based MPA application, this newly developed tool no longer requires computational expertise for installation and is now independent of any relational database system. In addition, MPA Portable now supports state-of-the-art database search engines and a convenient command line interface for high-performance data processing tasks. While search engine results can easily be combined to increase the protein identification yield, an additional two-step workflow is implemented to provide sufficient analysis resolution for further postprocessing steps, such as protein grouping as well as taxonomic and functional annotation. Our new application has been developed with a focus on intuitive usability, adherence to data standards, and adaptation to Web-based workflow platforms. The open source software package can be found at https://github.com/compomics/meta-proteome-analyzer .

  11. Localising semantic and syntactic processing in spoken and written language comprehension: an Activation Likelihood Estimation meta-analysis.

    PubMed

    Rodd, Jennifer M; Vitello, Sylvia; Woollams, Anna M; Adank, Patti

    2015-02-01

    We conducted an Activation Likelihood Estimation (ALE) meta-analysis to identify brain regions that are recruited by linguistic stimuli requiring relatively demanding semantic or syntactic processing. We included 54 functional MRI studies that explicitly varied the semantic or syntactic processing load, while holding constant demands on earlier stages of processing. We included studies that introduced a syntactic/semantic ambiguity or anomaly, used a priming manipulation that specifically reduced the load on semantic/syntactic processing, or varied the level of syntactic complexity. The results confirmed the critical role of the posterior left Inferior Frontal Gyrus (LIFG) in semantic and syntactic processing. These results challenge models of sentence comprehension highlighting the role of anterior LIFG for semantic processing. In addition, the results emphasise the posterior (but not anterior) temporal lobe for both semantic and syntactic processing. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  12. Decoding fMRI events in sensorimotor motor network using sparse paradigm free mapping and activation likelihood estimates.

    PubMed

    Tan, Francisca M; Caballero-Gaudes, César; Mullinger, Karen J; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L; Francis, Susan T; Gowland, Penny A

    2017-11-01

    Most functional MRI (fMRI) studies map task-driven brain activity using a block or event-related paradigm. Sparse paradigm free mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information, but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of activation likelihood estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the sensorimotor network (SMN) to six motor functions (left/right fingers, left/right toes, swallowing, and eye blinks). We validated the framework using simultaneous electromyography (EMG)-fMRI experiments and motor tasks with short and long duration, and random interstimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events were 77 ± 13% and 74 ± 16%, respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55% and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this article discusses methodological implications and improvements to increase the decoding performance. Hum Brain Mapp 38:5778-5794, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. Decoding fMRI events in Sensorimotor Motor Network using Sparse Paradigm Free Mapping and Activation Likelihood Estimates

    PubMed Central

    Tan, Francisca M.; Caballero-Gaudes, César; Mullinger, Karen J.; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L.; Francis, Susan T.; Gowland, Penny A.

    2017-01-01

    Most fMRI studies map task-driven brain activity using a block or event-related paradigm. Sparse Paradigm Free Mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information; but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of Activation Likelihood Estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the Sensorimotor Network (SMN) to six motor function (left/right fingers, left/right toes, swallowing and eye blinks). We validated the framework using simultaneous Electromyography-fMRI experiments and motor tasks with short and long duration, and random inter-stimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events was 77 ± 13% and 74 ± 16% respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55 and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this paper discusses methodological implications and improvements to increase the decoding performance. PMID:28815863

  14. Introspective Minds: Using ALE Meta-Analyses to Study Commonalities in the Neural Correlates of Emotional Processing, Social & Unconstrained Cognition

    PubMed Central

    Schilbach, Leonhard; Bzdok, Danilo; Timmermans, Bert; Fox, Peter T.; Laird, Angela R.; Vogeley, Kai; Eickhoff, Simon B.

    2012-01-01

    Previous research suggests overlap between brain regions that show task-induced deactivations and those activated during the performance of social-cognitive tasks. Here, we present results of quantitative meta-analyses of neuroimaging studies, which confirm a statistical convergence in the neural correlates of social and resting state cognition. Based on the idea that both social and unconstrained cognition might be characterized by introspective processes, which are also thought to be highly relevant for emotional experiences, a third meta-analysis was performed investigating studies on emotional processing. By using conjunction analyses across all three sets of studies, we can demonstrate significant overlap of task-related signal change in dorso-medial prefrontal and medial parietal cortex, brain regions that have, indeed, recently been linked to introspective abilities. Our findings, therefore, provide evidence for the existence of a core neural network, which shows task-related signal change during socio-emotional tasks and during resting states. PMID:22319593

  15. Customized workflow development and data modularization concepts for RNA-Sequencing and metatranscriptome experiments.

    PubMed

    Lott, Steffen C; Wolfien, Markus; Riege, Konstantin; Bagnacani, Andrea; Wolkenhauer, Olaf; Hoffmann, Steve; Hess, Wolfgang R

    2017-11-10

    RNA-Sequencing (RNA-Seq) has become a widely used approach to study quantitative and qualitative aspects of transcriptome data. The variety of RNA-Seq protocols, experimental study designs and the characteristic properties of the organisms under investigation greatly affect downstream and comparative analyses. In this review, we aim to explain the impact of structured pre-selection, classification and integration of best-performing tools within modularized data analysis workflows and ready-to-use computing infrastructures towards experimental data analyses. We highlight examples for workflows and use cases that are presented for pro-, eukaryotic and mixed dual RNA-Seq (meta-transcriptomics) experiments. In addition, we are summarizing the expertise of the laboratories participating in the project consortium "Structured Analysis and Integration of RNA-Seq experiments" (de.STAIR) and its integration with the Galaxy-workbench of the RNA Bioinformatics Center (RBC). Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Oxytocin and brain activity in humans: A systematic review and coordinate-based meta-analysis of functional MRI studies.

    PubMed

    Grace, Sally A; Rossell, Susan L; Heinrichs, Markus; Kordsachia, Catarina; Labuschagne, Izelle

    2018-05-24

    Oxytocin (OXT) is a neuropeptide which has a critical role in human social behaviour and cognition. Research investigating the role of OXT on functional brain changes in humans has often used task paradigms that probe socioemotional processes. Preliminary evidence suggests a central role of the amygdala in the social cognitive effects of intranasal OXT (IN-OXT), however, inconsistencies in task-design and analysis methods have led to inconclusive findings regarding a cohesive model of the neural mechanisms underlying OXT's actions. The aim of this meta-analysis was to systematically investigate these findings. A systematic search of PubMed, PsycINFO, and Scopus databases was conducted for fMRI studies which compared IN-OXT to placebo in humans. First, we systematically reviewed functional magnetic resonance imaging (fMRI) studies of IN-OXT, including studies of healthy humans, those with clinical disorders, and studies examining resting-state fMRI (rsfMRI). Second, we employed a coordinate-based meta-analysis for task-based neuroimaging literature using activation likelihood estimation (ALE), whereby, coordinates were extracted from clusters with significant differences in IN-OXT versus placebo in healthy adults. Data were included for 39 fMRI studies that reported a total of 374 distinct foci. The meta-analysis identified task-related IN-OXT increases in activity within a cluster of the left superior temporal gyrus during tasks of emotion processing. These findings are important as they implicate regions beyond the amygdala in the neural effects of IN-OXT. The outcomes from this meta-analysis can guide a priori predictions for future OXT research, and provide an avenue for targeted treatment interventions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Neural Networks Involved in Adolescent Reward Processing: An Activation Likelihood Estimation Meta-Analysis of Functional Neuroimaging Studies

    PubMed Central

    Silverman, Merav H.; Jedd, Kelly; Luciana, Monica

    2015-01-01

    Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587

  18. What you see is what you eat: an ALE meta-analysis of the neural correlates of food viewing in children and adolescents.

    PubMed

    van Meer, Floor; van der Laan, Laura N; Adan, Roger A H; Viergever, Max A; Smeets, Paul A M

    2015-01-01

    Food cues are omnipresent and may enhance overconsumption. In the last two decades the prevalence of childhood obesity has increased dramatically all over the world, largely due to overconsumption. Understanding children's neural responses to food may help to develop better interventions for preventing or reducing overconsumption. We aimed to determine which brain regions are concurrently activated in children/adolescents in response to viewing food pictures, and how these relate to adult findings. Two activation likelihood estimation (ALE) meta-analyses were performed: one with studies in normal weight children/adolescents (aged 8-18, 8 studies, 137 foci) and one with studies in normal weight adults (aged 18-45, 16 studies, 178 foci). A contrast analysis was performed for children/adolescents vs. adults. In children/adolescents, the most concurrent clusters were in the left lateral orbitofrontal cortex (OFC), the bilateral fusiform gyrus, and the right superior parietal lobule. In adults, clusters in similar areas were found. Although the number of studies for a direct statistical comparison between the groups was relatively low, there were indications that children/adolescents may not activate areas important for cognitive control. Overall, the number of studies that contributed to the significant clusters was moderate (6-75%). In summary, the brain areas most consistently activated in children/adolescents by food viewing are part of the appetitive brain network and overlap with those found in adults. However, the age range of the children studied was rather broad. This study offers important recommendations for future research; studies making a direct comparison between adults and children in a sufficiently narrow age range would further elucidate how neural responses to food cues change during development. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Perception of affective and linguistic prosody: an ALE meta-analysis of neuroimaging studies

    PubMed Central

    Brown, Steven

    2014-01-01

    Prosody refers to the melodic and rhythmic aspects of speech. Two forms of prosody are typically distinguished: ‘affective prosody’ refers to the expression of emotion in speech, whereas ‘linguistic prosody’ relates to the intonation of sentences, including the specification of focus within sentences and stress within polysyllabic words. While these two processes are united by their use of vocal pitch modulation, they are functionally distinct. In order to examine the localization and lateralization of speech prosody in the brain, we performed two voxel-based meta-analyses of neuroimaging studies of the perception of affective and linguistic prosody. There was substantial sharing of brain activations between analyses, particularly in right-hemisphere auditory areas. However, a major point of divergence was observed in the inferior frontal gyrus: affective prosody was more likely to activate Brodmann area 47, while linguistic prosody was more likely to activate the ventral part of area 44. PMID:23934416

  20. Anti-inflammatory evaluation and characterization of leaf extract of Ananas comosus.

    PubMed

    Kargutkar, Samira; Brijesh, S

    2018-04-01

    Ananas comosus (L.) Merr (Pineapple) is a tropical plant with an edible fruit. In the present study, the potential anti-inflammatory activity of A. comosus leaf extract (ALE) was studied. ALE prepared using soxhlet apparatus was subjected to preliminary qualitative phytochemical analysis and quantitative estimations of flavonoids and tannins. The components present in ALE were identified using liquid chromatography-mass spectrometry (LC-MS) and gas chromatography-mass spectrometry (GC-MS). Inhibitory effects of ALE on protein denaturation, and proteinase activity were assessed. Its effect on secretion of pro-inflammatory cytokines and inflammatory mediators by lipopolysaccharide-stimulated macrophages was also analyzed. Further, its anti-inflammatory activity in carrageenan-induced inflammatory rat model was examined. The preliminary qualitative phytochemical analysis revealed presence of flavonoids, phenols, tannins, carbohydrates, glycosides, and proteins in the extract. Total flavonoids and total tannins were 0.17 ± 0.006 mg equivalent of quercetin/g of ALE and 4.04 ± 0.56 mg equivalent of gallic acid/g of ALE. LC-MS analysis identified the presence of 4-hydroxy pelargonic acid, 3,4,5-trimethoxycinnamic and 4-methoxycinnamic acid, whereas GC-MS analysis identified the presence of campesterol and ethyl isoallocholate that have been previously reported for anti-inflammatory activity. ALE showed significant inhibition of protein denaturation and proteinase activity and also controlled secretion of tumour necrosis factor-α, interleukin-1β and prostaglandins, as well as the generation of reactive oxygen species by activated macrophages. ALE also significantly decreased carrageenan-induced acute paw edema. The study, therefore, identified the components present in ALE that may be responsible for its anti-inflammatory activity and thus demonstrated its potential use against acute inflammatory diseases.

  1. Social cognition and the cerebellum: a meta-analysis of over 350 fMRI studies.

    PubMed

    Van Overwalle, Frank; Baetens, Kris; Mariën, Peter; Vandekerckhove, Marie

    2014-02-01

    This meta-analysis explores the role of the cerebellum in social cognition. Recent meta-analyses of neuroimaging studies since 2008 demonstrate that the cerebellum is only marginally involved in social cognition and emotionality, with a few meta-analyses pointing to an involvement of at most 54% of the individual studies. In this study, novel meta-analyses of over 350 fMRI studies, dividing up the domain of social cognition in homogeneous subdomains, confirmed this low involvement of the cerebellum in conditions that trigger the mirror network (e.g., when familiar movements of body parts are observed) and the mentalizing network (when no moving body parts or unfamiliar movements are present). There is, however, one set of mentalizing conditions that strongly involve the cerebellum in 50-100% of the individual studies. In particular, when the level of abstraction is high, such as when behaviors are described in terms of traits or permanent characteristics, in terms of groups rather than individuals, in terms of the past (episodic autobiographic memory) or the future rather than the present, or in terms of hypothetical events that may happen. An activation likelihood estimation (ALE) meta-analysis conducted in this study reveals that the cerebellum is critically implicated in social cognition and that the areas of the cerebellum which are consistently involved in social cognitive processes show extensive overlap with the areas involved in sensorimotor (during mirror and self-judgments tasks) as well as in executive functioning (across all tasks). We discuss the role of the cerebellum in social cognition in general and in higher abstraction mentalizing in particular. We also point out a number of methodological limitations of some available studies on the social brain that hamper the detection of cerebellar activity. © 2013 Elsevier Inc. All rights reserved.

  2. Meta-learning framework applied in bioinformatics inference system design.

    PubMed

    Arredondo, Tomás; Ormazábal, Wladimir

    2015-01-01

    This paper describes a meta-learner inference system development framework which is applied and tested in the implementation of bioinformatic inference systems. These inference systems are used for the systematic classification of the best candidates for inclusion in bacterial metabolic pathway maps. This meta-learner-based approach utilises a workflow where the user provides feedback with final classification decisions which are stored in conjunction with analysed genetic sequences for periodic inference system training. The inference systems were trained and tested with three different data sets related to the bacterial degradation of aromatic compounds. The analysis of the meta-learner-based framework involved contrasting several different optimisation methods with various different parameters. The obtained inference systems were also contrasted with other standard classification methods with accurate prediction capabilities observed.

  3. Neuroimaging of Reading Intervention: A Systematic Review and Activation Likelihood Estimate Meta-Analysis

    PubMed Central

    Barquero, Laura A.; Davis, Nicole; Cutting, Laurie E.

    2014-01-01

    A growing number of studies examine instructional training and brain activity. The purpose of this paper is to review the literature regarding neuroimaging of reading intervention, with a particular focus on reading difficulties (RD). To locate relevant studies, searches of peer-reviewed literature were conducted using electronic databases to search for studies from the imaging modalities of fMRI and MEG (including MSI) that explored reading intervention. Of the 96 identified studies, 22 met the inclusion criteria for descriptive analysis. A subset of these (8 fMRI experiments with post-intervention data) was subjected to activation likelihood estimate (ALE) meta-analysis to investigate differences in functional activation following reading intervention. Findings from the literature review suggest differences in functional activation of numerous brain regions associated with reading intervention, including bilateral inferior frontal, superior temporal, middle temporal, middle frontal, superior frontal, and postcentral gyri, as well as bilateral occipital cortex, inferior parietal lobules, thalami, and insulae. Findings from the meta-analysis indicate change in functional activation following reading intervention in the left thalamus, right insula/inferior frontal, left inferior frontal, right posterior cingulate, and left middle occipital gyri. Though these findings should be interpreted with caution due to the small number of studies and the disparate methodologies used, this paper is an effort to synthesize across studies and to guide future exploration of neuroimaging and reading intervention. PMID:24427278

  4. The first taste is always with the eyes: a meta-analysis on the neural correlates of processing visual food cues.

    PubMed

    van der Laan, L N; de Ridder, D T D; Viergever, M A; Smeets, P A M

    2011-03-01

    Food selection is primarily guided by the visual system. Multiple functional neuro-imaging studies have examined the brain responses to visual food stimuli. However, the results of these studies are heterogeneous and there still is uncertainty about the core brain regions involved in the neural processing of viewing food pictures. The aims of the present study were to determine the concurrence in the brain regions activated in response to viewing pictures of food and to assess the modulating effects of hunger state and the food's energy content. We performed three Activation Likelihood Estimation (ALE) meta-analyses on data from healthy normal weight subjects in which we examined: 1) the contrast between viewing food and nonfood pictures (17 studies, 189 foci), 2) the modulation by hunger state (five studies, 48 foci) and 3) the modulation by energy content (seven studies, 86 foci). The most concurrent brain regions activated in response to viewing food pictures, both in terms of ALE values and the number of contributing experiments, were the bilateral posterior fusiform gyrus, the left lateral orbitofrontal cortex (OFC) and the left middle insula. Hunger modulated the response to food pictures in the right amygdala and left lateral OFC, and energy content modulated the response in the hypothalamus/ventral striatum. Overall, the concurrence between studies was moderate: at best 41% of the experiments contributed to the clusters for the contrast between food and nonfood. Therefore, future research should further elucidate the separate effects of methodological and physiological factors on between-study variations. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    PubMed

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  6. Are numbers grounded in a general magnitude processing system? A functional neuroimaging meta-analysis.

    PubMed

    Sokolowski, H Moriah; Fias, Wim; Bosah Ononye, Chuka; Ansari, Daniel

    2017-10-01

    It is currently debated whether numbers are processed using a number-specific system or a general magnitude processing system, also used for non-numerical magnitudes such as physical size, duration, or luminance. Activation likelihood estimation (ALE) was used to conduct the first quantitative meta-analysis of 93 empirical neuroimaging papers examining neural activation during numerical and non-numerical magnitude processing. Foci were compiled to generate probabilistic maps of activation for non-numerical magnitudes (e.g. physical size), symbolic numerical magnitudes (e.g. Arabic digits), and nonsymbolic numerical magnitudes (e.g. dot arrays). Conjunction analyses revealed overlapping activation for symbolic, nonsymbolic and non-numerical magnitudes in frontal and parietal lobes. Contrast analyses revealed specific activation in the left superior parietal lobule for symbolic numerical magnitudes. In contrast, small regions in the bilateral precuneus were specifically activated for nonsymbolic numerical magnitudes. No regions in the parietal lobes were activated for non-numerical magnitudes that were not also activated for numerical magnitudes. Therefore, numbers are processed using both a generalized magnitude system and format specific number regions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Perception of affective and linguistic prosody: an ALE meta-analysis of neuroimaging studies.

    PubMed

    Belyk, Michel; Brown, Steven

    2014-09-01

    Prosody refers to the melodic and rhythmic aspects of speech. Two forms of prosody are typically distinguished: 'affective prosody' refers to the expression of emotion in speech, whereas 'linguistic prosody' relates to the intonation of sentences, including the specification of focus within sentences and stress within polysyllabic words. While these two processes are united by their use of vocal pitch modulation, they are functionally distinct. In order to examine the localization and lateralization of speech prosody in the brain, we performed two voxel-based meta-analyses of neuroimaging studies of the perception of affective and linguistic prosody. There was substantial sharing of brain activations between analyses, particularly in right-hemisphere auditory areas. However, a major point of divergence was observed in the inferior frontal gyrus: affective prosody was more likely to activate Brodmann area 47, while linguistic prosody was more likely to activate the ventral part of area 44. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. What do results from coordinate-based meta-analyses tell us?

    PubMed

    Albajes-Eizagirre, Anton; Radua, Joaquim

    2018-08-01

    Coordinate-based meta-analyses (CBMA) methods, such as Activation Likelihood Estimation (ALE) and Seed-based d Mapping (SDM), have become an invaluable tool for summarizing the findings of voxel-based neuroimaging studies. However, the progressive sophistication of these methods may have concealed two particularities of their statistical tests. Common univariate voxelwise tests (such as the t/z-tests used in SPM and FSL) detect voxels that activate, or voxels that show differences between groups. Conversely, the tests conducted in CBMA test for "spatial convergence" of findings, i.e., they detect regions where studies report "more peaks than in most regions", regions that activate "more than most regions do", or regions that show "larger differences between groups than most regions do". The first particularity is that these tests rely on two spatial assumptions (voxels are independent and have the same probability to have a "false" peak), whose violation may make their results either conservative or liberal, though fortunately current versions of ALE, SDM and some other methods consider these assumptions. The second particularity is that the use of these tests involves an important paradox: the statistical power to detect a given effect is higher if there are no other effects in the brain, whereas lower in presence of multiple effects. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Stuttering as a trait or state - an ALE meta-analysis of neuroimaging studies.

    PubMed

    Belyk, Michel; Kraft, Shelly Jo; Brown, Steven

    2015-01-01

    Stuttering is a speech disorder characterised by repetitions, prolongations and blocks that disrupt the forward movement of speech. An earlier meta-analysis of brain imaging studies of stuttering (Brown et al., 2005) revealed a general trend towards rightward lateralization of brain activations and hyperactivity in the larynx motor cortex bilaterally. The present study sought not only to update that meta-analysis with recent work but to introduce an important distinction not present in the first study, namely the difference between 'trait' and 'state' stuttering. The analysis of trait stuttering compares people who stutter (PWS) with people who do not stutter when behaviour is controlled for, i.e., when speech is fluent in both groups. In contrast, the analysis of state stuttering examines PWS during episodes of stuttered speech compared with episodes of fluent speech. Seventeen studies were analysed using activation likelihood estimation. Trait stuttering was characterised by the well-known rightward shift in lateralization for language and speech areas. State stuttering revealed a more diverse pattern. Abnormal activation of larynx and lip motor cortex was common to the two analyses. State stuttering was associated with overactivation in the right hemisphere larynx and lip motor cortex. Trait stuttering was associated with overactivation of lip motor cortex in the right hemisphere but underactivation of larynx motor cortex in the left hemisphere. These results support a large literature highlighting laryngeal and lip involvement in the symptomatology of stuttering, and disambiguate two possible sources of activation in neuroimaging studies of persistent developmental stuttering. © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  10. Neural signatures of lexical tone reading.

    PubMed

    Kwok, Veronica P Y; Wang, Tianfu; Chen, Siping; Yakpo, Kofi; Zhu, Linlin; Fox, Peter T; Tan, Li Hai

    2015-01-01

    Research on how lexical tone is neuroanatomically represented in the human brain is central to our understanding of cortical regions subserving language. Past studies have exclusively focused on tone perception of the spoken language, and little is known as to the lexical tone processing in reading visual words and its associated brain mechanisms. In this study, we performed two experiments to identify neural substrates in Chinese tone reading. First, we used a tone judgment paradigm to investigate tone processing of visually presented Chinese characters. We found that, relative to baseline, tone perception of printed Chinese characters were mediated by strong brain activation in bilateral frontal regions, left inferior parietal lobule, left posterior middle/medial temporal gyrus, left inferior temporal region, bilateral visual systems, and cerebellum. Surprisingly, no activation was found in superior temporal regions, brain sites well known for speech tone processing. In activation likelihood estimation (ALE) meta-analysis to combine results of relevant published studies, we attempted to elucidate whether the left temporal cortex activities identified in Experiment one is consistent with those found in previous studies of auditory lexical tone perception. ALE results showed that only the left superior temporal gyrus and putamen were critical in auditory lexical tone processing. These findings suggest that activation in the superior temporal cortex associated with lexical tone perception is modality-dependent. © 2014 Wiley Periodicals, Inc.

  11. Rostral and caudal prefrontal contribution to creativity: a meta-analysis of functional imaging data

    PubMed Central

    Gonen-Yaacovi, Gil; de Souza, Leonardo Cruz; Levy, Richard; Urbanski, Marika; Josse, Goulven; Volle, Emmanuelle

    2013-01-01

    Creativity is of central importance for human civilization, yet its neurocognitive bases are poorly understood. The aim of the present study was to integrate existing functional imaging data by using the meta-analysis approach. We reviewed 34 functional imaging studies that reported activation foci during tasks assumed to engage creative thinking in healthy adults. A coordinate-based meta-analysis using Activation Likelihood Estimation (ALE) first showed a set of predominantly left-hemispheric regions shared by the various creativity tasks examined. These regions included the caudal lateral prefrontal cortex (PFC), the medial and lateral rostral PFC, and the inferior parietal and posterior temporal cortices. Further analyses showed that tasks involving the combination of remote information (combination tasks) activated more anterior areas of the lateral PFC than tasks involving the free generation of unusual responses (unusual generation tasks), although both types of tasks shared caudal prefrontal areas. In addition, verbal and non-verbal tasks involved the same regions in the left caudal prefrontal, temporal, and parietal areas, but also distinct domain-oriented areas. Taken together, these findings suggest that several frontal and parieto-temporal regions may support cognitive processes shared by diverse creativity tasks, and that some regions may be specialized for distinct types of processes. In particular, the lateral PFC appeared to be organized along a rostro-caudal axis, with rostral regions involved in combining ideas creatively and more posterior regions involved in freely generating novel ideas. PMID:23966927

  12. Sex differences in emotional perception: Meta analysis of divergent activation.

    PubMed

    Filkowski, Megan M; Olsen, Rachel M; Duda, Bryant; Wanger, Timothy J; Sabatinelli, Dean

    2017-02-15

    Behavioral and physiological sex differences in emotional reactivity are well documented, yet comparatively few neural differences have been identified. Here we apply quantitative activation likelihood estimation (ALE) meta-analysis across functional brain imaging studies that each reported clusters of activity differentiating men and women as they participated in emotion-evoking tasks in the visual modality. This approach requires the experimental paradigm to be balanced across the sexes, and thus may provide greater clarity than previous efforts. Results across 56 emotion-eliciting studies (n=1907) reveal distinct activation in the medial prefrontal cortex, anterior cingulate cortex, frontal pole, and mediodorsal nucleus of the thalamus in men relative to women. Women show distinct activation in bilateral amygdala, hippocampus, and regions of the dorsal midbrain including the periaqueductal gray/superior colliculus and locus coeruleus. While some clusters are consistent with prevailing perspectives on the foundations of sex differences in emotional reactivity, thalamic and brainstem regions have not previously been highlighted as sexually divergent. These data strongly support the need to include sex as a factor in functional brain imaging studies of emotion, and to extend our investigative focus beyond the cortex. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. ALE meta-analysis on facial judgments of trustworthiness and attractiveness.

    PubMed

    Bzdok, D; Langner, R; Caspers, S; Kurth, F; Habel, U; Zilles, K; Laird, A; Eickhoff, Simon B

    2011-01-01

    Faces convey a multitude of information in social interaction, among which are trustworthiness and attractiveness. Humans process and evaluate these two dimensions very quickly due to their great adaptive importance. Trustworthiness evaluation is crucial for modulating behavior toward strangers; attractiveness evaluation is a crucial factor for mate selection, possibly providing cues for reproductive success. As both dimensions rapidly guide social behavior, this study tests the hypothesis that both judgments may be subserved by overlapping brain networks. To this end, we conducted an activation likelihood estimation meta-analysis on 16 functional magnetic resonance imaging studies pertaining to facial judgments of trustworthiness and attractiveness. Throughout combined, individual, and conjunction analyses on those two facial judgments, we observed consistent maxima in the amygdala which corroborates our initial hypothesis. This finding supports the contemporary paradigm shift extending the amygdala's role from dominantly processing negative emotional stimuli to processing socially relevant ones. We speculate that the amygdala filters sensory information with evolutionarily conserved relevance. Our data suggest that such a role includes not only "fight-or-flight" decisions but also social behaviors with longer term pay-off schedules, e.g., trustworthiness and attractiveness evaluation. © Springer-Verlag 2010

  14. Aloe vera extract functionalized zinc oxide nanoparticles as nanoantibiotics against multi-drug resistant clinical bacterial isolates.

    PubMed

    Ali, Khursheed; Dwivedi, Sourabh; Azam, Ameer; Saquib, Quaiser; Al-Said, Mansour S; Alkhedhairy, Abdulaziz A; Musarrat, Javed

    2016-06-15

    ZnO nanoparticles (ZnONPs) were synthesised through a simple and efficient biogenic synthesis approach, exploiting the reducing and capping potential of Aloe barbadensis Miller (A. vera) leaf extract (ALE). ALE-capped ZnO nanoparticles (ALE-ZnONPs) were characterized using UV-Vis spectroscopy, X-ray diffraction (XRD), Fourier transform infrared (FTIR) spectroscopy, scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDX), and transmission electron microscopy (TEM) analyses. XRD analysis provided the average size of ZnONPs as 15 nm. FTIR spectral analysis suggested the role of phenolic compounds, terpenoids and proteins present in ALE, in nucleation and stability of ZnONPs. Flow cytometry and atomic absorption spectrophotometry (AAS) data analyses revealed the surface binding and internalization of ZnONPs in Gram +ve (Staphylococcus aureus) and Gram -ve (Escherichia coli) cells, respectively. Significant antibacterial activity of ALE-ZnONPs was observed against extended spectrum beta lactamases (ESBL) positive E. coli, Pseudomonas aeruginosa, and methicillin resistant S. aureus (MRSA) clinical isolates exhibiting the MIC and MBC values of 2200, 2400 μg/ml and 2300, 2700 μg/ml, respectively. Substantial inhibitory effects of ALE-ZnONPs on bacterial growth kinetics, exopolysaccharides and biofilm formation, unequivocally suggested the antibiotic and anti-biofilm potential. Overall, the results elucidated a rapid, environmentally benign, cost-effective, and convenient method for ALE-ZnONPs synthesis, for possible applications as nanoantibiotics or drug carriers. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. The "handwriting brain": a meta-analysis of neuroimaging studies of motor versus orthographic processes.

    PubMed

    Planton, Samuel; Jucla, Mélanie; Roux, Franck-Emmanuel; Démonet, Jean-François

    2013-01-01

    Handwriting is a modality of language production whose cerebral substrates remain poorly known although the existence of specific regions is postulated. The description of brain damaged patients with agraphia and, more recently, several neuroimaging studies suggest the involvement of different brain regions. However, results vary with the methodological choices made and may not always discriminate between "writing-specific" and motor or linguistic processes shared with other abilities. We used the "Activation Likelihood Estimate" (ALE) meta-analytical method to identify the cerebral network of areas commonly activated during handwriting in 18 neuroimaging studies published in the literature. Included contrasts were also classified according to the control tasks used, whether non-specific motor/output-control or linguistic/input-control. These data were included in two secondary meta-analyses in order to reveal the functional role of the different areas of this network. An extensive, mainly left-hemisphere network of 12 cortical and sub-cortical areas was obtained; three of which were considered as primarily writing-specific (left superior frontal sulcus/middle frontal gyrus area, left intraparietal sulcus/superior parietal area, right cerebellum) while others related rather to non-specific motor (primary motor and sensorimotor cortex, supplementary motor area, thalamus and putamen) or linguistic processes (ventral premotor cortex, posterior/inferior temporal cortex). This meta-analysis provides a description of the cerebral network of handwriting as revealed by various types of neuroimaging experiments and confirms the crucial involvement of the left frontal and superior parietal regions. These findings provide new insights into cognitive processes involved in handwriting and their cerebral substrates. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. The handyman's brain: a neuroimaging meta-analysis describing the similarities and differences between grip type and pattern in humans.

    PubMed

    King, M; Rauch, H G; Stein, D J; Brooks, S J

    2014-11-15

    Handgrip is a ubiquitous human movement that was critical in our evolution. However, the differences in brain activity between grip type (i.e. power or precision) and pattern (i.e. dynamic or static) are not fully understood. In order to address this, we performed Activation Likelihood Estimation (ALE) analysis between grip type and grip pattern using functional magnetic resonance imaging (fMRI) data. ALE provides a probabilistic summary of the BOLD response in hundreds of subjects, which is often beyond the scope of a single fMRI experiment. We collected data from 28 functional magnetic resonance data sets, which included a total of 398 male and female subjects. Using ALE, we analyzed the BOLD response during power, precision, static and dynamic grip in a range of forces and age in right handed healthy individuals without physical impairment, cardiovascular or neurological dysfunction using a variety of grip tools, feedback and experimental training. Power grip generates unique activation in the postcentral gyrus (areas 1 and 3b) and precision grip generates unique activation in the supplementary motor area (SMA, area 6) and precentral gyrus (area 4a). Dynamic handgrip generates unique activation in the precentral gyrus (area 4p) and SMA (area 6) and of particular interest, both dynamic and static grip share activation in the area 2 of the postcentral gyrus, an area implicated in the evolution of handgrip. According to effect size analysis, precision and dynamic grip generates stronger activity than power and static, respectively. Our study demonstrates specific differences between grip type and pattern. However, there was a large degree of overlap in the pre and postcentral gyrus, SMA and areas of the frontal-parietal-cerebellar network, which indicates that other mechanisms are potentially involved in regulating handgrip. Further, our study provides empirically based regions of interest, which can be downloaded here within, that can be used to more effectively study power grip in a range of populations and conditions. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. The Meditative Mind: A Comprehensive Meta-Analysis of MRI Studies

    PubMed Central

    2015-01-01

    Over the past decade mind and body practices, such as yoga and meditation, have raised interest in different scientific fields; in particular, the physiological mechanisms underlying the beneficial effects observed in meditators have been investigated. Neuroimaging studies have studied the effects of meditation on brain structure and function and findings have helped clarify the biological underpinnings of the positive effects of meditation practice and the possible integration of this technique in standard therapy. The large amount of data collected thus far allows drawing some conclusions about the neural effects of meditation practice. In the present study we used activation likelihood estimation (ALE) analysis to make a coordinate-based meta-analysis of neuroimaging data on the effects of meditation on brain structure and function. Results indicate that meditation leads to activation in brain areas involved in processing self-relevant information, self-regulation, focused problem-solving, adaptive behavior, and interoception. Results also show that meditation practice induces functional and structural brain modifications in expert meditators, especially in areas involved in self-referential processes such as self-awareness and self-regulation. These results demonstrate that a biological substrate underlies the positive pervasive effect of meditation practice and suggest that meditation techniques could be adopted in clinical populations and to prevent disease. PMID:26146618

  18. Exposure to subliminal arousing stimuli induces robust activation in the amygdala, hippocampus, anterior cingulate, insular cortex and primary visual cortex: a systematic meta-analysis of fMRI studies.

    PubMed

    Brooks, S J; Savov, V; Allzén, E; Benedict, C; Fredriksson, R; Schiöth, H B

    2012-02-01

    Functional Magnetic Resonance Imaging (fMRI) demonstrates that the subliminal presentation of arousing stimuli can activate subcortical brain regions independently of consciousness-generating top-down cortical modulation loops. Delineating these processes may elucidate mechanisms for arousal, aberration in which may underlie some psychiatric conditions. Here we are the first to review and discuss four Activation Likelihood Estimation (ALE) meta-analyses of fMRI studies using subliminal paradigms. We find a maximum of 9 out of 12 studies using subliminal presentation of faces contributing to activation of the amygdala, and also a significantly high number of studies reporting activation in the bilateral anterior cingulate, bilateral insular cortex, hippocampus and primary visual cortex. Subliminal faces are the strongest modality, whereas lexical stimuli are the weakest. Meta-analyses independent of studies using Regions of Interest (ROI) revealed no biasing effect. Core neuronal arousal in the brain, which may be at first independent of conscious processing, potentially involves a network incorporating primary visual areas, somatosensory, implicit memory and conflict monitoring regions. These data could provide candidate brain regions for the study of psychiatric disorders associated with aberrant automatic emotional processing. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Common and distinct networks underlying reward valence and processing stages: A meta-analysis of functional neuroimaging studies

    PubMed Central

    Liu, Xun; Hairston, Jacqueline; Schrier, Madeleine; Fan, Jin

    2011-01-01

    To better understand the reward circuitry in human brain, we conducted activation likelihood estimation (ALE) and parametric voxel-based meta-analyses (PVM) on 142 neuroimaging studies that examined brain activation in reward-related tasks in healthy adults. We observed several core brain areas that participated in reward-related decision making, including the nucleus accumbens (NAcc), caudate, putamen, thalamus, orbitofrontal cortex (OFC), bilateral anterior insula, anterior (ACC) and posterior (PCC) cingulate cortex, as well as cognitive control regions in the inferior parietal lobule and prefrontal cortex (PFC). The NAcc was commonly activated by both positive and negative rewards across various stages of reward processing (e.g., anticipation, outcome, and evaluation). In addition, the medial OFC and PCC preferentially responded to positive rewards, whereas the ACC, bilateral anterior insula, and lateral PFC selectively responded to negative rewards. Reward anticipation activated the ACC, bilateral anterior insula, and brain stem, whereas reward outcome more significantly activated the NAcc, medial OFC, and amygdala. Neurobiological theories of reward-related decision making should therefore distributed and interrelated representations of reward valuation and valence assessment into account. PMID:21185861

  20. Planning bioinformatics workflows using an expert system.

    PubMed

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Planning bioinformatics workflows using an expert system

    PubMed Central

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  2. The Role of the Amygdala in Facial Trustworthiness Processing: A Systematic Review and Meta-Analyses of fMRI Studies.

    PubMed

    Santos, Sara; Almeida, Inês; Oliveiros, Bárbara; Castelo-Branco, Miguel

    2016-01-01

    Faces play a key role in signaling social cues such as signals of trustworthiness. Although several studies identify the amygdala as a core brain region in social cognition, quantitative approaches evaluating its role are scarce. This review aimed to assess the role of the amygdala in the processing of facial trustworthiness, by analyzing its amplitude BOLD response polarity to untrustworthy versus trustworthy facial signals under fMRI tasks through a Meta-analysis of effect sizes (MA). Activation Likelihood Estimation (ALE) analyses were also conducted. Articles were retrieved from MEDLINE, ScienceDirect and Web-of-Science in January 2016. Following the PRISMA statement guidelines, a systematic review of original research articles in English language using the search string "(face OR facial) AND (trustworthiness OR trustworthy OR untrustworthy OR trustee) AND fMRI" was conducted. The MA concerned amygdala responses to facial trustworthiness for the contrast Untrustworthy vs. trustworthy faces, and included whole-brain and ROI studies. To prevent potential bias, results were considered even when at the single study level they did not survive correction for multiple comparisons or provided non-significant results. ALE considered whole-brain studies, using the same methodology to prevent bias. A summary of the methodological options (design and analysis) described in the articles was finally used to get further insight into the characteristics of the studies and to perform a subgroup analysis. Data were extracted by two authors and checked independently. Twenty fMRI studies were considered for systematic review. An MA of effect sizes with 11 articles (12 studies) showed high heterogeneity between studies [Q(11) = 265.68, p < .0001; I2 = 95.86%, 94.20% to 97.05%, with 95% confidence interval, CI]. Random effects analysis [RE(183) = 0.851, .422 to .969, 95% CI] supported the evidence that the (right) amygdala responds preferentially to untrustworthy faces. Moreover, two ALE analyses performed with 6 articles (7 studies) identified the amygdala, insula and medial dorsal nuclei of thalamus as structures with negative correlation with trustworthiness. Six articles/studies showed that posterior cingulate and medial frontal gyrus present positive correlations with increasing facial trustworthiness levels. Significant effects considering subgroup analysis based on methodological criteria were found for experiments using spatial smoothing, categorization of trustworthiness in 2 or 3 categories and paradigms which involve both explicit and implicit tasks. Significant heterogeneity between studies was found in MA, which might have arisen from inclusion of studies with smaller sample sizes and differences in methodological options. Studies using ROI analysis / small volume correction methods were more often devoted specifically to the amygdala region, with some results reporting uncorrected p-values based on mainly clinical a priori evidence of amygdala involvement in these processes. Nevertheless, we did not find significant evidence for publication bias. Our results support the role of amygdala in facial trustworthiness judgment, emphasizing its predominant role during processing of negative social signals in (untrustworthy) faces. This systematic review suggests that little consistency exists among studies' methodology, and that larger sample sizes should be preferred.

  3. Metaworkflows and Workflow Interoperability for Heliophysics

    NASA Astrophysics Data System (ADS)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They- implement Science Cases (the definition of a scientific challenge) by composing different Basic Workflows. The third and last layer,Iterative Science Workflows, is developed in WSPGRADE. It executes sub-workflows (either Basic or Science Workflows) as parameter sweep jobs to investigate Science Cases on large multiple data sets. So far, this approach has proven fruitful for three Science Cases of which one has been completed and two are still being tested.

  4. Blast Fragmentation Modeling and Analysis

    DTIC Science & Technology

    2010-10-31

    weapons device containing a multiphase blast explosive (MBX). 1. INTRODUCTION The ARL Survivability Lethality and Analysis Directorate (SLAD) is...velocity. In order to simulate the highly complex phenomenon, the exploding cylinder is modeled with the hydrodynamics code ALE3D , an arbitrary...Lagrangian-Eulerian multiphysics code, developed at Lawrence Livermore National Laboratory. ALE3D includes physical properties, constitutive models for

  5. Neural foundation of human moral reasoning: an ALE meta-analysis about the role of personal perspective.

    PubMed

    Boccia, M; Dacquino, C; Piccardi, L; Cordellieri, P; Guariglia, C; Ferlazzo, F; Ferracuti, S; Giannini, A M

    2017-02-01

    Moral sense is defined as a feeling of the rightness or wrongness of an action that knowingly causes harm to people other than the agent. The large amount of data collected over the past decade allows drawing some definite conclusions about the neurobiological foundations of moral reasoning as well as a systematic investigation of methodological variables during fMRI studies. Here, we verified the existence of converging and consistent evidence in the current literature by means of a meta-analysis of fMRI studies of moral reasoning, using activation likelihood estimation meta-analysis. We also tested for a possible neural segregation as function of the perspective used during moral reasoning i.e., first or third person perspectives. Results demonstrate the existence of a wide network of areas underpinning moral reasoning, including orbitofrontal cortex, insula, amygdala, anterior cingulate cortex as well as precuneus and posterior cingulate cortex. Within this network we found a neural segregation as a function of the personal perspective, with 1PP eliciting higher activation in the bilateral insula and superior temporal gyrus as well as in the anterior cingulate cortex, lingual and fusiform gyri, middle temporal gyrus and precentral gyrus in the left hemisphere, and 3PP eliciting higher activation in the bilateral amygdala, the posterior cingulate cortex, insula and supramarginal gyrus in the left hemisphere as well as the medial and ventromedial prefrontal cortex in the right hemisphere. These results shed some more light on the contribution of these areas to moral reasoning, strongly supporting a functional specialization as a function of the perspective used during moral reasoning.

  6. A quantitative meta-analysis and review of motor learning in the human brain

    PubMed Central

    Hardwick, Robert M.; Rottschy, Claudia; Miall, R. Chris; Eickhoff, Simon B.

    2013-01-01

    Neuroimaging studies have improved our understanding of which brain structures are involved in motor learning. Despite this, questions remain regarding the areas that contribute consistently across paradigms with different task demands. For instance, sensorimotor tasks focus on learning novel movement kinematics and dynamics, while serial response time task (SRTT) variants focus on sequence learning. These differing task demands are likely to elicit quantifiably different patterns of neural activity on top of a potentially consistent core network. The current study identified consistent activations across 70 motor learning experiments using activation likelihood estimation (ALE) meta-analysis. A global analysis of all tasks revealed a bilateral cortical–subcortical network consistently underlying motor learning across tasks. Converging activations were revealed in the dorsal premotor cortex, supplementary motor cortex, primary motor cortex, primary somatosensory cortex, superior parietal lobule, thalamus, putamen and cerebellum. These activations were broadly consistent across task specific analyses that separated sensorimotor tasks and SRTT variants. Contrast analysis indicated that activity in the basal ganglia and cerebellum was significantly stronger for sensorimotor tasks, while activity in cortical structures and the thalamus was significantly stronger for SRTT variants. Additional conjunction analyses then indicated that the left dorsal premotor cortex was activated across all analyses considered, even when controlling for potential motor confounds. The highly consistent activation of the left dorsal premotor cortex suggests it is a critical node in the motor learning network. PMID:23194819

  7. Event-related fMRI studies of episodic encoding and retrieval: meta-analyses using activation likelihood estimation.

    PubMed

    Spaniol, Julia; Davidson, Patrick S R; Kim, Alice S N; Han, Hua; Moscovitch, Morris; Grady, Cheryl L

    2009-07-01

    The recent surge in event-related fMRI studies of episodic memory has generated a wealth of information about the neural correlates of encoding and retrieval processes. However, interpretation of individual studies is hampered by methodological differences, and by the fact that sample sizes are typically small. We submitted results from studies of episodic memory in healthy young adults, published between 1998 and 2007, to a voxel-wise quantitative meta-analysis using activation likelihood estimation [Laird, A. R., McMillan, K. M., Lancaster, J. L., Kochunov, P., Turkeltaub, P. E., & Pardo, J. V., et al. (2005). A comparison of label-based review and ALE meta-analysis in the stroop task. Human Brain Mapping, 25, 6-21]. We conducted separate meta-analyses for four contrasts of interest: episodic encoding success as measured in the subsequent-memory paradigm (subsequent Hit vs. Miss), episodic retrieval success (Hit vs. Correct Rejection), objective recollection (e.g., Source Hit vs. Item Hit), and subjective recollection (e.g., Remember vs. Know). Concordance maps revealed significant cross-study overlap for each contrast. In each case, the left hemisphere showed greater concordance than the right hemisphere. Both encoding and retrieval success were associated with activation in medial-temporal, prefrontal, and parietal regions. Left ventrolateral prefrontal cortex (PFC) and medial-temporal regions were more strongly involved in encoding, whereas left superior parietal and dorsolateral and anterior PFC regions were more strongly involved in retrieval. Objective recollection was associated with activation in multiple PFC regions, as well as multiple posterior parietal and medial-temporal areas, but not hippocampus. Subjective recollection, in contrast, showed left hippocampal involvement. In summary, these results identify broadly consistent activation patterns associated with episodic encoding and retrieval, and subjective and objective recollection, but also subtle differences among these processes.

  8. DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows.

    PubMed

    Paraskevopoulou, Maria D; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A G

    2013-07-01

    MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA-gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines.

  9. DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows

    PubMed Central

    Paraskevopoulou, Maria D.; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S.; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A.G.

    2013-01-01

    MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA–gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines. PMID:23680784

  10. Characterization and functional analysis of the MAL and MPH Loci for maltose utilization in some ale and lager yeast strains.

    PubMed

    Vidgren, Virve; Ruohonen, Laura; Londesborough, John

    2005-12-01

    Maltose and maltotriose are the major sugars in brewer's wort. Brewer's yeasts contain multiple genes for maltose transporters. It is not known which of these express functional transporters. We correlated maltose transport kinetics with the genotypes of some ale and lager yeasts. Maltose transport by two ale strains was strongly inhibited by other alpha-glucosides, suggesting the use of broad substrate specificity transporters, such as Agt1p. Maltose transport by three lager strains was weakly inhibited by other alpha-glucosides, suggesting the use of narrow substrate specificity transporters. Hybridization studies showed that all five strains contained complete MAL1, MAL2, MAL3, and MAL4 loci, except for one ale strain, which lacked a MAL2 locus. All five strains also contained both AGT1 (coding a broad specificity alpha-glucoside transporter) and MAL11 alleles. MPH genes (maltose permease homologues) were present in the lager but not in the ale strains. During growth on maltose, the lager strains expressed AGT1 at low levels and MALx1 genes at high levels, whereas the ale strains expressed AGT1 at high levels and MALx1 genes at low levels. MPHx expression was negligible in all strains. The AGT1 sequences from the ale strains encoded full-length (616 amino acid) polypeptides, but those from both sequenced lager strains encoded truncated (394 amino acid) polypeptides that are unlikely to be functional transporters. Thus, despite the apparently similar genotypes of these ale and lager strains revealed by hybridization, maltose is predominantly carried by AGT1-encoded transporters in the ale strains and by MALx1-encoded transporters in the lager strains.

  11. Effects of cue focality on the neural mechanisms of prospective memory: A meta-analysis of neuroimaging studies.

    PubMed

    Cona, Giorgia; Bisiacchi, Patrizia Silvia; Sartori, Giuseppe; Scarpazza, Cristina

    2016-05-17

    Remembering to execute pre-defined intentions at the appropriate time in the future is typically referred to as Prospective Memory (PM). Studies of PM showed that distinct cognitive processes underlie the execution of delayed intentions depending on whether the cue associated with such intentions is focal to ongoing activity processing or not (i.e., cue focality). The present activation likelihood estimation (ALE) meta-analysis revealed several differences in brain activity as a function of focality of the PM cue. The retrieval of intention is supported mainly by left anterior prefrontal cortex (Brodmann Area, BA 10) in nonfocal tasks, and by cerebellum and ventral parietal regions in focal tasks. Furthermore, the precuneus showed increased activation during the maintenance phase of intentions compared to the retrieval phase in nonfocal tasks, whereas the inferior parietal lobule showed increased activation during the retrieval of intention compared to maintenance phase in the focal tasks. Finally, the retrieval of intention relies more on the activity in anterior cingulate cortex for nonfocal tasks, and on posterior cingulate cortex for focal tasks. Such focality-related pattern of activations suggests that prospective remembering is mediated mainly by top-down and stimulus-independent processes in nonfocal tasks, whereas by more automatic, bottom-up, processes in focal tasks.

  12. Effects of cue focality on the neural mechanisms of prospective memory: A meta-analysis of neuroimaging studies

    PubMed Central

    Cona, Giorgia; Bisiacchi, Patrizia Silvia; Sartori, Giuseppe; Scarpazza, Cristina

    2016-01-01

    Remembering to execute pre-defined intentions at the appropriate time in the future is typically referred to as Prospective Memory (PM). Studies of PM showed that distinct cognitive processes underlie the execution of delayed intentions depending on whether the cue associated with such intentions is focal to ongoing activity processing or not (i.e., cue focality). The present activation likelihood estimation (ALE) meta-analysis revealed several differences in brain activity as a function of focality of the PM cue. The retrieval of intention is supported mainly by left anterior prefrontal cortex (Brodmann Area, BA 10) in nonfocal tasks, and by cerebellum and ventral parietal regions in focal tasks. Furthermore, the precuneus showed increased activation during the maintenance phase of intentions compared to the retrieval phase in nonfocal tasks, whereas the inferior parietal lobule showed increased activation during the retrieval of intention compared to maintenance phase in the focal tasks. Finally, the retrieval of intention relies more on the activity in anterior cingulate cortex for nonfocal tasks, and on posterior cingulate cortex for focal tasks. Such focality-related pattern of activations suggests that prospective remembering is mediated mainly by top-down and stimulus-independent processes in nonfocal tasks, whereas by more automatic, bottom-up, processes in focal tasks. PMID:27185531

  13. Gray matter atrophy in narcolepsy: An activation likelihood estimation meta-analysis.

    PubMed

    Weng, Hsu-Huei; Chen, Chih-Feng; Tsai, Yuan-Hsiung; Wu, Chih-Ying; Lee, Meng; Lin, Yu-Ching; Yang, Cheng-Ta; Tsai, Ying-Huang; Yang, Chun-Yuh

    2015-12-01

    The authors reviewed the literature on the use of voxel-based morphometry (VBM) in narcolepsy magnetic resonance imaging (MRI) studies via the use of a meta-analysis of neuroimaging to identify concordant and specific structural deficits in patients with narcolepsy as compared with healthy subjects. We used PubMed to retrieve articles published between January 2000 and March 2014. The authors included all VBM research on narcolepsy and compared the findings of the studies by using gray matter volume (GMV) or gray matter concentration (GMC) to index differences in gray matter. Stereotactic data were extracted from 8 VBM studies of 149 narcoleptic patients and 162 control subjects. We applied activation likelihood estimation (ALE) technique and found significant regional gray matter reduction in the bilateral hypothalamus, thalamus, globus pallidus, extending to nucleus accumbens (NAcc) and anterior cingulate cortex (ACC), left mid orbital and rectal gyri (BAs 10 and 11), right inferior frontal gyrus (BA 47), and the right superior temporal gyrus (BA 41) in patients with narcolepsy. The significant gray matter deficits in narcoleptic patients occurred in the bilateral hypothalamus and frontotemporal regions, which may be related to the emotional processing abnormalities and orexin/hypocretin pathway common among populations of patients with narcolepsy. Copyright © 2015. Published by Elsevier Ltd.

  14. Do you see what I see? Optical morphology and visual capability of ‘disco’ clams (Ctenoides ales)

    PubMed Central

    Dubielzig, Richard R.; Schobert, Charles S.; Teixeira, Leandro B.; Li, Jingchun

    2017-01-01

    ABSTRACT The ‘disco’ clam Ctenoides ales (Finlay, 1927) is a marine bivalve that has a unique, vivid flashing display that is a result of light scattering by silica nanospheres and rapid mantle movement. The eyes of C. ales were examined to determine their visual capabilities and whether the clams can see the flashing of conspecifics. Similar to the congener C. scaber, C. ales exhibits an off-response (shadow reflex) and an on-response (light reflex). In field observations, a shadow caused a significant increase in flash rate from a mean of 3.9 Hz to 4.7 Hz (P=0.0016). In laboratory trials, a looming stimulus, which increased light intensity, caused a significant increase in flash rate from a median of 1.8 Hz to 2.2 Hz (P=0.0001). Morphological analysis of the eyes of C. ales revealed coarsely-packed photoreceptors lacking sophisticated structure, resulting in visual resolution that is likely too low to detect the flashing of conspecifics. As the eyes of C. ales are incapable of perceiving conspecific flashing, it is likely that their vision is instead used to detect predators. PMID:28396488

  15. Flexible workflow sharing and execution services for e-scientists

    NASA Astrophysics Data System (ADS)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.

  16. Is there a neuroanatomical basis of the vulnerability to suicidal behavior? A coordinate-based meta-analysis of structural and functional MRI studies

    PubMed Central

    van Heeringen, Kees; Bijttebier, Stijn; Desmyter, Stefanie; Vervaet, Myriam; Baeken, Chris

    2014-01-01

    Objective: We conducted meta-analyses of functional and structural neuroimaging studies comparing adolescent and adult individuals with a history of suicidal behavior and a psychiatric disorder to psychiatric controls in order to objectify changes in brain structure and function in association with a vulnerability to suicidal behavior. Methods: Magnetic resonance imaging studies published up to July 2013 investigating structural or functional brain correlates of suicidal behavior were identified through computerized and manual literature searches. Activation foci from 12 studies encompassing 475 individuals, i.e., 213 suicide attempters and 262 psychiatric controls were subjected to meta-analytical study using anatomic or activation likelihood estimation (ALE). Result: Activation likelihood estimation revealed structural deficits and functional changes in association with a history of suicidal behavior. Structural findings included reduced volumes of the rectal gyrus, superior temporal gyrus and caudate nucleus. Functional differences between study groups included an increased reactivity of the anterior and posterior cingulate cortices. Discussion: A history of suicidal behavior appears to be associated with (probably interrelated) structural deficits and functional overactivation in brain areas, which contribute to a decision-making network. The findings suggest that a vulnerability to suicidal behavior can be defined in terms of a reduced motivational control over the intentional behavioral reaction to salient negative stimuli. PMID:25374525

  17. Meta-manager: a requirements analysis.

    PubMed

    Cook, J F; Rozenblit, J W; Chacko, A K; Martinez, R; Timboe, H L

    1999-05-01

    The digital imaging network-picture-archiving and communications system (DIN-PACS) will be implemented in ten sites within the Great Plains Regional Medical Command (GPRMC). This network of PACS and teleradiology technology over a shared T1 network has opened the door for round the clock radiology coverage of all sites. However, the concept of a virtual radiology environment poses new issues for military medicine. A new workflow management system must be developed. This workflow management system will allow us to efficiently resolve these issues including quality of care, availability, severe capitation, and quality of the workforce. The design process of this management system must employ existing technology, operate over various telecommunication networks and protocols, be independent of platform operating systems, be flexible and scaleable, and involve the end user at the outset in the design process for which it is developed. Using the unified modeling language (UML), the specifications for this new business management system were created in concert between the University of Arizona and the GPRMC. These specifications detail a management system operating through a common object request brokered architecture (CORBA) environment. In this presentation, we characterize the Meta-Manager management system including aspects of intelligence, interfacility routing, fail-safe operations, and expected improvements in patient care and efficiency.

  18. Informing the Structure of Executive Function in Children: A Meta-Analysis of Functional Neuroimaging Data

    PubMed Central

    McKenna, Róisín; Rushe, T.; Woodcock, Kate A.

    2017-01-01

    The structure of executive function (EF) has been the focus of much debate for decades. What is more, the complexity and diversity provided by the developmental period only adds to this contention. The development of executive function plays an integral part in the expression of children's behavioral, cognitive, social, and emotional capabilities. Understanding how these processes are constructed during development allows for effective measurement of EF in this population. This meta-analysis aims to contribute to a better understanding of the structure of executive function in children. A coordinate-based meta-analysis was conducted (using BrainMap GingerALE 2.3), which incorporated studies administering functional magnetic resonance imaging (fMRI) during inhibition, switching, and working memory updating tasks in typical children (aged 6–18 years). The neural activation common across all executive tasks was compared to that shared by tasks pertaining only to inhibition, switching or updating, which are commonly considered to be fundamental executive processes. Results support the existence of partially separable but partially overlapping inhibition, switching, and updating executive processes at a neural level, in children over 6 years. Further, the shared neural activation across all tasks (associated with a proposed “unitary” component of executive function) overlapped to different degrees with the activation associated with each individual executive process. These findings provide evidence to support the suggestion that one of the most influential structural models of executive functioning in adults can also be applied to children of this age. However, the findings also call for careful consideration and measurement of both specific executive processes, and unitary executive function in this population. Furthermore, a need is highlighted for a new systematic developmental model, which captures the integrative nature of executive function in children. PMID:28439231

  19. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  20. Effect of low-magnitude whole-body vibration combined with alendronate in ovariectomized rats: a random controlled osteoporosis prevention study.

    PubMed

    Chen, Guo-Xian; Zheng, Shuai; Qin, Shuai; Zhong, Zhao-Ming; Wu, Xiu-Hua; Huang, Zhi-Ping; Li, Wei; Ding, Ruo-Ting; Yu, Hui; Chen, Jian-Ting

    2014-01-01

    Alendronate (ALE) is a conventional drug used to treat osteoporosis. Low-magnitude whole-body vibration (WBV) exercise has been developed as a potential treatment for osteoporosis. The aim of this study was to investigate whether low-magnitude WBV could enhance the protective effect of ALE on bone properties in ovariectomized rats. A total of 128 Sprague-Dawley rats were randomly divided into five groups (SHAM, OVX+VEH, OVX+WBV, OVX + ALE, OVX+WBV+ALE). The level of WBV applied was 0.3 g at 45-55 Hz for 20 min/day, 5 day/week and for 3 months. ALE was administered in dose of 1 mg/Kg once a week. Every four weeks eight rats from each group were sacrificed and their blood and both tibiae were harvested. The expression of osteocalcin and CTX in serum was measured by enzyme-linked immunosorbent assay (ELISA) and the tibiae were subjected to metaphyseal three-point bending and μCT analysis. Osteocalcin rose after ovariectomy and was not appreciably changed by either alendronate or WBV alone or in combination. Alendronate treatment significantly prevented an increase in CTX. WBV alone treatment did not alter this effect. Compared with the OVX+WBV group, nearly all tested indices such as the BV/TV, TV apparent, Tb.N, Tb.Th, and Conn.D were higher in the OVX+ALE group at week 12.Compared with the OVX+WBV group, certain tested indices such as BV/TV, TV apparent, Tb.N, and Con.D, were higher in the OVX+WBV+ALE group at week 12. At week 12, tibiae treated with WBV+ALE exhibited a significantly higher Fmax compared to the OVX+VEH group, and a significant difference was also found in energy absorption between the OVX+WBV+ALE and OVX+VEH groups. Compared with the WBV, ALE was more effective at preventing bone loss and improved the trabecular architecture. However, WBV enhanced the effect of alendronate in ovariectomized rats by inducing further improvements in trabecular architecture.

  1. Effect of Low-Magnitude Whole-Body Vibration Combined with Alendronate in Ovariectomized Rats: A Random Controlled Osteoporosis Prevention Study

    PubMed Central

    Zhong, Zhao-Ming; Wu, Xiu-Hua; Huang, Zhi-Ping; Li, Wei; Ding, Ruo-Ting; Yu, Hui; Chen, Jian-Ting

    2014-01-01

    Background Alendronate (ALE) is a conventional drug used to treat osteoporosis. Low-magnitude whole-body vibration (WBV) exercise has been developed as a potential treatment for osteoporosis. The aim of this study was to investigate whether low-magnitude WBV could enhance the protective effect of ALE on bone properties in ovariectomized rats. Methods A total of 128 Sprague-Dawley rats were randomly divided into five groups (SHAM, OVX+VEH, OVX+WBV, OVX + ALE, OVX+WBV+ALE). The level of WBV applied was 0.3 g at 45–55 Hz for 20 min/day, 5 day/week and for 3 months. ALE was administered in dose of 1 mg/Kg once a week. Every four weeks eight rats from each group were sacrificed and their blood and both tibiae were harvested. The expression of osteocalcin and CTX in serum was measured by enzyme-linked immunosorbent assay (ELISA) and the tibiae were subjected to metaphyseal three-point bending and μCT analysis. Results Osteocalcin rose after ovariectomy and was not appreciably changed by either alendronate or WBV alone or in combination. Alendronate treatment significantly prevented an increase in CTX. WBV alone treatment did not alter this effect. Compared with the OVX+WBV group, nearly all tested indices such as the BV/TV, TV apparent, Tb.N, Tb.Th, and Conn.D were higher in the OVX+ALE group at week 12.Compared with the OVX+WBV group, certain tested indices such as BV/TV, TV apparent, Tb.N, and Con.D, were higher in the OVX+WBV+ALE group at week 12. At week 12, tibiae treated with WBV+ALE exhibited a significantly higher Fmax compared to the OVX+VEH group, and a significant difference was also found in energy absorption between the OVX+WBV+ALE and OVX+VEH groups. Conclusions Compared with the WBV, ALE was more effective at preventing bone loss and improved the trabecular architecture. However, WBV enhanced the effect of alendronate in ovariectomized rats by inducing further improvements in trabecular architecture. PMID:24796785

  2. The Role of the Amygdala in Facial Trustworthiness Processing: A Systematic Review and Meta-Analyses of fMRI Studies

    PubMed Central

    Oliveiros, Bárbara

    2016-01-01

    Background Faces play a key role in signaling social cues such as signals of trustworthiness. Although several studies identify the amygdala as a core brain region in social cognition, quantitative approaches evaluating its role are scarce. Objectives This review aimed to assess the role of the amygdala in the processing of facial trustworthiness, by analyzing its amplitude BOLD response polarity to untrustworthy versus trustworthy facial signals under fMRI tasks through a Meta-analysis of effect sizes (MA). Activation Likelihood Estimation (ALE) analyses were also conducted. Data sources Articles were retrieved from MEDLINE, ScienceDirect and Web-of-Science in January 2016. Following the PRISMA statement guidelines, a systematic review of original research articles in English language using the search string “(face OR facial) AND (trustworthiness OR trustworthy OR untrustworthy OR trustee) AND fMRI” was conducted. Study selection and data extraction The MA concerned amygdala responses to facial trustworthiness for the contrast Untrustworthy vs. trustworthy faces, and included whole-brain and ROI studies. To prevent potential bias, results were considered even when at the single study level they did not survive correction for multiple comparisons or provided non-significant results. ALE considered whole-brain studies, using the same methodology to prevent bias. A summary of the methodological options (design and analysis) described in the articles was finally used to get further insight into the characteristics of the studies and to perform a subgroup analysis. Data were extracted by two authors and checked independently. Data synthesis Twenty fMRI studies were considered for systematic review. An MA of effect sizes with 11 articles (12 studies) showed high heterogeneity between studies [Q(11) = 265.68, p < .0001; I2 = 95.86%, 94.20% to 97.05%, with 95% confidence interval, CI]. Random effects analysis [RE(183) = 0.851, .422 to .969, 95% CI] supported the evidence that the (right) amygdala responds preferentially to untrustworthy faces. Moreover, two ALE analyses performed with 6 articles (7 studies) identified the amygdala, insula and medial dorsal nuclei of thalamus as structures with negative correlation with trustworthiness. Six articles/studies showed that posterior cingulate and medial frontal gyrus present positive correlations with increasing facial trustworthiness levels. Significant effects considering subgroup analysis based on methodological criteria were found for experiments using spatial smoothing, categorization of trustworthiness in 2 or 3 categories and paradigms which involve both explicit and implicit tasks. Limitations Significant heterogeneity between studies was found in MA, which might have arisen from inclusion of studies with smaller sample sizes and differences in methodological options. Studies using ROI analysis / small volume correction methods were more often devoted specifically to the amygdala region, with some results reporting uncorrected p-values based on mainly clinical a priori evidence of amygdala involvement in these processes. Nevertheless, we did not find significant evidence for publication bias. Conclusions and implications of key findings Our results support the role of amygdala in facial trustworthiness judgment, emphasizing its predominant role during processing of negative social signals in (untrustworthy) faces. This systematic review suggests that little consistency exists among studies’ methodology, and that larger sample sizes should be preferred. PMID:27898705

  3. Finite Element Simulation of a Space Shuttle Solid Rocket Booster Aft Skirt Splashdown Using an Arbitrary Lagrangian-Eulerian Approach

    NASA Astrophysics Data System (ADS)

    Melis, Matthew E.

    2003-01-01

    Explicit finite element techniques employing an Arbitrary Lagrangian-Eulerian (ALE) methodology, within the transient dynamic code LS-DYNA, are used to predict splashdown loads on a proposed replacement/upgrade of the hydrazine tanks on the thrust vector control system housed within the aft skirt of a Space Shuttle Solid Rocket Booster. Two preliminary studies are performed prior to the full aft skirt analysis: An analysis of the proposed tank impacting water without supporting aft skirt structure, and an analysis of space capsule water drop tests conducted at NASA's Langley Research Center. Results from the preliminary studies provide confidence that useful predictions can be made by applying the ALE methodology to a detailed analysis of a 26-degree section of the skirt with proposed tank attached. Results for all three studies are presented and compared to limited experimental data. The challenges of using the LS-DYNA ALE capability for this type of analysis are discussed.

  4. Finite Element Simulation of a Space Shuttle Solid Rocket Booster Aft Skirt Splashdown Using an Arbitrary Lagrangian-eulerian Approach

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.

    2003-01-01

    Explicit finite element techniques employing an Arbitrary Lagrangian-Eulerian (ALE) methodology, within the transient dynamic code LS-DYNA, are used to predict splashdown loads on a proposed replacement/upgrade of the hydrazine tanks on the thrust vector control system housed within the aft skirt of a Space Shuttle Solid Rocket Booster. Two preliminary studies are performed prior to the full aft skirt analysis: An analysis of the proposed tank impacting water without supporting aft skirt structure, and an analysis of space capsule water drop tests conducted at NASA's Langley Research Center. Results from the preliminary studies provide confidence that useful predictions can be made by applying the ALE methodology to a detailed analysis of a 26-degree section of the skirt with proposed tank attached. Results for all three studies are presented and compared to limited experimental data. The challenges of using the LS-DYNA ALE capability for this type of analysis are discussed.

  5. Active life expectancy from annual follow-up data with missing responses.

    PubMed

    Izmirlian, G; Brock, D; Ferrucci, L; Phillips, C

    2000-03-01

    Active life expectancy (ALE) at a given age is defined as the expected remaining years free of disability. In this study, three categories of health status are defined according to the ability to perform activities of daily living independently. Several studies have used increment-decrement life tables to estimate ALE, without error analysis, from only a baseline and one follow-up interview. The present work conducts an individual-level covariate analysis using a three-state Markov chain model for multiple follow-up data. Using a logistic link, the model estimates single-year transition probabilities among states of health, accounting for missing interviews. This approach has the advantages of smoothing subsequent estimates and increased power by using all follow-ups. We compute ALE and total life expectancy from these estimated single-year transition probabilities. Variance estimates are computed using the delta method. Data from the Iowa Established Population for the Epidemiologic Study of the Elderly are used to test the effects of smoking on ALE on all 5-year age groups past 65 years, controlling for sex and education.

  6. Task Management in the New ATLAS Production System

    NASA Astrophysics Data System (ADS)

    De, K.; Golubkov, D.; Klimentov, A.; Potekhin, M.; Vaniachine, A.; Atlas Collaboration

    2014-06-01

    This document describes the design of the new Production System of the ATLAS experiment at the LHC [1]. The Production System is the top level workflow manager which translates physicists' needs for production level processing and analysis into actual workflows executed across over a hundred Grid sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. In the new design, the main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, DEFT manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. The JEDI component then dynamically translates the task definitions from DEFT into actual workload jobs executed in the PanDA Workload Management System [2]. We present the requirements, design parameters, basics of the object model and concrete solutions utilized in building the new Production System and its components.

  7. Neural correlates of conversion disorder: overview and meta-analysis of neuroimaging studies on motor conversion disorder.

    PubMed

    Boeckle, Markus; Liegl, Gregor; Jank, Robert; Pieh, Christoph

    2016-06-10

    Conversion Disorders (CD) are prevalent functional disorders. Although the pathogenesis is still not completely understood, an interaction of genetic, neurobiological, and psychosocial factors is quite likely. The aim of this study is to provide a systematic overview on imaging studies on CDs and investigate neuronal areas involved in Motor Conversion Disorders (MCD). A systematic literature search was conducted on CD. Subsequently a meta-analysis of functional neuroimaging studies on MCD was implemented using an Activation Likelihood Estimation (ALE). We calculated differences between patients and healthy controls as well as between affected versus unaffected sides in addition to an overall analysis in order to identify neuronal areas related to MCD. Patients with MCD differ from healthy controls in the amygdala, superior temporal lobe, retrosplenial area, primary motor cortex, insula, red nucleus, thalamus, anterior as well as dorsolateral prefrontal and frontal cortex. When comparing affected versus unaffected sides, temporal cortex, dorsal anterior cingulate cortex, supramarginal gyrus, dorsal temporal lobe, anterior insula, primary somatosensory cortex, superior frontal gyrus and anterior prefrontal as well as frontal cortex show significant differences. Neuronal areas seem to be involved in the pathogenesis, maintenance or as a result of MCD. Areas that are important for motor-planning, motor-selection or autonomic response seem to be especially relevant. Our results support the emotional unawareness theory but also underline the need of more support by conduction imaging studies on both CD and MCD.

  8. Apocynum venetum Attenuates Acetaminophen-Induced Liver Injury in Mice.

    PubMed

    Xie, Wenyan; Chen, Chen; Jiang, Zhihui; Wang, Jian; Melzig, Matthias F; Zhang, Xiaoying

    2015-01-01

    Apocynum venetum L. (A. venetum) has long been used in oriental folk medicine for the treatment of some liver diseases; however, the underlying mechanisms remain to be fully elucidated. Acetaminophen (APAP) is a widely used analgesic drug that can cause acute liver injury in overdose situations. In this study, we investigated the potential protective effect of A. venetum leaf extract (ALE) against APAP-induced hepatotoxicity. Mice were intragastrically administered with ALE once daily for 3 consecutive days prior to receiving a single intraperitoneal injection of APAP. The APAP group showed severe liver injury characterized by the noticeable fluctuations in the following parameters: serum aminotransferases; hepatic malondialdehyde (MDA), 3-nitrotyrosine (3-NT), superoxide dismutase (SOD), glutathione peroxidase (GPx), glutathione reductase (GR) and glutathione (GSH). These liver damages induced by APAP were significantly attenuated by ALE pretreatments. A collective analysis of histopathological examination, DNA laddering and western blot for caspase-3 and cytochrome c indicated that the ALE is also capable of preventing APAP-induced hepatocyte death. Hyperoside, isoquercitrin and their derivatives have been identified as the major components of ALE using HPLC-MS/MS. Taken together, the A. venetum possesses hepatoprotective effects partially due to its anti-oxidant action.

  9. Connectivity and functional profiling of abnormal brain structures in pedophilia

    PubMed Central

    Poeppl, Timm B.; Eickhoff, Simon B.; Fox, Peter T.; Laird, Angela R.; Rupprecht, Rainer; Langguth, Berthold; Bzdok, Danilo

    2015-01-01

    Despite its 0.5–1% lifetime prevalence in men and its general societal relevance, neuroimaging investigations in pedophilia are scarce. Preliminary findings indicate abnormal brain structure and function. However, no study has yet linked structural alterations in pedophiles to both connectional and functional properties of the aberrant hotspots. The relationship between morphological alterations and brain function in pedophilia as well as their contribution to its psychopathology thus remain unclear. First, we assessed bimodal connectivity of structurally altered candidate regions using meta-analytic connectivity modeling (MACM) and resting-state correlations employing openly accessible data. We compared the ensuing connectivity maps to the activation likelihood estimation (ALE) maps of a recent quantitative meta-analysis of brain activity during processing of sexual stimuli. Second, we functionally characterized the structurally altered regions employing meta-data of a large-scale neuroimaging database. Candidate regions were functionally connected to key areas for processing of sexual stimuli. Moreover, we found that the functional role of structurally altered brain regions in pedophilia relates to nonsexual emotional as well as neurocognitive and executive functions, previously reported to be impaired in pedophiles. Our results suggest that structural brain alterations affect neural networks for sexual processing by way of disrupted functional connectivity, which may entail abnormal sexual arousal patterns. The findings moreover indicate that structural alterations account for common affective and neurocognitive impairments in pedophilia. The present multi-modal integration of brain structure and function analyses links sexual and nonsexual psychopathology in pedophilia. PMID:25733379

  10. Connectivity and functional profiling of abnormal brain structures in pedophilia.

    PubMed

    Poeppl, Timm B; Eickhoff, Simon B; Fox, Peter T; Laird, Angela R; Rupprecht, Rainer; Langguth, Berthold; Bzdok, Danilo

    2015-06-01

    Despite its 0.5-1% lifetime prevalence in men and its general societal relevance, neuroimaging investigations in pedophilia are scarce. Preliminary findings indicate abnormal brain structure and function. However, no study has yet linked structural alterations in pedophiles to both connectional and functional properties of the aberrant hotspots. The relationship between morphological alterations and brain function in pedophilia as well as their contribution to its psychopathology thus remain unclear. First, we assessed bimodal connectivity of structurally altered candidate regions using meta-analytic connectivity modeling (MACM) and resting-state correlations employing openly accessible data. We compared the ensuing connectivity maps to the activation likelihood estimation (ALE) maps of a recent quantitative meta-analysis of brain activity during processing of sexual stimuli. Second, we functionally characterized the structurally altered regions employing meta-data of a large-scale neuroimaging database. Candidate regions were functionally connected to key areas for processing of sexual stimuli. Moreover, we found that the functional role of structurally altered brain regions in pedophilia relates to nonsexual emotional as well as neurocognitive and executive functions, previously reported to be impaired in pedophiles. Our results suggest that structural brain alterations affect neural networks for sexual processing by way of disrupted functional connectivity, which may entail abnormal sexual arousal patterns. The findings moreover indicate that structural alterations account for common affective and neurocognitive impairments in pedophilia. The present multimodal integration of brain structure and function analyses links sexual and nonsexual psychopathology in pedophilia. © 2015 Wiley Periodicals, Inc.

  11. A Model for Designing Adaptive Laboratory Evolution Experiments.

    PubMed

    LaCroix, Ryan A; Palsson, Bernhard O; Feist, Adam M

    2017-04-15

    The occurrence of mutations is a cornerstone of the evolutionary theory of adaptation, capitalizing on the rare chance that a mutation confers a fitness benefit. Natural selection is increasingly being leveraged in laboratory settings for industrial and basic science applications. Despite increasing deployment, there are no standardized procedures available for designing and performing adaptive laboratory evolution (ALE) experiments. Thus, there is a need to optimize the experimental design, specifically for determining when to consider an experiment complete and for balancing outcomes with available resources (i.e., laboratory supplies, personnel, and time). To design and to better understand ALE experiments, a simulator, ALEsim, was developed, validated, and applied to the optimization of ALE experiments. The effects of various passage sizes were experimentally determined and subsequently evaluated with ALEsim, to explain differences in experimental outcomes. Furthermore, a beneficial mutation rate of 10 -6.9 to 10 -8.4 mutations per cell division was derived. A retrospective analysis of ALE experiments revealed that passage sizes typically employed in serial passage batch culture ALE experiments led to inefficient production and fixation of beneficial mutations. ALEsim and the results described here will aid in the design of ALE experiments to fit the exact needs of a project while taking into account the resources required and will lower the barriers to entry for this experimental technique. IMPORTANCE ALE is a widely used scientific technique to increase scientific understanding, as well as to create industrially relevant organisms. The manner in which ALE experiments are conducted is highly manual and uniform, with little optimization for efficiency. Such inefficiencies result in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized fashion and can design experiments to generate greater fitness in an accelerated time frame, thereby pushing the limits of what adaptive laboratory evolution can achieve. Copyright © 2017 American Society for Microbiology.

  12. Guideline validation in multiple trauma care through business process modeling.

    PubMed

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  13. A meta-model for computer executable dynamic clinical safety checklists.

    PubMed

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  14. Comparison of biological activities of selenium and silver nanoparticles attached with bioactive phytoconstituents: green synthesized using Spermacoce hispida extract

    NASA Astrophysics Data System (ADS)

    Vennila, Krishnan; Chitra, Loganathan; Balagurunathan, Rama; Palvannan, Thayumanavan

    2018-03-01

    Selenium and silver nanoparticles (NPs) were synthesized using Spermacoce hispida aqueous leaf extract (Sh-ALE). The optimum condition required for the synthesis of Sh-SeNPs was found to be 30 mM selenious acid solution to Sh-ALE at the ratio of 4:46, pH 9, incubated at 40 °C for 10 min. On the other hand, for Sh-AgNPs the optimum condition was found to be 1 mM AgNO3 to the Sh-ALE solution at the ratio of 4:46, pH 8, incubated at 40 °C for 10 min. SEM analysis revealed that both the Sh-AgNPs and Sh-SeNPs are predominantly rod-shaped. Sh-SeNPs and Sh-AgNPs were found to possess concentration-dependent antioxidant activity. However, Sh-SeNPs showed potent anti-inflammatory property, antibacterial property and anticancer activity against human cervical cancer cell in comparison to Sh-AgNPs. Phytochemical analysis, FTIR and GC-MS analysis showed that various flavonoids, saponins and phenolic compounds present in Sh-ALE catalysed the formation of NPs. Also, GC-MS analysis revealed that Sh-SeNPs are capped by synaptogenin B and derivatives of apigenin, quinoline and quinazoline. The advantage of attachment of such phytoconstituents on Sh-SeNPs for its potent biological activity in comparison to Sh-AgNPs is evident in in vitro conditions.

  15. Short term sodium alendronate administration improves the peri-implant bone quality in osteoporotic animals

    PubMed Central

    de OLIVEIRA, Danila; HASSUMI, Jaqueline Suemi; GOMES-FERREIRA, Pedro Henrique da Silva; POLO, Tárik Ocon Braga; FERREIRA, Gabriel Ramalho; FAVERANI, Leonardo Perez; OKAMOTO, Roberta

    2017-01-01

    Abstract Sodium alendronate is a bisphosphonate drug that exerts antiresorptive action and is used to treat osteoporosis. Objective The aim of this study was to evaluate the bone repair process at the bone/implant interface of osteoporotic rats treated with sodium alendronate through the analysis of microtomography, real time polymerase chain reactions and immunohistochemistry (RUNX2 protein, bone sialoprotein (BSP), alkaline phosphatase, osteopontin and osteocalcin). Material and Methods A total of 42 rats were used and divided in to the following experimental groups: CTL: control group (rats submitted to fictitious surgery and fed with a balanced diet), OST: osteoporosis group (rats submitted to a bilateral ovariectomy and fed with a low calcium diet) and ALE: alendronate group (rats submitted to a bilateral ovariectomy, fed with a low calcium diet and treated with sodium alendronate). A surface treated implant was installed in both tibial metaphyses of each rat. Euthanasia of the animals was conducted at 14 (immunhostochemistry) and 42 days (immunohistochemistry, micro CT and PCR). Data were subjected to statistical analysis with a 5% significance level. Results Bone volume (BV) and total pore volume were higher for ALE group (P<0.05). Molecular data for RUNX2 and BSP proteins were significantly expressed in the ALE group (P<0.05), in comparison with the other groups. ALP expression was higher in the CTL group (P<0.05). The immunostaining for RUNX2 and osteopontin was positive in the osteoblastic lineage cells of neoformed bone for the CTL and ALE groups in both periods (14 and 42 days). Alkaline phosphatase presented a lower staining area in the OST group compared to the CTL in both periods and the ALE at 42 days. Conclusion There was a decrease of osteocalcin precipitation at 42 days for the ALE and OST groups. Therefore, treatment with short-term sodium alendronate improved bone repair around the implants installed in the tibia of osteoporotic rats. PMID:28198975

  16. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    NASA Astrophysics Data System (ADS)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.

  17. Sensor-based architecture for medical imaging workflow analysis.

    PubMed

    Silva, Luís A Bastião; Campos, Samuel; Costa, Carlos; Oliveira, José Luis

    2014-08-01

    The growing use of computer systems in medical institutions has been generating a tremendous quantity of data. While these data have a critical role in assisting physicians in the clinical practice, the information that can be extracted goes far beyond this utilization. This article proposes a platform capable of assembling multiple data sources within a medical imaging laboratory, through a network of intelligent sensors. The proposed integration framework follows a SOA hybrid architecture based on an information sensor network, capable of collecting information from several sources in medical imaging laboratories. Currently, the system supports three types of sensors: DICOM repository meta-data, network workflows and examination reports. Each sensor is responsible for converting unstructured information from data sources into a common format that will then be semantically indexed in the framework engine. The platform was deployed in the Cardiology department of a central hospital, allowing identification of processes' characteristics and users' behaviours that were unknown before the utilization of this solution.

  18. BioMaS: a modular pipeline for Bioinformatic analysis of Metagenomic AmpliconS.

    PubMed

    Fosso, Bruno; Santamaria, Monica; Marzano, Marinella; Alonso-Alemany, Daniel; Valiente, Gabriel; Donvito, Giacinto; Monaco, Alfonso; Notarangelo, Pasquale; Pesole, Graziano

    2015-07-01

    Substantial advances in microbiology, molecular evolution and biodiversity have been carried out in recent years thanks to Metagenomics, which allows to unveil the composition and functions of mixed microbial communities in any environmental niche. If the investigation is aimed only at the microbiome taxonomic structure, a target-based metagenomic approach, here also referred as Meta-barcoding, is generally applied. This approach commonly involves the selective amplification of a species-specific genetic marker (DNA meta-barcode) in the whole taxonomic range of interest and the exploration of its taxon-related variants through High-Throughput Sequencing (HTS) technologies. The accessibility to proper computational systems for the large-scale bioinformatic analysis of HTS data represents, currently, one of the major challenges in advanced Meta-barcoding projects. BioMaS (Bioinformatic analysis of Metagenomic AmpliconS) is a new bioinformatic pipeline designed to support biomolecular researchers involved in taxonomic studies of environmental microbial communities by a completely automated workflow, comprehensive of all the fundamental steps, from raw sequence data upload and cleaning to final taxonomic identification, that are absolutely required in an appropriately designed Meta-barcoding HTS-based experiment. In its current version, BioMaS allows the analysis of both bacterial and fungal environments starting directly from the raw sequencing data from either Roche 454 or Illumina HTS platforms, following two alternative paths, respectively. BioMaS is implemented into a public web service available at https://recasgateway.ba.infn.it/ and is also available in Galaxy at http://galaxy.cloud.ba.infn.it:8080 (only for Illumina data). BioMaS is a friendly pipeline for Meta-barcoding HTS data analysis specifically designed for users without particular computing skills. A comparative benchmark, carried out by using a simulated dataset suitably designed to broadly represent the currently known bacterial and fungal world, showed that BioMaS outperforms QIIME and MOTHUR in terms of extent and accuracy of deep taxonomic sequence assignments.

  19. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copps, Kevin D.

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data managementmore » coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.« less

  20. Structural basis of empathy and the domain general region in the anterior insular cortex

    PubMed Central

    Mutschler, Isabella; Reinbold, Céline; Wankerl, Johanna; Seifritz, Erich; Ball, Tonio

    2013-01-01

    Empathy is key for healthy social functioning and individual differences in empathy have strong implications for manifold domains of social behavior. Empathy comprises of emotional and cognitive components and may also be closely linked to sensorimotor processes, which go along with the motivation and behavior to respond compassionately to another person's feelings. There is growing evidence for local plastic change in the structure of the healthy adult human brain in response to environmental demands or intrinsic factors. Here we have investigated changes in brain structure resulting from or predisposing to empathy. Structural MRI data of 101 healthy adult females was analyzed. Empathy in fictitious as well as real-life situations was assessed using a validated self-evaluation measure. Furthermore, empathy-related structural effects were also put into the context of a functional map of the anterior insular cortex (AIC) determined by activation likelihood estimate (ALE) meta-analysis of previous functional imaging studies. We found that gray matter (GM) density in the left dorsal AIC correlates with empathy and that this area overlaps with the domain general region (DGR) of the anterior insula that is situated in-between functional systems involved in emotion–cognition, pain, and motor tasks as determined by our meta-analysis. Thus, we propose that this insular region where we find structural differences depending on individual empathy may play a crucial role in modulating the efficiency of neural integration underlying emotional, cognitive, and sensorimotor information which is essential for global empathy. PMID:23675334

  1. Differentiating between self and others: an ALE meta-analysis of fMRI studies of self-recognition and theory of mind.

    PubMed

    van Veluw, Susanne J; Chance, Steven A

    2014-03-01

    The perception of self and others is a key aspect of social cognition. In order to investigate the neurobiological basis of this distinction we reviewed two classes of task that study self-awareness and awareness of others (theory of mind, ToM). A reliable task to measure self-awareness is the recognition of one's own face in contrast to the recognition of others' faces. False-belief tasks are widely used to identify neural correlates of ToM as a measure of awareness of others. We performed an activation likelihood estimation meta-analysis, using the fMRI literature on self-face recognition and false-belief tasks. The brain areas involved in performing false-belief tasks were the medial prefrontal cortex (MPFC), bilateral temporo-parietal junction, precuneus, and the bilateral middle temporal gyrus. Distinct self-face recognition regions were the right superior temporal gyrus, the right parahippocampal gyrus, the right inferior frontal gyrus/anterior cingulate cortex, and the left inferior parietal lobe. Overlapping brain areas were the superior temporal gyrus, and the more ventral parts of the MPFC. We confirmed that self-recognition in contrast to recognition of others' faces, and awareness of others involves a network that consists of separate, distinct neural pathways, but also includes overlapping regions of higher order prefrontal cortex where these processes may be combined. Insights derived from the neurobiology of disorders such as autism and schizophrenia are consistent with this notion.

  2. Neural network of cognitive emotion regulation — An ALE meta-analysis and MACM analysis

    PubMed Central

    Kohn, N.; Eickhoff, S.B.; Scheller, M.; Laird, A.R.; Fox, P.T.; Habel, U.

    2016-01-01

    Cognitive regulation of emotions is a fundamental prerequisite for intact social functioning which impacts on both well being and psychopathology. The neural underpinnings of this process have been studied intensively in recent years, without, however, a general consensus. We here quantitatively summarize the published literature on cognitive emotion regulation using activation likelihood estimation in fMRI and PET (23 studies/479 subjects). In addition, we assessed the particular functional contribution of identified regions and their interactions using quantitative functional inference and meta-analytic connectivity modeling, respectively. In doing so, we developed a model for the core brain network involved in emotion regulation of emotional reactivity. According to this, the superior temporal gyrus, angular gyrus and (pre) supplementary motor area should be involved in execution of regulation initiated by frontal areas. The dorsolateral prefrontal cortex may be related to regulation of cognitive processes such as attention, while the ventrolateral prefrontal cortex may not necessarily reflect the regulatory process per se, but signals salience and therefore the need to regulate. We also identified a cluster in the anterior middle cingulate cortex as a region, which is anatomically and functionally in an ideal position to influence behavior and subcortical structures related to affect generation. Hence this area may play a central, integrative role in emotion regulation. By focusing on regions commonly active across multiple studies, this proposed model should provide important a priori information for the assessment of dysregulated emotion regulation in psychiatric disorders. PMID:24220041

  3. Speech perception in autism spectrum disorder: An activation likelihood estimation meta-analysis.

    PubMed

    Tryfon, Ana; Foster, Nicholas E V; Sharda, Megha; Hyde, Krista L

    2018-02-15

    Autism spectrum disorder (ASD) is often characterized by atypical language profiles and auditory and speech processing. These can contribute to aberrant language and social communication skills in ASD. The study of the neural basis of speech perception in ASD can serve as a potential neurobiological marker of ASD early on, but mixed results across studies renders it difficult to find a reliable neural characterization of speech processing in ASD. To this aim, the present study examined the functional neural basis of speech perception in ASD versus typical development (TD) using an activation likelihood estimation (ALE) meta-analysis of 18 qualifying studies. The present study included separate analyses for TD and ASD, which allowed us to examine patterns of within-group brain activation as well as both common and distinct patterns of brain activation across the ASD and TD groups. Overall, ASD and TD showed mostly common brain activation of speech processing in bilateral superior temporal gyrus (STG) and left inferior frontal gyrus (IFG). However, the results revealed trends for some distinct activation in the TD group showing additional activation in higher-order brain areas including left superior frontal gyrus (SFG), left medial frontal gyrus (MFG), and right IFG. These results provide a more reliable neural characterization of speech processing in ASD relative to previous single neuroimaging studies and motivate future work to investigate how these brain signatures relate to behavioral measures of speech processing in ASD. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Neuroanatomical correlates of negative emotionality-related traits: A systematic review and meta-analysis.

    PubMed

    Mincic, Adina M

    2015-10-01

    Two central traits present in the most influential models of personality characterize the response to positive and, respectively, negative emotional events. Negative emotionality (NE)-related traits are linked to vulnerability to mood and anxiety disorders; this has fuelled a special interest in examining stable differences in brain morphology associated to these traits. Structural imaging methods including voxel-based morphometry, cortical thickness analysis and diffusion tensor imaging (DTI) have yielded inconclusive and sometimes contradictory results. This review summarizes the findings reported to date through these methods and discusses them in relation to the functional imaging results. To detect topographic convergence between studies showing positive and, respectively, negative grey matter associations with NE-traits, activation likelihood estimation (ALE) meta-analyses of VBM studies were performed. Individuals scoring high on NE-related traits show consistent morphological differences in a left-lateralized circuit: higher grey matter volume (GMV) in amygdala and anterior parahippocampal gyrus and lower GMV in the orbitofrontal cortex extending into perigenual anterior cingulate cortex. Most DTI studies indicate reduced white matter integrity in various brain regions and tracts, particularly in the uncinate fasciculus and in cingulum bundle. These results show that the behavioural phenotype associated to NE traits is reflected in structural differences within the cortico-limbic system, suggesting alterations in information processing and transmission. The results are discussed from the perspective of neuron-glia interactions. Future directions are outlined based on recent developments in structural imaging techniques. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Overview of atomic layer etching in the semiconductor industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanarik, Keren J., E-mail: keren.kanarik@lamresearch.com; Lill, Thorsten; Hudson, Eric A.

    2015-03-15

    Atomic layer etching (ALE) is a technique for removing thin layers of material using sequential reaction steps that are self-limiting. ALE has been studied in the laboratory for more than 25 years. Today, it is being driven by the semiconductor industry as an alternative to continuous etching and is viewed as an essential counterpart to atomic layer deposition. As we enter the era of atomic-scale dimensions, there is need to unify the ALE field through increased effectiveness of collaboration between academia and industry, and to help enable the transition from lab to fab. With this in mind, this article providesmore » defining criteria for ALE, along with clarification of some of the terminology and assumptions of this field. To increase understanding of the process, the mechanistic understanding is described for the silicon ALE case study, including the advantages of plasma-assisted processing. A historical overview spanning more than 25 years is provided for silicon, as well as ALE studies on oxides, III–V compounds, and other materials. Together, these processes encompass a variety of implementations, all following the same ALE principles. While the focus is on directional etching, isotropic ALE is also included. As part of this review, the authors also address the role of power pulsing as a predecessor to ALE and examine the outlook of ALE in the manufacturing of advanced semiconductor devices.« less

  6. Toward Improved Fidelity of Thermal Explosion Simulations

    NASA Astrophysics Data System (ADS)

    Nichols, Albert; Becker, Richard; Burnham, Alan; Howard, W. Michael; Knap, Jarek; Wemhoff, Aaron

    2009-06-01

    We present results of an improved thermal/chemical/mechanical model of HMX based explosives like LX04 and LX10 for thermal cook-off. The original HMX model and analysis scheme were developed by Yoh et.al. for use in the ALE3D modeling framework. The improvements were concentrated in four areas. First, we added porosity to the chemical material model framework in ALE3D used to model HMX explosive formulations to handle the roughly 2% porosity in solid explosives. Second, we improved the HMX reaction network, which included the addition of a reactive phase change model base on work by Henson et.al. Third, we added early decomposition gas species to the CHEETAH material database to improve equations of state for gaseous intermediates and products. Finally, we improved the implicit mechanics module in ALE3D to more naturally handle the long time scales associated with thermal cookoff. The application of the resulting framework to the analysis of the Scaled Thermal Explosion (STEX) experiments will be discussed.

  7. SUSHI: an exquisite recipe for fully documented, reproducible and reusable NGS data analysis.

    PubMed

    Hatakeyama, Masaomi; Opitz, Lennart; Russo, Giancarlo; Qi, Weihong; Schlapbach, Ralph; Rehrauer, Hubert

    2016-06-02

    Next generation sequencing (NGS) produces massive datasets consisting of billions of reads and up to thousands of samples. Subsequent bioinformatic analysis is typically done with the help of open source tools, where each application performs a single step towards the final result. This situation leaves the bioinformaticians with the tasks to combine the tools, manage the data files and meta-information, document the analysis, and ensure reproducibility. We present SUSHI, an agile data analysis framework that relieves bioinformaticians from the administrative challenges of their data analysis. SUSHI lets users build reproducible data analysis workflows from individual applications and manages the input data, the parameters, meta-information with user-driven semantics, and the job scripts. As distinguishing features, SUSHI provides an expert command line interface as well as a convenient web interface to run bioinformatics tools. SUSHI datasets are self-contained and self-documented on the file system. This makes them fully reproducible and ready to be shared. With the associated meta-information being formatted as plain text tables, the datasets can be readily further analyzed and interpreted outside SUSHI. SUSHI provides an exquisite recipe for analysing NGS data. By following the SUSHI recipe, SUSHI makes data analysis straightforward and takes care of documentation and administration tasks. Thus, the user can fully dedicate his time to the analysis itself. SUSHI is suitable for use by bioinformaticians as well as life science researchers. It is targeted for, but by no means constrained to, NGS data analysis. Our SUSHI instance is in productive use and has served as data analysis interface for more than 1000 data analysis projects. SUSHI source code as well as a demo server are freely available.

  8. Advanced glycoxidation and lipoxidation end products (AGEs and ALEs): an overview of their mechanisms of formation.

    PubMed

    Vistoli, G; De Maddis, D; Cipak, A; Zarkovic, N; Carini, M; Aldini, G

    2013-08-01

    Advanced lipoxidation end products (ALEs) and advanced glycation end products (AGEs) have a pathogenetic role in the development and progression of different oxidative-based diseases including diabetes, atherosclerosis, and neurological disorders. AGEs and ALEs represent a quite complex class of compounds that are formed by different mechanisms, by heterogeneous precursors and that can be formed either exogenously or endogenously. There is a wide interest in AGEs and ALEs involving different aspects of research which are essentially focused on set-up and application of analytical strategies (1) to identify, characterize, and quantify AGEs and ALEs in different pathophysiological conditions; (2) to elucidate the molecular basis of their biological effects; and (3) to discover compounds able to inhibit AGEs/ALEs damaging effects not only as biological tools aimed at validating AGEs/ALEs as drug target, but also as promising drugs. All the above-mentioned research stages require a clear picture of the chemical formation of AGEs/ALEs but this is not simple, due to the complex and heterogeneous pathways, involving different precursors and mechanisms. In view of this intricate scenario, the aim of the present review is to group the main AGEs and ALEs and to describe, for each of them, the precursors and mechanisms of formation.

  9. Chemicals Compositions, Antioxidant and Anti-Inflammatory Activity of Cynara scolymus Leaves Extracts, and Analysis of Major Bioactive Polyphenols by HPLC

    PubMed Central

    Ben Salem, Maryem; Athmouni, Khaled; Ksouda, Kamilia; Dhouibi, Raouia; Sahnoun, Zouheir; Hammami, Serria; Zeghal, Khaled Mounir

    2017-01-01

    Objective. Artichoke (Cynara scolymus L.) was one of the plant remedies for primary health care. The present study was focused on the determination of chemical composition, antioxidant activities, and anti-inflammatory activity and on analyzing its major bioactive polyphenols by HPLC. Methods. Artichoke Leaves Extracts (ALE) were analyzed for proximate analysis and phytochemical and antioxidant activity by several methods such as DDPH, ABTS, FRAP, and beta-carotene bleaching test. The carrageenan (Carr) model induced paw oedema in order to investigate the anti-inflammatory activity. Identification and quantification of bioactive polyphenols compounds were done by HPLC method. The oxidative stress parameters were determined; CAT, SOD, GSH, MDA, and AOPP activities and the histopathological examination were also performed. Results. It was noted that EtOH extract of ALE contained the highest phenolic, flavonoid, and tannin contents and the strongest antioxidants activities including DDPH (94.23%), ABTS (538.75 mmol), FRAP assay (542.62 umol), and β-carotene bleaching (70.74%) compared to the other extracts of ALE. Administration of EtOH extract at dose 400 mg/kg/bw exhibited a maximum inhibition of inflammation induced by Carr for 3 and 5 hours compared to reference group Indomethacin (Indo). Conclusion. ALE displayed high potential as natural source of minerals and phytochemicals compounds with antioxidant and anti-inflammatory properties. PMID:28539965

  10. Chemicals Compositions, Antioxidant and Anti-Inflammatory Activity of Cynara scolymus Leaves Extracts, and Analysis of Major Bioactive Polyphenols by HPLC.

    PubMed

    Ben Salem, Maryem; Affes, Hanen; Athmouni, Khaled; Ksouda, Kamilia; Dhouibi, Raouia; Sahnoun, Zouheir; Hammami, Serria; Zeghal, Khaled Mounir

    2017-01-01

    Objective . Artichoke ( Cynara scolymus L.) was one of the plant remedies for primary health care. The present study was focused on the determination of chemical composition, antioxidant activities, and anti-inflammatory activity and on analyzing its major bioactive polyphenols by HPLC. Methods . Artichoke Leaves Extracts (ALE) were analyzed for proximate analysis and phytochemical and antioxidant activity by several methods such as DDPH, ABTS, FRAP, and beta-carotene bleaching test. The carrageenan (Carr) model induced paw oedema in order to investigate the anti-inflammatory activity. Identification and quantification of bioactive polyphenols compounds were done by HPLC method. The oxidative stress parameters were determined; CAT, SOD, GSH, MDA, and AOPP activities and the histopathological examination were also performed. Results . It was noted that EtOH extract of ALE contained the highest phenolic, flavonoid, and tannin contents and the strongest antioxidants activities including DDPH (94.23%), ABTS (538.75 mmol), FRAP assay (542.62 umol), and β -carotene bleaching (70.74%) compared to the other extracts of ALE. Administration of EtOH extract at dose 400 mg/kg/bw exhibited a maximum inhibition of inflammation induced by Carr for 3 and 5 hours compared to reference group Indomethacin (Indo). Conclusion . ALE displayed high potential as natural source of minerals and phytochemicals compounds with antioxidant and anti-inflammatory properties.

  11. Predicting synergy in atomic layer etching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanarik, Keren J.; Tan, Samantha; Yang, Wenbing

    2017-03-27

    Atomic layer etching (ALE) is a multistep process used today in manufacturing for removing ultrathin layers of material. In this article, the authors report on ALE of Si, Ge, C, W, GaN, and SiO 2 using a directional (anisotropic) plasma-enhanced approach. The authors analyze these systems by defining an “ALE synergy” parameter which quantifies the degree to which a process approaches the ideal ALE regime. This parameter is inspired by the ion-neutral synergy concept introduced in the 1979 paper by Coburn and Winters. ALE synergy is related to the energetics of underlying surface interactions and is understood in terms ofmore » energy criteria for the energy barriers involved in the reactions. Synergistic behavior is observed for all of the systems studied, with each exhibiting behavior unique to the reactant–material combination. By systematically studying atomic layer etching of a group of materials, the authors show that ALE synergy scales with the surface binding energy of the bulk material. This insight explains why some materials are more or less amenable to the directional ALE approach. Furthermore, they conclude that ALE is both simpler to understand than conventional plasma etch processing and is applicable to metals, semiconductors, and dielectrics.« less

  12. Structuring research methods and data with the research object model: genomics workflows as a case study.

    PubMed

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  13. Consequences of atomic layer etching on wafer scale uniformity in inductively coupled plasmas

    NASA Astrophysics Data System (ADS)

    Huard, Chad M.; Lanham, Steven J.; Kushner, Mark J.

    2018-04-01

    Atomic layer etching (ALE) typically divides the etching process into two self-limited reactions. One reaction passivates a single layer of material while the second preferentially removes the passivated layer. As such, under ideal conditions the wafer scale uniformity of ALE should be independent of the uniformity of the reactant fluxes onto the wafers, provided all surface reactions are saturated. The passivation and etch steps should individually asymptotically saturate after a characteristic fluence of reactants has been delivered to each site. In this paper, results from a computational investigation are discussed regarding the uniformity of ALE of Si in Cl2 containing inductively coupled plasmas when the reactant fluxes are both non-uniform and non-ideal. In the parameter space investigated for inductively coupled plasmas, the local etch rate for continuous processing was proportional to the ion flux. When operated with saturated conditions (that is, both ALE steps are allowed to self-terminate), the ALE process is less sensitive to non-uniformities in the incoming ion flux than continuous etching. Operating ALE in a sub-saturation regime resulted in less uniform etching. It was also found that ALE processing with saturated steps requires a larger total ion fluence than continuous etching to achieve the same etch depth. This condition may result in increased resist erosion and/or damage to stopping layers using ALE. While these results demonstrate that ALE provides increased etch depth uniformity, they do not show an improved critical dimension uniformity in all cases. These possible limitations to ALE processing, as well as increased processing time, will be part of the process optimization that includes the benefits of atomic resolution and improved uniformity.

  14. Language workbench user interfaces for data analysis

    PubMed Central

    Benson, Victoria M.

    2015-01-01

    Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW) technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS) LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/). PMID:25755929

  15. Childhood adverse life events, disordered eating, and body mass index in US Military service members.

    PubMed

    Bakalar, Jennifer L; Barmine, Marissa; Druskin, Lindsay; Olsen, Cara H; Quinlan, Jeffrey; Sbrocco, Tracy; Tanofsky-Kraff, Marian

    2018-03-02

    US service members appear to be at high-risk for disordered eating. Further, the military is experiencing unprecedented prevalence of overweight and obesity. US service members also report a high prevalence of childhood adverse life event (ALE) exposure. Despite consistent links between early adversity with eating disorders and obesity, there is a dearth of research examining the association between ALE exposure and disordered eating and weight in military personnel. An online survey study was conducted in active duty personnel to examine childhood ALE history using the Life Stressor Checklist - Revised, disordered eating using the Eating Disorder Examination - Questionnaire total score, and self-reported body mass index (BMI, kg/m 2 ). Among 179 respondents, multiple indices of childhood ALE were positively associated with disordered eating. Traumatic childhood ALE and subjective impact of childhood ALE were associated with higher BMI and these associations were mediated by disordered eating. Findings support evaluating childhood ALE exposure among service members with disordered eating and weight concerns. Moreover, findings support the need for prospective research to elucidate these relationships. © 2018 Wiley Periodicals, Inc.

  16. Neural correlates of social exclusion across ages: A coordinate-based meta-analysis of functional MRI studies.

    PubMed

    Vijayakumar, Nandita; Cheng, Theresa W; Pfeifer, Jennifer H

    2017-06-01

    Given the recent surge in functional neuroimaging studies on social exclusion, the current study employed activation likelihood estimation (ALE) based meta-analyses to identify brain regions that have consistently been implicated across different experimental paradigms used to investigate exclusion. We also examined the neural correlates underlying Cyberball, the most commonly used paradigm to study exclusion, as well as differences in exclusion-related activation between developing (7-18 years of age, from pre-adolescence up to late adolescence) and emerging adult (broadly defined as undergraduates, including late adolescence and young adulthood) samples. Results revealed involvement of the bilateral medial prefrontal and posterior cingulate cortices, right precuneus and left ventrolateral prefrontal cortex across the different paradigms used to examine social exclusion; similar activation patterns were identified when restricting the analysis to Cyberball studies. Investigations into age-related effects revealed that ventrolateral prefrontal activations identified in the full sample were driven by (i.e. present in) developmental samples, while medial prefrontal activations were driven by emerging adult samples. In addition, the right ventral striatum was implicated in exclusion, but only in developmental samples. Subtraction analysis revealed significantly greater activation likelihood in striatal and ventrolateral prefrontal clusters in the developmental samples as compared to emerging adults, though the opposite contrast failed to identify any significant regions. Findings integrate the knowledge accrued from functional neuroimaging studies on social exclusion to date, highlighting involvement of lateral prefrontal regions implicated in regulation and midline structures involved in social cognitive and self-evaluative processes across experimental paradigms and ages, as well as limbic structures in developing samples specifically. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Using coordinate-based meta-analyses to explore structural imaging genetics.

    PubMed

    Janouschek, Hildegard; Eickhoff, Claudia R; Mühleisen, Thomas W; Eickhoff, Simon B; Nickl-Jockschat, Thomas

    2018-05-05

    Imaging genetics has become a highly popular approach in the field of schizophrenia research. A frequently reported finding is that effects from common genetic variation are associated with a schizophrenia-related structural endophenotype. Genetic contributions to a structural endophenotype may be easier to delineate, when referring to biological rather than diagnostic criteria. We used coordinate-based meta-analyses, namely the anatomical likelihood estimation (ALE) algorithm on 30 schizophrenia-related imaging genetics studies, representing 44 single-nucleotide polymorphisms at 26 gene loci investigated in 4682 subjects. To test whether analyses based on biological information would improve the convergence of results, gene ontology (GO) terms were used to group the findings from the published studies. We did not find any significant results for the main contrast. However, our analysis enrolling studies on genotype × diagnosis interaction yielded two clusters in the left temporal lobe and the medial orbitofrontal cortex. All other subanalyses did not yield any significant results. To gain insight into possible biological relationships between the genes implicated by these clusters, we mapped five of them to GO terms of the category "biological process" (AKT1, CNNM2, DISC1, DTNBP1, VAV3), then five to "cellular component" terms (AKT1, CNNM2, DISC1, DTNBP1, VAV3), and three to "molecular function" terms (AKT1, VAV3, ZNF804A). A subsequent cluster analysis identified representative, non-redundant subsets of semantically similar terms that aided a further interpretation. We regard this approach as a new option to systematically explore the richness of the literature in imaging genetics.

  18. Neural Systems Underlying Emotional and Non-emotional Interference Processing: An ALE Meta-Analysis of Functional Neuroimaging Studies

    PubMed Central

    Xu, Min; Xu, Guiping; Yang, Yang

    2016-01-01

    Understanding how the nature of interference might influence the recruitments of the neural systems is considered as the key to understanding cognitive control. Although, interference processing in the emotional domain has recently attracted great interest, the question of whether there are separable neural patterns for emotional and non-emotional interference processing remains open. Here, we performed an activation likelihood estimation meta-analysis of 78 neuroimaging experiments, and examined common and distinct neural systems for emotional and non-emotional interference processing. We examined brain activation in three domains of interference processing: emotional verbal interference in the face-word conflict task, non-emotional verbal interference in the color-word Stroop task, and non-emotional spatial interference in the Simon, SRC and Flanker tasks. Our results show that the dorsal anterior cingulate cortex (ACC) was recruited for both emotional and non-emotional interference. In addition, the right anterior insula, presupplementary motor area (pre-SMA), and right inferior frontal gyrus (IFG) were activated by interference processing across both emotional and non-emotional domains. In light of these results, we propose that the anterior insular cortex may serve to integrate information from different dimensions and work together with the dorsal ACC to detect and monitor conflicts, whereas pre-SMA and right IFG may be recruited to inhibit inappropriate responses. In contrast, the dorsolateral prefrontal cortex (DLPFC) and posterior parietal cortex (PPC) showed different degrees of activation and distinct lateralization patterns for different processing domains, which suggests that these regions may implement cognitive control based on the specific task requirements. PMID:27895564

  19. Comparison of ALE and SPH Methods for Simulating Mine Blast Effects on Structures

    DTIC Science & Technology

    2010-12-01

    Comparison of ALE and SPH methods for simulating mine blast effects on struc- tures Geneviève Toussaint Amal Bouamoul DRDC Valcartier Defence R&D...Canada – Valcartier Technical Report DRDC Valcartier TR 2010-326 December 2010 Comparison of ALE and SPH methods for simulating mine blast...Valcartier TR 2010-326 iii Executive summary Comparison of ALE and SPH methods for simulating mine blast effects on structures

  20. A coupled ALE-AMR method for shock hydrodynamics

    DOE PAGES

    Waltz, J.; Bakosi, J.

    2018-03-05

    We present a numerical method combining adaptive mesh refinement (AMR) with arbitrary Lagrangian-Eulerian (ALE) mesh motion for the simulation of shock hydrodynamics on unstructured grids. The primary goal of the coupled method is to use AMR to reduce numerical error in ALE simulations at reduced computational expense relative to uniform fine mesh calculations, in the same manner that AMR has been used in Eulerian simulations. We also identify deficiencies with ALE methods that AMR is able to mitigate, and discuss the unique coupling challenges. The coupled method is demonstrated using three-dimensional unstructured meshes of up to O(10 7) tetrahedral cells.more » Convergence of ALE-AMR solutions towards both uniform fine mesh ALE results and analytic solutions is demonstrated. Speed-ups of 5-10× for a given level of error are observed relative to uniform fine mesh calculations.« less

  1. A coupled ALE-AMR method for shock hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waltz, J.; Bakosi, J.

    We present a numerical method combining adaptive mesh refinement (AMR) with arbitrary Lagrangian-Eulerian (ALE) mesh motion for the simulation of shock hydrodynamics on unstructured grids. The primary goal of the coupled method is to use AMR to reduce numerical error in ALE simulations at reduced computational expense relative to uniform fine mesh calculations, in the same manner that AMR has been used in Eulerian simulations. We also identify deficiencies with ALE methods that AMR is able to mitigate, and discuss the unique coupling challenges. The coupled method is demonstrated using three-dimensional unstructured meshes of up to O(10 7) tetrahedral cells.more » Convergence of ALE-AMR solutions towards both uniform fine mesh ALE results and analytic solutions is demonstrated. Speed-ups of 5-10× for a given level of error are observed relative to uniform fine mesh calculations.« less

  2. Aqueous extracts from Uncaria tomentosa (Willd. ex Schult.) DC. reduce bronchial hyperresponsiveness and inflammation in a murine model of asthma.

    PubMed

    Azevedo, Bruna Cestari; Morel, Lucas Junqueira Freitas; Carmona, Fábio; Cunha, Thiago Mattar; Contini, Silvia Helena Taleb; Delprete, Piero Giuseppe; Ramalho, Fernando Silva; Crevelin, Eduardo; Bertoni, Bianca Waléria; França, Suzelei Castro; Borges, Marcos Carvalho; Pereira, Ana Maria Soares

    2018-05-23

    Uncaria tomentosa (Willd. Ex Schult) DC is used by indigenous tribes in the Amazonian region of Central and South America to treat inflammation, allergies and asthma. The therapeutic properties of U. tomentosa have been attributed to the presence of tetracyclic and pentacyclic oxindole alkaloids and to phenolic acids. To characterize aqueous bark extracts (ABE) and aqueous leaf extracts (ALE) of U. tomentosa and to compare their anti-inflammatory effects. Constituents of the extracts were identified by ultra performance liquid chromatography-mass spectrometry. Anti-inflammatory activities were assessed in vitro by exposing lipopolysaccharide-stimulated macrophage cells (RAW264.7-Luc) to ABE, ALE and standard mitraphylline. In vivo assays were performed using a murine model of ovalbumin (OVA)-induced asthma. OVA-sensitized animals were treated with ABE or ALE while controls received dexamethasone or saline solution. Bronchial hyperresponsiveness, production of Th1 and Th2 cytokines, total and differential counts of inflammatory cells in the bronchoalveolar lavage (BAL) and lung tissue were determined. Mitraphylline, isomitraphylline, chlorogenic acid and quinic acid were detected in both extracts, while isorhyncophylline and rutin were detected only in ALE. ABE, ALE and mitraphylline inhibited the transcription of nuclear factor kappa-B in cell cultures, ALE and mitraphylline reduced the production of interleukin (IL)-6, and mitraphylline reduced production of tumor necrosis factor-alpha. Treatment with ABE and ALE at 50 and 200 mg kg -1 , respectively, reduced respiratory elastance and tissue damping and elastance. ABE and ALE reduced the number of eosinophils in BAL, while ALE at 200 mg kg -1 reduced the levels of IL-4 and IL-5 in the lung homogenate. Peribronchial inflammation was significantly reduced by treatment with ABE and ALE at 50 and 100 mg kg -1 respectively. The results clarify for the first time the anti-inflammatory activity of U. tomentosa in a murine model of asthma. Although ABE and ALE exhibited distinct chemical compositions, both extracts inhibited the production of pro-inflammatory cytokines in vitro. In vivo assays revealed that ABE was more effective in treating asthmatic inflammation while ALE was more successful in controlling respiratory mechanics. Both extracts may have promising applications in the phytotherapy of allergic asthma. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Insensitive Munitions Modeling Improvement Efforts

    DTIC Science & Technology

    2010-10-01

    LLNL) ALE3D . Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to...codes most commonly used by munition designers are CTH and the SIERRA suite of codes produced by Sandia National Labs (SNL) and ALE3D produced by... ALE3D , a LLNL developed code, is also used by various DoD participants. It was however, designed differently than either CTH or Sierra. ALE3D is a

  4. An asymptotic preserving multidimensional ALE method for a system of two compressible flows coupled with friction

    NASA Astrophysics Data System (ADS)

    Del Pino, S.; Labourasse, E.; Morel, G.

    2018-06-01

    We present a multidimensional asymptotic preserving scheme for the approximation of a mixture of compressible flows. Fluids are modelled by two Euler systems of equations coupled with a friction term. The asymptotic preserving property is mandatory for this kind of model, to derive a scheme that behaves well in all regimes (i.e. whatever the friction parameter value is). The method we propose is defined in ALE coordinates, using a Lagrange plus remap approach. This imposes a multidimensional definition and analysis of the scheme.

  5. Publishing datasets with eSciDoc and panMetaDocs

    NASA Astrophysics Data System (ADS)

    Ulbricht, D.; Klump, J.; Bertelmann, R.

    2012-04-01

    Currently serveral research institutions worldwide undertake considerable efforts to have their scientific datasets published and to syndicate them to data portals as extensively described objects identified by a persistent identifier. This is done to foster the reuse of data, to make scientific work more transparent, and to create a citable entity that can be referenced unambigously in written publications. GFZ Potsdam established a publishing workflow for file based research datasets. Key software components are an eSciDoc infrastructure [1] and multiple instances of the data curation tool panMetaDocs [2]. The eSciDoc repository holds data objects and their associated metadata in container objects, called eSciDoc items. A key metadata element in this context is the publication status of the referenced data set. PanMetaDocs, which is based on PanMetaWorks [3], is a PHP based web application that allows to describe data with any XML-based metadata schema. The metadata fields can be filled with static or dynamic content to reduce the number of fields that require manual entries to a minimum and make use of contextual information in a project setting. Access rights can be applied to set visibility of datasets to other project members and allow collaboration on and notifying about datasets (RSS) and interaction with the internal messaging system, that was inherited from panMetaWorks. When a dataset is to be published, panMetaDocs allows to change the publication status of the eSciDoc item from status "private" to "submitted" and prepare the dataset for verification by an external reviewer. After quality checks, the item publication status can be changed to "published". This makes the data and metadata available through the internet worldwide. PanMetaDocs is developed as an eSciDoc application. It is an easy to use graphical user interface to eSciDoc items, their data and metadata. It is also an application supporting a DOI publication agent during the process of publishing scientific datasets as electronic data supplements to research papers. Publication of research manuscripts has an already well established workflow that shares junctures with other processes and involves several parties in the process of dataset publication. Activities of the author, the reviewer, the print publisher and the data publisher have to be coordinated into a common data publication workflow. The case of data publication at GFZ Potsdam displays some specifics, e.g. the DOIDB webservice. The DOIDB is a proxy service at GFZ for the DataCite [4] DOI registration and its metadata store. DOIDB provides a local summary of the dataset DOIs registered through GFZ as a publication agent. An additional use case for the DOIDB is its function to enrich the datacite metadata with additional custom attributes, like a geographic reference in a DIF record. These attributes are at the moment not available in the datacite metadata schema but would be valuable elements for the compilation of data catalogues in the earth sciences and for dissemination of catalogue data via OAI-PMH. [1] http://www.escidoc.org , eSciDoc, FIZ Karlruhe, Germany [2] http://panmetadocs.sf.net , panMetaDocs, GFZ Potsdam, Germany [3] http://metaworks.pangaea.de , panMetaWorks, Dr. R. Huber, MARUM, Univ. Bremen, Germany [4] http://www.datacite.org

  6. MetaPro-IQ: a universal metaproteomic approach to studying human and mouse gut microbiota.

    PubMed

    Zhang, Xu; Ning, Zhibin; Mayne, Janice; Moore, Jasmine I; Li, Jennifer; Butcher, James; Deeke, Shelley Ann; Chen, Rui; Chiang, Cheng-Kang; Wen, Ming; Mack, David; Stintzi, Alain; Figeys, Daniel

    2016-06-24

    The gut microbiota has been shown to be closely associated with human health and disease. While next-generation sequencing can be readily used to profile the microbiota taxonomy and metabolic potential, metaproteomics is better suited for deciphering microbial biological activities. However, the application of gut metaproteomics has largely been limited due to the low efficiency of protein identification. Thus, a high-performance and easy-to-implement gut metaproteomic approach is required. In this study, we developed a high-performance and universal workflow for gut metaproteome identification and quantification (named MetaPro-IQ) by using the close-to-complete human or mouse gut microbial gene catalog as database and an iterative database search strategy. An average of 38 and 33 % of the acquired tandem mass spectrometry (MS) spectra was confidently identified for the studied mouse stool and human mucosal-luminal interface samples, respectively. In total, we accurately quantified 30,749 protein groups for the mouse metaproteome and 19,011 protein groups for the human metaproteome. Moreover, the MetaPro-IQ approach enabled comparable identifications with the matched metagenome database search strategy that is widely used but needs prior metagenomic sequencing. The response of gut microbiota to high-fat diet in mice was then assessed, which showed distinct metaproteome patterns for high-fat-fed mice and identified 849 proteins as significant responders to high-fat feeding in comparison to low-fat feeding. We present MetaPro-IQ, a metaproteomic approach for highly efficient intestinal microbial protein identification and quantification, which functions as a universal workflow for metaproteomic studies, and will thus facilitate the application of metaproteomics for better understanding the functions of gut microbiota in health and disease.

  7. An AMR capable finite element diffusion solver for ALE hydrocodes [An AMR capable diffusion solver for ALE-AMR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, A. C.; Bailey, D. S.; Kaiser, T. B.

    2015-02-01

    Here, we present a novel method for the solution of the diffusion equation on a composite AMR mesh. This approach is suitable for including diffusion based physics modules to hydrocodes that support ALE and AMR capabilities. To illustrate, we proffer our implementations of diffusion based radiation transport and heat conduction in a hydrocode called ALE-AMR. Numerical experiments conducted with the diffusion solver and associated physics packages yield 2nd order convergence in the L 2 norm.

  8. ALE3D: An Arbitrary Lagrangian-Eulerian Multi-Physics Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noble, Charles R.; Anderson, Andrew T.; Barton, Nathan R.

    ALE3D is a multi-physics numerical simulation software tool utilizing arbitrary-Lagrangian- Eulerian (ALE) techniques. The code is written to address both two-dimensional (2D plane and axisymmetric) and three-dimensional (3D) physics and engineering problems using a hybrid finite element and finite volume formulation to model fluid and elastic-plastic response of materials on an unstructured grid. As shown in Figure 1, ALE3D is a single code that integrates many physical phenomena.

  9. The nuclear hormone receptor E75A regulates vitellogenin gene (Al-Vg) expression in the mirid bug Apolygus lucorum.

    PubMed

    Tan, Y-A; Zhao, X-D; Sun, Y; Hao, D-J; Zhao, J; Jiang, Y-P; Bai, L-X; Xiao, L-B

    2018-04-01

    Apolygus lucorum is the predominant pest of Bacillus thuringiensis (Bt) cotton in China. 20-hydroxyecdysone (20E) plays a key role in the reproduction of this insect. To better understand the mechanism underlying 20E-regulated reproduction, the nuclear hormone receptor E75 isoform-A of Ap. lucorum (Al-E75A) was cloned and its expression analysed. A 2241-bp sequence of Al-E75A cDNA encoded an open reading frame of a polypeptide with a predicted molecular mass of 69.04 kDa. Al-E75A mRNA was detected in female adult stages of Ap. lucorum with peak expression in 7-day-old animals. Al-E75A was also expressed in several tissues, particularly in the fat body and ovary. A 3.2 kb Al-E75A mRNA was detected in all tissues by Northern blot. The fecundity and longevity were significantly decreased in female adults treated with Al-E75A small interfering RNA. The rates of egg incubation rates were considerably lower in the RNA interference-treated animals compared to the untreated controls. In order to investigate the molecular mechanism underlying the effects described above, vitellogenin (Al-Vg) was selected for further investigation. The expression pattern of Al-Vg was similar to that of Al-E75A and was up-regulated by 20E. After knockdown of Al-E75A, the expression profile of Al-Vg and the protein levels were down-regulated. These findings suggest that Al-E75A plays a crucial role in the regulation of Al-Vg expression in Ap. lucorum. © 2017 The Royal Entomological Society.

  10. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission.

    PubMed

    Hankeln, Wolfgang; Buttigieg, Pier Luigi; Fink, Dennis; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver

    2010-06-30

    Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft Excel spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data.

  11. MetaBar - a tool for consistent contextual data acquisition and standards compliant submission

    PubMed Central

    2010-01-01

    Background Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. Results MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft® Excel® spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). Conclusion The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data. PMID:20591175

  12. An evolving computational platform for biological mass spectrometry: workflows, statistics and data mining with MASSyPup64.

    PubMed

    Winkler, Robert

    2015-01-01

    In biological mass spectrometry, crude instrumental data need to be converted into meaningful theoretical models. Several data processing and data evaluation steps are required to come to the final results. These operations are often difficult to reproduce, because of too specific computing platforms. This effect, known as 'workflow decay', can be diminished by using a standardized informatic infrastructure. Thus, we compiled an integrated platform, which contains ready-to-use tools and workflows for mass spectrometry data analysis. Apart from general unit operations, such as peak picking and identification of proteins and metabolites, we put a strong emphasis on the statistical validation of results and Data Mining. MASSyPup64 includes e.g., the OpenMS/TOPPAS framework, the Trans-Proteomic-Pipeline programs, the ProteoWizard tools, X!Tandem, Comet and SpiderMass. The statistical computing language R is installed with packages for MS data analyses, such as XCMS/metaXCMS and MetabR. The R package Rattle provides a user-friendly access to multiple Data Mining methods. Further, we added the non-conventional spreadsheet program teapot for editing large data sets and a command line tool for transposing large matrices. Individual programs, console commands and modules can be integrated using the Workflow Management System (WMS) taverna. We explain the useful combination of the tools by practical examples: (1) A workflow for protein identification and validation, with subsequent Association Analysis of peptides, (2) Cluster analysis and Data Mining in targeted Metabolomics, and (3) Raw data processing, Data Mining and identification of metabolites in untargeted Metabolomics. Association Analyses reveal relationships between variables across different sample sets. We present its application for finding co-occurring peptides, which can be used for target proteomics, the discovery of alternative biomarkers and protein-protein interactions. Data Mining derived models displayed a higher robustness and accuracy for classifying sample groups in targeted Metabolomics than cluster analyses. Random Forest models do not only provide predictive models, which can be deployed for new data sets, but also the variable importance. We demonstrate that the later is especially useful for tracking down significant signals and affected pathways in untargeted Metabolomics. Thus, Random Forest modeling supports the unbiased search for relevant biological features in Metabolomics. Our results clearly manifest the importance of Data Mining methods to disclose non-obvious information in biological mass spectrometry . The application of a Workflow Management System and the integration of all required programs and data in a consistent platform makes the presented data analyses strategies reproducible for non-expert users. The simple remastering process and the Open Source licenses of MASSyPup64 (http://www.bioprocess.org/massypup/) enable the continuous improvement of the system.

  13. Parallel processing considerations for image recognition tasks

    NASA Astrophysics Data System (ADS)

    Simske, Steven J.

    2011-01-01

    Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.

  14. Method of and apparatus for modeling interactions

    DOEpatents

    Budge, Kent G.

    2004-01-13

    A method and apparatus for modeling interactions can accurately model tribological and other properties and accommodate topological disruptions. Two portions of a problem space are represented, a first with a Lagrangian mesh and a second with an ALE mesh. The ALE and Lagrangian meshes are constructed so that each node on the surface of the Lagrangian mesh is in a known correspondence with adjacent nodes in the ALE mesh. The interaction can be predicted for a time interval. Material flow within the ALE mesh can accurately model complex interactions such as bifurcation. After prediction, nodes in the ALE mesh in correspondence with nodes on the surface of the Lagrangian mesh can be mapped so that they are once again adjacent to their corresponding Lagrangian mesh nodes. The ALE mesh can then be smoothed to reduce mesh distortion that might reduce the accuracy or efficiency of subsequent prediction steps. The process, from prediction through mapping and smoothing, can be repeated until a terminal condition is reached.

  15. Acanthus ilicifolius plant extract prevents DNA alterations in a transplantable Ehrlich ascites carcinoma-bearing murine model.

    PubMed

    Chakraborty, Tridib; Bhuniya, Dipak; Chatterjee, Mary; Rahaman, Mosiur; Singha, Dipak; Chatterjee, Baidya Nath; Datta, Subrata; Rana, Ajay; Samanta, Kartick; Srivastawa, Sunil; Maitra, Sankar K; Chatterjee, Malay

    2007-12-28

    To investigate the chemopreventive efficacy of the Indian medicinal plant Acanthus ilicifolius L Acanthaceae in a transplantable Ehrlich ascites carcinoma (EAC)-bearing murine model. Male Swiss albino mice were divided into four groups: Group A was the untreated normal control; Group B was the EAC control mice group that received serial, intraperitoneal (ip) inoculations of rapidly proliferating 2 x 10(5) viable EAC cells in 0.2 mL of sterile phosphate buffered saline; Group C was the plant extract-treated group that received the aqueous leaf extract (ALE) of the plant at a dose of 2.5 mg/kg body weight by single ip injections, once daily for 10, 20 and 30 consecutive days following tumour inoculation (ALE control); and Group D was the EAC + ALE-treatment group. The chemopreventive potential of the ALE was evaluated in a murine model by studying various biological parameters and genotoxic markers, such as tumour cell count, mean survival of the animals, haematological indices, hepatocellular histology, immunohistochemical expression of liver metallothionein (MT) protein, sister-chromatid exchanges (SCEs), and DNA alterations. Treatment of the EAC-bearing mice with the ALE significantly (P < 0.001) reduced viable tumour cell count by 68.34% (228.7 x 10(6) +/- 0.53) when compared to EAC control mice (72.4 x 10(6) +/- 0.49), and restored body and organ weights almost to the normal values. ALE administration also increased (P < 0.001) mean survival of the hosts from 35 +/- 3.46 d in EAC control mice to 83 +/- 2.69 d in EAC + ALE-treated mice. Haematological indices also showed marked improvement with administration of ALE in EAC-bearing animals. There was a significant increase in RBC count (P < 0.001), hemoglobin percent (P < 0.001), and haematocrit value (P < 0.001) from 4.3 +/- 0.12, 6.4 +/- 0.93, and 17.63 +/- 0.72 respectively in EAC control mice to 7.1 +/- 0.13, 12.1 +/- 0.77, and 30.23 +/- 0.57 respectively in EAC + ALE-treated group, along with concurrent decrement (P < 0.001) in WBC count from 18.8 +/- 0.54 in EAC control to 8.4 +/- 0.71 in EAC + ALE. Furthermore, treatment with ALE substantially improved hepatocellular architecture and no noticeable neoplastic lesions or foci of cellular alteration were observed. Daily administration of the ALE was found to limit liver MT expression, an important marker of cell proliferation with concomitant reduction in MT immunoreactivity (62.25 +/- 2.58 vs 86.24 +/- 5.69, P < 0.01). ALE was also potentially effective in reducing (P < 0.001) the frequency of SCEs from 14.94 +/- 2.14 in EAC control to 5.12 +/- 1.16 in EAC + ALE-treated group. Finally, in comparison to the EAC control, ALE was able to suppress in vivo DNA damage by abating the generations of 'tailed' DNA by 53.59% (98.65 +/- 2.31 vs 45.06 +/- 1.14, P < 0.001), and DNA single-strand breaks (SSBs) by 38.53% (3.14 +/- 0.31 vs 1.93 +/- 0.23, P < 0.01) in EAC-bearing murine liver. Our data indicate that, ALE is beneficial in restoring haematological and hepatic histological profiles and in lengthening the survival of the animals against the proliferation of ascites tumour in vivo. Finally, the chemopreventive efficacy of the ALE is manifested in limiting MT expression and in preventing DNA alterations in murine liver. The promising results of this study suggest further investigation into the chemopreventive mechanisms of the medicinal plant A. ilicifolius in vivo and in vitro.

  16. Active Learning in Engineering Education: A (Re)Introduction

    ERIC Educational Resources Information Center

    Lima, Rui M.; Andersson, Pernille Hammar; Saalman, Elisabeth

    2017-01-01

    The informal network "Active Learning in Engineering Education" (ALE) has been promoting Active Learning since 2001. ALE creates opportunity for practitioners and researchers of engineering education to collaboratively learn how to foster learning of engineering students. The activities in ALE are centred on the vision that learners…

  17. Toward Improved Fidelity of Thermal Explosion Simulations

    NASA Astrophysics Data System (ADS)

    Nichols, A. L.; Becker, R.; Howard, W. M.; Wemhoff, A.

    2009-12-01

    We will present results of an effort to improve the thermal/chemical/mechanical modeling of HMX based explosives like LX04 and LX10 for thermal cook-off The original HMX model and analysis scheme were developed by Yoh et al. for use in the ALE3D modeling framework. The current results were built to remedy the deficiencies of that original model. We concentrated our efforts in four areas. The first area was addition of porosity to the chemical material model framework in ALE3D that is used to model the HMX explosive formulation. This is needed to handle the roughly 2% porosity in solid explosives. The second area was the improvement of the HMX reaction network, which included a reactive phase change model base on work by Henson et al. The third area required adding early decomposition gas species to the CHEETAH material database to develop more accurate equations of state for gaseous intermediates and products. Finally, it was necessary to improve the implicit mechanics module in ALE3D to more naturally handle the long time scales associated with thermal cook-off The application of the resulting framework to the analysis of the Scaled Thermal Explosion (STEX) experiments will be discussed.

  18. MetaDB a Data Processing Workflow in Untargeted MS-Based Metabolomics Experiments.

    PubMed

    Franceschi, Pietro; Mylonas, Roman; Shahaf, Nir; Scholz, Matthias; Arapitsas, Panagiotis; Masuero, Domenico; Weingart, Georg; Carlin, Silvia; Vrhovsek, Urska; Mattivi, Fulvio; Wehrens, Ron

    2014-01-01

    Due to their sensitivity and speed, mass-spectrometry based analytical technologies are widely used to in metabolomics to characterize biological phenomena. To address issues like metadata organization, quality assessment, data processing, data storage, and, finally, submission to public repositories, bioinformatic pipelines of a non-interactive nature are often employed, complementing the interactive software used for initial inspection and visualization of the data. These pipelines often are created as open-source software allowing the complete and exhaustive documentation of each step, ensuring the reproducibility of the analysis of extensive and often expensive experiments. In this paper, we will review the major steps which constitute such a data processing pipeline, discussing them in the context of an open-source software for untargeted MS-based metabolomics experiments recently developed at our institute. The software has been developed by integrating our metaMS R package with a user-friendly web-based application written in Grails. MetaMS takes care of data pre-processing and annotation, while the interface deals with the creation of the sample lists, the organization of the data storage, and the generation of survey plots for quality assessment. Experimental and biological metadata are stored in the ISA-Tab format making the proposed pipeline fully integrated with the Metabolights framework.

  19. ALES: An Innovative Argument-Learning Environment

    ERIC Educational Resources Information Center

    Abbas, Safia; Sawamura, Hajime

    2010-01-01

    This paper presents the development of an Argument-Learning System (ALES). The idea is based on the AIF (argumentation interchange format) ontology using "Walton theory". ALES uses different mining techniques to manage a highly structured arguments repository. This repository was designed, developed and implemented by the authors. The aim is to…

  20. Application of the ALE and MBE Methods to the Growth of Layered Hg sub x Cd sub 1-x Te Films.

    DTIC Science & Technology

    1986-09-26

    films / We have studied the applicability of the Atomic Layer Epitaxy (ALE, vee Ref. -1pand Molecular Beam Epitaxy (MBE) ito growth of Hg2 Cdi- ,Te...thin- films throughout the composition range 0 x $ 0.8. The progress of the Contract has been reported periodically in five interim reports. This final...I separate sources) yielded films with high x values. On the grounds of these observations we do not find ALE suitable for growth of HgCdTe. 2) ALE

  1. Is creative insight task-specific? A coordinate-based meta-analysis of neuroimaging studies on insightful problem solving.

    PubMed

    Shen, Wangbing; Yuan, Yuan; Liu, Chang; Zhang, Xiaojiang; Luo, Jing; Gong, Zhe

    2016-12-01

    The question of whether creative insight varies across problem types has recently come to the forefront of studies of creative cognition. In the present study, to address the nature of creative insight, the coordinate-based activation likelihood estimation (ALE) technique was utilized to individually conduct three quantitative meta-analyses of neuroimaging experiments that used the compound remote associate (CRA) task, the prototype heuristic (PH) task and the Chinese character chunk decomposition (CCD) task. These tasks were chosen because they are frequently used to uncover the neurocognitive correlates of insight. Our results demonstrated that creative insight reliably activates largely non-overlapping brain regions across task types, with the exception of some shared regions: the CRA task mainly relied on the right parahippocampal gyrus, the superior frontal gyrus and the inferior frontal gyrus; the PH task primarily depended on the right middle occipital gyrus (MOG), the bilateral superior parietal lobule/precuneus, the left inferior parietal lobule, the left lingual gyrus and the left middle frontal gyrus; and the CCD task activated a broad cerebral network consisting of most dorsolateral and medial prefrontal regions, frontoparietal regions and the right MOG. These results provide the first neural evidence of the task dependence of creative insight. The implications of these findings for resolving conflict surrounding the different theories of creative cognition and for defining insight as a set of heterogeneous processes are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  3. Traumatic brain injury and adverse life events: Group differences in young adults injured as children.

    PubMed

    Taylor, Olivia; Barrett, Robert D; McLellan, Tracey; McKinlay, Audrey

    2015-01-01

    To investigate whether individuals with a history of traumatic brain injury (TBI) experience a greater number of adverse life events (ALE) compared to controls, to identify significant predictors of experiencing ALE and whether the severity of childhood TBI negatively influences adult life outcomes. A total of 167 individuals, injured prior to age 18, 5 or more years post-injury and 18 or more years of age, were recruited in the Canterbury region of New Zealand, with 124 having sustained childhood TBI (62 mild, 62 moderate/severe) and 43 orthopaedic injury controls. Participants were asked about ALE they had experienced and other adult life outcomes. Individuals with a history of TBI experienced more ALE compared to controls. The number of ALE experienced by an individual was associated with more visits to the doctor, lower education level and lower satisfaction with material standard of living. Childhood TBI is associated with an increased number of ALE and adult negative life outcomes. Understanding factors that contribute to negative outcomes following childhood TBI will provide an avenue for rehabilitation and support to reduce any problems in adulthood.

  4. Tuberculose péritonéale pseudo tumorale mimant un cancer ovarien: un diagnostic différentiel important à considérer

    PubMed Central

    Moukit, Mounir; Fadel, Fatimazahra Ait El; Kouach, Jaouad; Babahabib, Abdellah; Dehayni, Mohammed; Rahali, Driss Moussaoui

    2016-01-01

    La tuberculose est une maladie infectieuse curable qui peut simuler dans sa localisation péritonéale un cancer ovarien avancé conduisant ainsi à une chirurgie étendue et inutile souvent chez des femmes en âge de reproduction. Nous rapportons un nouveau cas de tuberculose péritonéale pseudo tumorale chez une patiente âgée de 43 ans chez qui le diagnostic d’un cancer ovarien avec carcinose péritonéale avait été suspecté. La laparotomie exploratrice avec examen histologique extemporané ont permis de confirmer le diagnostic de tuberculose péritonéale. La patiente a bien répondu au traitement antituberculeux selon le protocole 2ERHZ/4RH. PMID:28292155

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chen; Metzler, Dominik; Oehrlein, Gottlieb S., E-mail: oehrlein@umd.edu

    Angstrom-level plasma etching precision is required for semiconductor manufacturing of sub-10 nm critical dimension features. Atomic layer etching (ALE), achieved by a series of self-limited cycles, can precisely control etching depths by limiting the amount of chemical reactant available at the surface. Recently, SiO{sub 2} ALE has been achieved by deposition of a thin (several Angstroms) reactive fluorocarbon (FC) layer on the material surface using controlled FC precursor flow and subsequent low energy Ar{sup +} ion bombardment in a cyclic fashion. Low energy ion bombardment is used to remove the FC layer along with a limited amount of SiO{sub 2} frommore » the surface. In the present article, the authors describe controlled etching of Si{sub 3}N{sub 4} and SiO{sub 2} layers of one to several Angstroms using this cyclic ALE approach. Si{sub 3}N{sub 4} etching and etching selectivity of SiO{sub 2} over Si{sub 3}N{sub 4} were studied and evaluated with regard to the dependence on maximum ion energy, etching step length (ESL), FC surface coverage, and precursor selection. Surface chemistries of Si{sub 3}N{sub 4} were investigated by x-ray photoelectron spectroscopy (XPS) after vacuum transfer at each stage of the ALE process. Since Si{sub 3}N{sub 4} has a lower physical sputtering energy threshold than SiO{sub 2}, Si{sub 3}N{sub 4} physical sputtering can take place after removal of chemical etchant at the end of each cycle for relatively high ion energies. Si{sub 3}N{sub 4} to SiO{sub 2} ALE etching selectivity was observed for these FC depleted conditions. By optimization of the ALE process parameters, e.g., low ion energies, short ESLs, and/or high FC film deposition per cycle, highly selective SiO{sub 2} to Si{sub 3}N{sub 4} etching can be achieved for FC accumulation conditions, where FC can be selectively accumulated on Si{sub 3}N{sub 4} surfaces. This highly selective etching is explained by a lower carbon consumption of Si{sub 3}N{sub 4} as compared to SiO{sub 2}. The comparison of C{sub 4}F{sub 8} and CHF{sub 3} only showed a difference in etching selectivity for FC depleted conditions. For FC accumulation conditions, precursor chemistry has a weak impact on etching selectivity. Surface chemistry analysis shows that surface fluorination and FC reduction take place during a single ALE cycle for FC depleted conditions. A fluorine rich carbon layer was observed on the Si{sub 3}N{sub 4} surface after ALE processes for which FC accumulation takes place. The angle resolved-XPS thickness calculations confirmed the results of the ellipsometry measurements in all cases.« less

  6. The role of the salience network in processing lexical and nonlexical stimuli in cochlear implant users: an ALE meta-analysis of PET studies.

    PubMed

    Song, Jae-Jin; Vanneste, Sven; Lazard, Diane S; Van de Heyning, Paul; Park, Joo Hyun; Oh, Seung Ha; De Ridder, Dirk

    2015-05-01

    Previous positron emission tomography (PET) studies have shown that various cortical areas are activated to process speech signal in cochlear implant (CI) users. Nonetheless, differences in task dimension among studies and low statistical power preclude from understanding sound processing mechanism in CI users. Hence, we performed activation likelihood estimation meta-analysis of PET studies in CI users and normal hearing (NH) controls to compare the two groups. Eight studies (58 CI subjects/92 peak coordinates; 45 NH subjects/40 peak coordinates) were included and analyzed, retrieving areas significantly activated by lexical and nonlexical stimuli. For lexical and nonlexical stimuli, both groups showed activations in the components of the dual-stream model such as bilateral superior temporal gyrus/sulcus, middle temporal gyrus, left posterior inferior frontal gyrus, and left insula. However, CI users displayed additional unique activation patterns by lexical and nonlexical stimuli. That is, for the lexical stimuli, significant activations were observed in areas comprising salience network (SN), also known as the intrinsic alertness network, such as the left dorsal anterior cingulate cortex (dACC), left insula, and right supplementary motor area in the CI user group. Also, for the nonlexical stimuli, CI users activated areas comprising SN such as the right insula and left dACC. Previous episodic observations on lexical stimuli processing using the dual auditory stream in CI users were reconfirmed in this study. However, this study also suggests that dual-stream auditory processing in CI users may need supports from the SN. In other words, CI users need to pay extra attention to cope with degraded auditory signal provided by the implant. © 2015 Wiley Periodicals, Inc.

  7. Using EHR audit trail logs to analyze clinical workflow: A case study from community-based ambulatory clinics.

    PubMed

    Wu, Danny T Y; Smart, Nikolas; Ciemins, Elizabeth L; Lanham, Holly J; Lindberg, Curt; Zheng, Kai

    2017-01-01

    To develop a workflow-supported clinical documentation system, it is a critical first step to understand clinical workflow. While Time and Motion studies has been regarded as the gold standard of workflow analysis, this method can be resource consuming and its data may be biased due to the cognitive limitation of human observers. In this study, we aimed to evaluate the feasibility and validity of using EHR audit trail logs to analyze clinical workflow. Specifically, we compared three known workflow changes from our previous study with the corresponding EHR audit trail logs of the study participants. The results showed that EHR audit trail logs can be a valid source for clinical workflow analysis, and can provide an objective view of clinicians' behaviors, multi-dimensional comparisons, and a highly extensible analysis framework.

  8. Microbial diversity and metabolite composition of Belgian red-brown acidic ales.

    PubMed

    Snauwaert, Isabel; Roels, Sanne P; Van Nieuwerburg, Filip; Van Landschoot, Anita; De Vuyst, Luc; Vandamme, Peter

    2016-03-16

    Belgian red-brown acidic ales are sour and alcoholic fermented beers, which are produced by mixed-culture fermentation and blending. The brews are aged in oak barrels for about two years, after which mature beer is blended with young, non-aged beer to obtain the end-products. The present study evaluated the microbial community diversity of Belgian red-brown acidic ales at the end of the maturation phase of three subsequent brews of three different breweries. The microbial diversity was compared with the metabolite composition of the brews at the end of the maturation phase. Therefore, mature brew samples were subjected to 454 pyrosequencing of the 16S rRNA gene (bacteria) and the internal transcribed spacer region (yeasts) and a broad range of metabolites was quantified. The most important microbial species present in the Belgian red-brown acidic ales investigated were Pediococcus damnosus, Dekkera bruxellensis, and Acetobacter pasteurianus. In addition, this culture-independent analysis revealed operational taxonomic units that were assigned to an unclassified fungal community member, Candida, and Lactobacillus. The main metabolites present in the brew samples were L-lactic acid, D-lactic acid, and ethanol, whereas acetic acid was produced in lower quantities. The most prevailing aroma compounds were ethyl acetate, isoamyl acetate, ethyl hexanoate, and ethyl octanoate, which might be of impact on the aroma of the end-products. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. SMITH: a LIMS for handling next-generation sequencing workflows

    PubMed Central

    2014-01-01

    Background Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). Methods SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. Results SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. Conclusions SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis. PMID:25471934

  10. SMITH: a LIMS for handling next-generation sequencing workflows.

    PubMed

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis.

  11. Differences in effectiveness of the active living every day program for older adults with arthritis.

    PubMed

    Sperber, Nina R; Allen, Kelli D; Devellis, Brenda M; Devellis, Robert F; Lewis, Megan A; Callahan, Leigh F

    2013-10-01

    The authors explored whether demographic and psychosocial variables predicted differences in physical activity for participants with arthritis in a trial of Active Living Every Day (ALED). Participants (N = 280) from 17 community sites were randomized into ALED or usual care. The authors assessed participant demographic characteristics, self-efficacy, outcome expectations, pain, fatigue, and depressive symptoms at baseline and physical activity frequency at 20-wk follow-up. They conducted linear regression with interaction terms (Baseline Characteristic × Randomization Group). Being female (p ≤ .05), less depressed (p ≤ .05), or younger (p ≤ .10) was associated with more frequent posttest physical activity for ALED participants than for those with usual care. Higher education was associated with more physical activity for both ALED and usual-care groups. ALED was particularly effective for female, younger, and less depressed participants. Further research should determine whether modifications could produce better outcomes in other subgroups.

  12. Variation in the Gender Gap in Inactive and Active Life Expectancy by the Definition of Inactivity Among Older Adults.

    PubMed

    Malhotra, Rahul; Chan, Angelique; Ajay, Shweta; Ma, Stefan; Saito, Yasuhiko

    2016-10-01

    To assess variation in gender gap (female-male) in inactive life expectancy (IALE) and active life expectancy (ALE) by definition of inactivity. Inactivity, among older Singaporeans, was defined as follows: Scenario 1-health-related difficulty in activities of daily living (ADLs); Scenario 2-health-related difficulty in ADLs/instrumental ADLs (IADLs); Scenario 3-health-related difficulty in ADLs/IADLs or non-health-related non-performance of IADLs. Multistate life tables computed IALE and ALE at age 60, testing three hypotheses: In all scenarios, life expectancy, absolute and relative IALE, and absolute ALE are higher for females (Hypothesis 1 [H1]); gender gap in absolute and relative IALE expands, and in absolute ALE, it contracts in Scenario 2 versus 1 (Hypothesis 2 [H2]); gender gap in absolute and relative IALE decreases, and in absolute ALE, it increases in Scenario 3 versus 2 (Hypothesis 3 [H3]). H1 was supported in Scenarios 1 and 3 but not Scenario 2. Both H2 and H3 were supported. Definition of inactivity influences gender gap in IALE and ALE. © The Author(s) 2016.

  13. An improved design method for EPC middleware

    NASA Astrophysics Data System (ADS)

    Lou, Guohuan; Xu, Ran; Yang, Chunming

    2014-04-01

    For currently existed problems and difficulties during the small and medium enterprises use EPC (Electronic Product Code) ALE (Application Level Events) specification to achieved middleware, based on the analysis of principle of EPC Middleware, an improved design method for EPC middleware is presented. This method combines the powerful function of MySQL database, uses database to connect reader-writer with upper application system, instead of development of ALE application program interface to achieve a middleware with general function. This structure is simple and easy to implement and maintain. Under this structure, different types of reader-writers added can be configured conveniently and the expandability of the system is improved.

  14. Toward Improved Fidelity of Thermal Explosion Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, A L; Becker, R; Howard, W M

    2009-07-17

    We will present results of an effort to improve the thermal/chemical/mechanical modeling of HMX based explosive like LX04 and LX10 for thermal cook-off. The original HMX model and analysis scheme were developed by Yoh et.al. for use in the ALE3D modeling framework. The current results were built to remedy the deficiencies of that original model. We concentrated our efforts in four areas. The first area was addition of porosity to the chemical material model framework in ALE3D that is used to model the HMX explosive formulation. This is needed to handle the roughly 2% porosity in solid explosives. The secondmore » area was the improvement of the HMX reaction network, which included the inclusion of a reactive phase change model base on work by Henson et.al. The third area required adding early decomposition gas species to the CHEETAH material database to develop more accurate equations of state for gaseous intermediates and products. Finally, it was necessary to improve the implicit mechanics module in ALE3D to more naturally handle the long time scales associated with thermal cook-off. The application of the resulting framework to the analysis of the Scaled Thermal Explosion (STEX) experiments will be discussed.« less

  15. Methods for simulation-based analysis of fluid-structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less

  16. The influence of emotional interference on cognitive control: A meta-analysis of neuroimaging studies using the emotional Stroop task.

    PubMed

    Song, Sensen; Zilverstand, Anna; Song, Hongwen; d'Oleire Uquillas, Federico; Wang, Yongming; Xie, Chao; Cheng, Li; Zou, Zhiling

    2017-05-18

    The neural correlates underlying the influence of emotional interference on cognitive control remain a topic of discussion. Here, we assessed 16 neuroimaging studies that used an emotional Stroop task and that reported a significant interaction effect between emotion (stimulus type) and cognitive conflict. There were a total of 330 participants, equaling 132 foci for an activation likelihood estimation (ALE) analysis. Results revealed consistent brain activation patterns related to emotionally-salient stimuli (as compared to emotionally-neutral trials) during cognitive conflict trials [incongruent trials (with task-irrelevant information interfering), versus congruent/baseline trials (less disturbance from task-irrelevant information)], that span the lateral prefrontal cortex (dorsolateral prefrontal cortex and inferior frontal gyrus), the medial prefrontal cortex, and the dorsal anterior cingulate cortex. Comparing mild emotional interference trials (without semantic conflict) versus intense emotional interference trials (with semantic conflict), revealed that while concurrent activation in similar brain regions as mentioned above was found for intense emotional interference trials, activation for mild emotional interference trials was only found in the precentral/postcentral gyrus. These data provide evidence for the potential neural mechanisms underlying emotional interference on cognitive control, and further elucidate an important distinction in brain activation patterns for different levels of emotional conflict across emotional Stroop tasks.

  17. Simulating Small-Scale Experiments of In-Tunnel Airblast Using STUN and ALE3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuscamman, Stephanie; Glenn, Lewis; Schebler, Gregory

    2011-09-12

    This report details continuing validation efforts for the Sphere and Tunnel (STUN) and ALE3D codes. STUN has been validated previously for blast propagation through tunnels using several sets of experimental data with varying charge sizes and tunnel configurations, including the MARVEL nuclear driven shock tube experiment (Glenn, 2001). The DHS-funded STUNTool version is compared to experimental data and the LLNL ALE3D hydrocode. In this particular study, we compare the performance of the STUN and ALE3D codes in modeling an in-tunnel airblast to experimental results obtained by Lunderman and Ohrt in a series of small-scale high explosive experiments (1997).

  18. Bioinformatics workflows and web services in systems biology made easy for experimentalists.

    PubMed

    Jimenez, Rafael C; Corpas, Manuel

    2013-01-01

    Workflows are useful to perform data analysis and integration in systems biology. Workflow management systems can help users create workflows without any previous knowledge in programming and web services. However the computational skills required to build such workflows are usually above the level most biological experimentalists are comfortable with. In this chapter we introduce workflow management systems that reuse existing workflows instead of creating them, making it easier for experimentalists to perform computational tasks.

  19. Investigation of the in vivo antioxidative activity of Cynara scolymus (artichoke) leaf extract in the streptozotocin-induced diabetic rat.

    PubMed

    Magielse, Joanna; Verlaet, Annelies; Breynaert, Annelies; Keenoy, Begoña Manuel Y; Apers, Sandra; Pieters, Luc; Hermans, Nina

    2014-01-01

    The in vivo antioxidant activity of a quantified leaf extract of Cynara scolymus (artichoke) was studied. The aqueous artichoke leaf extract (ALE), containing 1.5% caffeoylquinic acid with chlorogenic acid being most abundant (0.30%), and luteolin-7-O-glucoside as major flavonoid (0.15%), was investigated by evaluating the effect on different oxidative stress biomarkers, after 3 wk oral supplementation in the streptozotocin-induced diabetic rat model. Apart from two test groups (0.2 g ALE/kg BW/day and 1 g ALE/kg BW/day, where BW is body weight), a healthy control group, untreated oxidative stress group, and vitamin E treated group (positive control) were included. A 0.2 g/kg BW/day of ALE decreased oxidative stress: malondialdehyde and 8-hydroxydeoxyguanosine levels significantly diminished, whereas erythrocyte glutathione levels significantly increased. A 1.0 g/kg BW/day ALE did not show higher antioxidant activity. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. GeNNet: an integrated platform for unifying scientific workflows and graph databases for transcriptome data analysis

    PubMed Central

    Gadelha, Luiz; Ribeiro-Alves, Marcelo; Porto, Fábio

    2017-01-01

    There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were analyzed. The results are integrated into GeNNet-DB, a database about genes, clusters, experiments and their properties and relationships. The resulting graph database is explored with queries that demonstrate the expressiveness of this data model for reasoning about gene interaction networks. GeNNet is the first platform to integrate the analytical process of transcriptome data with graph databases. It provides a comprehensive set of tools that would otherwise be challenging for non-expert users to install and use. Developers can add new functionality to components of GeNNet. The derived data allows for testing previous hypotheses about an experiment and exploring new ones through the interactive graph database environment. It enables the analysis of different data on humans, rhesus, mice and rat coming from Affymetrix platforms. GeNNet is available as an open source platform at https://github.com/raquele/GeNNet and can be retrieved as a software container with the command docker pull quelopes/gennet. PMID:28695067

  1. GeNNet: an integrated platform for unifying scientific workflows and graph databases for transcriptome data analysis.

    PubMed

    Costa, Raquel L; Gadelha, Luiz; Ribeiro-Alves, Marcelo; Porto, Fábio

    2017-01-01

    There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were analyzed. The results are integrated into GeNNet-DB, a database about genes, clusters, experiments and their properties and relationships. The resulting graph database is explored with queries that demonstrate the expressiveness of this data model for reasoning about gene interaction networks. GeNNet is the first platform to integrate the analytical process of transcriptome data with graph databases. It provides a comprehensive set of tools that would otherwise be challenging for non-expert users to install and use. Developers can add new functionality to components of GeNNet. The derived data allows for testing previous hypotheses about an experiment and exploring new ones through the interactive graph database environment. It enables the analysis of different data on humans, rhesus, mice and rat coming from Affymetrix platforms. GeNNet is available as an open source platform at https://github.com/raquele/GeNNet and can be retrieved as a software container with the command docker pull quelopes/gennet.

  2. An unstructured mesh arbitrary Lagrangian-Eulerian unsteady incompressible flow solver and its application to insect flight aerodynamics

    NASA Astrophysics Data System (ADS)

    Su, Xiaohui; Cao, Yuanwei; Zhao, Yong

    2016-06-01

    In this paper, an unstructured mesh Arbitrary Lagrangian-Eulerian (ALE) incompressible flow solver is developed to investigate the aerodynamics of insect hovering flight. The proposed finite-volume ALE Navier-Stokes solver is based on the artificial compressibility method (ACM) with a high-resolution method of characteristics-based scheme on unstructured grids. The present ALE model is validated and assessed through flow passing over an oscillating cylinder. Good agreements with experimental results and other numerical solutions are obtained, which demonstrates the accuracy and the capability of the present model. The lift generation mechanisms of 2D wing in hovering motion, including wake capture, delayed stall, rapid pitch, as well as clap and fling are then studied and illustrated using the current ALE model. Moreover, the optimized angular amplitude in symmetry model, 45°, is firstly reported in details using averaged lift and the energy power method. Besides, the lift generation of complete cyclic clap and fling motion, which is simulated by few researchers using the ALE method due to large deformation, is studied and clarified for the first time. The present ALE model is found to be a useful tool to investigate lift force generation mechanism for insect wing flight.

  3. Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less

  4. A meta-analysis of fMRI studies on Chinese orthographic, phonological, and semantic processing.

    PubMed

    Wu, Chiao-Yi; Ho, Moon-Ho Ringo; Chen, Shen-Hsing Annabel

    2012-10-15

    A growing body of neuroimaging evidence has shown that Chinese character processing recruits differential activation from alphabetic languages due to its unique linguistic features. As more investigations on Chinese character processing have recently become available, we applied a meta-analytic approach to summarize previous findings and examined the neural networks for orthographic, phonological, and semantic processing of Chinese characters independently. The activation likelihood estimation (ALE) method was used to analyze eight studies in the orthographic task category, eleven in the phonological and fifteen in the semantic task categories. Converging activation among three language-processing components was found in the left middle frontal gyrus, the left superior parietal lobule and the left mid-fusiform gyrus, suggesting a common sub-network underlying the character recognition process regardless of the task nature. With increasing task demands, the left inferior parietal lobule and the right superior temporal gyrus were specialized for phonological processing, while the left middle temporal gyrus was involved in semantic processing. Functional dissociation was identified in the left inferior frontal gyrus, with the posterior dorsal part for phonological processing and the anterior ventral part for semantic processing. Moreover, bilateral involvement of the ventral occipito-temporal regions was found for both phonological and semantic processing. The results provide better understanding of the neural networks underlying Chinese orthographic, phonological, and semantic processing, and consolidate the findings of additional recruitment of the left middle frontal gyrus and the right fusiform gyrus for Chinese character processing as compared with the universal language network that has been based on alphabetic languages. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. WO3 and W Thermal Atomic Layer Etching Using "Conversion-Fluorination" and "Oxidation-Conversion-Fluorination" Mechanisms.

    PubMed

    Johnson, Nicholas R; George, Steven M

    2017-10-04

    The thermal atomic layer etching (ALE) of WO 3 and W was demonstrated with new "conversion-fluorination" and "oxidation-conversion-fluorination" etching mechanisms. Both of these mechanisms are based on sequential, self-limiting reactions. WO 3 ALE was achieved by a "conversion-fluorination" mechanism using an AB exposure sequence with boron trichloride (BCl 3 ) and hydrogen fluoride (HF). BCl 3 converts the WO 3 surface to a B 2 O 3 layer while forming volatile WO x Cl y products. Subsequently, HF spontaneously etches the B 2 O 3 layer producing volatile BF 3 and H 2 O products. In situ spectroscopic ellipsometry (SE) studies determined that the BCl 3 and HF reactions were self-limiting versus exposure. The WO 3 ALE etch rates increased with temperature from 0.55 Å/cycle at 128 °C to 4.19 Å/cycle at 207 °C. W served as an etch stop because BCl 3 and HF could not etch the underlying W film. W ALE was performed using a three-step "oxidation-conversion-fluorination" mechanism. In this ABC exposure sequence, the W surface is first oxidized to a WO 3 layer using O 2 /O 3 . Subsequently, the WO 3 layer is etched with BCl 3 and HF. SE could simultaneously monitor the W and WO 3 thicknesses and conversion of W to WO 3 . SE measurements showed that the W film thickness decreased linearly with number of ABC reaction cycles. W ALE was shown to be self-limiting with respect to each reaction in the ABC process. The etch rate for W ALE was ∼2.5 Å/cycle at 207 °C. An oxide thickness of ∼20 Å remained after W ALE, but could be removed by sequential BCl 3 and HF exposures without affecting the W layer. These new etching mechanisms will enable the thermal ALE of a variety of additional metal materials including those that have volatile metal fluorides.

  6. Scientific Data Management (SDM) Center for Enabling Technologies. Final Report, 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Our contributions to advancing the State of the Art in scientific workflows have focused on the following areas: Workflow development; Generic workflow components and templates; Provenance collection and analysis; and, Workflow reliability and fault tolerance.

  7. Perceiving emotional expressions in others: Activation likelihood estimation meta-analyses of explicit evaluation, passive perception and incidental perception of emotions.

    PubMed

    Dricu, Mihai; Frühholz, Sascha

    2016-12-01

    We conducted a series of activation likelihood estimation (ALE) meta-analyses to determine the commonalities and distinctions between separate levels of emotion perception, namely incidental perception, passive perception, and explicit evaluation of emotional expressions. Pooling together more than 180 neuroimaging experiments using facial, vocal or body expressions, our results are threefold. First, explicitly evaluating the emotions of others recruits brain regions associated with the sensory processing of expressions, such as the inferior occipital gyrus, middle fusiform gyrus and the superior temporal gyrus, and brain regions involved in low-level and high-level mindreading, namely the posterior superior temporal sulcus, the inferior frontal cortex and dorsomedial frontal cortex. Second, we show that only the sensory regions were also consistently active during the passive perception of emotional expressions. Third, we show that the brain regions involved in mindreading were active during the explicit evaluation of both facial and vocal expressions. We discuss these results in light of the existing literature and conclude by proposing a cognitive model for perceiving and evaluating the emotions of others. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    PubMed

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-12-01

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Fluid-Structure Interaction Simulation of Prosthetic Aortic Valves: Comparison between Immersed Boundary and Arbitrary Lagrangian-Eulerian Techniques for the Mesh Representation

    PubMed Central

    Iannaccone, Francesco; Degroote, Joris; Vierendeels, Jan; Segers, Patrick

    2016-01-01

    In recent years the role of FSI (fluid-structure interaction) simulations in the analysis of the fluid-mechanics of heart valves is becoming more and more important, being able to capture the interaction between the blood and both the surrounding biological tissues and the valve itself. When setting up an FSI simulation, several choices have to be made to select the most suitable approach for the case of interest: in particular, to simulate flexible leaflet cardiac valves, the type of discretization of the fluid domain is crucial, which can be described with an ALE (Arbitrary Lagrangian-Eulerian) or an Eulerian formulation. The majority of the reported 3D heart valve FSI simulations are performed with the Eulerian formulation, allowing for large deformations of the domains without compromising the quality of the fluid grid. Nevertheless, it is known that the ALE-FSI approach guarantees more accurate results at the interface between the solid and the fluid. The goal of this paper is to describe the same aortic valve model in the two cases, comparing the performances of an ALE-based FSI solution and an Eulerian-based FSI approach. After a first simplified 2D case, the aortic geometry was considered in a full 3D set-up. The model was kept as similar as possible in the two settings, to better compare the simulations’ outcomes. Although for the 2D case the differences were unsubstantial, in our experience the performance of a full 3D ALE-FSI simulation was significantly limited by the technical problems and requirements inherent to the ALE formulation, mainly related to the mesh motion and deformation of the fluid domain. As a secondary outcome of this work, it is important to point out that the choice of the solver also influenced the reliability of the final results. PMID:27128798

  10. Evolution of Escherichia coli to 42 °C and subsequent genetic engineering reveals adaptive mechanisms and novel mutations.

    PubMed

    Sandberg, Troy E; Pedersen, Margit; LaCroix, Ryan A; Ebrahim, Ali; Bonde, Mads; Herrgard, Markus J; Palsson, Bernhard O; Sommer, Morten; Feist, Adam M

    2014-10-01

    Adaptive laboratory evolution (ALE) has emerged as a valuable method by which to investigate microbial adaptation to a desired environment. Here, we performed ALE to 42 °C of ten parallel populations of Escherichia coli K-12 MG1655 grown in glucose minimal media. Tightly controlled experimental conditions allowed selection based on exponential-phase growth rate, yielding strains that uniformly converged toward a similar phenotype along distinct genetic paths. Adapted strains possessed as few as 6 and as many as 55 mutations, and of the 144 genes that mutated in total, 14 arose independently across two or more strains. This mutational recurrence pointed to the key genetic targets underlying the evolved fitness increase. Genome engineering was used to introduce the novel ALE-acquired alleles in random combinations into the ancestral strain, and competition between these engineered strains reaffirmed the impact of the key mutations on the growth rate at 42 °C. Interestingly, most of the identified key gene targets differed significantly from those found in similar temperature adaptation studies, highlighting the sensitivity of genetic evolution to experimental conditions and ancestral genotype. Additionally, transcriptomic analysis of the ancestral and evolved strains revealed a general trend for restoration of the global expression state back toward preheat stressed levels. This restorative effect was previously documented following evolution to metabolic perturbations, and thus may represent a general feature of ALE experiments. The widespread evolved expression shifts were enabled by a comparatively scant number of regulatory mutations, providing a net fitness benefit but causing suboptimal expression levels for certain genes, such as those governing flagellar formation, which then became targets for additional ameliorating mutations. Overall, the results of this study provide insight into the adaptation process and yield lessons important for the future implementation of ALE as a tool for scientific research and engineering. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  11. Matching visual and nonvisual signals: evidence for a mechanism to discount optic flow during locomotion

    NASA Astrophysics Data System (ADS)

    Thurrell, Adrian; Pelah, Adar

    2005-03-01

    We report on recent experiments to investigate the Arthrovisual Locomotor Effect (ALE), a mechanism based on non-visual signals postulated to discount or remove the self-generated visual motion signals during locomotion. It is shown that perceptual matches made by standing subjects to a constant motion optic flow stimulus that is viewed while walking on a treadmill are linearly reduced by walking speed, a measure of the reported ALE. The degree of reduction in perceived speed depends on the similarity of the motor activity to natural locomotion, thus for the four activities tested, ALE strength is ranked as follows: Walking > Cycling > Hand Pedalling > Finger Tapping = 0. Other variations and important controls for the ALE are described.

  12. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  13. Biowep: a workflow enactment portal for bioinformatics applications.

    PubMed

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.

  14. Biowep: a workflow enactment portal for bioinformatics applications

    PubMed Central

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-01-01

    Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563

  15. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data

    PubMed Central

    Denis, Jean-Baptiste; Vandenbogaert, Mathias; Caro, Valérie

    2016-01-01

    The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS) or Next-Generation Sequencing (NGS) technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS), solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users’ input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power). Galaxy is used to handle and analyze the user’s input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy’s main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration of intuitive exploratory tools, like Krona for representation of taxonomic classification, can be achieved very easily. In the trend of Galaxy, the interface enables the sharing of scientific results to fellow team members. PMID:28451381

  16. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data.

    PubMed

    Correia, Damien; Doppelt-Azeroual, Olivia; Denis, Jean-Baptiste; Vandenbogaert, Mathias; Caro, Valérie

    2015-01-01

    The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS) or Next-Generation Sequencing (NGS) technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS), solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users' input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power). Galaxy is used to handle and analyze the user's input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy's main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration of intuitive exploratory tools, like Krona for representation of taxonomic classification, can be achieved very easily. In the trend of Galaxy, the interface enables the sharing of scientific results to fellow team members.

  17. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    PubMed

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org.

  18. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using "service casts" and "interest casts" (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH's Mining Workflow Composer and the open-source Active BPEL engine, and JPL's SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the "sociological" problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  19. Optimized adipose tissue engineering strategy based on a neo-mechanical processing method.

    PubMed

    He, Yunfan; Lin, Maohui; Wang, Xuecen; Guan, Jingyan; Dong, Ziqing; Feng, Lu; Xing, Malcolm; Feng, Chuanbo; Li, Xiaojian

    2018-05-26

    Decellularized adipose tissue (DAT) represents a promising scaffold for adipose tissue engineering. However, the unique and prolonged lipid removal process required for adipose tissue can damage extracellular matrix (ECM) constituents. Moreover, inadequate vascularization limits the recellularization of DAT in vivo. We proposed a neo-mechanical protocol for rapidly breaking adipocytes and removing lipid content from adipose tissue. The lipid-depleted adipose tissue was then subjected to a fast and mild decellularization to fabricate high-quality DAT (M-DAT). Adipose liquid extract (ALE) derived from this mechanical process was collected and incorporated into M-DAT to further optimize in vivo recellularization. Ordinary DAT was fabricated and served as a control. This developed strategy was evaluated based on decellularization efficiency, ECM quality, and recellularization efficiency. Angiogenic factor components and angiogenic potential of ALE were evaluated in vivo and in vitro. M-DAT achieved the same decellularization efficiency, but exhibited better retention of ECM components and recellularization, compared to those with ordinary DAT. Protein quantification revealed considerable levels of angiogenic factors (basic fibroblast growth factor, epidermal growth factor, transforming growth factor-β1, and vascular endothelial growth factor) in ALE. ALE promoted tube formation in vitro and induced intense angiogenesis in M-DAT in vivo; furthermore, higher expression of the adipogenic factor PPARγ and greater numbers of adipocytes were evident following ALE treatment, compared to those in the M-DAT group. Mechanical processing of adipose tissue led to the production of high-quality M-DAT and angiogenic factor-enriched ALE. The combination of ALE and M-DAT could be a promising strategy for engineered adipose tissue construction. This article is protected by copyright. All rights reserved. © 2018 by the Wound Healing Society.

  20. Improved Mobilization of Exogenous Mesenchymal Stem Cells to Bone for Fracture Healing and Sex Difference

    PubMed Central

    Yao, Wei; Evan Lay, Yu-An; Kot, Alexander; Liu, Ruiwu; Zhang, Hongliang; Chen, Haiyan; Lam, Kit; Lane, Nancy E.

    2017-01-01

    Mesenchymal stem cell (MSC) transplantation has been tested in animal and clinical fracture studies. We have developed a bone-seeking compound, LLP2A-Alendronate (LLP2A-Ale) that augments MSC homing to bone. The purpose of this study was to determine whether treatment with LLP2A-Ale or a combination of LLP2A-Ale and MSCs would accelerate bone healing in a mouse closed fracture model and if the effects are sex dependent. A right mid-femur fracture was induced in two-month-old osterix-mCherry (Osx-mCherry) male and female reporter mice. The mice were subsequently treated with placebo, LLP2A-Ale (500 µg/kg, IV), MSCs derived from wild-type female Osx-mCherry adipose tissue (ADSC, 3 × 105, IV) or ADSC + LLP2A-Ale. In phosphate buffered saline-treated mice, females had higher systemic and surface-based bone formation than males. However, male mice formed a larger callus and had higher volumetric bone mineral density and bone strength than females. LLP2A-Ale treatment increased exogenous MSC homing to the fracture gaps, enhanced incorporation of these cells into callus formation, and stimulated endochondral bone formation. Additionally, higher engraftment of exogenous MSCs in fracture gaps seemed to contribute to overall fracture healing and improved bone strength. These effects were sex-independent. There was a sex-difference in the rate of fracture healing. ADSC and LLP2A-Ale combination treatment was superior to on callus formation, which was independent of sex. Increased mobilization of exogenous MSCs to fracture sites accelerated endochondral bone formation and enhanced bone tissue regeneration. PMID:27334693

  1. Modeling of Complex Coupled Fluid-Structure Interaction Systems in Arbitrary Water Depth

    DTIC Science & Technology

    2008-01-01

    model in a particle finite element method ( PFEM ) based framework for the ALE-RANS solver and submitted a journal paper recently [1]. In the paper, we...developing a fluid-flexible structure interaction model without free surface using ALE-RANS and k-ε turbulence closure model implemented by PFEM . In...the ALE_RANS and k-ε turbulence closure model based on the particle finite element Method ( PFEM ) and obtained some satisfying results [1-2]. The

  2. Comparison of updated Lagrangian FEM with arbitrary Lagrangian Eulerian method for 3D thermo-mechanical extrusion of a tube profile

    NASA Astrophysics Data System (ADS)

    Kronsteiner, J.; Horwatitsch, D.; Zeman, K.

    2017-10-01

    Thermo-mechanical numerical modelling and simulation of extrusion processes faces several serious challenges. Large plastic deformations in combination with a strong coupling of thermal with mechanical effects leads to a high numerical demand for the solution as well as for the handling of mesh distortions. The two numerical methods presented in this paper also reflect two different ways to deal with mesh distortions. Lagrangian Finite Element Methods (FEM) tackle distorted elements by building a new mesh (called re-meshing) whereas Arbitrary Lagrangian Eulerian (ALE) methods use an "advection" step to remap the solution from the distorted to the undistorted mesh. Another difference between conventional Lagrangian and ALE methods is the separate treatment of material and mesh in ALE, allowing the definition of individual velocity fields. In theory, an ALE formulation contains the Eulerian formulation as a subset to the Lagrangian description of the material. The investigations presented in this paper were dealing with the direct extrusion of a tube profile using EN-AW 6082 aluminum alloy and a comparison of experimental with Lagrangian and ALE results. The numerical simulations cover the billet upsetting and last until one third of the billet length is extruded. A good qualitative correlation of experimental and numerical results could be found, however, major differences between Lagrangian and ALE methods concerning thermo-mechanical coupling lead to deviations in the thermal results.

  3. Diffraction-limited Mid-infrared Integral Field Spectroscopy of Io's Volcanic Activity with ALES on the Large Binocular Telescope

    NASA Astrophysics Data System (ADS)

    Skrutskie, Michael F.; de Kleer, Katherine R.; Stone, Jordan; Conrad, Al; Davies, Ashley; de Pater, Imke; Leisenring, Jarron; Hinz, Philip; Skemer, Andrew; Veillet, Christian; Woodward, Charles E.; Ertel, Steve; Spalding, Eckhart

    2017-10-01

    The Arizona Lenslet for Exoplanet Spectroscopy (ALES) is an enhancement to the Large Binocular Telescope's mid-infrared imager, LMIRcam, that permits low-resolution (R~20) spectroscopy between 2.8 and 4.2 μm of every diffraction-limited resolution element in a 2.5"x2.5" field-of-view on a 2048x2048 HAWAII-2RG 5.2 μm-cutoff array. The 1" disk of Io, dotted with powerful self-luminous volcanic eruptions, provides an ideal target for ALES, where the single 8.4-meter aperture diffraction-limited scale for Io at opposition ranges from 240 kilometers (80 milliarcseconds) at 2.8 μm to 360 kilometers (120 milliarcseconds) at 4.2 μm. ALES provides the capability to assess the color temperature of each volcanic thermal emission site as well as map broadband absorbers such as SO2 frost. A monitoring campaign in the Spring 2017 semester provided two global snapshots of Io's volcanic activity with ALES as well as characterization of a new brightening episode at Loki Patera over four epochs between January and May 2017.

  4. Scientific Workflows + Provenance = Better (Meta-)Data Management

    NASA Astrophysics Data System (ADS)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata. DataONE is a federation of member nodes that store data and metadata for discovery and access. By enriching metadata with provenance information, search and reuse of data is enhanced, and the 'social life' of data (being the product of many workflow runs, different people, etc.) is revealed. We are currently prototyping a provenance repository (PBase) to demonstrate what can be achieved with advanced provenance queries. The ProvExplorer and ProPub tools support advanced ad-hoc querying and visualization of provenance as well as customized provenance publications (e.g., to address privacy issues, or to focus provenance to relevant details). In a parallel line of work, we are exploring ways to add provenance support to widely-used scripting platforms (e.g. R and Python) and then expose that information via D-PROV.

  5. Morning sickness

    MedlinePlus

    ... Bland foods, such as gelatin, frozen desserts, broth, ginger ale, and saltine crackers, also soothe the stomach. ... your stomach does not get too full. Seltzer, ginger ale, or other sparkling waters may help control ...

  6. Modelling and analysis of workflow for lean supply chains

    NASA Astrophysics Data System (ADS)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  7. Phosphates behaviours in conversion of FP chlorides

    NASA Astrophysics Data System (ADS)

    Amamoto, I.; Kofuji, H.; Myochin, M.; Takasaki, Y.; Terai, T.

    2009-06-01

    The spent electrolyte of the pyroprocessing by metal electrorefining method should be considered for recycling after removal of fission products (FP) such as, alkali metals (AL), alkaline earth metals (ALE), and/or rare earth elements (REE), to reduce the volume of high-level radioactive waste. Among the various methods suggested for this purpose is precipitation by converting FP from chlorides to phosphates. Authors have been carrying out the theoretical analysis and experiment showing the behaviours of phosphate precipitates so as to estimate the feasibility of this method. From acquired results, it was found that AL except lithium and ALE are unlikely to form phosphate precipitates. However their conversion behaviours including REE were compatible with the theoretical analysis; in the case of LaPO 4 as one of the REE precipitates, submicron-size particles could be observed while that of Li 3PO 4 was larger; the precipitates were apt to grow larger at higher temperature; etc.

  8. The neural basis of kinesthetic and visual imagery in sports: an ALE meta - analysis.

    PubMed

    Filgueiras, Alberto; Quintas Conde, Erick Francisco; Hall, Craig R

    2017-12-19

    Imagery is a widely spread technique in the sport sciences that entails the mental rehearsal of a given situation to improve an athlete's learning, performance and motivation. Two modalities of imagery are reported to tap into distinct brain structures, but sharing common components: kinesthetic and visual imagery. This study aimed to investigate the neural basis of those types of imagery with Activation Likelihood Estimation algorithm to perform a meta - analysis. A systematic search was used to retrieve only experimental studies with athletes or sportspersons. Altogether, nine studies were selected and an ALE meta - analysis was performed. Results indicated significant activation of the premotor, somatosensory cortex, supplementary motor areas, inferior and superior parietal lobule, caudate, cingulate and cerebellum in both imagery tasks. It was concluded that visual and kinesthetic imagery share similar neural networks which suggests that combined interventions are beneficial to athletes whereas separate use of those two modalities of imagery may seem less efficient from a neuropsychological approach.

  9. The ALE/GAGE/AGAGE Network (DB1001)

    DOE Data Explorer

    Prinn, Ronald G. [MIT, Center for Global Change Science; Weiss, Ray F. [University of California, San Diego; Scripps Institution of Oceanography; Krummel, Paul B. [CSIRO Oceans and Atmosphere, Cape Grim; O'Doherty, Simon [University of Bristol, Barbados and Mace Head Stations; Fraser, Paul [CSIRO Oceans and Atmosphere; Muhle, Jens [UCSD Scripps Institution of Oceanography; Cape Matatula Station; Reimann, Stefan [Swiss Federal Laboratories for Materials Science and Research (EMPA); Jungfraujoch Station; Vollmer, Martin [Swiss Federal Laboratories for Materials Science and Research (EMPA); Jungfraujoch Station; Simmonds, Peter G. [University of Bristol, Atmospheric Chemistry Research Group; Mace Head Station; Malone, Michela [University of Urbino; Monte Cimone Station; Arduini, Jgor [University of Urbino; Monte Cimone Station; Lunder, Chris [Norwegian Institute for Air Research; Ny Alesund Station; Hermansen, Ove [Norwegian Inst. for Air Research (NILU), Kjeller (Norway); Ny Alesund Station; Schmidbauer, Norbert [Norwegian Inst. for Air Research (NILU), Kjeller (Norway); Global Network; Young, Dickon [University of Bristol; Ragged Point Station; Wang, Hsiang J. (Ray) [Geogia Institute of Technology, School of Earth and Atmospheric Sciences; Global Network; Huang, Jin; Rigby, Matthew [University of Bristol; Global Network; Harth, Chris [UCSD, Scripps Institutioon of Oceanography; Global Network; Salameh, Peter [UCSD, Scripps Institution of Oceanography; Global Network; Spain, Gerard [National University of Ireland; Global Network; Steele, Paul [CSIRO Oceans and Atmosphere; Global Network; Arnold, Tim; Kim, Jooil [UCSD, Scripps Institution of Oceanography; Global Network; Derek, Nada; mitrevski, Blagoj; Langenfelds, Ray

    2008-01-01

    In the ALE/GAGE/AGAGE global network program, continuous high frequency gas chromatographic measurements of four biogenic/anthropogenic gases (methane, CH4; nitrous oxide, N2O; hydrogen, H; and carbon monoxide, CO) and several anthropogenic gases that contribute to stratospheric ozone destruction and/or to the greenhouse effect have been carried out at five globally distributed sites for several years. The program, which began in 1978, is divided into three parts associated with three changes in instrumentation: the Atmospheric Lifetime Experiment (ALE), which used Hewlett Packard HP5840 gas chromatographs; the Global Atmospheric Gases Experiment (GAGE), which used HP5880 gas chromatographs; and the present Advanced GAGE (AGAGE). AGAGE uses two types of instruments: a gas chromatograph with multiple detectors (GC-MD), and a gas chromatograph with mass spectrometric analysis (GC-MS). Beginning in January 2004, an improved cryogenic preconcentration system (Medusa) replaced the absorption-desorption module in the GC-MS systems at Mace Head and Cape Grim; this provided improved capability to measure a broader range of volatile perfluorocarbons with high global warming potentials. More information may be found at the AGAGE home page: http://agage.eas.gatech.edu/instruments-gcms-medusa.htm.

  10. ALE3D Simulation and Measurement of Violence in a Fast Cookoff Experiment with LX-10

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClelland, M A; Maienschein, J L; Howard, W M

    We performed a computational and experimental analysis of fast cookoff of LX-10 (94.7% HMX, 5.3% Viton A) confined in a 2 kbar steel tube with reinforced end caps. A Scaled-Thermal-Explosion-eXperiment (STEX) was completed in which three radiant heaters were used to heat the vessel until ignition, resulting in a moderately violent explosion after 20.4 minutes. Thermocouple measurements showed tube temperatures as high as 340 C at ignition and LX-10 surface temperatures as high as 279 C, which is near the melting point of HMX. Three micro-power radar systems were used to measure mean fragment velocities of 840 m/s. Photonics Dopplermore » Velocimeters (PDVs) showed a rapid acceleration of fragments over 80 {micro}s. A one-dimensional ALE3D cookoff model at the vessel midplane was used to simulate the heating, thermal expansion, LX-10 decomposition composition, and closing of the gap between the HE (High Explosive) and vessel wall. Although the ALE3D simulation terminated before ignition, the model provided a good representation of heat transfer through the case and across the dynamic gap to the explosive.« less

  11. Safeguarding donors' personal rights and biobank autonomy in biobank networks: the CRIP privacy regime.

    PubMed

    Schröder, Christina; Heidtke, Karsten R; Zacherl, Nikolaus; Zatloukal, Kurt; Taupitz, Jochen

    2011-08-01

    Governance, underlying general ICT (Information and Communication Technology) architecture, and workflow of the Central Research Infrastructure for molecular Pathology (CRIP) are discussed as a model enabling biobank networks to form operational "meta biobanks" whilst respecting the donors' privacy, biobank autonomy and confidentiality, and the researchers' needs for appropriate biospecimens and information, as well as confidentiality. Tailored to these needs, CRIP efficiently accelerates and facilitates research with human biospecimens and data.

  12. Workflows for microarray data processing in the Kepler environment.

    PubMed

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.

  13. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using “service casts” and “interest casts” (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH’s Mining Workflow Composer and the open-source Active BPEL engine, and JPL’s SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the “sociological” problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  14. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    PubMed

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most efficient analysis of soybean data using thorough testing and validation. This research serves as an example of best practices for development of genomics data analysis workflows by integrating remote HPC resources and efficient data management with ease of use for biological users. PGen workflow can also be easily customized for analysis of data in other species.

  15. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    PubMed Central

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org. PMID:22559942

  16. Active Learning in Engineering Education: a (re)introduction

    NASA Astrophysics Data System (ADS)

    Lima, Rui M.; Andersson, Pernille Hammar; Saalman, Elisabeth

    2017-01-01

    The informal network 'Active Learning in Engineering Education' (ALE) has been promoting Active Learning since 2001. ALE creates opportunity for practitioners and researchers of engineering education to collaboratively learn how to foster learning of engineering students. The activities in ALE are centred on the vision that learners construct their knowledge based on meaningful activities and knowledge. In 2014, the steering committee of the ALE network reinforced the need to discuss the meaning of Active Learning and that was the base for this proposal for a special issue. More than 40 submissions were reviewed by the European Journal of Engineering Education community and this theme issue ended up with eight contributions, which are different both in their research and Active Learning approaches. These different Active Learning approaches are aligned with the different approaches that can be increasingly found in indexed journals.

  17. Adaptive reconnection-based arbitrary Lagrangian Eulerian method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo, Wurigen; Shashkov, Mikhail

    We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less

  18. Adaptive reconnection-based arbitrary Lagrangian Eulerian method

    DOE PAGES

    Bo, Wurigen; Shashkov, Mikhail

    2015-07-21

    We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less

  19. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  20. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-03-01

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. The all local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  1. Antiallergic Activity of Ethanol Extracts of Arctium lappa L. Undried Roots and Its Active Compound, Oleamide, in Regulating FcεRI-Mediated and MAPK Signaling in RBL-2H3 Cells.

    PubMed

    Yang, Woong-Suk; Lee, Sung Ryul; Jeong, Yong Joon; Park, Dae Won; Cho, Young Mi; Joo, Hae Mi; Kim, Inhye; Seu, Young-Bae; Sohn, Eun-Hwa; Kang, Se Chan

    2016-05-11

    The antiallergic potential of Arctium lappa L. was investigated in Sprague-Dawley rats, ICR mice, and RBL-2H3 cells. Ethanol extract (90%) of A. lappa (ALE, 100 μg/mL) inhibited the degranulation rate by 52.9%, determined by the level of β-hexosaminidase. ALE suppressed passive cutaneous anaphylaxis (PCA) in rats and attenuated anaphylaxis and histamine release in mice. To identify the active compound of ALE, we subsequently fractionated and determined the level of β-hexosaminidase in all subfractions. Oleamide was identified as an active compound of ALE, which attenuated the secretion of histamine and the production of tumor necrosis factor (TNF)-α and interleukin-4 (IL-4) in cells treated with compound 48/80 or A23187/phorbol myristate acetate (PMA). Oleamide suppressed FcεRI-tyrosine kinase Lyn-mediated pathway, c-Jun N-terminal kinases (JNK/SAPK), and p38 mitogen-activated protein kinases (p38-MAPKs). These results showed that ALE and oleamide attenuated allergic reactions and should serve as a platform to search for compounds with antiallergic activity.

  2. Trade-off of cerebello-cortical and cortico-cortical functional networks for planning in 6-year-old children.

    PubMed

    Kipping, Judy A; Margulies, Daniel S; Eickhoff, Simon B; Lee, Annie; Qiu, Anqi

    2018-08-01

    Childhood is a critical period for the development of cognitive planning. There is a lack of knowledge on its neural mechanisms in children. This study aimed to examine cerebello-cortical and cortico-cortical functional connectivity in association with planning skills in 6-year-olds (n = 76). We identified the cerebello-cortical and cortico-cortical functional networks related to cognitive planning using activation likelihood estimation (ALE) meta-analysis on existing functional imaging studies on spatial planning, and data-driven independent component analysis (ICA) of children's resting-state functional MRI (rs-fMRI). We investigated associations of cerebello-cortical and cortico-cortical functional connectivity with planning ability in 6-year-olds, as assessed using the Stockings of Cambridge task. Long-range functional connectivity of two cerebellar networks (lobules VI and lateral VIIa) with the prefrontal and premotor cortex were greater in children with poorer planning ability. In contrast, cortico-cortical association networks were not associated with the performance of planning in children. These results highlighted the key contribution of the lateral cerebello-frontal functional connectivity, but not cortico-cortical association functional connectivity, for planning ability in 6-year-olds. Our results suggested that brain adaptation to the acquisition of planning ability during childhood is partially achieved through the engagement of the cerebello-cortical functional connectivity. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Centrality of prefrontal and motor preparation cortices to Tourette Syndrome revealed by meta-analysis of task-based neuroimaging studies.

    PubMed

    Polyanska, Liliana; Critchley, Hugo D; Rae, Charlotte L

    2017-01-01

    Tourette Syndrome (TS) is a neurodevelopmental condition characterized by chronic multiple tics, which are experienced as compulsive and 'unwilled'. Patients with TS can differ markedly in the frequency, severity, and bodily distribution of tics. Moreover, there are high comorbidity rates with attention deficit hyperactivity disorder (ADHD), obsessive compulsive disorder (OCD), anxiety disorders, and depression. This complex clinical profile may account for apparent variability of findings across neuroimaging studies that connect neural function to cognitive and motor behavior in TS. Here we crystalized information from neuroimaging regarding the functional circuitry of TS, and furthermore, tested specifically for neural determinants of tic severity, by applying activation likelihood estimation (ALE) meta-analyses to neuroimaging (activation) studies of TS. Fourteen task-based studies (13 fMRI and one H2O-PET) met rigorous inclusion criteria. These studies, encompassing 25 experiments and 651 participants, tested for differences between TS participants and healthy controls across cognitive, motor, perceptual and somatosensory domains. Relative to controls, TS participants showed distributed differences in the activation of prefrontal (inferior, middle, and superior frontal gyri), anterior cingulate, and motor preparation cortices (lateral premotor cortex and supplementary motor area; SMA). Differences also extended into sensory (somatosensory cortex and the lingual gyrus; V4); and temporo-parietal association cortices (posterior superior temporal sulcus, supramarginal gyrus, and retrosplenial cortex). Within TS participants, tic severity (reported using the Yale Global Tic Severity Scale; YGTSS) selectively correlated with engagement of SMA, precentral gyrus, and middle frontal gyrus across tasks. The dispersed involvement of multiple cortical regions with differences in functional reactivity may account for heterogeneity in the symptomatic expression of TS and its comorbidities. More specifically for tics and tic severity, the findings reinforce previously proposed contributions of premotor and lateral prefrontal cortices to tic expression.

  4. Standardizing clinical trials workflow representation in UML for international site comparison.

    PubMed

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-11-09

    With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.

  5. Standardizing Clinical Trials Workflow Representation in UML for International Site Comparison

    PubMed Central

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M. O.; Rodrigues, Maria J.; Shah, Jatin; Loures, Marco R.; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-01-01

    Background With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Methods Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Results Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. Conclusions This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows. PMID:21085484

  6. Evaluation of Coastal Sea Level from Jason-2 Altimetry Offshore Hong Kong

    NASA Astrophysics Data System (ADS)

    Birol, F.; Xu, X. Y., , Dr; Cazenave, A. A.

    2017-12-01

    In the recent years, several coastal altimetry products of Jason-2 mission have been distributed by different agencies, the most advance ones of which are XTRACK, PISTACH and ALES. Each product represents extraordinary endeavors on some aspects of retracking or advanced geophysical corrections, and each has its advantage. The motivation of this presentation is to evaluate these products in order to refine the sea level measurements at the coast. Three retrackers: MLE4, MLE3 and ALES are focused on. Within 20km coastward, neither GDR nor ALES readily provides sea level anomaly (SLA) measurements, so we recomputed the 20Hz GDR and ALES SLA from the raw data, adopting auxiliary information (such as waveform classification and wet tropospheric delay) from PISTACH. The region of interest is track #153 of the Jason-2 satellite (offshore Hong Kong, China), and the altimetry products are processed over seven years (2008-2015, cycles 1-252). The coastline offshore Hong Kong is rather complicated and we feel that it can be a good indicator of the performance of coastal altimetry under undesirable coast conditions. We computed the bias and noise level of ALES, MLE3 and MLE4 SLA over open ocean and in the coastal zone (within 10km or 5km coast-ward). The results showed that, after outlier-editing, ALES performs better than MLE4 and MLE3 both in terms of noise level and uncertainty in sea level trend estimation. We validated the coastal altimetry-based SLA by comparing with data from the Hong Kong tide gauge (located 10km across-track). An interesting , but still preliminary, result is that the computed sea level trend within 5 km from the coast is significantly larger than the trend estimated at larger distances from the coast. Keywords: Jason-2, Hong Kong coast, ALES, MLE3, MLE4

  7. Test Results of Initial Installation DATAS/TCAS Monitor: DFW Airport

    DOT National Transportation Integrated Search

    1992-01-01

    This document presents the results of initial tests with the Data Link Test and : Analysis System (DATAS)/Traffic Alert and Collision Avoidance System (TCAS). : Since a significant amount of air carriers have recently been equipped with : Traffic Ale...

  8. Radiography for intensive care: participatory process analysis in a PACS-equipped and film/screen environment

    NASA Astrophysics Data System (ADS)

    Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes

    2002-05-01

    If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.

  9. Numerical simulation of the fluid-structure interaction between air blast waves and soil structure

    NASA Astrophysics Data System (ADS)

    Umar, S.; Risby, M. S.; Albert, A. Luthfi; Norazman, M.; Ariffin, I.; Alias, Y. Muhamad

    2014-03-01

    Normally, an explosion threat on free field especially from high explosives is very dangerous due to the ground shocks generated that have high impulsive load. Nowadays, explosion threats do not only occur in the battlefield, but also in industries and urban areas. In industries such as oil and gas, explosion threats may occur on logistic transportation, maintenance, production, and distribution pipeline that are located underground to supply crude oil. Therefore, the appropriate blast resistances are a priority requirement that can be obtained through an assessment on the structural response, material strength and impact pattern of material due to ground shock. A highly impulsive load from ground shocks is a dynamic load due to its loading time which is faster than ground response time. Of late, almost all blast studies consider and analyze the ground shock in the fluid-structure interaction (FSI) because of its influence on the propagation and interaction of ground shock. Furthermore, analysis in the FSI integrates action of ground shock and reaction of ground on calculations of velocity, pressure and force. Therefore, this integration of the FSI has the capability to deliver the ground shock analysis on simulation to be closer to experimental investigation results. In this study, the FSI was implemented on AUTODYN computer code by using Euler-Godunov and the arbitrary Lagrangian-Eulerian (ALE). Euler-Godunov has the capability to deliver a structural computation on a 3D analysis, while ALE delivers an arbitrary calculation that is appropriate for a FSI analysis. In addition, ALE scheme delivers fine approach on little deformation analysis with an arbitrary motion, while the Euler-Godunov scheme delivers fine approach on a large deformation analysis. An integrated scheme based on Euler-Godunov and the arbitrary Lagrangian-Eulerian allows us to analyze the blast propagation waves and structural interaction simultaneously.

  10. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  11. Rethinking the role of the rTPJ in attention and social cognition in light of the opposing domains hypothesis: findings from an ALE-based meta-analysis and resting-state functional connectivity

    PubMed Central

    Kubit, Benjamin; Jack, Anthony I.

    2013-01-01

    The right temporo-parietal junction (rTPJ) has been associated with two apparently disparate functional roles: in attention and in social cognition. According to one account, the rTPJ initiates a “circuit-breaking” signal that interrupts ongoing attentional processes, effectively reorienting attention. It is argued this primary function of the rTPJ has been extended beyond attention, through a process of evolutionarily cooption, to play a role in social cognition. We propose an alternative account, according to which the capacity for social cognition depends on a network which is both distinct from and in tension with brain areas involved in focused attention and target detection: the default mode network (DMN). Theory characterizing the rTPJ based on the area's purported role in reorienting may be falsely guided by the co-occurrence of two distinct effects in contiguous regions: activation of the supramarginal gyrus (SMG), associated with its functional role in target detection; and the transient release, during spatial reorienting, of suppression of the angular gyrus (AG) associated with focused attention. Findings based on meta-analysis and resting functional connectivity are presented which support this alternative account. We find distinct regions, possessing anti-correlated patterns of resting connectivity, associated with social reasoning (AG) and target detection (SMG) at the rTPJ. The locus for reorienting was spatially intermediate between the AG and SMG and showed a pattern of connectivity with similarities to social reasoning and target detection seeds. These findings highlight a general methodological concern for brain imaging. Given evidence that certain tasks not only activate some areas but also suppress activity in other areas, it is suggested that researchers need to distinguish two distinct putative mechanisms, either of which may produce an increase in activity in a brain area: functional engagement in the task vs. release of suppression. PMID:23847497

  12. An interactive environment for agile analysis and visualization of ChIP-sequencing data.

    PubMed

    Lerdrup, Mads; Johansen, Jens Vilstrup; Agrawal-Singh, Shuchi; Hansen, Klaus

    2016-04-01

    To empower experimentalists with a means for fast and comprehensive chromatin immunoprecipitation sequencing (ChIP-seq) data analyses, we introduce an integrated computational environment, EaSeq. The software combines the exploratory power of genome browsers with an extensive set of interactive and user-friendly tools for genome-wide abstraction and visualization. It enables experimentalists to easily extract information and generate hypotheses from their own data and public genome-wide datasets. For demonstration purposes, we performed meta-analyses of public Polycomb ChIP-seq data and established a new screening approach to analyze more than 900 datasets from mouse embryonic stem cells for factors potentially associated with Polycomb recruitment. EaSeq, which is freely available and works on a standard personal computer, can substantially increase the throughput of many analysis workflows, facilitate transparency and reproducibility by automatically documenting and organizing analyses, and enable a broader group of scientists to gain insights from ChIP-seq data.

  13. Agile parallel bioinformatics workflow management using Pwrake.

    PubMed

    Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro

    2011-09-08

    In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows.

  14. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows. PMID:21899774

  15. Electrochemical Atomic Layer Epitaxy of Thin Film CdSe

    NASA Astrophysics Data System (ADS)

    Pham, L.; Kaleida, K.; Happek, U.; Mathe, M. K.; Vaidyanathan, R.; Stickney, J. L.; Radevic, M.

    2002-10-01

    Electrochemical atomic layer epitaxy (EC-ALE) is a current developmental technique for the fabrication of compound semiconductor thin films. The deposition of elements making up the compound utilizes surface limited reactions where the potential is less than that required for bulk growth. This growth method offers mono-atomic layer control, allowing the deposition of superlattices with sharp interfaces. Here we report on the EC-ALE formation of CdSe thin films on Au and Cu substrates using an automated flow cell system. The band gap was measured using IR absorption and photoconductivity and found to be consistent with the literature value of 1.74 eV at 300K and 1.85 eV at 20K. The stoichiometry of the thin film was confirmed with electron microprobe analysis and x-ray diffraction.

  16. Data management routines for reproducible research using the G-Node Python Client library

    PubMed Central

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J.; Garbers, Christian; Rautenberg, Philipp L.; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654

  17. Data management routines for reproducible research using the G-Node Python Client library.

    PubMed

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.

  18. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  19. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    PubMed Central

    Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R

    2007-01-01

    Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870

  20. Conceptual-level workflow modeling of scientific experiments using NMR as a case study.

    PubMed

    Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R

    2007-01-30

    Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.

  1. Dynamic analysis of a needle insertion for soft materials: Arbitrary Lagrangian-Eulerian-based three-dimensional finite element analysis.

    PubMed

    Yamaguchi, Satoshi; Tsutsui, Kihei; Satake, Koji; Morikawa, Shigehiro; Shirai, Yoshiaki; Tanaka, Hiromi T

    2014-10-01

    Our goal was to develop a three-dimensional finite element model that enables dynamic analysis of needle insertion for soft materials. To demonstrate large deformation and fracture, we used the arbitrary Lagrangian-Eulerian (ALE) method for fluid analysis. We performed ALE-based finite element analysis for 3% agar gel and three types of copper needle with bevel tips. To evaluate simulation results, we compared the needle deflection and insertion force with corresponding experimental results acquired with a uniaxial manipulator. We studied the shear stress distribution of agar gel on various time scales. For 30°, 45°, and 60°, differences in deflections of each needle between both sets of results were 2.424, 2.981, and 3.737mm, respectively. For the insertion force, there was no significant difference for mismatching area error (p<0.05) between simulation and experimental results. Our results have the potential to be a stepping stone to develop pre-operative surgical planning to estimate an optimal needle insertion path for MR image-guided microwave coagulation therapy and for analyzing large deformation and fracture in biological tissues. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Investigating flavour characteristics of British ale yeasts: techniques, resources and opportunities for innovation

    PubMed Central

    Parker, Neva; James, Steve; Dicks, Jo; Bond, Chris; Nueno-Palop, Carmen; White, Chris; Roberts, Ian N

    2015-01-01

    Five British ale yeast strains were subjected to flavour profiling under brewery fermentation conditions in which all other brewing parameters were kept constant. Significant variation was observed in the timing and quantity of flavour-related chemicals produced. Genetic tests showed no evidence of hybrid origins in any of the strains, including one strain previously reported as a possible hybrid of Saccharomyces cerevisiae and S. bayanus. Variation maintained in historical S. cerevisiae ale yeast collections is highlighted as a potential source of novelty in innovative strain improvement for bioflavour production. Copyright © 2014 John Wiley & Sons, Ltd. PMID:25361168

  3. Posteriori error determination and grid adaptation for AMR and ALE computational fluid dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lapenta, G. M.

    2002-01-01

    We discuss grid adaptation for application to AMR and ALE codes. Two new contributions are presented. First, a new method to locate the regions where the truncation error is being created due to an insufficient accuracy: the operator recovery error origin (OREO) detector. The OREO detector is automatic, reliable, easy to implement and extremely inexpensive. Second, a new grid motion technique is presented for application to ALE codes. The method is based on the Brackbill-Saltzman approach but it is directly linked to the OREO detector and moves the grid automatically to minimize the error.

  4. Advanced lipid peroxidation end products in oxidative damage to proteins. Potential role in diseases and therapeutic prospects for the inhibitors.

    PubMed

    Negre-Salvayre, A; Coatrieux, C; Ingueneau, C; Salvayre, R

    2008-01-01

    Reactive carbonyl compounds (RCCs) formed during lipid peroxidation and sugar glycoxidation, namely Advanced lipid peroxidation end products (ALEs) and Advanced Glycation end products (AGEs), accumulate with ageing and oxidative stress-related diseases, such as atherosclerosis, diabetes or neurodegenerative diseases. RCCs induce the 'carbonyl stress' characterized by the formation of adducts and cross-links on proteins, which progressively leads to impaired protein function and damages in all tissues, and pathological consequences including cell dysfunction, inflammatory response and apoptosis. The prevention of carbonyl stress involves the use of free radical scavengers and antioxidants that prevent the generation of lipid peroxidation products, but are inefficient on pre-formed RCCs. Conversely, carbonyl scavengers prevent carbonyl stress by inhibiting the formation of protein cross-links. While a large variety of AGE inhibitors has been developed, only few carbonyl scavengers have been tested on ALE-mediated effects. This review summarizes the signalling properties of ALEs and ALE-precursors, their role in the pathogenesis of oxidative stress-associated diseases, and the different agents efficient in neutralizing ALEs effects in vitro and in vivo. The generation of drugs sharing both antioxidant and carbonyl scavenger properties represents a new therapeutic challenge in the treatment of carbonyl stress-associated diseases.

  5. Studies on the protective effect of the artichoke (Cynara scolymus) leaf extract against cadmium toxicity-induced oxidative stress, hepatorenal damage, and immunosuppressive and hematological disorders in rats.

    PubMed

    El-Boshy, Mohamed; Ashshi, Ahmad; Gaith, Mazen; Qusty, Naeem; Bokhary, Thalat; AlTaweel, Nagwa; Abdelhady, Mohamed

    2017-05-01

    Our objective was to explore the protective effect of artichoke leaf extract (ALE) against cadmium (Cd) toxicity-induced oxidative organ damage in rats. Male albino Wistar rats were divided into four equal groups of eight animals each. The first group was assigned as a control. Groups 2-4 were orally administered with ALE (300 mg/kg bw), Cd (CdCl 2 , 100 mg/L drinking water), and ALE plus Cd, respectively, daily for 4 weeks. After treatment with Cd, the liver and kidney malondialdehyde (MDA) increased significantly compared with the control rats. The sera interleukin (IL)-1β, tumor necrosis factor (TNF-α), and IL-10, liver transaminase, urea, creatinine, and peripheral neutrophil count were significantly increased in Cd-exposed rats compared to the control group. The reduced glutathione (GSH), glutathione peroxidase (GPX), superoxide dismutase (SOD), and catalase (CAT) decreased in the liver and kidney in Cd-exposed group. In combination treatment, Cd and ALE significantly improved immune response, an antioxidant system, and hepatorenal function with a significant decline in MDA. In conclusion, ALE ameliorates the immunosuppressive and hepatorenal oxidative injury stimulated by Cd in rats. These results suggest that artichoke has shown promising effects against adverse effects of Cd toxicity.

  6. Inhibitory Effect and Mechanism of Arctium lappa Extract on NLRP3 Inflammasome Activation.

    PubMed

    Kim, Young-Kyu; Koppula, Sushruta; Shim, Do-Wan; In, Eun-Jung; Kwak, Su-Bin; Kim, Myong-Ki; Yu, Sang-Hyeun; Lee, Kwang-Ho; Kang, Tae-Bong

    2018-01-01

    Arctium lappa (A. lappa) , Compositae, is considered a potential source of nutrition and is used as a traditional medicine in East Asian countries for centuries. Although several studies have shown its biological activities as an anti-inflammatory agent, there have been no reports on A. lappa with regard to regulatory role in inflammasome activation. The purpose of this study was to investigate the inhibitory effects of A. lappa extract (ALE) on NLRP3 inflammasome activation and explore the underlying mechanisms. We found that ALE inhibited IL-1 β secretion from NLRP3 inflammasome activated bone marrow derived macrophages but not that secreted by NLRC4 and AIM2 inflammasomes activation. Mechanistic studies revealed that ALE suppressed the ATPase activity of purified NLRP3 and reduced mitochondrial reactive oxygen species (mROS) generated during NLRP3 activation. Therefore, the inhibitory effect of ALE on NLRP3 inflammasome might be attributed to its ability to inhibit the NLRP3 ATPase function and attenuated the mROS during inflammasome activation. In addition, ALE significantly reduced the LPS-induced increase of plasma IL-1 β in mouse peritonitis model. These results provide evidence of novel anti-inflammatory mechanisms of A. lappa , which might be used for therapeutic applications in the treatment of NLRP3 inflammasome-associated inflammatory disorders.

  7. Inhibitory Effect and Mechanism of Arctium lappa Extract on NLRP3 Inflammasome Activation

    PubMed Central

    Kim, Young-Kyu; Koppula, Sushruta; Shim, Do-Wan; In, Eun-Jung; Kwak, Su-Bin; Yu, Sang-Hyeun

    2018-01-01

    Arctium lappa (A. lappa), Compositae, is considered a potential source of nutrition and is used as a traditional medicine in East Asian countries for centuries. Although several studies have shown its biological activities as an anti-inflammatory agent, there have been no reports on A. lappa with regard to regulatory role in inflammasome activation. The purpose of this study was to investigate the inhibitory effects of A. lappa extract (ALE) on NLRP3 inflammasome activation and explore the underlying mechanisms. We found that ALE inhibited IL-1β secretion from NLRP3 inflammasome activated bone marrow derived macrophages but not that secreted by NLRC4 and AIM2 inflammasomes activation. Mechanistic studies revealed that ALE suppressed the ATPase activity of purified NLRP3 and reduced mitochondrial reactive oxygen species (mROS) generated during NLRP3 activation. Therefore, the inhibitory effect of ALE on NLRP3 inflammasome might be attributed to its ability to inhibit the NLRP3 ATPase function and attenuated the mROS during inflammasome activation. In addition, ALE significantly reduced the LPS-induced increase of plasma IL-1β in mouse peritonitis model. These results provide evidence of novel anti-inflammatory mechanisms of A. lappa, which might be used for therapeutic applications in the treatment of NLRP3 inflammasome-associated inflammatory disorders. PMID:29576797

  8. Qualitative screening for new psychoactive substances in wastewater collected during a city festival using liquid chromatography coupled to high-resolution mass spectrometry.

    PubMed

    Causanilles, Ana; Kinyua, Juliet; Ruttkies, Christoph; van Nuijs, Alexander L N; Emke, Erik; Covaci, Adrian; de Voogt, Pim

    2017-10-01

    The inclusion of new psychoactive substances (NPS) in the wastewater-based epidemiology approach presents challenges, such as the reduced number of users that translates into low concentrations of residues and the limited pharmacokinetics information available, which renders the choice of target biomarker difficult. The sampling during special social settings, the analysis with improved analytical techniques, and data processing with specific workflow to narrow the search, are required approaches for a successful monitoring. This work presents the application of a qualitative screening technique to wastewater samples collected during a city festival, where likely users of recreational substances gather and consequently higher residual concentrations of used NPS are expected. The analysis was performed using liquid chromatography coupled to high-resolution mass spectrometry. Data were processed using an algorithm that involves the extraction of accurate masses (calculated based on molecular formula) of expected m/z from an in-house database containing about 2,000 entries, including NPS and transformation products. We positively identified eight NPS belonging to the classes of synthetic cathinones, phenethylamines and opioids. In addition, the presence of benzodiazepine analogues, classical drugs and other licit substances with potential for abuse was confirmed. The screening workflow based on a database search was useful in the identification of NPS biomarkers in wastewater. The findings highlight the specific classical drugs and low NPS use in the Netherlands. Additionally, meta-chlorophenylpiperazine (mCPP), 2,5-dimethoxy-4-bromophenethylamine (2C-B), and 4-fluoroamphetamine (FA) were identified in wastewater for the first time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Improving adherence to the Epic Beacon ambulatory workflow.

    PubMed

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  10. An Integrated Framework for Parameter-based Optimization of Scientific Workflows.

    PubMed

    Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel

    2009-01-01

    Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.

  11. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGES

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...

    2015-12-23

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  12. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    PubMed

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Managing Radiation Therapy Side Effects: What to Do When You Have Loose Stools (Diarrhea)

    MedlinePlus

    ... Drink lots of clear liquids, such as water, ginger ale, and clear soup. n Most people who ... beef Drinks (clear liquids) • Clear soda, such as ginger ale • Cranberry or grape juice • Oral rehydration solution ...

  14. Flocculation in ale brewing strains of Saccharomyces cerevisiae: re-evaluation of the role of cell surface charge and hydrophobicity.

    PubMed

    Holle, Ann Van; Machado, Manuela D; Soares, Eduardo V

    2012-02-01

    Flocculation is an eco-friendly process of cell separation, which has been traditionally exploited by the brewing industry. Cell surface charge (CSC), cell surface hydrophobicity (CSH) and the presence of active flocculins, during the growth of two (NCYC 1195 and NCYC 1214) ale brewing flocculent strains, belonging to the NewFlo phenotype, were examined. Ale strains, in exponential phase of growth, were not flocculent and did not present active flocculent lectins on the cell surface; in contrast, the same strains, in stationary phase of growth, were highly flocculent (>98%) and presented a hydrophobicity of approximately three to seven times higher than in exponential phase. No relationship between growth phase, flocculation and CSC was observed. For comparative purposes, a constitutively flocculent strain (S646-1B) and its isogenic non-flocculent strain (S646-8D) were also used. The treatment of ale brewing and S646-1B strains with pronase E originated a loss of flocculation and a strong reduction of CSH; S646-1B pronase E-treated cells displayed a similar CSH as the non-treated S646-8D cells. The treatment of the S646-8D strain with protease did not reduce CSH. In conclusion, the increase of CSH observed at the onset of flocculation of ale strains is a consequence of the presence of flocculins on the yeast cell surface and not the cause of yeast flocculation. CSH and CSC play a minor role in the auto-aggregation of the ale strains since the degree of flocculation is defined, primarily, by the presence of active flocculins on the yeast cell wall.

  15. Flavonoids from artichoke (Cynara scolymus L.) up-regulate endothelial-type nitric-oxide synthase gene expression in human endothelial cells.

    PubMed

    Li, Huige; Xia, Ning; Brausch, Isolde; Yao, Ying; Förstermann, Ulrich

    2004-09-01

    Nitric oxide (NO) produced by endothelial nitric-oxide synthase (eNOS) represents an antithrombotic and anti-atherosclerotic principle in the vasculature. Hence, an enhanced expression of eNOS in response to pharmacological interventions could provide protection against cardiovascular diseases. In EA.hy 926 cells, a cell line derived from human umbilical vein endothelial cells (HUVECs), an artichoke leaf extract (ALE) increased the activity of the human eNOS promoter (determined by luciferase reporter gene assay). An organic subfraction from ALE was more potent in this respect than the crude extract, whereas an aqueous subfraction of ALE was without effect. ALE and the organic subfraction thereof also increased eNOS mRNA expression (measured by an RNase protection assay) and eNOS protein expression (determined by Western blot) both in EA.hy 926 cells and in native HUVECs. NO production (measured by NO-ozone chemiluminescence) was increased by both extracts. In organ chamber experiments, ex vivo incubation (18 h) of rat aortic rings with the organic subfraction of ALE enhanced the NO-mediated vasodilator response to acetylcholine, indicating that the up-regulated eNOS remained functional. Caffeoylquinic acids and flavonoids are two major groups of constituents of ALE. Interestingly, the flavonoids luteolin and cynaroside increased eNOS promoter activity and eNOS mRNA expression, whereas the caffeoylquinic acids cynarin and chlorogenic acid were without effect. Thus, in addition to the lipid-lowering and antioxidant properties of artichoke, an increase in eNOS gene transcription may also contribute to its beneficial cardiovascular profile. Artichoke flavonoids are likely to represent the active ingredients mediating eNOS up-regulation.

  16. A Pilot Study for Applying an Extravehicular Activity Exercise Prebreathe Protocol to the International Space Station

    NASA Technical Reports Server (NTRS)

    Woodruff, Kristin K.; Johnson, Anyika N.; Lee, Stuart M. C.; Gernhardt, Michael; Schneider, Suzanne M.; Foster, Philip P.

    2000-01-01

    Decompression sickness (DCS) is a serious risk to astronauts performing extravehicular activity (EVA). To reduce this risk, the addition of ten minutes of moderate exercise (75% VO2pk) during prebreathe has been shown to decrease the total prebreathe time from 4 to 2 hours and to decrease the incidence of DCS. The overall purpose of this pilot study was to develop an exercise protocol using flight hardware and an in-flight physical fitness cycle test to perform prebreathe exercise before an EVA. Eleven subjects volunteered to participate in this study. The first objective of this study was to compare the steady-state heart rate (HR) and oxygen consumption (VO2) from a submaximal arm and leg exercise (ALE) session with those predicted from a maximal ALE test. The second objective was to compare the steady-state HR and V02 from a submaximal elastic tube and leg exercise (TLE) session with those predicted from the maximal ALE test. The third objective involved a comparison of the maximal ALE test with a maximal leg-only (LE) test to conform to the in- flight fitness assessment test. The 75% VO2pk target HR from the LE test was significantly less than the target HR from the ALE test. Prescribing exercise using data from the maximal ALE test resulted in the measured submaximal values being higher than predicted VO2 and HR. The results of this pilot study suggest that elastic tubing is valid during EVA prebreathe as a method of arm exercise with the flight leg ergometer and it is recommended that prebreathe countermeasure exercise protocol incorporate this method.

  17. Épidémiologie descriptive de la carcinose péritonéale d’origine digestive à l’Hôpital Universitaire Ibn Rochd de Casablanca (2008-2010)

    PubMed Central

    Benlahfid, Mohammed; Erguibi, Driss; Elhattabi, Khalid; Bensardi, Fatimazahra; Khaiz, Driss; Lafriekh, Rachid; Rebroub, Dounia; Fadil, Abdelaziz; Aboussaouira, Touria

    2017-01-01

    Introduction La carcinose péritonéale est une diffusion inéluctablement terminale chez les patients atteints de cancers abdominaux. C'est le signe d'une maladie avancée ou d'une ré-évolution le plus souvent associée à un pronostic sombre. Environ deux tiers de l'ensemble des carcinoses péritonéales sont d'origine digestive et un tiers d'origine non digestive. Méthodes Il s'agit d'une étude rétrospective descriptive menée entre janvier 2008 et décembre 2010, dans le but de dresser le profil épidémiologique et les facteurs de risques de la carcinose péritonéale d'origine digestive au Centre Hospitalier Universitaire de Casablanca. Résultats Quarante-sept cas de carcinose péritonéale d'origine digestive ont été recensées (22 femmes, 25 hommes) ce qui représente une prévalence de 6.19% et un nombre moyen de 15.6 cas par an. L'âge était le facteur de risque essentiel dans notre série avec un âge moyen de 55.55 ans ±12.32. Les antécédents familiaux présentaient aussi un facteur de risque à prendre en considération. Conclusion A travers notre étude, nous avons conclus que les principaux facteurs de risque de la carcinose péritonéale d'origine digestive au Centre Hospitalier Universitaire Ibn Rochd Casablanca, sont l'âge et les antécédents familiaux. PMID:28979636

  18. Afar-wide Crustal Strain Field from Multiple InSAR Tracks

    NASA Astrophysics Data System (ADS)

    Pagli, C.; Wright, T. J.; Wang, H.; Calais, E.; Bennati Rassion, L. S.; Ebinger, C. J.; Lewi, E.

    2010-12-01

    Onset of a rifting episode in the Dabbahu volcanic segment, Afar (Ethiopia), in 2005 renewed interest in crustal deformation studies in the area. As a consequence, an extensive geodetic data set, including InSAR and GPS measurements have been acquired over Afar and hold great potential towards improving our understanding of the extensional processes that operate during the final stages of continental rupture. The current geodetic observational and modelling strategy has focused on detailed, localised studies of dyke intrusions and eruptions mainly in the Dabbahu segment. However, an eruption in the Erta ‘Ale volcanic segment in 2008, and cluster of earthquakes observed in the Tat Ale segment, are testament to activity elsewhere in Afar. Here we make use of the vast geodetic dataset available to obtain strain information over the whole Afar depression. A systematic analysis of all the volcanic segments, including Dabbahu, Manda-Hararo, Alayta, Tat ‘Ale Erta Ale and the Djibouti deformation zone, is undertaken. We use InSAR data from multiple tracks together with available GPS measurements to obtain a velocity field model for Afar. We use over 300 radar images acquired by the Envisat satellite in both descending and ascending orbits, from 12 distinct tracks in image and wide swath modes, spanning the time period from October 2005 to present time. We obtain the line-of-sight deformation rates from each InSAR track using a network approach and then combine the InSAR velocities with the GPS observations, as suggested by Wright and Wang (2010) following the method of England and Molnar (1997). A mesh is constructed over the Afar area and then we solve for the horizontal and vertical velocities on each node. The resultant full 3D Afar-wide velocity field shows where current strains are being accumulated within the various volcanic segments of Afar, the width of the plate boundary deformation zone and possible connections between distinct volcanic segments on a regional scale. A comparison of crustal strains from the geodetic analysis with the seismicity data will also be made.

  19. A guide to enterotypes across the human body: meta-analysis of microbial community structures in human microbiome datasets.

    PubMed

    Koren, Omry; Knights, Dan; Gonzalez, Antonio; Waldron, Levi; Segata, Nicola; Knight, Rob; Huttenhower, Curtis; Ley, Ruth E

    2013-01-01

    Recent analyses of human-associated bacterial diversity have categorized individuals into 'enterotypes' or clusters based on the abundances of key bacterial genera in the gut microbiota. There is a lack of consensus, however, on the analytical basis for enterotypes and on the interpretation of these results. We tested how the following factors influenced the detection of enterotypes: clustering methodology, distance metrics, OTU-picking approaches, sequencing depth, data type (whole genome shotgun (WGS) vs.16S rRNA gene sequence data), and 16S rRNA region. We included 16S rRNA gene sequences from the Human Microbiome Project (HMP) and from 16 additional studies and WGS sequences from the HMP and MetaHIT. In most body sites, we observed smooth abundance gradients of key genera without discrete clustering of samples. Some body habitats displayed bimodal (e.g., gut) or multimodal (e.g., vagina) distributions of sample abundances, but not all clustering methods and workflows accurately highlight such clusters. Because identifying enterotypes in datasets depends not only on the structure of the data but is also sensitive to the methods applied to identifying clustering strength, we recommend that multiple approaches be used and compared when testing for enterotypes.

  20. A Guide to Enterotypes across the Human Body: Meta-Analysis of Microbial Community Structures in Human Microbiome Datasets

    PubMed Central

    Waldron, Levi; Segata, Nicola; Knight, Rob; Huttenhower, Curtis; Ley, Ruth E.

    2013-01-01

    Recent analyses of human-associated bacterial diversity have categorized individuals into ‘enterotypes’ or clusters based on the abundances of key bacterial genera in the gut microbiota. There is a lack of consensus, however, on the analytical basis for enterotypes and on the interpretation of these results. We tested how the following factors influenced the detection of enterotypes: clustering methodology, distance metrics, OTU-picking approaches, sequencing depth, data type (whole genome shotgun (WGS) vs.16S rRNA gene sequence data), and 16S rRNA region. We included 16S rRNA gene sequences from the Human Microbiome Project (HMP) and from 16 additional studies and WGS sequences from the HMP and MetaHIT. In most body sites, we observed smooth abundance gradients of key genera without discrete clustering of samples. Some body habitats displayed bimodal (e.g., gut) or multimodal (e.g., vagina) distributions of sample abundances, but not all clustering methods and workflows accurately highlight such clusters. Because identifying enterotypes in datasets depends not only on the structure of the data but is also sensitive to the methods applied to identifying clustering strength, we recommend that multiple approaches be used and compared when testing for enterotypes. PMID:23326225

  1. The ICCAM platform study: An experimental medicine platform for evaluating new drugs for relapse prevention in addiction. Part B: fMRI description

    PubMed Central

    McGonigle, John; Murphy, Anna; Paterson, Louise M; Reed, Laurence J; Nestor, Liam; Nash, Jonathan; Elliott, Rebecca; Ersche, Karen D; Flechais, Remy SA; Newbould, Rexford; Orban, Csaba; Smith, Dana G; Taylor, Eleanor M; Waldman, Adam D; Robbins, Trevor W; Deakin, JF William; Nutt, David J; Lingford-Hughes, Anne R; Suckling, John

    2016-01-01

    Objectives: We aimed to set up a robust multi-centre clinical fMRI and neuropsychological platform to investigate the neuropharmacology of brain processes relevant to addiction – reward, impulsivity and emotional reactivity. Here we provide an overview of the fMRI battery, carried out across three centres, characterizing neuronal response to the tasks, along with exploring inter-centre differences in healthy participants. Experimental design: Three fMRI tasks were used: monetary incentive delay to probe reward sensitivity, go/no-go to probe impulsivity and an evocative images task to probe emotional reactivity. A coordinate-based activation likelihood estimation (ALE) meta-analysis was carried out for the reward and impulsivity tasks to help establish region of interest (ROI) placement. A group of healthy participants was recruited from across three centres (total n=43) to investigate inter-centre differences. Principle observations: The pattern of response observed for each of the three tasks was consistent with previous studies using similar paradigms. At the whole brain level, significant differences were not observed between centres for any task. Conclusions: In developing this platform we successfully integrated neuroimaging data from three centres, adapted validated tasks and applied whole brain and ROI approaches to explore and demonstrate their consistency across centres. PMID:27703042

  2. Where Is the Semantic System? A Critical Review and Meta-Analysis of 120 Functional Neuroimaging Studies

    PubMed Central

    Desai, Rutvik H.; Graves, William W.; Conant, Lisa L.

    2009-01-01

    Semantic memory refers to knowledge about people, objects, actions, relations, self, and culture acquired through experience. The neural systems that store and retrieve this information have been studied for many years, but a consensus regarding their identity has not been reached. Using strict inclusion criteria, we analyzed 120 functional neuroimaging studies focusing on semantic processing. Reliable areas of activation in these studies were identified using the activation likelihood estimate (ALE) technique. These activations formed a distinct, left-lateralized network comprised of 7 regions: posterior inferior parietal lobe, middle temporal gyrus, fusiform and parahippocampal gyri, dorsomedial prefrontal cortex, inferior frontal gyrus, ventromedial prefrontal cortex, and posterior cingulate gyrus. Secondary analyses showed specific subregions of this network associated with knowledge of actions, manipulable artifacts, abstract concepts, and concrete concepts. The cortical regions involved in semantic processing can be grouped into 3 broad categories: posterior multimodal and heteromodal association cortex, heteromodal prefrontal cortex, and medial limbic regions. The expansion of these regions in the human relative to the nonhuman primate brain may explain uniquely human capacities to use language productively, plan, solve problems, and create cultural and technological artifacts, all of which depend on the fluid and efficient retrieval and manipulation of semantic knowledge. PMID:19329570

  3. The ICCAM platform study: An experimental medicine platform for evaluating new drugs for relapse prevention in addiction. Part B: fMRI description.

    PubMed

    McGonigle, John; Murphy, Anna; Paterson, Louise M; Reed, Laurence J; Nestor, Liam; Nash, Jonathan; Elliott, Rebecca; Ersche, Karen D; Flechais, Remy Sa; Newbould, Rexford; Orban, Csaba; Smith, Dana G; Taylor, Eleanor M; Waldman, Adam D; Robbins, Trevor W; Deakin, Jf William; Nutt, David J; Lingford-Hughes, Anne R; Suckling, John

    2017-01-01

    We aimed to set up a robust multi-centre clinical fMRI and neuropsychological platform to investigate the neuropharmacology of brain processes relevant to addiction - reward, impulsivity and emotional reactivity. Here we provide an overview of the fMRI battery, carried out across three centres, characterizing neuronal response to the tasks, along with exploring inter-centre differences in healthy participants. Three fMRI tasks were used: monetary incentive delay to probe reward sensitivity, go/no-go to probe impulsivity and an evocative images task to probe emotional reactivity. A coordinate-based activation likelihood estimation (ALE) meta-analysis was carried out for the reward and impulsivity tasks to help establish region of interest (ROI) placement. A group of healthy participants was recruited from across three centres (total n=43) to investigate inter-centre differences. Principle observations: The pattern of response observed for each of the three tasks was consistent with previous studies using similar paradigms. At the whole brain level, significant differences were not observed between centres for any task. In developing this platform we successfully integrated neuroimaging data from three centres, adapted validated tasks and applied whole brain and ROI approaches to explore and demonstrate their consistency across centres.

  4. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  5. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  6. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  7. 47 CFR 87.149 - Special requirements for automatic link establishment (ALE).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Special requirements for automatic link establishment (ALE). 87.149 Section 87.149 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AVIATION SERVICES Technical Requirements § 87.149 Special requirements...

  8. A cognitive task analysis of a visual analytic workflow: Exploring molecular interaction networks in systems biology.

    PubMed

    Mirel, Barbara; Eichinger, Felix; Keller, Benjamin J; Kretzler, Matthias

    2011-03-21

    Bioinformatics visualization tools are often not robust enough to support biomedical specialists’ complex exploratory analyses. Tools need to accommodate the workflows that scientists actually perform for specific translational research questions. To understand and model one of these workflows, we conducted a case-based, cognitive task analysis of a biomedical specialist’s exploratory workflow for the question: What functional interactions among gene products of high throughput expression data suggest previously unknown mechanisms of a disease? From our cognitive task analysis four complementary representations of the targeted workflow were developed. They include: usage scenarios, flow diagrams, a cognitive task taxonomy, and a mapping between cognitive tasks and user-centered visualization requirements. The representations capture the flows of cognitive tasks that led a biomedical specialist to inferences critical to hypothesizing. We created representations at levels of detail that could strategically guide visualization development, and we confirmed this by making a trial prototype based on user requirements for a small portion of the workflow. Our results imply that visualizations should make available to scientific users “bundles of features” consonant with the compositional cognitive tasks purposefully enacted at specific points in the workflow. We also highlight certain aspects of visualizations that: (a) need more built-in flexibility; (b) are critical for negotiating meaning; and (c) are necessary for essential metacognitive support.

  9. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  10. Dynamic Deployment Simulations of Inflatable Space Structures

    NASA Technical Reports Server (NTRS)

    Wang, John T.

    2005-01-01

    The feasibility of using Control Volume (CV) method and the Arbitrary Lagrangian Eulerian (ALE) method in LSDYNA to simulate the dynamic deployment of inflatable space structures is investigated. The CV and ALE methods were used to predict the inflation deployments of three folded tube configurations. The CV method was found to be a simple and computationally efficient method that may be adequate for modeling slow inflation deployment sine the inertia of the inflation gas can be neglected. The ALE method was found to be very computationally intensive since it involves the solving of three conservative equations of fluid as well as dealing with complex fluid structure interactions.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metzler, Dominik; Li, Chen; Engelmann, Sebastian

    The need for atomic layer etching (ALE) is steadily increasing as smaller critical dimensions and pitches are required in device patterning. A flux-control based cyclic Ar/C 4F 8 ALE based on steady-state Ar plasma in conjunction with periodic, precise C 4F 8 injection and synchronized plasma-based low energy Ar + ion bombardment has been established for SiO 2. 1 In this work, the cyclic process is further characterized and extended to ALE of silicon under similar process conditions. The use of CHF 3 as a precursor is examined and compared to C 4F 8. CHF 3 is shown to enablemore » selective SiO 2/Si etching using a fluorocarbon (FC) film build up. Other critical process parameters investigated are the FC film thickness deposited per cycle, the ion energy, and the etch step length. Etching behavior and mechanisms are studied using in situ real time ellipsometry and X-ray photoelectron spectroscopy. Silicon ALE shows less self-limitation than silicon oxide due to higher physical sputtering rates for the maximum ion energies used in this work, ranged from 20 to 30 eV. The surface chemistry is found to contain fluorinated silicon oxide during the etching of silicon. As a result, plasma parameters during ALE are studied using a Langmuir probe and establish the impact of precursor addition on plasma properties.« less

  12. Photo-Oxidative Stress-Driven Mutagenesis and Adaptive Evolution on the Marine Diatom Phaeodactylum tricornutum for Enhanced Carotenoid Accumulation.

    PubMed

    Yi, Zhiqian; Xu, Maonian; Magnusdottir, Manuela; Zhang, Yuetuan; Brynjolfsson, Sigurdur; Fu, Weiqi

    2015-09-29

    Marine diatoms have recently gained much attention as they are expected to be a promising resource for sustainable production of bioactive compounds such as carotenoids and biofuels as a future clean energy solution. To develop photosynthetic cell factories, it is important to improve diatoms for value-added products. In this study, we utilized UVC radiation to induce mutations in the marine diatom Phaeodactylum tricornutum and screened strains with enhanced accumulation of neutral lipids and carotenoids. Adaptive laboratory evolution (ALE) was also used in parallel to develop altered phenotypic and biological functions in P. tricornutum and it was reported for the first time that ALE was successfully applied on diatoms for the enhancement of growth performance and productivity of value-added carotenoids to date. Liquid chromatography-mass spectrometry (LC-MS) was utilized to study the composition of major pigments in the wild type P. tricornutum, UV mutants and ALE strains. UVC radiated strains exhibited higher accumulation of fucoxanthin as well as neutral lipids compared to their wild type counterpart. In addition to UV mutagenesis, P. tricornutum strains developed by ALE also yielded enhanced biomass production and fucoxanthin accumulation under combined red and blue light. In short, both UV mutagenesis and ALE appeared as an effective approach to developing desired phenotypes in the marine diatoms via electromagnetic radiation-induced oxidative stress.

  13. Photo-Oxidative Stress-Driven Mutagenesis and Adaptive Evolution on the Marine Diatom Phaeodactylum tricornutum for Enhanced Carotenoid Accumulation

    PubMed Central

    Yi, Zhiqian; Xu, Maonian; Magnusdottir, Manuela; Zhang, Yuetuan; Brynjolfsson, Sigurdur; Fu, Weiqi

    2015-01-01

    Marine diatoms have recently gained much attention as they are expected to be a promising resource for sustainable production of bioactive compounds such as carotenoids and biofuels as a future clean energy solution. To develop photosynthetic cell factories, it is important to improve diatoms for value-added products. In this study, we utilized UVC radiation to induce mutations in the marine diatom Phaeodactylum tricornutum and screened strains with enhanced accumulation of neutral lipids and carotenoids. Adaptive laboratory evolution (ALE) was also used in parallel to develop altered phenotypic and biological functions in P. tricornutum and it was reported for the first time that ALE was successfully applied on diatoms for the enhancement of growth performance and productivity of value-added carotenoids to date. Liquid chromatography-mass spectrometry (LC-MS) was utilized to study the composition of major pigments in the wild type P. tricornutum, UV mutants and ALE strains. UVC radiated strains exhibited higher accumulation of fucoxanthin as well as neutral lipids compared to their wild type counterpart. In addition to UV mutagenesis, P. tricornutum strains developed by ALE also yielded enhanced biomass production and fucoxanthin accumulation under combined red and blue light. In short, both UV mutagenesis and ALE appeared as an effective approach to developing desired phenotypes in the marine diatoms via electromagnetic radiation-induced oxidative stress. PMID:26426027

  14. Numerical modeling of pulsed laser-material interaction and of laser plume dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Qiang; Shi, Yina

    2015-03-10

    We have developed two-dimensional Arbitrary Lagrangian Eulerian (ALE) code which is used to study the physical processes, the plasma absorption, the crater profile, and the temperature distribution on metallic target and below the surface. The ALE method overcomes problems with Lagrangian moving mesh distortion by mesh smoothing and conservative quantities remapping from Lagrangian mesh to smoothed one. A new second order accurate diffusion solver has been implemented for the thermal conduction and radiation transport on distorted mesh. The results of numerical simulation of pulsed laser ablation are presented. The influences of different processes, such as time evolution of the surfacemore » temperature, interspecies interactions (elastic collisions, recombination-dissociation reaction), interaction with an ambient gas are examined. The study presents particular interest for the analysis of experimental results obtained during pulsed laser ablation.« less

  15. A scientific workflow framework for (13)C metabolic flux analysis.

    PubMed

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Evaluating characteristics of PROSPERO records as predictors of eventual publication of non-Cochrane systematic reviews: a meta-epidemiological study protocol.

    PubMed

    Ruano, Juan; Gómez-García, Francisco; Gay-Mimbrera, Jesús; Aguilar-Luque, Macarena; Fernández-Rueda, José Luis; Fernández-Chaichio, Jesús; Alcalde-Mellado, Patricia; Carmona-Fernandez, Pedro J; Sanz-Cabanillas, Juan Luis; Viguera-Guerra, Isabel; Franco-García, Francisco; Cárdenas-Aranzana, Manuel; Romero, José Luis Hernández; Gonzalez-Padilla, Marcelino; Isla-Tejera, Beatriz; Garcia-Nieto, Antonio Velez

    2018-03-09

    Epidemiology and the reporting characteristics of systematic reviews (SRs) and meta-analyses (MAs) are well known. However, no study has analyzed the influence of protocol features on the probability that a study's results will be finally reported, thereby indirectly assessing the reporting bias of International Prospective Register of Systematic Reviews (PROSPERO) registration records. The objective of this study is to explore which factors are associated with a higher probability that results derived from a non-Cochrane PROSPERO registration record for a systematic review will be finally reported as an original article in a scientific journal. The PROSPERO repository will be web scraped to automatically and iteratively obtain all completed non-Cochrane registration records stored from February 2011 to December 2017. Downloaded records will be screened, and those with less than 90% fulfilled or are duplicated (i.e., those sharing titles and reviewers) will be excluded. Manual and human-supervised automatic methods will be used for data extraction, depending on the data source (fields of PROSPERO registration records, bibliometric databases, etc.). Records will be classified into published, discontinued, and abandoned review subgroups. All articles derived from published reviews will be obtained through multiple parallel searches using the full protocol "title" and/or "list reviewers" in MEDLINE/PubMed databases and Google Scholar. Reviewer, author, article, and journal metadata will be obtained using different sources. R and Python programming and analysis languages will be used to describe the datasets; perform text mining, machine learning, and deep learning analyses; and visualize the data. We will report the study according to the recommendations for meta-epidemiological studies adapted from the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement for SRs and MAs. This meta-epidemiological study will explore, for the first time, characteristics of PROSPERO records that may be associated with the publication of a completed systematic review. The evidence may help to improve review workflow performance in terms of research topic selection, decision-making regarding team selection, planning relationships with funding sources, implementing literature search strategies, and efficient data extraction and analysis. We expect to make our results, datasets, and R and Python code scripts publicly available during the third quarter of 2018.

  17. Safety and feasibility of STAT RAD: Improvement of a novel rapid tomotherapy-based radiation therapy workflow by failure mode and effects analysis.

    PubMed

    Jones, Ryan T; Handsfield, Lydia; Read, Paul W; Wilson, David D; Van Ausdal, Ray; Schlesinger, David J; Siebers, Jeffrey V; Chen, Quan

    2015-01-01

    The clinical challenge of radiation therapy (RT) for painful bone metastases requires clinicians to consider both treatment efficacy and patient prognosis when selecting a radiation therapy regimen. The traditional RT workflow requires several weeks for common palliative RT schedules of 30 Gy in 10 fractions or 20 Gy in 5 fractions. At our institution, we have created a new RT workflow termed "STAT RAD" that allows clinicians to perform computed tomographic (CT) simulation, planning, and highly conformal single fraction treatment delivery within 2 hours. In this study, we evaluate the safety and feasibility of the STAT RAD workflow. A failure mode and effects analysis (FMEA) was performed on the STAT RAD workflow, including development of a process map, identification of potential failure modes, description of the cause and effect, temporal occurrence, and team member involvement in each failure mode, and examination of existing safety controls. A risk probability number (RPN) was calculated for each failure mode. As necessary, workflow adjustments were then made to safeguard failure modes of significant RPN values. After workflow alterations, RPN numbers were again recomputed. A total of 72 potential failure modes were identified in the pre-FMEA STAT RAD workflow, of which 22 met the RPN threshold for clinical significance. Workflow adjustments included the addition of a team member checklist, changing simulation from megavoltage CT to kilovoltage CT, alteration of patient-specific quality assurance testing, and allocating increased time for critical workflow steps. After these modifications, only 1 failure mode maintained RPN significance; patient motion after alignment or during treatment. Performing the FMEA for the STAT RAD workflow before clinical implementation has significantly strengthened the safety and feasibility of STAT RAD. The FMEA proved a valuable evaluation tool, identifying potential problem areas so that we could create a safer workflow. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  18. Taverna: a tool for building and running workflows of services

    PubMed Central

    Hull, Duncan; Wolstencroft, Katy; Stevens, Robert; Goble, Carole; Pocock, Mathew R.; Li, Peter; Oinn, Tom

    2006-01-01

    Taverna is an application that eases the use and integration of the growing number of molecular biology tools and databases available on the web, especially web services. It allows bioinformaticians to construct workflows or pipelines of services to perform a range of different analyses, such as sequence analysis and genome annotation. These high-level workflows can integrate many different resources into a single analysis. Taverna is available freely under the terms of the GNU Lesser General Public License (LGPL) from . PMID:16845108

  19. Clinico-pathological correlation in adenylate kinase 5 autoimmune limbic encephalitis

    PubMed Central

    Ng, Adeline S.L.; Kramer, Joel; Centurion, Alejandro; Dalmau, Josep; Huang, Eric; Cotter, Jennifer A.; Geschwind, Michael D.

    2016-01-01

    Autoantibodies associated with autoimmune limbic encephalitis (ALE) have been well-characterized, with intracellular neuronal antibodies being less responsive to immunotherapy than antibodies to cell surface antigens. Adenylate kinase 5 (AK5) is a nucleoside monophosphate kinase vital for neuronal-specific metabolism and is located intracellularly in the cytosol and expressed exclusively in the brain. Antibodies to AK5 had been previously identified but were not known to be associated with human disease prior to the report of two patients with AK5-related ALE (Tuzun et al., 2007). We present the complete clinical picture for one of these patients and the first reported neuropathology for AK5 ALE. PMID:26439959

  20. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    PubMed

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS

    PubMed Central

    Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2016-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971

  2. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    PubMed

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  3. Don't Panic! Closed String Tachyons in ALE Spacetimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silverstein, Eva M

    2001-08-20

    We consider closed string tachyons localized at the fixed points of noncompact nonsupersymmetric orbifolds. We argue that tachyon condensation drives these orbifolds to flat space or supersymmetric ALE spaces. The decay proceeds via an expanding shell of dilaton gradients and curvature which interpolates between two regions of distinct angular geometry. The string coupling remains weak throughout. For small tachyon VEVs, evidence comes from quiver theories on D-branes probes, in which deformations by twisted couplings smoothly connect non-supersymmetric orbifolds to supersymmetric orbifolds of reduced order. For large tachyon VEVs, evidence comes from worldsheet RG flow and spacetime gravity. For C{sup 2}/Z{submore » n}, we exhibit infinite sequences of transitions producing SUSY ALE spaces via twisted closed string condensation from non-supersymmetric ALE spaces. In a T-dual description this provides a mechanism for creating NS5-branes via closed string tachyon condensation similar to the creation of D-branes via open string tachyon condensation. We also apply our results to recent duality conjectures involving fluxbranes and the type 0 string.« less

  4. Nephroprotective potential of artichoke leaves extract against gentamicin in rats: Antioxidant mechanisms.

    PubMed

    Khattab, Hala Ah; Wazzan, Maha Am; Al-Ahdab, Maha A

    2016-09-01

    Nephrotoxicity represents a major health problem. This study aims to determine nephroprotective of artichoke leaves extract (ALE) against gentamicin (GM) injection in male rats. Rats (n=30) were divided into; negative control, nephrotoxic (GM) injected intraperitoneally (i.p.) with GM (100 mg/kg b.wt/d for 10 days), and groups administered orally with ALE (200, 400 or 600 mg/kg b.wt/d) and injected with GM. The results revealed that, GM injection induced marked nephrotoxicity as evidenced by significant increase in kidney functions, albumin and potassium (K+), with significant decrease in serum levels of total protein and sodium (Na + ) as compared with negative control group. There was significant increase in malondialdehyde (MDA) level in GM group compared with negative control group. Renal examined tissues showed severe changes manifested by atrophy of glomerular tuft, necrosis of epithelial lining renal tubules with apoptosis of tubular epithelium and renal hemorrhage. Simultaneous administration of ALE during GM therapy protected kidney tissues as evidenced by normalization of kidney biochemical parameters and minimized the histopathological changes. Therefore, ALE has nephroprotective and antioxidant effects, thus could be beneficial for kidney patients.

  5. Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics

    NASA Astrophysics Data System (ADS)

    Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.

    2006-06-01

    Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.

  6. Workflow Management for Complex HEP Analyses

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.

    2017-10-01

    We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metzler, Dominik; Oehrlein, Gottlieb S., E-mail: oehrlein@umd.edu; Li, Chen

    The need for atomic layer etching (ALE) is steadily increasing as smaller critical dimensions and pitches are required in device patterning. A flux-control based cyclic Ar/C{sub 4}F{sub 8} ALE based on steady-state Ar plasma in conjunction with periodic, precise C{sub 4}F{sub 8} injection and synchronized plasma-based low energy Ar{sup +} ion bombardment has been established for SiO{sub 2} [Metzler et al., J. Vac. Sci. Technol. A 32, 020603 (2014)]. In this work, the cyclic process is further characterized and extended to ALE of silicon under similar process conditions. The use of CHF{sub 3} as a precursor is examined and comparedmore » to C{sub 4}F{sub 8}. CHF{sub 3} is shown to enable selective SiO{sub 2}/Si etching using a fluorocarbon (FC) film build up. Other critical process parameters investigated are the FC film thickness deposited per cycle, the ion energy, and the etch step length. Etching behavior and mechanisms are studied using in situ real time ellipsometry and x-ray photoelectron spectroscopy. Silicon ALE shows less self-limitation than silicon oxide due to higher physical sputtering rates for the maximum ion energies used in this work, ranged from 20 to 30 eV. The surface chemistry is found to contain fluorinated silicon oxide during the etching of silicon. Plasma parameters during ALE are studied using a Langmuir probe and establish the impact of precursor addition on plasma properties.« less

  8. Fluorocarbon assisted atomic layer etching of SiO 2 and Si using cyclic Ar/C 4F 8 and Ar/CHF 3 plasma

    DOE PAGES

    Metzler, Dominik; Li, Chen; Engelmann, Sebastian; ...

    2015-11-11

    The need for atomic layer etching (ALE) is steadily increasing as smaller critical dimensions and pitches are required in device patterning. A flux-control based cyclic Ar/C 4F 8 ALE based on steady-state Ar plasma in conjunction with periodic, precise C 4F 8 injection and synchronized plasma-based low energy Ar + ion bombardment has been established for SiO 2. 1 In this work, the cyclic process is further characterized and extended to ALE of silicon under similar process conditions. The use of CHF 3 as a precursor is examined and compared to C 4F 8. CHF 3 is shown to enablemore » selective SiO 2/Si etching using a fluorocarbon (FC) film build up. Other critical process parameters investigated are the FC film thickness deposited per cycle, the ion energy, and the etch step length. Etching behavior and mechanisms are studied using in situ real time ellipsometry and X-ray photoelectron spectroscopy. Silicon ALE shows less self-limitation than silicon oxide due to higher physical sputtering rates for the maximum ion energies used in this work, ranged from 20 to 30 eV. The surface chemistry is found to contain fluorinated silicon oxide during the etching of silicon. As a result, plasma parameters during ALE are studied using a Langmuir probe and establish the impact of precursor addition on plasma properties.« less

  9. Anti-Inflammatory Effects of Artemisia Leaf Extract in Mice with Contact Dermatitis In Vitro and In Vivo.

    PubMed

    Yun, Chanyong; Jung, Youngchul; Chun, Wonjoo; Yang, Beodeul; Ryu, Junghyun; Lim, Chiyeon; Kim, Jung-Hoon; Kim, Hyungwoo; Cho, Su-In

    2016-01-01

    The leaves of Artemisia argyi Lev. et Vant. and A. princeps Pamp. are well known medicinal herbs used to treat patients in China, Japan, and Korea with skin problems such as eczema and itching, as well as abdominal pain and dysmenorrhoea. We investigated the anti-inflammatory effects of Artemisia leaf extract (ALE) using CD mice and Raw 264.7 cells. The effects of ALE on histopathological changes and cytokine production in ear tissues were assessed in mice with CD induced by 1-fluoro-2,4-dinitrobenzene (DNFB). Moreover, the anti-inflammatory effects on production levels of prostaglandin E2 (PGE2) and nitric oxide (NO) and expression levels of cyclooxygenase 2 (COX-2) and inducible nitric oxide synthase (iNOS) were investigated in Raw 264.7 cells. Topical application of ALE effectively prevented ear swelling induced by repeated DNFB application. ALE prevented epidermal hyperplasia and infiltration of immune cells and lowered the production of interferon- (IFN-) gamma (γ), tumour necrosis factor- (TNF-) alpha (α), and interleukin- (IL-) 6 in inflamed tissues. In addition, ALE inhibited expression of COX-2 and iNOS and production of NO and PGE2 in Raw 264.7 cells. These results indicate that Artemisia leaf can be used as a therapeutic agent for inflammatory skin diseases and that its anti-inflammatory effects are closely related to the inhibition of inflammatory mediator release from macrophages and inflammatory cytokine production in inflamed tissues.

  10. Anti-Inflammatory Effects of Artemisia Leaf Extract in Mice with Contact Dermatitis In Vitro and In Vivo

    PubMed Central

    Yun, Chanyong; Jung, Youngchul; Chun, Wonjoo; Yang, Beodeul; Ryu, Junghyun; Cho, Su-In

    2016-01-01

    The leaves of Artemisia argyi Lev. et Vant. and A. princeps Pamp. are well known medicinal herbs used to treat patients in China, Japan, and Korea with skin problems such as eczema and itching, as well as abdominal pain and dysmenorrhoea. We investigated the anti-inflammatory effects of Artemisia leaf extract (ALE) using CD mice and Raw 264.7 cells. The effects of ALE on histopathological changes and cytokine production in ear tissues were assessed in mice with CD induced by 1-fluoro-2,4-dinitrobenzene (DNFB). Moreover, the anti-inflammatory effects on production levels of prostaglandin E2 (PGE2) and nitric oxide (NO) and expression levels of cyclooxygenase 2 (COX-2) and inducible nitric oxide synthase (iNOS) were investigated in Raw 264.7 cells. Topical application of ALE effectively prevented ear swelling induced by repeated DNFB application. ALE prevented epidermal hyperplasia and infiltration of immune cells and lowered the production of interferon- (IFN-) gamma (γ), tumour necrosis factor- (TNF-) alpha (α), and interleukin- (IL-) 6 in inflamed tissues. In addition, ALE inhibited expression of COX-2 and iNOS and production of NO and PGE2 in Raw 264.7 cells. These results indicate that Artemisia leaf can be used as a therapeutic agent for inflammatory skin diseases and that its anti-inflammatory effects are closely related to the inhibition of inflammatory mediator release from macrophages and inflammatory cytokine production in inflamed tissues. PMID:27647952

  11. Potential of knowledge discovery using workflows implemented in the C3Grid

    NASA Astrophysics Data System (ADS)

    Engel, Thomas; Fink, Andreas; Ulbrich, Uwe; Schartner, Thomas; Dobler, Andreas; Fritzsch, Bernadette; Hiller, Wolfgang; Bräuer, Benny

    2013-04-01

    With the increasing number of climate simulations, reanalyses and observations, new infrastructures to search and analyse distributed data are necessary. In recent years, the Grid architecture became an important technology to fulfill these demands. For the German project "Collaborative Climate Community Data and Processing Grid" (C3Grid) computer scientists and meteorologists developed a system that offers its users a webinterface to search and download climate data and use implemented analysis tools (called workflows) to further investigate them. In this contribution, two workflows that are implemented in the C3Grid architecture are presented: the Cyclone Tracking (CT) and Stormtrack workflow. They shall serve as an example on how to perform numerous investigations on midlatitude winterstorms on a large amount of analysis and climate model data without having an insight into the data source, program code and a low-to-moderate understanding of the theortical background. CT is based on the work of Murray and Simmonds (1991) to identify and track local minima in the mean sea level pressure (MSLP) field of the selected dataset. Adjustable thresholds for the curvature of the isobars as well as the minimum lifetime of a cyclone allow the distinction of weak subtropical heat low systems and stronger midlatitude cyclones e.g. in the Northern Atlantic. The user gets the resulting track data including statistics about the track density, average central pressure, average central curvature, cyclogenesis and cyclolysis as well as pre-built visualizations of these results. Stormtrack calculates the 2.5-6 day bandpassfiltered standard deviation of the geopotential height on a selected pressure level. Although this workflow needs much less computational effort compared to CT it shows structures that are in good agreement with the track density of the CT workflow. To what extent changes in the mid-level tropospheric storm track are reflected in trough density and intensity alteration of surface cyclones. A specific feature of C3Grid is the flexible Workflow Scheduling Service (WSS) which also allows for automated nightly analysis runs of CT, Stormtrack, etc. with different input parameter sets. The statistical results of these workflows can be accumulated afterwards by a scheduled final analysis step, thereby providing a tool for data intensive analytics for the massive amounts of climate model data accessible through C3Grid. First tests with these automated analysis workflows show promising results to speed up the investigation of high volume modeling data. This example is relevant to the thorough analysis of future changes in storminess in Europe and is just one example of the potential of knowledge discovery using automated workflows implemented in the C3Grid architecture.

  12. Efficient Exploration of the Space of Reconciled Gene Trees

    PubMed Central

    Szöllősi, Gergely J.; Rosikiewicz, Wojciech; Boussau, Bastien; Tannier, Eric; Daubin, Vincent

    2013-01-01

    Gene trees record the combination of gene-level events, such as duplication, transfer and loss (DTL), and species-level events, such as speciation and extinction. Gene tree–species tree reconciliation methods model these processes by drawing gene trees into the species tree using a series of gene and species-level events. The reconstruction of gene trees based on sequence alone almost always involves choosing between statistically equivalent or weakly distinguishable relationships that could be much better resolved based on a putative species tree. To exploit this potential for accurate reconstruction of gene trees, the space of reconciled gene trees must be explored according to a joint model of sequence evolution and gene tree–species tree reconciliation. Here we present amalgamated likelihood estimation (ALE), a probabilistic approach to exhaustively explore all reconciled gene trees that can be amalgamated as a combination of clades observed in a sample of gene trees. We implement the ALE approach in the context of a reconciliation model (Szöllősi et al. 2013), which allows for the DTL of genes. We use ALE to efficiently approximate the sum of the joint likelihood over amalgamations and to find the reconciled gene tree that maximizes the joint likelihood among all such trees. We demonstrate using simulations that gene trees reconstructed using the joint likelihood are substantially more accurate than those reconstructed using sequence alone. Using realistic gene tree topologies, branch lengths, and alignment sizes, we demonstrate that ALE produces more accurate gene trees even if the model of sequence evolution is greatly simplified. Finally, examining 1099 gene families from 36 cyanobacterial genomes we find that joint likelihood-based inference results in a striking reduction in apparent phylogenetic discord, with respectively. 24%, 59%, and 46% reductions in the mean numbers of duplications, transfers, and losses per gene family. The open source implementation of ALE is available from https://github.com/ssolo/ALE.git. [amalgamation; gene tree reconciliation; gene tree reconstruction; lateral gene transfer; phylogeny.] PMID:23925510

  13. Modeling Three-Dimensional Shock Initiation of PBX 9501 in ALE3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leininger, L; Springer, H K; Mace, J

    A recent SMIS (Specific Munitions Impact Scenario) experimental series performed at Los Alamos National Laboratory has provided 3-dimensional shock initiation behavior of the HMX-based heterogeneous high explosive, PBX 9501. A series of finite element impact calculations have been performed in the ALE3D [1] hydrodynamic code and compared to the SMIS results to validate and study code predictions. These SMIS tests used a powder gun to shoot scaled NATO standard fragments into a cylinder of PBX 9501, which has a PMMA case and a steel impact cover. This SMIS real-world shot scenario creates a unique test-bed because (1) SMIS tests facilitatemore » the investigation of 3D Shock to Detonation Transition (SDT) within the context of a considerable suite of diagnostics, and (2) many of the fragments arrive at the impact plate off-center and at an angle of impact. A particular goal of these model validation experiments is to demonstrate the predictive capability of the ALE3D implementation of the Tarver-Lee Ignition and Growth reactive flow model [2] within a fully 3-dimensional regime of SDT. The 3-dimensional Arbitrary Lagrange Eulerian (ALE) hydrodynamic model in ALE3D applies the Ignition and Growth (I&G) reactive flow model with PBX 9501 parameters derived from historical 1-dimensional experimental data. The model includes the off-center and angle of impact variations seen in the experiments. Qualitatively, the ALE3D I&G calculations reproduce observed 'Go/No-Go' 3D Shock to Detonation Transition (SDT) reaction in the explosive, as well as the case expansion recorded by a high-speed optical camera. Quantitatively, the calculations show good agreement with the shock time of arrival at internal and external diagnostic pins. This exercise demonstrates the utility of the Ignition and Growth model applied for the response of heterogeneous high explosives in the SDT regime.« less

  14. In Situ Infrared Spectroscopic Studies of Molecular Layer Deposition and Atomic Layer Etching Processes

    NASA Astrophysics Data System (ADS)

    DuMont, Jaime Willadean

    In this thesis, in situ Fourier transform infrared (FTIR) spectroscopy was used to study: i) the growth and pyrolysis of molecular layer deposition (MLD) films. ii) the surface chemistry of atomic layer etching (ALE) processes. Atomic layer processes such as molecular layer deposition (MLD) and atomic layer etching (ALE) are techniques that can add or remove material with atomic level precision using sequential, self-limiting surface reactions. Deposition and removal processes at the atomic scale are powerful tools for many industrial and research applications such as energy storage and semiconductor nanofabrication. The first section of this thesis describes the chemistry of reactions leading to the MLD of aluminum and tin alkoxide polymer films known as "alucone" and "tincone", respectively. The subsequent pyrolysis of these films to produce metal oxide/carbon composites was also investigated. In situ FTIR spectroscopy was conducted to monitor surface species during MLD film growth and to monitor the films background infrared absorbance versus pyrolysis temperature. Ex situ techniques such as transmission electron microscopy (TEM), four-point probe and X-ray diffraction (XRD) were utilized to study the properties of the films post-pyrolysis. TEM confirmed that the pyrolyzed films maintained conformality during post-processing. Four-point probe monitored film resistivity versus pyrolysis temperature and XRD determined the film crystallinity. The second section of this thesis focuses on the surface chemistry of Al2O3 and SiO2 ALE processes, respectively. Thermal ALE processes have been recently developed which utilize sequential fluorination and ligand exchange reactions. An intimate knowledge of the surface chemistry is important in understanding the ALE process. In this section, the competition between the Al2O3 etching and AlF 3 growth that occur during sequential HF (fluorinating agent) and TMA (ligand exchange) exposures is investigated using in situ FTIR spectroscopy. Also included in this section is the first demonstration of thermal ALE for SiO2. In situ FTIR spectroscopy was conducted to monitor the loss of bulk Si-O vibrational modes corresponding to the removal of SiO2. FTIR was also used to monitor surface species during each ALE half cycle and to verify self-limiting behavior. X-ray reflectivity experiments were conducted to establish etch rates on thermal oxide silicon wafers.

  15. Metabolic flux and nodes control analysis of brewer's yeasts under different fermentation temperature during beer brewing.

    PubMed

    Yu, Zhimin; Zhao, Haifeng; Zhao, Mouming; Lei, Hongjie; Li, Huiping

    2012-12-01

    The aim of this work was to further investigate the glycolysis performance of lager and ale brewer's yeasts under different fermentation temperature using a combined analysis of metabolic flux, glycolytic enzyme activities, and flux control. The results indicated that the fluxes through glycolytic pathway decreased with the change of the fermentation temperature from 15 °C to 10 °C, which resulted in the prolonged fermentation times. The maximum activities (V (max)) of hexokinase (HK), phosphofructokinase (PFK), and pyruvate kinase (PK) at key nodes of glycolytic pathway decreased with decreasing fermentation temperature, which was estimated to have different control extent (22-84 %) on the glycolytic fluxes in exponential or flocculent phase. Moreover, the decrease of V (max) of PFK or PK displayed the crucial role in down-regulation of flux in flocculent phase. In addition, the metabolic state of ale strain was more sensitive to the variation of temperature than that of lager strain. The results of the metabolic flux and nodes control analysis in brewer's yeasts under different fermentation temperature may provide an alternative approach to regulate glycolytic flux by changing V (max) and improve the production efficiency and beer quality.

  16. SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.

    PubMed

    Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B

    2016-02-04

    Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.

  17. A novel rail defect detection method based on undecimated lifting wavelet packet transform and Shannon entropy-improved adaptive line enhancer

    NASA Astrophysics Data System (ADS)

    Hao, Qiushi; Zhang, Xin; Wang, Yan; Shen, Yi; Makis, Viliam

    2018-07-01

    Acoustic emission (AE) technology is sensitive to subliminal rail defects, however strong wheel-rail contact rolling noise under high-speed condition has gravely impeded detecting of rail defects using traditional denoising methods. In this context, the paper develops an adaptive detection method for rail cracks, which combines multiresolution analysis with an improved adaptive line enhancer (ALE). To obtain elaborate multiresolution information of transient crack signals with low computational cost, lifting scheme-based undecimated wavelet packet transform is adopted. In order to feature the impulsive property of crack signals, a Shannon entropy-improved ALE is proposed as a signal enhancing approach, where Shannon entropy is introduced to improve the cost function. Then a rail defect detection plan based on the proposed method for high-speed condition is put forward. From theoretical analysis and experimental verification, it is demonstrated that the proposed method has superior performance in enhancing the rail defect AE signal and reducing the strong background noise, offering an effective multiresolution approach for rail defect detection under high-speed and strong-noise condition.

  18. chemalot and chemalot_knime: Command line programs as workflow tools for drug discovery.

    PubMed

    Lee, Man-Ling; Aliagas, Ignacio; Feng, Jianwen A; Gabriel, Thomas; O'Donnell, T J; Sellers, Benjamin D; Wiswedel, Bernd; Gobbi, Alberto

    2017-06-12

    Analyzing files containing chemical information is at the core of cheminformatics. Each analysis may require a unique workflow. This paper describes the chemalot and chemalot_knime open source packages. Chemalot is a set of command line programs with a wide range of functionalities for cheminformatics. The chemalot_knime package allows command line programs that read and write SD files from stdin and to stdout to be wrapped into KNIME nodes. The combination of chemalot and chemalot_knime not only facilitates the compilation and maintenance of sequences of command line programs but also allows KNIME workflows to take advantage of the compute power of a LINUX cluster. Use of the command line programs is demonstrated in three different workflow examples: (1) A workflow to create a data file with project-relevant data for structure-activity or property analysis and other type of investigations, (2) The creation of a quantitative structure-property-relationship model using the command line programs via KNIME nodes, and (3) The analysis of strain energy in small molecule ligand conformations from the Protein Data Bank database. The chemalot and chemalot_knime packages provide lightweight and powerful tools for many tasks in cheminformatics. They are easily integrated with other open source and commercial command line tools and can be combined to build new and even more powerful tools. The chemalot_knime package facilitates the generation and maintenance of user-defined command line workflows, taking advantage of the graphical design capabilities in KNIME. Graphical abstract Example KNIME workflow with chemalot nodes and the corresponding command line pipe.

  19. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  20. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  1. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  2. Towards a Unified Architecture for Data-Intensive Seismology in VERCE

    NASA Astrophysics Data System (ADS)

    Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.

    2013-12-01

    Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.

  3. The TimeStudio Project: An open source scientific workflow system for the behavioral and brain sciences.

    PubMed

    Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf

    2016-06-01

    This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.

  4. A Web-Hosted R Workflow to Simplify and Automate the Analysis of 16S NGS Data

    EPA Science Inventory

    Next-Generation Sequencing (NGS) produces large data sets that include tens-of-thousands of sequence reads per sample. For analysis of bacterial diversity, 16S NGS sequences are typically analyzed in a workflow that containing best-of-breed bioinformatics packages that may levera...

  5. Next-generation sequencing meets genetic diagnostics: development of a comprehensive workflow for the analysis of BRCA1 and BRCA2 genes

    PubMed Central

    Feliubadaló, Lídia; Lopez-Doriga, Adriana; Castellsagué, Ester; del Valle, Jesús; Menéndez, Mireia; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Gómez, Carolina; Campos, Olga; Pineda, Marta; González, Sara; Moreno, Victor; Brunet, Joan; Blanco, Ignacio; Serra, Eduard; Capellá, Gabriel; Lázaro, Conxi

    2013-01-01

    Next-generation sequencing (NGS) is changing genetic diagnosis due to its huge sequencing capacity and cost-effectiveness. The aim of this study was to develop an NGS-based workflow for routine diagnostics for hereditary breast and ovarian cancer syndrome (HBOCS), to improve genetic testing for BRCA1 and BRCA2. A NGS-based workflow was designed using BRCA MASTR kit amplicon libraries followed by GS Junior pyrosequencing. Data analysis combined Variant Identification Pipeline freely available software and ad hoc R scripts, including a cascade of filters to generate coverage and variant calling reports. A BRCA homopolymer assay was performed in parallel. A research scheme was designed in two parts. A Training Set of 28 DNA samples containing 23 unique pathogenic mutations and 213 other variants (33 unique) was used. The workflow was validated in a set of 14 samples from HBOCS families in parallel with the current diagnostic workflow (Validation Set). The NGS-based workflow developed permitted the identification of all pathogenic mutations and genetic variants, including those located in or close to homopolymers. The use of NGS for detecting copy-number alterations was also investigated. The workflow meets the sensitivity and specificity requirements for the genetic diagnosis of HBOCS and improves on the cost-effectiveness of current approaches. PMID:23249957

  6. Big Data Challenges in Global Seismic 'Adjoint Tomography' (Invited)

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Smith, J.

    2013-12-01

    The challenge of imaging Earth's interior on a global scale is closely linked to the challenge of handling large data sets. The related iterative workflow involves five distinct phases, namely, 1) data gathering and culling, 2) synthetic seismogram calculations, 3) pre-processing (time-series analysis and time-window selection), 4) data assimilation and adjoint calculations, 5) post-processing (pre-conditioning, regularization, model update). In order to implement this workflow on modern high-performance computing systems, a new seismic data format is being developed. The Adaptable Seismic Data Format (ASDF) is designed to replace currently used data formats with a more flexible format that allows for fast parallel I/O. The metadata is divided into abstract categories, such as "source" and "receiver", along with provenance information for complete reproducibility. The structure of ASDF is designed keeping in mind three distinct applications: earthquake seismology, seismic interferometry, and exploration seismology. Existing time-series analysis tool kits, such as SAC and ObsPy, can be easily interfaced with ASDF so that seismologists can use robust, previously developed software packages. ASDF accommodates an automated, efficient workflow for global adjoint tomography. Manually managing the large number of simulations associated with the workflow can rapidly become a burden, especially with increasing numbers of earthquakes and stations. Therefore, it is of importance to investigate the possibility of automating the entire workflow. Scientific Workflow Management Software (SWfMS) allows users to execute workflows almost routinely. SWfMS provides additional advantages. In particular, it is possible to group independent simulations in a single job to fit the available computational resources. They also give a basic level of fault resilience as the workflow can be resumed at the correct state preceding a failure. Some of the best candidates for our particular workflow are Kepler and Swift, and the latter appears to be the most serious candidate for a large-scale workflow on a single supercomputer, remaining sufficiently simple to accommodate further modifications and improvements.

  7. Inequality in Participation in Adult Learning and Education (ALE): Effects of Micro- and Macro-Level Factors through a Comparative Study

    ERIC Educational Resources Information Center

    Lee, Jeongwoo

    2017-01-01

    The objectives of this dissertation include describing and analyzing the patterns of inequality in ALE participation at both the micro and macro levels. Special attention is paid to social origins of individual adults and their association with two groups of macro-level factors, social inequality (income, education, and skill inequality) and…

  8. Ecological perspectives of land use history: The Arid Lands Ecology (ALE) Reserve

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinds, N R; Rogers, L E

    The objective of this study was to gather information on the land use history of the Arid Land Ecology (ALE) Reserve so that current ecological research could be placed within a historical perspective. The data were gathered in the early 1980s by interviewing former users of the land and from previously published research (where available). Interviews with former land users of the ALE Reserve in Benton County, Washington, revealed that major land uses from 1880 to 1940 were homesteading, grazing, oil/gas production, and road building. Land use practices associated with grazing and homesteading have left the greatest impact on themore » landscape. Disturbed sites where succession is characterized by non-native species, plots where sagebrush was railed away, and sheep trails are major indications today of past land uses. Recent estimates of annual bunchgrass production do ALE do not support the widespread belief that bunchgrass were more productive during the homesteading era, though the invasion of cheatgrass (Bromus tectorum), Jim Hill mustard (Sisymbrium altissium), and other European alien plant species has altered pre-settlement succession patterns. 15 refs., 6 figs., 1 tab.« less

  9. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  10. REPRODUCIBLE RESEARCH WORKFLOW IN R FOR THE ANALYSIS OF PERSONALIZED HUMAN MICROBIOME DATA.

    PubMed

    Callahan, Benjamin; Proctor, Diana; Relman, David; Fukuyama, Julia; Holmes, Susan

    2016-01-01

    This article presents a reproducible research workflow for amplicon-based microbiome studies in personalized medicine created using Bioconductor packages and the knitr markdown interface.We show that sometimes a multiplicity of choices and lack of consistent documentation at each stage of the sequential processing pipeline used for the analysis of microbiome data can lead to spurious results. We propose its replacement with reproducible and documented analysis using R packages dada2, knitr, and phyloseq. This workflow implements both key stages of amplicon analysis: the initial filtering and denoising steps needed to construct taxonomic feature tables from error-containing sequencing reads (dada2), and the exploratory and inferential analysis of those feature tables and associated sample metadata (phyloseq). This workow facilitates reproducible interrogation of the full set of choices required in microbiome studies. We present several examples in which we leverage existing packages for analysis in a way that allows easy sharing and modification by others, and give pointers to articles that depend on this reproducible workflow for the study of longitudinal and spatial series analyses of the vaginal microbiome in pregnancy and the oral microbiome in humans with healthy dentition and intra-oral tissues.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  12. An automated analysis workflow for optimization of force-field parameters using neutron scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Vickie E.; Borreguero, Jose M.; Bhowmik, Debsindhu

    Graphical abstract: - Highlights: • An automated workflow to optimize force-field parameters. • Used the workflow to optimize force-field parameter for a system containing nanodiamond and tRNA. • The mechanism relies on molecular dynamics simulation and neutron scattering experimental data. • The workflow can be generalized to any other experimental and simulation techniques. - Abstract: Large-scale simulations and data analysis are often required to explain neutron scattering experiments to establish a connection between the fundamental physics at the nanoscale and data probed by neutrons. However, to perform simulations at experimental conditions it is critical to use correct force-field (FF) parametersmore » which are unfortunately not available for most complex experimental systems. In this work, we have developed a workflow optimization technique to provide optimized FF parameters by comparing molecular dynamics (MD) to neutron scattering data. We describe the workflow in detail by using an example system consisting of tRNA and hydrophilic nanodiamonds in a deuterated water (D{sub 2}O) environment. Quasi-elastic neutron scattering (QENS) data show a faster motion of the tRNA in the presence of nanodiamond than without the ND. To compare the QENS and MD results quantitatively, a proper choice of FF parameters is necessary. We use an efficient workflow to optimize the FF parameters between the hydrophilic nanodiamond and water by comparing to the QENS data. Our results show that we can obtain accurate FF parameters by using this technique. The workflow can be generalized to other types of neutron data for FF optimization, such as vibrational spectroscopy and spin echo.« less

  13. Disentangling the neural mechanisms involved in Hinduism- and Buddhism-related meditations.

    PubMed

    Tomasino, Barbara; Chiesa, Alberto; Fabbro, Franco

    2014-10-01

    The most diffuse forms of meditation derive from Hinduism and Buddhism spiritual traditions. Different cognitive processes are set in place to reach these meditation states. According to an historical-philological hypothesis (Wynne, 2009) the two forms of meditation could be disentangled. While mindfulness is the focus of Buddhist meditation reached by focusing sustained attention on the body, on breathing and on the content of the thoughts, reaching an ineffable state of nothigness accompanied by a loss of sense of self and duality (Samadhi) is the main focus of Hinduism-inspired meditation. It is possible that these different practices activate separate brain networks. We tested this hypothesis by conducting an activation likelihood estimation (ALE) meta-analysis of functional magnetic resonance imaging (fMRI) studies. The network related to Buddhism-inspired meditation (16 experiments, 263 subjects, and 96 activation foci) included activations in some frontal lobe structures associated with executive attention, possibly confirming the fundamental role of mindfulness shared by many Buddhist meditations. By contrast, the network related to Hinduism-inspired meditation (8 experiments, 54 activation foci and 66 subjects) triggered a left lateralized network of areas including the postcentral gyrus, the superior parietal lobe, the hippocampus and the right middle cingulate cortex. The dissociation between anterior and posterior networks support the notion that different meditation styles and traditions are characterized by different patterns of neural activation. Copyright © 2014. Published by Elsevier Inc.

  14. Application of cyclic fluorocarbon/argon discharges to device patterning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metzler, Dominik, E-mail: dmetzler@umd.edu; Uppireddi, Kishore; Bruce, Robert L.

    2016-01-15

    With increasing demands on device patterning to achieve smaller critical dimensions and pitches for the 5 nm node and beyond, the need for atomic layer etching (ALE) is steadily increasing. In this work, a cyclic fluorocarbon/Ar plasma is successfully used for ALE patterning in a manufacturing scale reactor. Self-limited etching of silicon oxide is observed. The impact of various process parameters on the etch performance is established. The substrate temperature has been shown to play an especially significant role, with lower temperatures leading to higher selectivity and lower etch rates, but worse pattern fidelity. The cyclic ALE approach established with thismore » work is shown to have great potential for small scale device patterning, showing self-limited etching, improved uniformity and resist mask performance.« less

  15. Application of cyclic fluorocarbon/argon discharges to device patterning

    DOE PAGES

    Metzler, Dominik; Uppiredi, Kishore; Bruce, Robert L.; ...

    2015-11-13

    With increasing demands on device patterning to achieve smaller critical dimensions and pitches for the 5nm node and beyond, the need for atomic layer etching (ALE) is steadily increasing. In this study, a cyclic fluorocarbon/Ar plasma is successfully used for ALE patterning in a manufacturing scale reactor. Self-limited etching of silicon oxide is observed. The impact of various process parameters on the etch performance is established. The substrate temperature has been shown to play an especially significant role, with lower temperatures leading to higher selectivity and lower etch rates, but worse pattern fidelity. The cyclic ALE approach established with thismore » work is shown to have great potential for small scale device patterning, showing self-limited etching, improved uniformity and resist mask performance.« less

  16. Research Infrastructure for Collaborative Team Science: Challenges in Technology-Supported Workflows in and Across Laboratories, Institutions, and Geographies.

    PubMed

    Mirel, Barbara; Luo, Airong; Harris, Marcelline

    2015-05-01

    Collaborative research has many challenges. One under-researched challenge is how to align collaborators' research practices and evolving analytical reasoning with technologies and configurations of technologies that best support them. The goal of such alignment is to enhance collaborative problem solving capabilities in research. Toward this end, we draw on our own research and a synthesis of the literature to characterize the workflow of collaborating scientists in systems-level renal disease research. We describe the various phases of a hypothetical workflow among diverse collaborators within and across laboratories, extending from their primary analysis through secondary analysis. For each phase, we highlight required technology supports, and. At time, complementary organizational supports. This survey of supports matching collaborators' analysis practices and needs in research projects to technological support is preliminary, aimed ultimately at developing a research capability framework that can help scientists and technologists mutually understand workflows and technologies that can help enable and enhance them. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Cryptanalysis of the Sodark Family of Cipher Algorithms

    DTIC Science & Technology

    2017-09-01

    software project for building three-bit LUT circuit representations of S- boxes is available as a GitHub repository [40]. It contains several improvements...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The...second- and third-generation automatic link establishment (ALE) systems for high frequency radios. Radios utilizing ALE technology are in use by a

  18. myExperiment: a repository and social network for the sharing of bioinformatics workflows

    PubMed Central

    Goble, Carole A.; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David

    2010-01-01

    myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org. PMID:20501605

  19. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  20. Visualization and Analysis for Near-Real-Time Decision Making in Distributed Workflows

    DOE PAGES

    Pugmire, David; Kress, James; Choi, Jong; ...

    2016-08-04

    Data driven science is becoming increasingly more common, complex, and is placing tremendous stresses on visualization and analysis frameworks. Data sources producing 10GB per second (and more) are becoming increasingly commonplace in both simulation, sensor and experimental sciences. These data sources, which are often distributed around the world, must be analyzed by teams of scientists that are also distributed. Enabling scientists to view, query and interact with such large volumes of data in near-real-time requires a rich fusion of visualization and analysis techniques, middleware and workflow systems. Here, this paper discusses initial research into visualization and analysis of distributed datamore » workflows that enables scientists to make near-real-time decisions of large volumes of time varying data.« less

  1. Oqtans: the RNA-seq workbench in the cloud for complete and reproducible quantitative transcriptome analysis.

    PubMed

    Sreedharan, Vipin T; Schultheiss, Sebastian J; Jean, Géraldine; Kahles, André; Bohnert, Regina; Drewe, Philipp; Mudrakarta, Pramod; Görnitz, Nico; Zeller, Georg; Rätsch, Gunnar

    2014-05-01

    We present Oqtans, an open-source workbench for quantitative transcriptome analysis, that is integrated in Galaxy. Its distinguishing features include customizable computational workflows and a modular pipeline architecture that facilitates comparative assessment of tool and data quality. Oqtans integrates an assortment of machine learning-powered tools into Galaxy, which show superior or equal performance to state-of-the-art tools. Implemented tools comprise a complete transcriptome analysis workflow: short-read alignment, transcript identification/quantification and differential expression analysis. Oqtans and Galaxy facilitate persistent storage, data exchange and documentation of intermediate results and analysis workflows. We illustrate how Oqtans aids the interpretation of data from different experiments in easy to understand use cases. Users can easily create their own workflows and extend Oqtans by integrating specific tools. Oqtans is available as (i) a cloud machine image with a demo instance at cloud.oqtans.org, (ii) a public Galaxy instance at galaxy.cbio.mskcc.org, (iii) a git repository containing all installed software (oqtans.org/git); most of which is also available from (iv) the Galaxy Toolshed and (v) a share string to use along with Galaxy CloudMan.

  2. The Nasa-Isro SAR Mission Science Data Products and Processing Workflows

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Agram, P. S.; Lavalle, M.; Cohen, J.; Buckley, S.; Kumar, R.; Misra-Ray, A.; Ramanujam, V.; Agarwal, K. M.

    2017-12-01

    The NASA-ISRO SAR (NISAR) Mission is currently in the development phase and in the process of specifying its suite of data products and algorithmic workflows, responding to inputs from the NISAR Science and Applications Team. NISAR will provide raw data (Level 0), full-resolution complex imagery (Level 1), and interferometric and polarimetric image products (Level 2) for the entire data set, in both natural radar and geocoded coordinates. NASA and ISRO are coordinating the formats, meta-data layers, and algorithms for these products, for both the NASA-provided L-band radar and the ISRO-provided S-band radar. Higher level products will be also be generated for the purpose of calibration and validation, over large areas of Earth, including tectonic plate boundaries, ice sheets and sea-ice, and areas of ecosystem disturbance and change. This level of comprehensive product generation has been unprecedented for SAR missions in the past, and leads to storage processing challenges for the production system and the archive center. Further, recognizing the potential to support applications that require low latency product generation and delivery, the NISAR team is optimizing the entire end-to-end ground data system for such response, including exploring the advantages of cloud-based processing, algorithmic acceleration using GPUs, and on-demand processing schemes that minimize computational and transport costs, but allow rapid delivery to science and applications users. This paper will review the current products, workflows, and discuss the scientific and operational trade-space of mission capabilities.

  3. Galaxy-M: a Galaxy workflow for processing and analyzing direct infusion and liquid chromatography mass spectrometry-based metabolomics data.

    PubMed

    Davidson, Robert L; Weber, Ralf J M; Liu, Haoyu; Sharma-Oates, Archana; Viant, Mark R

    2016-01-01

    Metabolomics is increasingly recognized as an invaluable tool in the biological, medical and environmental sciences yet lags behind the methodological maturity of other omics fields. To achieve its full potential, including the integration of multiple omics modalities, the accessibility, standardization and reproducibility of computational metabolomics tools must be improved significantly. Here we present our end-to-end mass spectrometry metabolomics workflow in the widely used platform, Galaxy. Named Galaxy-M, our workflow has been developed for both direct infusion mass spectrometry (DIMS) and liquid chromatography mass spectrometry (LC-MS) metabolomics. The range of tools presented spans from processing of raw data, e.g. peak picking and alignment, through data cleansing, e.g. missing value imputation, to preparation for statistical analysis, e.g. normalization and scaling, and principal components analysis (PCA) with associated statistical evaluation. We demonstrate the ease of using these Galaxy workflows via the analysis of DIMS and LC-MS datasets, and provide PCA scores and associated statistics to help other users to ensure that they can accurately repeat the processing and analysis of these two datasets. Galaxy and data are all provided pre-installed in a virtual machine (VM) that can be downloaded from the GigaDB repository. Additionally, source code, executables and installation instructions are available from GitHub. The Galaxy platform has enabled us to produce an easily accessible and reproducible computational metabolomics workflow. More tools could be added by the community to expand its functionality. We recommend that Galaxy-M workflow files are included within the supplementary information of publications, enabling metabolomics studies to achieve greater reproducibility.

  4. Use of adaptive laboratory evolution to discover key mutations enabling rapid growth of Escherichia coli K-12 MG1655 on glucose minimal medium.

    PubMed

    LaCroix, Ryan A; Sandberg, Troy E; O'Brien, Edward J; Utrilla, Jose; Ebrahim, Ali; Guzman, Gabriela I; Szubin, Richard; Palsson, Bernhard O; Feist, Adam M

    2015-01-01

    Adaptive laboratory evolution (ALE) has emerged as an effective tool for scientific discovery and addressing biotechnological needs. Much of ALE's utility is derived from reproducibly obtained fitness increases. Identifying causal genetic changes and their combinatorial effects is challenging and time-consuming. Understanding how these genetic changes enable increased fitness can be difficult. A series of approaches that address these challenges was developed and demonstrated using Escherichia coli K-12 MG1655 on glucose minimal media at 37°C. By keeping E. coli in constant substrate excess and exponential growth, fitness increases up to 1.6-fold were obtained compared to the wild type. These increases are comparable to previously reported maximum growth rates in similar conditions but were obtained over a shorter time frame. Across the eight replicate ALE experiments performed, causal mutations were identified using three approaches: identifying mutations in the same gene/region across replicate experiments, sequencing strains before and after computationally determined fitness jumps, and allelic replacement coupled with targeted ALE of reconstructed strains. Three genetic regions were most often mutated: the global transcription gene rpoB, an 82-bp deletion between the metabolic pyrE gene and rph, and an IS element between the DNA structural gene hns and tdk. Model-derived classification of gene expression revealed a number of processes important for increased growth that were missed using a gene classification system alone. The methods described here represent a powerful combination of technologies to increase the speed and efficiency of ALE studies. The identified mutations can be examined as genetic parts for increasing growth rate in a desired strain and for understanding rapid growth phenotypes. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  5. Use of Adaptive Laboratory Evolution To Discover Key Mutations Enabling Rapid Growth of Escherichia coli K-12 MG1655 on Glucose Minimal Medium

    PubMed Central

    LaCroix, Ryan A.; Sandberg, Troy E.; O'Brien, Edward J.; Utrilla, Jose; Ebrahim, Ali; Guzman, Gabriela I.; Szubin, Richard; Palsson, Bernhard O.

    2014-01-01

    Adaptive laboratory evolution (ALE) has emerged as an effective tool for scientific discovery and addressing biotechnological needs. Much of ALE's utility is derived from reproducibly obtained fitness increases. Identifying causal genetic changes and their combinatorial effects is challenging and time-consuming. Understanding how these genetic changes enable increased fitness can be difficult. A series of approaches that address these challenges was developed and demonstrated using Escherichia coli K-12 MG1655 on glucose minimal media at 37°C. By keeping E. coli in constant substrate excess and exponential growth, fitness increases up to 1.6-fold were obtained compared to the wild type. These increases are comparable to previously reported maximum growth rates in similar conditions but were obtained over a shorter time frame. Across the eight replicate ALE experiments performed, causal mutations were identified using three approaches: identifying mutations in the same gene/region across replicate experiments, sequencing strains before and after computationally determined fitness jumps, and allelic replacement coupled with targeted ALE of reconstructed strains. Three genetic regions were most often mutated: the global transcription gene rpoB, an 82-bp deletion between the metabolic pyrE gene and rph, and an IS element between the DNA structural gene hns and tdk. Model-derived classification of gene expression revealed a number of processes important for increased growth that were missed using a gene classification system alone. The methods described here represent a powerful combination of technologies to increase the speed and efficiency of ALE studies. The identified mutations can be examined as genetic parts for increasing growth rate in a desired strain and for understanding rapid growth phenotypes. PMID:25304508

  6. Impact of a grout curtain on groundwater regime in karst: the example of the Ðale reservoir (Croatia)

    NASA Astrophysics Data System (ADS)

    Bonacci, Ognjen; Roje-Bonacci, Tanja

    2010-05-01

    Construction of grout curtains in karst terrains is primarily connected with dams and reservoirs. Their role is to increase watertightness and prevent progressive erosion. In this presentation hourly continuous measurement of groundwater level in two deep piezometers near the Đale reservoir is analysed. The Đale reservoir in the Cetina River began operation in 1989. The total length of the grout curtain is 3.9 km. It spreads 120 m bellow the Đale dam. First analysed piezometer A is drilled in the interior part of the system, between the reservoir and the grout curtain, while the second one B is located in its external part. Distance between them is 200 m. In natural conditions, prior the grout curtain construction, groundwater level fluctuation in both of them was similar (practically the same). Construction of the grout curtain extremely changed groundwater behaviour in each of them. During the six month of continuous monitoring, differences between groundwater levels in them range between +19.86 m (groundwater in B is lower than in A) and -12.77 m (groundwater in A is lower than in B). During the 77% of analysed period the groundwater level in interior piezometer A is higher than the groundwater level in external piezometer B. In other 23% of analysed period the groundwater level in outside piezometer B is higher than in inside A. The construction of the grout curtain caused unnaturally high hydrostatic gradients, which can accelerate the dissolutional expansion of karst fractures. As a result, unbearable leakage of the reservoir Đale can occur over its lifetime. Careful analyses of groundwater level behaviour discover some other very important characteristics of karst underground morphology.

  7. An architecture model for multiple disease management information systems.

    PubMed

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  8. A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.

    PubMed

    Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish

    2014-08-01

    Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P < .0001), with higher sensitivity (85% vs. 69%) and specificity (94% vs. 73%) for margin prediction in a 5 × 5-mm grid analysis. A significantly better alignment was observed in the optimized workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  9. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  10. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149

  11. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  12. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    NASA Technical Reports Server (NTRS)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.

  13. AstroGrid: Taverna in the Virtual Observatory .

    NASA Astrophysics Data System (ADS)

    Benson, K. M.; Walton, N. A.

    This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.

  14. Comparison of ALE and SPH Simulations of Vertical Drop Tests of a Composite Fuselage Section into Water

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fuchs, Yvonne T.

    2008-01-01

    Simulation of multi-terrain impact has been identified as an important research area for improved prediction of rotorcraft crashworthiness within the NASA Subsonic Rotary Wing Aeronautics Program on Rotorcraft Crashworthiness. As part of this effort, two vertical drop tests were conducted of a 5-ft-diameter composite fuselage section into water. For the first test, the fuselage section was impacted in a baseline configuration without energy absorbers. For the second test, the fuselage section was retrofitted with a composite honeycomb energy absorber. Both tests were conducted at a nominal velocity of 25-ft/s. A detailed finite element model was developed to represent each test article and water impact was simulated using both Arbitrary Lagrangian Eulerian (ALE) and Smooth Particle Hydrodynamics (SPH) approaches in LS-DYNA, a nonlinear, explicit transient dynamic finite element code. Analytical predictions were correlated with experimental data for both test configurations. In addition, studies were performed to evaluate the influence of mesh density on test-analysis correlation.

  15. Asterism: an integrated, complete, and open-source approach for running seismologist continuous data-intensive analysis on heterogeneous systems

    NASA Astrophysics Data System (ADS)

    Ferreira da Silva, R.; Filgueira, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present Asterism, an open source data-intensive framework, which combines the Pegasus and dispel4py workflow systems. Asterism aims to simplify the effort required to develop data-intensive applications that run across multiple heterogeneous resources, without users having to: re-formulate their methods according to different enactment systems; manage the data distribution across systems; parallelize their methods; co-place and schedule their methods with computing resources; and store and transfer large/small volumes of data. Asterism's key element is to leverage the strengths of each workflow system: dispel4py allows developing scientific applications locally and then automatically parallelize and scale them on a wide range of HPC infrastructures with no changes to the application's code; Pegasus orchestrates the distributed execution of applications while providing portability, automated data management, recovery, debugging, and monitoring, without users needing to worry about the particulars of the target execution systems. Asterism leverages the level of abstractions provided by each workflow system to describe hybrid workflows where no information about the underlying infrastructure is required beforehand. The feasibility of Asterism has been evaluated using the seismic ambient noise cross-correlation application, a common data-intensive analysis pattern used by many seismologists. The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The Asterism workflow is implemented as a Pegasus workflow composed of two tasks (Phase1 and Phase2), where each phase represents a dispel4py workflow. Pegasus tasks describe the in/output data at a logical level, the data dependency between tasks, and the e-Infrastructures and the execution engine to run each dispel4py workflow. We have instantiated the workflow using data from 1000 stations from the IRIS services, and run it across two heterogeneous resources described as Docker containers: MPI (Container2) and Storm (Container3) clusters (Figure 1). Each dispel4py workflow is mapped to a particular execution engine, and data transfers between resources are automatically handled by Pegasus. Asterism is freely available online at http://github.com/dispel4py/pegasus_dispel4py.

  16. Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics

    PubMed Central

    Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A.; Caron, Christophe

    2015-01-01

    Summary: The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. Availability and implementation: http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). Contact: contact@workflow4metabolomics.org PMID:25527831

  17. Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics.

    PubMed

    Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A; Caron, Christophe

    2015-05-01

    The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). contact@workflow4metabolomics.org. © The Author 2014. Published by Oxford University Press.

  18. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Koo, Michelle; Cao, Yu

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less

  19. Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental and Workflow Protocols

    DTIC Science & Technology

    2016-06-01

    unlimited. v List of Tables Table 1 Single-lap-joint experimental parameters ..............................................7 Table 2 Survey ...Joints: Experimental and Workflow Protocols by Robert E Jensen, Daniel C DeSchepper, and David P Flanagan Approved for...TR-7696 ● JUNE 2016 US Army Research Laboratory Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental

  20. A guide to understanding meta-analysis.

    PubMed

    Israel, Heidi; Richter, Randy R

    2011-07-01

    With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.

  1. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  2. Using lean methodology to improve productivity in a hospital oncology pharmacy.

    PubMed

    Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D

    2014-09-01

    Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  3. Taking advantage of HTML5 browsers to realize the concepts of session state and workflow sharing in web-tool applications

    NASA Astrophysics Data System (ADS)

    Suftin, I.; Read, J. S.; Walker, J.

    2013-12-01

    Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file was generated. A user may then view results produced during that session or go back and alter input parameters, creating new results and producing new, unique sessions which they can then again share. This technique not only provides independence for the user to manage their session as they like, but also allows much greater freedom for the application provider to scale out without having to worry about carrying over user information or maintaining it in a central location.

  4. Explore Care Pathways of Colorectal Cancer Patients with Social Network Analysis.

    PubMed

    Huo, Tianyao; George, Thomas J; Guo, Yi; He, Zhe; Prosperi, Mattia; Modave, François; Bian, Jiang

    2017-01-01

    Patients with colorectal cancer (CRC) often face treatment delays and the exact reasons have not been well studied. This study is to explore clinical workflow patterns for CRC patients using electronic health records (EHR). In particular, we modeled the clinical workflow (provider-provider interactions) of a CRC patient's workup period as a social network, and identified clusters of workflow patterns based on network characteristics. Understanding of these patterns will help guide healthcare policy-making and practice.

  5. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  6. CyREST: Turbocharging Cytoscape Access for External Tools via a RESTful API.

    PubMed

    Ono, Keiichiro; Muetze, Tanja; Kolishovski, Georgi; Shannon, Paul; Demchak, Barry

    2015-01-01

    As bioinformatic workflows become increasingly complex and involve multiple specialized tools, so does the difficulty of reliably reproducing those workflows. Cytoscape is a critical workflow component for executing network visualization, analysis, and publishing tasks, but it can be operated only manually via a point-and-click user interface. Consequently, Cytoscape-oriented tasks are laborious and often error prone, especially with multistep protocols involving many networks. In this paper, we present the new cyREST Cytoscape app and accompanying harmonization libraries. Together, they improve workflow reproducibility and researcher productivity by enabling popular languages (e.g., Python and R, JavaScript, and C#) and tools (e.g., IPython/Jupyter Notebook and RStudio) to directly define and query networks, and perform network analysis, layouts and renderings. We describe cyREST's API and overall construction, and present Python- and R-based examples that illustrate how Cytoscape can be integrated into large scale data analysis pipelines. cyREST is available in the Cytoscape app store (http://apps.cytoscape.org) where it has been downloaded over 1900 times since its release in late 2014.

  7. Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Logan Timothy; Hackenberg, Robert Errol

    These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a "XML Schema" look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneckmore » Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.« less

  8. Scalable and cost-effective NGS genotyping in the cloud.

    PubMed

    Souilmi, Yassine; Lancaster, Alex K; Jung, Jae-Yoon; Rizzo, Ettore; Hawkins, Jared B; Powles, Ryan; Amzazi, Saaïd; Ghazal, Hassan; Tonellato, Peter J; Wall, Dennis P

    2015-10-15

    While next-generation sequencing (NGS) costs have plummeted in recent years, cost and complexity of computation remain substantial barriers to the use of NGS in routine clinical care. The clinical potential of NGS will not be realized until robust and routine whole genome sequencing data can be accurately rendered to medically actionable reports within a time window of hours and at scales of economy in the 10's of dollars. We take a step towards addressing this challenge, by using COSMOS, a cloud-enabled workflow management system, to develop GenomeKey, an NGS whole genome analysis workflow. COSMOS implements complex workflows making optimal use of high-performance compute clusters. Here we show that the Amazon Web Service (AWS) implementation of GenomeKey via COSMOS provides a fast, scalable, and cost-effective analysis of both public benchmarking and large-scale heterogeneous clinical NGS datasets. Our systematic benchmarking reveals important new insights and considerations to produce clinical turn-around of whole genome analysis optimization and workflow management including strategic batching of individual genomes and efficient cluster resource configuration.

  9. CyREST: Turbocharging Cytoscape Access for External Tools via a RESTful API

    PubMed Central

    Ono, Keiichiro; Muetze, Tanja; Kolishovski, Georgi; Shannon, Paul; Demchak, Barry

    2015-01-01

    As bioinformatic workflows become increasingly complex and involve multiple specialized tools, so does the difficulty of reliably reproducing those workflows. Cytoscape is a critical workflow component for executing network visualization, analysis, and publishing tasks, but it can be operated only manually via a point-and-click user interface. Consequently, Cytoscape-oriented tasks are laborious and often error prone, especially with multistep protocols involving many networks. In this paper, we present the new cyREST Cytoscape app and accompanying harmonization libraries. Together, they improve workflow reproducibility and researcher productivity by enabling popular languages (e.g., Python and R, JavaScript, and C#) and tools (e.g., IPython/Jupyter Notebook and RStudio) to directly define and query networks, and perform network analysis, layouts and renderings. We describe cyREST’s API and overall construction, and present Python- and R-based examples that illustrate how Cytoscape can be integrated into large scale data analysis pipelines. cyREST is available in the Cytoscape app store (http://apps.cytoscape.org) where it has been downloaded over 1900 times since its release in late 2014. PMID:26672762

  10. 77 FR 60022 - Supplemental Identification Information for One (1) Individual Designated Pursuant to Executive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ... 1. JIM'ALE, Ahmed Nur Ali (a.k.a. JIMALE, Ahmad Ali; a.k.a. JIM'ALE, Ahmad Nur Ali; a.k.a. JIMALE, Ahmed Ali; a.k.a. JIMALE, Shaykh Ahmed Nur; a.k.a. JIMALE, Sheikh Ahmed; a.k.a. JUMALE, Ahmed Ali; a.k.a. JUMALE, Ahmed Nur; a.k.a. JUMALI, Ahmed Ali), P.O. Box 3312, Dubai, United Arab Emirates; Mogadishu...

  11. Slip Continuity in Explicit Crystal Plasticity Simulations Using Nonlocal Continuum and Semi-discrete Approaches

    DTIC Science & Technology

    2013-01-01

    Based Micropolar Single Crystal Plasticity: Comparison of Multi - and Single Criterion Theories. J. Mech. Phys. Solids 2011, 59, 398–422. ALE3D ...element boundaries in a multi -step constitutive evaluation (Becker, 2011). The results showed the desired effects of smoothing the deformation field...Implementation The model was implemented in the large-scale parallel, explicit finite element code ALE3D (2012). The crystal plasticity

  12. Performance evaluation of a mobile satellite system modem using an ALE method

    NASA Technical Reports Server (NTRS)

    Ohsawa, Tomoki; Iwasaki, Motoya

    1990-01-01

    Experimental performance of a newly designed demodulation concept is presented. This concept applies an Adaptive Line Enhancer (ALE) to a carrier recovery circuit, which makes pull-in time significantly shorter in noisy and large carrier offset conditions. This new demodulation concept was actually developed as an INMARSAT standard-C modem, and was evaluated. On a performance evaluation, 50 symbol pull-in time is confirmed under 4 dB Eb/No condition.

  13. Targeted delivery of mesenchymal stem cells to the bone.

    PubMed

    Yao, Wei; Lane, Nancy E

    2015-01-01

    Osteoporosis is a disease of excess skeletal fragility that results from estrogen loss and aging. Age related bone loss has been attributed to both elevated bone resorption and insufficient bone formation. We developed a hybrid compound, LLP2A-Ale in which LLP2A has high affinity for the α4β1 integrin on mesenchymal stem cells (MSCs) and alendronate has high affinity for bone. When LLP2A-Ale was injected into mice, the compound directed MSCs to both trabecular and cortical bone surfaces and increased bone mass and bone strength. Additional studies are underway to further characterize this hybrid compound, LLP2A-Ale, and how it can be utilized for the treatment of bone loss resulting from hormone deficiency, aging, and inflammation and to augment bone fracture healing. This article is part of a Special Issue entitled "Stem Cells and Bone". Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Performance evaluation of heart sound cancellation in FPGA hardware implementation for electronic stethoscope.

    PubMed

    Chao, Chun-Tang; Maneetien, Nopadon; Wang, Chi-Jo; Chiou, Juing-Shian

    2014-01-01

    This paper presents the design and evaluation of the hardware circuit for electronic stethoscopes with heart sound cancellation capabilities using field programmable gate arrays (FPGAs). The adaptive line enhancer (ALE) was adopted as the filtering methodology to reduce heart sound attributes from the breath sounds obtained via the electronic stethoscope pickup. FPGAs were utilized to implement the ALE functions in hardware to achieve near real-time breath sound processing. We believe that such an implementation is unprecedented and crucial toward a truly useful, standalone medical device in outpatient clinic settings. The implementation evaluation with one Altera cyclone II-EP2C70F89 shows that the proposed ALE used 45% resources of the chip. Experiments with the proposed prototype were made using DE2-70 emulation board with recorded body signals obtained from online medical archives. Clear suppressions were observed in our experiments from both the frequency domain and time domain perspectives.

  15. Simulating Afterburn with LLNL Hydrocodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, L D

    2004-06-11

    Presented here is a working methodology for adapting a Lawrence Livermore National Laboratory (LLNL) developed hydrocode, ALE3D, to simulate weapon damage effects when afterburn is a consideration in the blast propagation. Experiments have shown that afterburn is of great consequence in enclosed environments (i.e. bomb in tunnel scenario, penetrating conventional munition in a bunker, or satchel charge placed in a deep underground facility). This empirical energy deposition methodology simulates the anticipated addition of kinetic energy that has been demonstrated by experiment (Kuhl, et. al. 1998), without explicitly solving the chemistry, or resolving the mesh to capture small-scale vorticity. This effortmore » is intended to complement the existing capability of either coupling ALE3D blast simulations with DYNA3D or performing fully coupled ALE3D simulations to predict building or component failure, for applications in National Security offensive strike planning as well as Homeland Defense infrastructure protection.« less

  16. Cross-calibrating ALES Envisat and CryoSat-2 Delay-Doppler: A coastal altimetry study in the Indonesian Seas

    NASA Astrophysics Data System (ADS)

    Passaro, Marcello; Dinardo, Salvatore; Quartly, Graham D.; Snaith, Helen M.; Benveniste, Jérôme; Cipollini, Paolo; Lucas, Bruno

    2016-08-01

    A regional cross-calibration between the first Delay-Doppler altimetry dataset from CryoSat-2 and a retracked Envisat dataset is here presented, in order to test the benefits of the Delay-Doppler processing and to expand the Envisat time series in the coastal ocean. The Indonesian Seas are chosen for the calibration, since the availability of altimetry data in this region is particularly beneficial due to the lack of in situ measurements and its importance for global ocean circulation. The Envisat data in the region are retracked with the Adaptive Leading Edge Subwaveform (ALES) retracker, which has been previously validated and applied successfully to coastal sea level research. The study demonstrates that CryoSat-2 is able to decrease the 1-Hz noise of sea level estimations by 0.3 cm within 50 km of the coast, when compared to the ALES-reprocessed Envisat dataset. It also shows that Envisat can be confidently used for detailed oceanographic research after the orbit change of October 2010. Cross-calibration at the crossover points indicates that in the region of study a sea state bias correction equal to 5% of the significant wave height is an acceptable approximation for Delay-Doppler altimetry. The analysis of the joint sea level time series reveals the geographic extent of the semiannual signal caused by Kelvin waves during the monsoon transitions, the larger amplitudes of the annual signal due to the Java Coastal Current and the impact of the strong La Niña event of 2010 on rising sea level trends.

  17. Structuring clinical workflows for diabetes care: an overview of the OntoHealth approach.

    PubMed

    Schweitzer, M; Lasierra, N; Oberbichler, S; Toma, I; Fensel, A; Hoerbst, A

    2014-01-01

    Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view.

  18. Structuring Clinical Workflows for Diabetes Care

    PubMed Central

    Lasierra, N.; Oberbichler, S.; Toma, I.; Fensel, A.; Hoerbst, A.

    2014-01-01

    Summary Background Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. Objectives The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. Methods A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. Results This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. Conclusions The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view. PMID:25024765

  19. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  20. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  1. Application of Six Sigma methodology to a diagnostic imaging process.

    PubMed

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  2. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  3. Financial and workflow analysis of radiology reporting processes in the planning phase of implementation of a speech recognition system

    NASA Astrophysics Data System (ADS)

    Whang, Tom; Ratib, Osman M.; Umamoto, Kathleen; Grant, Edward G.; McCoy, Michael J.

    2002-05-01

    The goal of this study is to determine the financial value and workflow improvements achievable by replacing traditional transcription services with a speech recognition system in a large, university hospital setting. Workflow metrics were measured at two hospitals, one of which exclusively uses a transcription service (UCLA Medical Center), and the other which exclusively uses speech recognition (West Los Angeles VA Hospital). Workflow metrics include time spent per report (the sum of time spent interpreting, dictating, reviewing, and editing), transcription turnaround, and total report turnaround. Compared to traditional transcription, speech recognition resulted in radiologists spending 13-32% more time per report, but it also resulted in reduction of report turnaround time by 22-62% and reduction of marginal cost per report by 94%. The model developed here helps justify the introduction of a speech recognition system by showing that the benefits of reduced operating costs and decreased turnaround time outweigh the cost of increased time spent per report. Whether the ultimate goal is to achieve a financial objective or to improve operational efficiency, it is important to conduct a thorough analysis of workflow before implementation.

  4. Processing, Cataloguing and Distribution of Uas Images in Near Real Time

    NASA Astrophysics Data System (ADS)

    Runkel, I.

    2013-08-01

    Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images can be checked and interpreted in near real-time. For sensible areas it gives you the possibility to inform remote decision makers or interpretation experts in order to provide them situations awareness, wherever they are. For monitoring and inspection tasks it speeds up the process of data capture and data interpretation. The fully automated workflow of data pre-processing, data georeferencing, data cataloguing and data dissemination in near real time was developed based on the Intergraph products ERDAS IMAGINE, ERDAS APOLLO and GEOSYSTEMS METAmorph!IT. It is offered as adaptable solution by GEOSYSTEMS GmbH.

  5. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  6. The equivalency between logic Petri workflow nets and workflow nets.

    PubMed

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  7. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    PubMed Central

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  8. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Schöbi, Roland; Sudret, Bruno

    2017-06-01

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions to surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.

  9. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schöbi, Roland, E-mail: schoebi@ibk.baug.ethz.ch; Sudret, Bruno, E-mail: sudret@ibk.baug.ethz.ch

    2017-06-15

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions tomore » surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.« less

  10. Revegetation Plan for Areas of the Fitzner-Eberhardt Arid Lands Ecology Reserve Affected by Decommissioning of Buildings and Infrastructure and Debris Clean-up Actions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downs, Janelle L.; Durham, Robin E.; Larson, Kyle B.

    The U.S. Department of Energy (DOE), Richland Operations Office is working to remove a number of facilities on the Fitzner Eberhardt Arid Lands Ecology Reserve (ALE), which is part of the Hanford Reach National Monument. Decommissioning and removal of buildings and debris on ALE will leave bare soils and excavated areas that need to be revegetated to prevent erosion and weed invasion. Four main areas within ALE are affected by these activities (DOE 2009;DOE/EA-1660F): 1) facilities along the ridgeline of Rattlesnake Mountain, 2) the former Nike missile base and ALE HQ laboratory buildings, 3) the aquatic research laboratory at Rattlesnakemore » Springs area, and 4) a number of small sites across ALE where various types of debris remain from previous uses. This revegetation plan addresses the revegetation and restoration of those land areas disturbed by decommissioning and removal of buildings, facilities and associated infrastructure or debris removal. The primary objective of the revegetation efforts on ALE is to establish native vegetation at each of the sites that will enhance and accelerate the recovery of the native plant community that naturally persists at that location. Revegetation is intended to meet the direction specified by the Environmental Assessment (DOE 2009; DOE/EA-1660F) and by Stipulation C.7 of the Memorandum of Agreement (MOA) for the Rattlesnake Mountain Combined Community Communication Facility and InfrastructureCleanup on the Fitzner/Eberhardt Arid Lands Ecology Reserve, Hanford Site, Richland Washington(DOE 2009; Appendix B). Pacific Northwest National Laboratory (PNNL) under contract with CH2M Hill Plateau Remediation Company (CPRC) and in consultation with the tribes and DOE-RL developed a site-specific strategy for each of the revegetation units identified within this document. The strategy and implementation approach for each revegetation unit identifies an appropriate native species mix and outlines the necessary site preparation activities and specific methods for seeding and planting at each area. evegetation work is scheduled to commence during the first quarter of FY 2011 to minimize the amount of time that sites are unvegetated and more susceptible to invasion by non-native weedy annual species.« less

  11. Teaching meta-analysis using MetaLight.

    PubMed

    Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark

    2012-10-18

    Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.

  12. [Autism spectrum disorder and evaluation of perceived stress parents and professionals: Study of the psychometric properties of a French adaptation of the Appraisal of Life Event Scale (ALES-vf)].

    PubMed

    Cappe, É; Poirier, N; Boujut, É; Nader-Grosbois, N; Dionne, C; Boulard, A

    2017-08-01

    Autism and related disorders are grouped into the category of « Autism Spectrum Disorder » (ASD) in the DSM-5. This appellation reflects the idea of a dimensional representation of autism that combines symptoms and characteristics that vary in severity and intensity. Despite common characteristics, there are varying degrees in intensity and in the onset of symptoms, ranging from a disability that can be very heavy with a total lack of communication and major disorders associated with the existence of a relative autonomy associated, sometimes, with extraordinary intellectual abilities. Parents are faced with several difficult situations, such as sleep disturbances, agitation, shouting, hetero violence, self-harm, learning difficulties, stereotyping, lack of social and emotional reciprocity, inappropriate behavior, etc. They can feel helpless and may experience stress related to these developmental and behavioral difficulties. The heterogeneity of symptoms, the presence of behavioral problems, the lack of reciprocity and autonomy also represent a challenge for practitioners in institutions and teachers at school. The objective of this research is to present the validation of a French translation of the Appraisal of Life Events Scale (ALES-vf) from Ferguson, Matthex and Cox, specifically adapted to the context of ASD. ALES was originally developed to operationalize the three dimensions of perceived stress (threat, loss and challenge) described by Lazarus and Folkman. ALES-vf was initially translated into French and adapted to the situation of parents of children with ASD. It was subsequently administered to 343 parents, 150 paramedical professionals involved with people with ASD, and 155 teachers from an ordinary school environment and from specialized schools, welcoming in their classroom at least one child with ASD. An exploratory factor analysis performed on data from 170 parents highlighted two exploratory models with four and three factors, slightly different from the original three-factor model of Ferguson and his collaborators. Confirmatory analyzes were conducted on data from 173 other parents to test two exploratory models and the original model of Ferguson. It has also been tested on data from 305 professionals (paramedical professionals and teachers) and on the whole sample (parents and professionals). The results suggest a better match of the original three-factor model. In addition, Cronbach's alpha coefficients and inter-item correlations showed a good internal consistency for these three factors. Finally, variance analysis and regressions were performed to test the effect of the nationality of the parents, the child's level of autonomy, the child's level of communication, and on the perceived stress by experienced professionals. ALES-vf, after our adaptation has good psychometric properties for use not only with parents but also with professionals (teachers, educators, psychologists) working with children with ASD. Our analyses showed that the nationality of the parents does not significantly influence the subscales « threat » and « challenge » of ALES-vf, which makes it usable in other Francophone countries. Specificities in the subscales were identified based on group membership (parents and professionals). For example parents get higher average scores on subscales « loss » and « threat » and a lower average score on the subscale « challenge », compared to professionals. Finally, regarding the specifics found among professionals, the results show that the years of experience have an effect on perceived stress. Specifically, teachers and educators who have more experience perceive their work with children with ASD as a challenge. This is consistent with the results of studies showing that teachers who have had experience with children with ASD had less difficulty in their interventions. Copyright © 2016 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  13. Overcoming Barriers to Technology Adoption in Small Manufacturing Enterprises (SMEs)

    DTIC Science & Technology

    2003-06-01

    automates quote-generation, order - processing workflow management, perform- ance analysis, and accounting functions. Ultimately, it will enable Magdic...that Magdic imple- ment an MES instead. The MES, in addition to solving the problem of document manage- ment, would automate quote-generation, order ... processing , workflow management, perform- ance analysis, and accounting functions. To help Magdic personnel learn about the MES, TIDE personnel provided

  14. Scientific Workflow Management in Proteomics

    PubMed Central

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  15. Solutions for Mining Distributed Scientific Data

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Pham, L.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.

    2007-12-01

    Researchers at the University of Alabama in Huntsville (UAH) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) are working on approaches and methodologies facilitating the analysis of large amounts of distributed scientific data. Despite the existence of full-featured analysis tools, such as the Algorithm Development and Mining (ADaM) toolkit from UAH, and data repositories, such as the GES DISC, that provide online access to large amounts of data, there remain obstacles to getting the analysis tools and the data together in a workable environment. Does one bring the data to the tools or deploy the tools close to the data? The large size of many current Earth science datasets incurs significant overhead in network transfer for analysis workflows, even with the advanced networking capabilities that are available between many educational and government facilities. The UAH and GES DISC team are developing a capability to define analysis workflows using distributed services and online data resources. We are developing two solutions for this problem that address different analysis scenarios. The first is a Data Center Deployment of the analysis services for large data selections, orchestrated by a remotely defined analysis workflow. The second is a Data Mining Center approach of providing a cohesive analysis solution for smaller subsets of data. The two approaches can be complementary and thus provide flexibility for researchers to exploit the best solution for their data requirements. The Data Center Deployment of the analysis services has been implemented by deploying ADaM web services at the GES DISC so they can access the data directly, without the need of network transfers. Using the Mining Workflow Composer, a user can define an analysis workflow that is then submitted through a Web Services interface to the GES DISC for execution by a processing engine. The workflow definition is composed, maintained and executed at a distributed location, but most of the actual services comprising the workflow are available local to the GES DISC data repository. Additional refinements will ultimately provide a package that is easily implemented and configured at additional data centers for analysis of additional science data sets. Enhancements to the ADaM toolkit allow the staging of distributed data wherever the services are deployed, to support a Data Mining Center that can provide additional computational resources, large storage of output, easier addition and updates to available services, and access to data from multiple repositories. The Data Mining Center case provides researchers more flexibility to quickly try different workflow configurations and refine the process, using smaller amounts of data that may likely be transferred from distributed online repositories. This environment is sufficient for some analyses, but can also be used as an initial sandbox to test and refine a solution before staging the execution at a Data Center Deployment. Detection of airborne dust both over water and land in MODIS imagery using mining services for both solutions will be presented. The dust detection is just one possible example of the mining and analysis capabilities the proposed mining services solutions will provide to the science community. More information about the available services and the current status of this project is available at http://www.itsc.uah.edu/mws/

  16. Comparison of peak-picking workflows for untargeted liquid chromatography/high-resolution mass spectrometry metabolomics data analysis.

    PubMed

    Rafiei, Atefeh; Sleno, Lekha

    2015-01-15

    Data analysis is a key step in mass spectrometry based untargeted metabolomics, starting with the generation of generic peak lists from raw liquid chromatography/mass spectrometry (LC/MS) data. Due to the use of various algorithms by different workflows, the results of different peak-picking strategies often differ widely. Raw LC/HRMS data from two types of biological samples (bile and urine), as well as a standard mixture of 84 metabolites, were processed with four peak-picking softwares: Peakview®, Markerview™, MetabolitePilot™ and XCMS Online. The overlaps between the results of each peak-generating method were then investigated. To gauge the relevance of peak lists, a database search using the METLIN online database was performed to determine which features had accurate masses matching known metabolites as well as a secondary filtering based on MS/MS spectral matching. In this study, only a small proportion of all peaks (less than 10%) were common to all four software programs. Comparison of database searching results showed peaks found uniquely by one workflow have less chance of being found in the METLIN metabolomics database and are even less likely to be confirmed by MS/MS. It was shown that the performance of peak-generating workflows has a direct impact on untargeted metabolomics results. As it was demonstrated that the peaks found in more than one peak detection workflow have higher potential to be identified by accurate mass as well as MS/MS spectrum matching, it is suggested to use the overlap of different peak-picking workflows as preliminary peak lists for more rugged statistical analysis in global metabolomics investigations. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Modeling of Complex Coupled Fluid-Structure Interaction Systems in Arbitrary Water Depth

    DTIC Science & Technology

    2009-01-01

    basin. For the particle finite- element method ( PFEM ) near-field fluid model we completed: (4) the development of a fully-coupled fluid/flexible...method ( PFEM ) based framework for the ALE-RANS solver [1]. We presented the theory of ALE-RANS with a k- turbulence closure model and several numerical...implemented by PFEM (Task (4)). In this work a universal wall function (UWF) is introduced and implemented to more accurately predict the boundary

  18. Sensitivity of Particle Size in Discrete Element Method to Particle Gas Method (DEM_PGM) Coupling in Underbody Blast Simulations

    DTIC Science & Technology

    2016-06-12

    Particle Size in Discrete Element Method to Particle Gas Method (DEM_PGM) Coupling in Underbody Blast Simulations Venkatesh Babu, Kumar Kulkarni, Sanjay...buried in soil viz., (1) coupled discrete element & particle gas methods (DEM-PGM) and (2) Arbitrary Lagrangian-Eulerian (ALE), are investigated. The...DEM_PGM and identify the limitations/strengths compared to the ALE method. Discrete Element Method (DEM) can model individual particle directly, and

  19. Comparison of the LLNL ALE3D and AKTS Thermal Safety Computer Codes for Calculating Times to Explosion in ODTX and STEX Thermal Cookoff Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemhoff, A P; Burnham, A K

    2006-04-05

    Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less

  20. Nootropic activity of tuber extract of Pueraria tuberosa (Roxb).

    PubMed

    Rao, N Venkata; Pujar, Basavaraj; Nimbal, S K; Shantakumar, S M; Satyanarayana, S

    2008-08-01

    Nootropic effect of alcoholic (ALE; 50, 75, 100 mg/kg) and aqueous (AQE; 100, 200, 400 mg/kg) extracts of P. tuberosa was evaluated by using Elevated Plus Maze (EPM), scopolamine-induced amnesia (SIA), diazepam-induced amnesia (DIA), clonidine-induced (NA-mediated) hypothermia (CIH), lithium-induced (5-HT mediated) head twitches (LIH) and haloperidol-induced (DA- mediated) catalepsy (HIC) models. Piracetam was used as the standard drug. A significant increase in inflexion ratio (IR) was recorded in EPM, SIA and DIA models. A significant reversal effect was observed on rectal temperature in CIH model, reduction of head twitches in LIH models. However no significant reduction in catalepsy scores in HIC models were observed with test extracts and standard piracetam. The results indicate that nootropic activity observed with ALE and AQE of tuber extracts of P. tuberosa could be through improved learning and memory either by augmenting the noradrenaline (NA) transmission or by interfering with 5-hydroxytryptamine (5-HT) release. Further, the extracts neither facilitated nor blocked release of the dopamine (DA). Thus ALE and AQE elicited significant nootropic effect in mice and rats by interacting with cholinergic, GABAnergic, adrenergic and serotonergic systems. Phytoconstituents like flavonoids have been reported for their nootropic effect and these are present in both ALE and AQE extracts of tubers of P. tuberosa (Roxb) and these active principles may be responsible for nootropic activity.

  1. A post-Amadori inhibitor pyridoxamine also inhibits chemical modification of proteins by scavenging carbonyl intermediates of carbohydrate and lipid degradation.

    PubMed

    Voziyan, Paul A; Metz, Thomas O; Baynes, John W; Hudson, Billy G

    2002-02-01

    Reactive carbonyl compounds are formed during autoxidation of carbohydrates and peroxidation of lipids. These compounds are intermediates in the formation of advanced glycation end products (AGE) and advanced lipoxidation end products (ALE) in tissue proteins during aging and in chronic disease. We studied the reaction of carbonyl compounds glyoxal (GO) and glycolaldehyde (GLA) with pyridoxamine (PM), a potent post-Amadori inhibitor of AGE formation in vitro and of development of renal and retinal pathology in diabetic animals. PM reacted rapidly with GO and GLA in neutral, aqueous buffer, forming a Schiff base intermediate that cyclized to a hemiaminal adduct by intramolecular reaction with the phenolic hydroxyl group of PM. This bicyclic intermediate dimerized to form a five-ring compound with a central piperazine ring, which was characterized by electrospray ionization-liquid chromatography/mass spectrometry, NMR, and x-ray crystallography. PM also inhibited the modification of lysine residues and loss of enzymatic activity of RNase in the presence of GO and GLA and inhibited formation of the AGE/ALE N(epsilon)-(carboxymethyl)lysine during reaction of GO and GLA with bovine serum albumin. Our data suggest that the AGE/ALE inhibitory activity and the therapeutic effects of PM observed in diabetic animal models depend, at least in part, on its ability to trap reactive carbonyl intermediates in AGE/ALE formation, thereby inhibiting the chemical modification of tissue proteins.

  2. Shrub-Steppe Seasons A Natural History of the Mid-Columbia Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LE Rogers

    1995-08-01

    This book collects and updates a series of articles about the natural history of the Mid-Columbia region. The articles first appeared as a monthly column titled ''Natural History'' in the Tri-City Herald, beginning in May 1991. My approach has been to condense the best of what is known about the ecology of the region to a manageable length with little in the way of technical language and terms. Admittedly, there is a bias toward those topics and species on which I have either been personally involved or observed as part of the ecology research programs conducted on the Fitzner/Eberhardt Aridmore » Lands Ecology (ALE) Reserve. The ALE Reserve is situated on the northeast-facing flank of the Rattlesnake Hills. Rattlesnake Mountain with a crest of over 3,600 feet is visible throughout much of the Mid-Columbia. Shrub-steppe grasslands once covered a large part of the western United States but most have been converted to other uses. The ALE site is the only remaining sizeable acreage (120 square miles) that is in near pristine condition and provides the only clear indication as to what the early trappers, traders, pioneers, and tribal members may have encountered in their day-to-day activities. In this respect, ALE provides a visible touchstone linking the past with the present for all of us.« less

  3. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  4. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  5. An integrated strategy to improve data acquisition and metabolite identification by time-staggered ion lists in UHPLC/Q-TOF MS-based metabolomics.

    PubMed

    Wang, Yang; Feng, Ruibing; He, Chengwei; Su, Huanxing; Ma, Huan; Wan, Jian-Bo

    2018-08-05

    The narrow linear range and the limited scan time of the given ion make the quantification of the features challenging in liquid chromatography-mass spectrometry (LC-MS)-based untargeted metabolomics with the full-scan mode. And metabolite identification is another bottleneck of untargeted analysis owing to the difficulty of acquiring MS/MS information of most metabolites detected. In this study, an integrated workflow was proposed using the newly established multiple ion monitoring mode with time-staggered ion lists (tsMIM) and target-directed data-dependent acquisition with time-staggered ion lists (tsDDA) to improve data acquisition and metabolite identification in UHPLC/Q-TOF MS-based untargeted metabolomics. Compared to the conventional untargeted metabolomics, the proprosed workflow exhibited the better repeatability before and after data normalization. After selecting features with the significant change by statistical analysis, MS/MS information of all these features can be obtained by tsDDA analysis to facilitate metabolite identification. Using time-staggered ion lists, the workflow is more sensitive in data acquisition, especially for the low-abundant features. Moreover, the metabolites with low abundance tend to be wrongly integrated and triggered by full scan-based untargeted analysis with MS E acquisition mode, which can be greatly improved by the proposed workflow. The integrated workflow was also successfully applied to discover serum biosignatures for the genetic modification of fat-1 in mice, which indicated its practicability and great potential in future metabolomics studies. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Modeling workflow to design machine translation applications for public health practice

    PubMed Central

    Turner, Anne M.; Brownstein, Megumu K.; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2014-01-01

    Objective Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). Materials and Methods We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. Results The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. Discussion This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. Counclusion The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. PMID:25445922

  7. Build and Execute Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Qiang

    At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows willmore » compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.« less

  8. Multidirectional testing of one- and two-level ProDisc-L versus simulated fusions.

    PubMed

    Panjabi, Manohar; Henderson, Gweneth; Abjornson, Celeste; Yue, James

    2007-05-20

    An in vitro human cadaveric biomechanical study. To evaluate intervertebral rotation changes due to lumbar ProDisc-L compared with simulated fusion, using follower load and multidirectional testing. Artificial discs, as opposed to the fusions, are thought to decrease the long-term accelerated degeneration at adjacent levels. A biomechanical assessment can be helpful, as the long-term clinical evaluation is impractical. Six fresh human cadaveric lumbar specimens (T12-S1) underwent multidirectional testing in flexion-extension, bilateral lateral bending, and bilateral torsion using the Hybrid test method. First, intact specimen total range of rotation (T12-S1) was determined. Second, using pure moments again, this range of rotation was achieved in each of the 5 constructs: A) ProDisc-L at L5-S1; B) fusion at L5-S1; C) ProDisc-L at L4-L5 and fusion at L5-S1; D) ProDisc-L at L4-L5 and L5-S1; and E) 2-level fusion at L4-L5 to L5-S1. Significant changes in the intervertebral rotations due to each construct were determined at the operated and nonoperated levels using repeated measures single factor ANOVA and Bonferroni statistical tests (P < 0.05). Adjacent-level effects (ALEs) were defined as the percentage changes in intervertebral rotations at the nonoperated levels due to the constructs. One- and 2-level ProDisc-L constructs showed only small ALE in any of the 3 rotations. In contrast, 1- and 2-level fusions showed increased ALE in all 3 directions (average, 7.8% and 35.3%, respectively, for 1 and 2 levels). In the disc plus fusion combination (construct C), the ALEs were similar to the 1-level fusion alone. In general, ProDisc-L preserved physiologic motions at all spinal levels, while the fusion simulations resulted in significant ALE.

  9. Describing and Modeling Workflow and Information Flow in Chronic Disease Care

    PubMed Central

    Unertl, Kim M.; Weinger, Matthew B.; Johnson, Kevin B.; Lorenzi, Nancy M.

    2009-01-01

    Objectives The goal of the study was to develop an in-depth understanding of work practices, workflow, and information flow in chronic disease care, to facilitate development of context-appropriate informatics tools. Design The study was conducted over a 10-month period in three ambulatory clinics providing chronic disease care. The authors iteratively collected data using direct observation and semi-structured interviews. Measurements The authors observed all aspects of care in three different chronic disease clinics for over 150 hours, including 157 patient-provider interactions. Observation focused on interactions among people, processes, and technology. Observation data were analyzed through an open coding approach. The authors then developed models of workflow and information flow using Hierarchical Task Analysis and Soft Systems Methodology. The authors also conducted nine semi-structured interviews to confirm and refine the models. Results The study had three primary outcomes: models of workflow for each clinic, models of information flow for each clinic, and an in-depth description of work practices and the role of health information technology (HIT) in the clinics. The authors identified gaps between the existing HIT functionality and the needs of chronic disease providers. Conclusions In response to the analysis of workflow and information flow, the authors developed ten guidelines for design of HIT to support chronic disease care, including recommendations to pursue modular approaches to design that would support disease-specific needs. The study demonstrates the importance of evaluating workflow and information flow in HIT design and implementation. PMID:19717802

  10. Task–Technology Fit of Video Telehealth for Nurses in an Outpatient Clinic Setting

    PubMed Central

    Finkelstein, Stanley M.

    2014-01-01

    Abstract Background: Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task–technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task–technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. Materials and Methods: The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time–motion study. Qualitative and quantitative results were merged and analyzed within the task–technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Results: Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task–technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Conclusions: Telehealth must provide the right information to the right clinician at the right time. Evaluating task–technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology. PMID:24841219

  11. Subliminal versus supraliminal stimuli activate neural responses in anterior cingulate cortex, fusiform gyrus and insula: a meta-analysis of fMRI studies.

    PubMed

    Meneguzzo, Paolo; Tsakiris, Manos; Schioth, Helgi B; Stein, Dan J; Brooks, Samantha J

    2014-01-01

    Non-conscious neural activation may underlie various psychological functions in health and disorder. However, the neural substrates of non-conscious processing have not been entirely elucidated. Examining the differential effects of arousing stimuli that are consciously, versus unconsciously perceived will improve our knowledge of neural circuitry involved in non-conscious perception. Here we conduct preliminary analyses of neural activation in studies that have used both subliminal and supraliminal presentation of the same stimulus. We use Activation Likelihood Estimation (ALE) to examine functional Magnetic Resonance Imaging (fMRI) studies that uniquely present the same stimuli subliminally and supraliminally to healthy participants during functional magnetic resonance imaging (fMRI). We included a total of 193 foci from 9 studies representing subliminal stimulation and 315 foci from 10 studies representing supraliminal stimulation. The anterior cingulate cortex is significantly activated during both subliminal and supraliminal stimulus presentation. Subliminal stimuli are linked to significantly increased activation in the right fusiform gyrus and right insula. Supraliminal stimuli show significantly increased activation in the left rostral anterior cingulate. Non-conscious processing of arousing stimuli may involve primary visual areas and may also recruit the insula, a brain area involved in eventual interoceptive awareness. The anterior cingulate is perhaps a key brain region for the integration of conscious and non-conscious processing. These preliminary data provide candidate brain regions for further study in to the neural correlates of conscious experience.

  12. An ALE meta-analysis on the audiovisual integration of speech signals.

    PubMed

    Erickson, Laura C; Heeg, Elizabeth; Rauschecker, Josef P; Turkeltaub, Peter E

    2014-11-01

    The brain improves speech processing through the integration of audiovisual (AV) signals. Situations involving AV speech integration may be crudely dichotomized into those where auditory and visual inputs contain (1) equivalent, complementary signals (validating AV speech) or (2) inconsistent, different signals (conflicting AV speech). This simple framework may allow the systematic examination of broad commonalities and differences between AV neural processes engaged by various experimental paradigms frequently used to study AV speech integration. We conducted an activation likelihood estimation metaanalysis of 22 functional imaging studies comprising 33 experiments, 311 subjects, and 347 foci examining "conflicting" versus "validating" AV speech. Experimental paradigms included content congruency, timing synchrony, and perceptual measures, such as the McGurk effect or synchrony judgments, across AV speech stimulus types (sublexical to sentence). Colocalization of conflicting AV speech experiments revealed consistency across at least two contrast types (e.g., synchrony and congruency) in a network of dorsal stream regions in the frontal, parietal, and temporal lobes. There was consistency across all contrast types (synchrony, congruency, and percept) in the bilateral posterior superior/middle temporal cortex. Although fewer studies were available, validating AV speech experiments were localized to other regions, such as ventral stream visual areas in the occipital and inferior temporal cortex. These results suggest that while equivalent, complementary AV speech signals may evoke activity in regions related to the corroboration of sensory input, conflicting AV speech signals recruit widespread dorsal stream areas likely involved in the resolution of conflicting sensory signals. Copyright © 2014 Wiley Periodicals, Inc.

  13. Closha: bioinformatics workflow system for the analysis of massive sequencing data.

    PubMed

    Ko, GunHwan; Kim, Pan-Gyu; Yoon, Jongcheol; Han, Gukhee; Park, Seong-Jin; Song, Wangho; Lee, Byungwook

    2018-02-19

    While next-generation sequencing (NGS) costs have fallen in recent years, the cost and complexity of computation remain substantial obstacles to the use of NGS in bio-medical care and genomic research. The rapidly increasing amounts of data available from the new high-throughput methods have made data processing infeasible without automated pipelines. The integration of data and analytic resources into workflow systems provides a solution to the problem by simplifying the task of data analysis. To address this challenge, we developed a cloud-based workflow management system, Closha, to provide fast and cost-effective analysis of massive genomic data. We implemented complex workflows making optimal use of high-performance computing clusters. Closha allows users to create multi-step analyses using drag and drop functionality and to modify the parameters of pipeline tools. Users can also import the Galaxy pipelines into Closha. Closha is a hybrid system that enables users to use both analysis programs providing traditional tools and MapReduce-based big data analysis programs simultaneously in a single pipeline. Thus, the execution of analytics algorithms can be parallelized, speeding up the whole process. We also developed a high-speed data transmission solution, KoDS, to transmit a large amount of data at a fast rate. KoDS has a file transfer speed of up to 10 times that of normal FTP and HTTP. The computer hardware for Closha is 660 CPU cores and 800 TB of disk storage, enabling 500 jobs to run at the same time. Closha is a scalable, cost-effective, and publicly available web service for large-scale genomic data analysis. Closha supports the reliable and highly scalable execution of sequencing analysis workflows in a fully automated manner. Closha provides a user-friendly interface to all genomic scientists to try to derive accurate results from NGS platform data. The Closha cloud server is freely available for use from http://closha.kobic.re.kr/ .

  14. Comparison of variance estimators for meta-analysis of instrumental variable estimates

    PubMed Central

    Schmidt, AF; Hingorani, AD; Jefferis, BJ; White, J; Groenwold, RHH; Dudbridge, F

    2016-01-01

    Abstract Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two versions of the delta method (IV before or after pooling), four bootstrap estimators, a jack-knife estimator and a heteroscedasticity-consistent (HC) variance estimator were compared using simulation. Two types of meta-analyses were compared, a two-stage meta-analysis pooling results, and a one-stage meta-analysis pooling datasets. Results: Using a two-stage meta-analysis, coverage of the point estimate using bootstrapped estimators deviated from nominal levels at weak instrument settings and/or outcome probabilities ≤ 0.10. The jack-knife estimator was the least biased resampling method, the HC estimator often failed at outcome probabilities ≤ 0.50 and overall the delta method estimators were the least biased. In the presence of between-study heterogeneity, the delta method before meta-analysis performed best. Using a one-stage meta-analysis all methods performed equally well and better than two-stage meta-analysis of greater or equal size. Conclusions: In the presence of between-study heterogeneity, two-stage meta-analyses should preferentially use the delta method before meta-analysis. Weak instrument bias can be reduced by performing a one-stage meta-analysis. PMID:27591262

  15. Performing statistical analyses on quantitative data in Taverna workflows: an example using R and maxdBrowse to identify differentially-expressed genes from microarray data.

    PubMed

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-08-07

    There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data.

  16. Performing statistical analyses on quantitative data in Taverna workflows: An example using R and maxdBrowse to identify differentially-expressed genes from microarray data

    PubMed Central

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-01-01

    Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Conclusion Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data. PMID:18687127

  17. On the Multilevel Nature of Meta-Analysis: A Tutorial, Comparison of Software Programs, and Discussion of Analytic Choices.

    PubMed

    Pastor, Dena A; Lazowski, Rory A

    2018-01-01

    The term "multilevel meta-analysis" is encountered not only in applied research studies, but in multilevel resources comparing traditional meta-analysis to multilevel meta-analysis. In this tutorial, we argue that the term "multilevel meta-analysis" is redundant since all meta-analysis can be formulated as a special kind of multilevel model. To clarify the multilevel nature of meta-analysis the four standard meta-analytic models are presented using multilevel equations and fit to an example data set using four software programs: two specific to meta-analysis (metafor in R and SPSS macros) and two specific to multilevel modeling (PROC MIXED in SAS and HLM). The same parameter estimates are obtained across programs underscoring that all meta-analyses are multilevel in nature. Despite the equivalent results, not all software programs are alike and differences are noted in the output provided and estimators available. This tutorial also recasts distinctions made in the literature between traditional and multilevel meta-analysis as differences between meta-analytic choices, not between meta-analytic models, and provides guidance to inform choices in estimators, significance tests, moderator analyses, and modeling sequence. The extent to which the software programs allow flexibility with respect to these decisions is noted, with metafor emerging as the most favorable program reviewed.

  18. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    NASA Astrophysics Data System (ADS)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  19. Dynamic reusable workflows for ocean science

    USGS Publications Warehouse

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic notebooks across the geoscience domains.

  20. BioVeL: a virtual laboratory for data analysis and modelling in biodiversity science and ecology.

    PubMed

    Hardisty, Alex R; Bacall, Finn; Beard, Niall; Balcázar-Vargas, Maria-Paula; Balech, Bachir; Barcza, Zoltán; Bourlat, Sarah J; De Giovanni, Renato; de Jong, Yde; De Leo, Francesca; Dobor, Laura; Donvito, Giacinto; Fellows, Donal; Guerra, Antonio Fernandez; Ferreira, Nuno; Fetyukova, Yuliya; Fosso, Bruno; Giddy, Jonathan; Goble, Carole; Güntsch, Anton; Haines, Robert; Ernst, Vera Hernández; Hettling, Hannes; Hidy, Dóra; Horváth, Ferenc; Ittzés, Dóra; Ittzés, Péter; Jones, Andrew; Kottmann, Renzo; Kulawik, Robert; Leidenberger, Sonja; Lyytikäinen-Saarenmaa, Päivi; Mathew, Cherian; Morrison, Norman; Nenadic, Aleksandra; de la Hidalga, Abraham Nieva; Obst, Matthias; Oostermeijer, Gerard; Paymal, Elisabeth; Pesole, Graziano; Pinto, Salvatore; Poigné, Axel; Fernandez, Francisco Quevedo; Santamaria, Monica; Saarenmaa, Hannu; Sipos, Gergely; Sylla, Karl-Heinz; Tähtinen, Marko; Vicario, Saverio; Vos, Rutger Aldo; Williams, Alan R; Yilmaz, Pelin

    2016-10-20

    Making forecasts about biodiversity and giving support to policy relies increasingly on large collections of data held electronically, and on substantial computational capability and capacity to analyse, model, simulate and predict using such data. However, the physically distributed nature of data resources and of expertise in advanced analytical tools creates many challenges for the modern scientist. Across the wider biological sciences, presenting such capabilities on the Internet (as "Web services") and using scientific workflow systems to compose them for particular tasks is a practical way to carry out robust "in silico" science. However, use of this approach in biodiversity science and ecology has thus far been quite limited. BioVeL is a virtual laboratory for data analysis and modelling in biodiversity science and ecology, freely accessible via the Internet. BioVeL includes functions for accessing and analysing data through curated Web services; for performing complex in silico analysis through exposure of R programs, workflows, and batch processing functions; for on-line collaboration through sharing of workflows and workflow runs; for experiment documentation through reproducibility and repeatability; and for computational support via seamless connections to supporting computing infrastructures. We developed and improved more than 60 Web services with significant potential in many different kinds of data analysis and modelling tasks. We composed reusable workflows using these Web services, also incorporating R programs. Deploying these tools into an easy-to-use and accessible 'virtual laboratory', free via the Internet, we applied the workflows in several diverse case studies. We opened the virtual laboratory for public use and through a programme of external engagement we actively encouraged scientists and third party application and tool developers to try out the services and contribute to the activity. Our work shows we can deliver an operational, scalable and flexible Internet-based virtual laboratory to meet new demands for data processing and analysis in biodiversity science and ecology. In particular, we have successfully integrated existing and popular tools and practices from different scientific disciplines to be used in biodiversity and ecological research.

  1. Rethinking Meta-Analysis: Applications for Air Pollution Data and Beyond

    PubMed Central

    Goodman, Julie E; Petito Boyce, Catherine; Sax, Sonja N; Beyer, Leslie A; Prueitt, Robyn L

    2015-01-01

    Meta-analyses offer a rigorous and transparent systematic framework for synthesizing data that can be used for a wide range of research areas, study designs, and data types. Both the outcome of meta-analyses and the meta-analysis process itself can yield useful insights for answering scientific questions and making policy decisions. Development of the National Ambient Air Quality Standards illustrates many potential applications of meta-analysis. These applications demonstrate the strengths and limitations of meta-analysis, issues that arise in various data realms, how meta-analysis design choices can influence interpretation of results, and how meta-analysis can be used to address bias and heterogeneity. Reviewing available data from a meta-analysis perspective can provide a useful framework and impetus for identifying and refining strategies for future research. Moreover, increased pervasiveness of a meta-analysis mindset—focusing on how the pieces of the research puzzle fit together—would benefit scientific research and data syntheses regardless of whether or not a quantitative meta-analysis is undertaken. While an individual meta-analysis can only synthesize studies addressing the same research question, the results of separate meta-analyses can be combined to address a question encompassing multiple data types. This observation applies to any scientific or policy area where information from a variety of disciplines must be considered to address a broader research question. PMID:25969128

  2. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  3. Dissociations of cognitive inhibition, response inhibition, and emotional interference: Voxelwise ALE meta-analyses of fMRI studies.

    PubMed

    Hung, Yuwen; Gaillard, Schuyler L; Yarmak, Pavel; Arsalidou, Marie

    2018-06-19

    Inhibitory control is the stopping of a mental process with or without intention, conceptualized as mental suppression of competing information because of limited cognitive capacity. Inhibitory control dysfunction is a core characteristic of many major psychiatric disorders. Inhibition is generally thought to involve the prefrontal cortex; however, a single inhibitory mechanism is insufficient for interpreting the heterogeneous nature of human cognition. It remains unclear whether different dimensions of inhibitory processes-specifically cognitive inhibition, response inhibition, and emotional interference-rely on dissociated neural systems. We conducted systematic meta-analyses of fMRI studies in the BrainMap database supplemented by PubMed using whole-brain activation likelihood estimation. A total of 66 study experiments including 1,447 participants and 987 foci revealed that while the left anterior insula was concordant in all inhibitory dimensions, cognitive inhibition reliably activated specific dorsal frontal inhibitory system, engaging dorsal anterior cingulate, dorsolateral prefrontal cortex, and parietal areas, whereas emotional interference reliably implicated a ventral inhibitory system, involving the ventral surface of the inferior frontal gyrus and the amygdala. Response inhibition showed concordant clusters in the fronto-striatal system, including the dorsal anterior cingulate region and extended supplementary motor areas, the dorsal and ventral lateral prefrontal cortex, basal ganglia, midbrain regions, and parietal regions. We provide an empirically derived dimensional model of inhibition characterizing neural systems underlying different aspects of inhibitory mechanisms. This study offers a fundamental framework to advance current understanding of inhibition and provides new insights for future clinical research into disorders with different types of inhibition-related dysfunctions. © 2018 Wiley Periodicals, Inc.

  4. Atmospheric emissions and trends of nitrous oxide deduced from 10 years of ALE-GAGE data

    NASA Technical Reports Server (NTRS)

    Prinn, R.; Cunnold, D.; Alyea, F.; Rasmussen, R.; Simmonds, P.

    1990-01-01

    Long-term measurements of nitrous oxide (N2O) obtained during the Atmospheric Lifetime Experiment (ALE) and the Global Atmospheric Gases Experiment (GAGE) for a period from 1978 to 1988 are presented and interpreted. It is observed that the average concentration in the Northern Hemisphere is 0.75 +/- 0.16 ppbv higher than in the Southern Hemisphere and that the global average linear trend in N2O lies in the range from 0.25 to 0.31 percent/year. The measured trends and latitudinal distributions are shown to be consistent with the hypothesis that stratospheric photodissociation is the major atmospheric sink for N2O, while the cause of the N2O trend is suggested to be a combination of a growing tropical source and a growing Northern mid-latitude source. A 10-year average global N2O emission rate of (20.5 +/- 2.4) x 10 to the 12th g N2O/year is deduced from the ALE/GAGE data.

  5. New frontiers of atomic layer etching

    NASA Astrophysics Data System (ADS)

    Sherpa, Sonam D.; Ranjan, Alok

    2018-03-01

    Interest in atomic layer etching (ALE) has surged recently because it offers several advantages over continuous or quasicontinuous plasma etching. These benefits include (1) independent control of ion energy, ion flux, and radical flux, (2) flux-independent etch rate that mitigates the iso-dense loading effects, and (3) ability to control the etch rate with atomic or nanoscale precision. In addition to these benefits, we demonstrate an area-selective etching for maskless lithography as a new frontier of ALE. In this paper, area-selective etching refers to the confinement of etching into the specific areas of the substrate. The concept of area-selective etching originated during our studies on quasi-ALE of silicon nitride which consists of sequential exposure of silicon nitride to hydrogen and fluorinated plasma. The findings of our studies reported in this paper suggest that it may be possible to confine the etching into specific areas of silicon nitride without using any mask by replacing conventional hydrogen plasma with a localized source of hydrogen ions.

  6. Performance Evaluation of Heart Sound Cancellation in FPGA Hardware Implementation for Electronic Stethoscope

    PubMed Central

    Chao, Chun-Tang

    2014-01-01

    This paper presents the design and evaluation of the hardware circuit for electronic stethoscopes with heart sound cancellation capabilities using field programmable gate arrays (FPGAs). The adaptive line enhancer (ALE) was adopted as the filtering methodology to reduce heart sound attributes from the breath sounds obtained via the electronic stethoscope pickup. FPGAs were utilized to implement the ALE functions in hardware to achieve near real-time breath sound processing. We believe that such an implementation is unprecedented and crucial toward a truly useful, standalone medical device in outpatient clinic settings. The implementation evaluation with one Altera cyclone II–EP2C70F89 shows that the proposed ALE used 45% resources of the chip. Experiments with the proposed prototype were made using DE2-70 emulation board with recorded body signals obtained from online medical archives. Clear suppressions were observed in our experiments from both the frequency domain and time domain perspectives. PMID:24790573

  7. Deployment Simulation Methods for Ultra-Lightweight Inflatable Structures

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Johnson, Arthur R.

    2003-01-01

    Two dynamic inflation simulation methods are employed for modeling the deployment of folded thin-membrane tubes. The simulations are necessary because ground tests include gravity effects and may poorly represent deployment in space. The two simulation methods are referred to as the Control Volume (CV) method and the Arbitrary Lagrangian Eulerian (ALE) method. They are available in the LS-DYNA nonlinear dynamic finite element code. Both methods are suitable for modeling the interactions between the inflation gas and the thin-membrane tube structures. The CV method only considers the pressure induced by the inflation gas in the simulation, while the ALE method models the actual flow of the inflation gas. Thus, the transient fluid properties at any location within the tube can be predicted by the ALE method. Deployment simulations of three packaged tube models; namely coiled, Z-folded, and telescopically-folded configurations, are performed. Results predicted by both methods for the telescopically-folded configuration are correlated and computational efficiency issues are discussed.

  8. Simulation of underwater explosion benchmark experiments with ALE3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couch, R.; Faux, D.

    1997-05-19

    Some code improvements have been made during the course of this study. One immediately obvious need was for more flexibility in the constitutive representation for materials in shell elements. To remedy this situation, a model with a tabular representation of stress versus strain and rate dependent effects was implemented. This was required in order to obtain reasonable results in the IED cylinder simulation. Another deficiency was in the ability to extract and plot variables associated with shell elements. The pipe whip analysis required the development of a scheme to tally and plot time dependent shell quantities such as stresses andmore » strains. This capability had previously existed only for solid elements. Work was initiated to provide the same range of plotting capability for structural elements that exist with the DYNA3D/TAURUS tools. One of the characteristics of these problems is the disparity in zoning required in the vicinity of the charge and bubble compared to that needed in the far field. This disparity can cause the equipotential relaxation logic to provide a less than optimal solution. Various approaches were utilized to bias the relaxation to obtain more optimal meshing during relaxation. Extensions of these techniques have been developed to provide more powerful options, but more work still needs to be done. The results presented here are representative of what can be produced with an ALE code structured like ALE3D. They are not necessarily the best results that could have been obtained. More experience in assessing sensitivities to meshing and boundary conditions would be very useful. A number of code deficiencies discovered in the course of this work have been corrected and are available for any future investigations.« less

  9. Implications of evolutionary engineering for growth and recombinant protein production in methanol-based growth media in the yeast Pichia pastoris.

    PubMed

    Moser, Josef W; Prielhofer, Roland; Gerner, Samuel M; Graf, Alexandra B; Wilson, Iain B H; Mattanovich, Diethard; Dragosits, Martin

    2017-03-17

    Pichia pastoris is a widely used eukaryotic expression host for recombinant protein production. Adaptive laboratory evolution (ALE) has been applied in a wide range of studies in order to improve strains for biotechnological purposes. In this context, the impact of long-term carbon source adaptation in P. pastoris has not been addressed so far. Thus, we performed a pilot experiment in order to analyze the applicability and potential benefits of ALE towards improved growth and recombinant protein production in P. pastoris. Adaptation towards growth on methanol was performed in replicate cultures in rich and minimal growth medium for 250 generations. Increased growth rates on these growth media were observed at the population and single clone level. Evolved populations showed various degrees of growth advantages and trade-offs in non-evolutionary growth conditions. Genome resequencing revealed a wide variety of potential genetic targets associated with improved growth performance on methanol-based growth media. Alcohol oxidase represented a mutational hotspot since four out of seven evolved P. pastoris clones harbored mutations in this gene, resulting in decreased Aox activity, despite increased growth rates. Selected clones displayed strain-dependent variations for AOX-promoter based recombinant protein expression yield. One particularly interesting clone showed increased product titers ranging from a 2.5-fold increase in shake flask batch culture to a 1.8-fold increase during fed batch cultivation. Our data indicate a complex correlation of carbon source, growth context and recombinant protein production. While similar experiments have already shown their potential in other biotechnological areas where microbes were evolutionary engineered for improved stress resistance and growth, the current dataset encourages the analysis of the potential of ALE for improved protein production in P. pastoris on a broader scale.

  10. ALE: automated label extraction from GEO metadata.

    PubMed

    Giles, Cory B; Brown, Chase A; Ripperger, Michael; Dennis, Zane; Roopnarinesingh, Xiavan; Porter, Hunter; Perz, Aleksandra; Wren, Jonathan D

    2017-12-28

    NCBI's Gene Expression Omnibus (GEO) is a rich community resource containing millions of gene expression experiments from human, mouse, rat, and other model organisms. However, information about each experiment (metadata) is in the format of an open-ended, non-standardized textual description provided by the depositor. Thus, classification of experiments for meta-analysis by factors such as gender, age of the sample donor, and tissue of origin is not feasible without assigning labels to the experiments. Automated approaches are preferable for this, primarily because of the size and volume of the data to be processed, but also because it ensures standardization and consistency. While some of these labels can be extracted directly from the textual metadata, many of the data available do not contain explicit text informing the researcher about the age and gender of the subjects with the study. To bridge this gap, machine-learning methods can be trained to use the gene expression patterns associated with the text-derived labels to refine label-prediction confidence. Our analysis shows only 26% of metadata text contains information about gender and 21% about age. In order to ameliorate the lack of available labels for these data sets, we first extract labels from the textual metadata for each GEO RNA dataset and evaluate the performance against a gold standard of manually curated labels. We then use machine-learning methods to predict labels, based upon gene expression of the samples and compare this to the text-based method. Here we present an automated method to extract labels for age, gender, and tissue from textual metadata and GEO data using both a heuristic approach as well as machine learning. We show the two methods together improve accuracy of label assignment to GEO samples.

  11. Brain regions with mirror properties: a meta-analysis of 125 human fMRI studies.

    PubMed

    Molenberghs, Pascal; Cunnington, Ross; Mattingley, Jason B

    2012-01-01

    Mirror neurons in macaque area F5 fire when an animal performs an action, such as a mouth or limb movement, and also when the animal passively observes an identical or similar action performed by another individual. Brain-imaging studies in humans conducted over the last 20 years have repeatedly attempted to reveal analogous brain regions with mirror properties in humans, with broad and often speculative claims about their functional significance across a range of cognitive domains, from language to social cognition. Despite such concerted efforts, the likely neural substrates of these mirror regions have remained controversial, and indeed the very existence of a distinct subcategory of human neurons with mirroring properties has been questioned. Here we used activation likelihood estimation (ALE), to provide a quantitative index of the consistency of patterns of fMRI activity measured in human studies of action observation and action execution. From an initial sample of more than 300 published works, data from 125 papers met our strict inclusion and exclusion criteria. The analysis revealed 14 separate clusters in which activation has been consistently attributed to brain regions with mirror properties, encompassing 9 different Brodmann areas. These clusters were located in areas purported to show mirroring properties in the macaque, such as the inferior parietal lobule, inferior frontal gyrus and the adjacent ventral premotor cortex, but surprisingly also in regions such as the primary visual cortex, cerebellum and parts of the limbic system. Our findings suggest a core network of human brain regions that possess mirror properties associated with action observation and execution, with additional areas recruited during tasks that engage non-motor functions, such as auditory, somatosensory and affective components. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  12. Semi-automted analysis of high-resolution aerial images to quantify docks in Upper Midwest glacial lakes

    USGS Publications Warehouse

    Beck, Marcus W.; Vondracek, Bruce C.; Hatch, Lorin K.; Vinje, Jason

    2013-01-01

    Lake resources can be negatively affected by environmental stressors originating from multiple sources and different spatial scales. Shoreline development, in particular, can negatively affect lake resources through decline in habitat quality, physical disturbance, and impacts on fisheries. The development of remote sensing techniques that efficiently characterize shoreline development in a regional context could greatly improve management approaches for protecting and restoring lake resources. The goal of this study was to develop an approach using high-resolution aerial photographs to quantify and assess docks as indicators of shoreline development. First, we describe a dock analysis workflow that can be used to quantify the spatial extent of docks using aerial images. Our approach incorporates pixel-based classifiers with object-based techniques to effectively analyze high-resolution digital imagery. Second, we apply the analysis workflow to quantify docks for 4261 lakes managed by the Minnesota Department of Natural Resources. Overall accuracy of the analysis results was 98.4% (87.7% based on ) after manual post-processing. The analysis workflow was also 74% more efficient than the time required for manual digitization of docks. These analyses have immediate relevance for resource planning in Minnesota, whereas the dock analysis workflow could be used to quantify shoreline development in other regions with comparable imagery. These data can also be used to better understand the effects of shoreline development on aquatic resources and to evaluate the effects of shoreline development relative to other stressors.

  13. Evaluation of non-volatile metabolites in beer stored at high temperature and utility as an accelerated method to predict flavour stability.

    PubMed

    Heuberger, Adam L; Broeckling, Corey D; Sedin, Dana; Holbrook, Christian; Barr, Lindsay; Kirkpatrick, Kaylyn; Prenni, Jessica E

    2016-06-01

    Flavour stability is vital to the brewing industry as beer is often stored for an extended time under variable conditions. Developing an accelerated model to evaluate brewing techniques that affect flavour stability is an important area of research. Here, we performed metabolomics on non-volatile compounds in beer stored at 37 °C between 1 and 14 days for two beer types: an amber ale and an India pale ale. The experiment determined high temperature to influence non-volatile metabolites, including the purine 5-methylthioadenosine (5-MTA). In a second experiment, three brewing techniques were evaluated for improved flavour stability: use of antioxidant crowns, chelation of pro-oxidants, and varying plant content in hops. Sensory analysis determined the hop method was associated with improved flavour stability, and this was consistent with reduced 5-MTA at both regular and high temperature storage. Future studies are warranted to understand the influence of 5-MTA on flavour and aging within different beer types. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Yadage and Packtivity - analysis preservation using parametrized workflows

    NASA Astrophysics Data System (ADS)

    Cranmer, Kyle; Heinrich, Lukas

    2017-10-01

    Preserving data analyses produced by the collaborations at LHC in a parametrized fashion is crucial in order to maintain reproducibility and re-usability. We argue for a declarative description in terms of individual processing steps - “packtivities” - linked through a dynamic directed acyclic graph (DAG) and present an initial set of JSON schemas for such a description and an implementation - “yadage” - capable of executing workflows of analysis preserved via Linux containers.

  15. Accessing and integrating data and knowledge for biomedical research.

    PubMed

    Burgun, A; Bodenreider, O

    2008-01-01

    To review the issues that have arisen with the advent of translational research in terms of integration of data and knowledge, and survey current efforts to address these issues. Using examples form the biomedical literature, we identified new trends in biomedical research and their impact on bioinformatics. We analyzed the requirements for effective knowledge repositories and studied issues in the integration of biomedical knowledge. New diagnostic and therapeutic approaches based on gene expression patterns have brought about new issues in the statistical analysis of data, and new workflows are needed are needed to support translational research. Interoperable data repositories based on standard annotations, infrastructures and services are needed to support the pooling and meta-analysis of data, as well as their comparison to earlier experiments. High-quality, integrated ontologies and knowledge bases serve as a source of prior knowledge used in combination with traditional data mining techniques and contribute to the development of more effective data analysis strategies. As biomedical research evolves from traditional clinical and biological investigations towards omics sciences and translational research, specific needs have emerged, including integrating data collected in research studies with patient clinical data, linking omics knowledge with medical knowledge, modeling the molecular basis of diseases, and developing tools that support in-depth analysis of research data. As such, translational research illustrates the need to bridge the gap between bioinformatics and medical informatics, and opens new avenues for biomedical informatics research.

  16. Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment

    PubMed Central

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan

    2016-01-01

    Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058

  17. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    PubMed

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  18. A ChIP-Seq Data Analysis Pipeline Based on Bioconductor Packages.

    PubMed

    Park, Seung-Jin; Kim, Jong-Hwan; Yoon, Byung-Ha; Kim, Seon-Young

    2017-03-01

    Nowadays, huge volumes of chromatin immunoprecipitation-sequencing (ChIP-Seq) data are generated to increase the knowledge on DNA-protein interactions in the cell, and accordingly, many tools have been developed for ChIP-Seq analysis. Here, we provide an example of a streamlined workflow for ChIP-Seq data analysis composed of only four packages in Bioconductor: dada2, QuasR, mosaics, and ChIPseeker. 'dada2' performs trimming of the high-throughput sequencing data. 'QuasR' and 'mosaics' perform quality control and mapping of the input reads to the reference genome and peak calling, respectively. Finally, 'ChIPseeker' performs annotation and visualization of the called peaks. This workflow runs well independently of operating systems (e.g., Windows, Mac, or Linux) and processes the input fastq files into various results in one run. R code is available at github: https://github.com/ddhb/Workflow_of_Chipseq.git.

  19. A ChIP-Seq Data Analysis Pipeline Based on Bioconductor Packages

    PubMed Central

    Park, Seung-Jin; Kim, Jong-Hwan; Yoon, Byung-Ha; Kim, Seon-Young

    2017-01-01

    Nowadays, huge volumes of chromatin immunoprecipitation-sequencing (ChIP-Seq) data are generated to increase the knowledge on DNA-protein interactions in the cell, and accordingly, many tools have been developed for ChIP-Seq analysis. Here, we provide an example of a streamlined workflow for ChIP-Seq data analysis composed of only four packages in Bioconductor: dada2, QuasR, mosaics, and ChIPseeker. ‘dada2’ performs trimming of the high-throughput sequencing data. ‘QuasR’ and ‘mosaics’ perform quality control and mapping of the input reads to the reference genome and peak calling, respectively. Finally, ‘ChIPseeker’ performs annotation and visualization of the called peaks. This workflow runs well independently of operating systems (e.g., Windows, Mac, or Linux) and processes the input fastq files into various results in one run. R code is available at github: https://github.com/ddhb/Workflow_of_Chipseq.git. PMID:28416945

  20. Compatible, energy conserving, bounds preserving remap of hydrodynamic fields for an extended ALE scheme

    DOE PAGES

    Burton, Donald E.; Morgan, Nathaniel Ray; Charest, Marc Robert Joseph; ...

    2017-11-22

    From the very origins of numerical hydrodynamics in the Lagrangian work of von Neumann and Richtmyer [83], the issue of total energy conservation as well as entropy production has been problematic. Because of well known problems with mesh deformation, Lagrangian schemes have evolved into Arbitrary Lagrangian–Eulerian (ALE) methods [39] that combine the best properties of Lagrangian and Eulerian methods. Energy issues have persisted for this class of methods. We believe that fundamental issues of energy conservation and entropy production in ALE require further examination. The context of the paper is an ALE scheme that is extended in the sense thatmore » it permits cyclic or periodic remap of data between grids of the same or differing connectivity. The principal design goals for a remap method then consist of total energy conservation, bounded internal energy, and compatibility of kinetic energy and momentum. We also have secondary objectives of limiting velocity and stress in a non-directional manner, keeping primitive variables monotone, and providing a higher than second order reconstruction of remapped variables. Particularly, the new contributions fall into three categories associated with: energy conservation and entropy production, reconstruction and bounds preservation of scalar and tensor fields, and conservative remap of nonlinear fields. Our paper presents a derivation of the methods, details of implementation, and numerical results for a number of test problems. The methods requires volume integration of polynomial functions in polytopal cells with planar facets, and the requisite expressions are derived for arbitrary order.« less

  1. Magmatic architecture within a rift segment: Articulate axial magma storage at Erta Ale volcano, Ethiopia

    NASA Astrophysics Data System (ADS)

    Xu, Wenbin; Rivalta, Eleonora; Li, Xing

    2017-10-01

    Understanding the magmatic systems beneath rift volcanoes provides insights into the deeper processes associated with rift architecture and development. At the slow spreading Erta Ale segment (Afar, Ethiopia) transition from continental rifting to seafloor spreading is ongoing on land. A lava lake has been documented since the twentieth century at the summit of the Erta Ale volcano and acts as an indicator of the pressure of its magma reservoir. However, the structure of the plumbing system of the volcano feeding such persistent active lava lake and the mechanisms controlling the architecture of magma storage remain unclear. Here, we combine high-resolution satellite optical imagery and radar interferometry (InSAR) to infer the shape, location and orientation of the conduits feeding the 2017 Erta Ale eruption. We show that the lava lake was rooted in a vertical dike-shaped reservoir that had been inflating prior to the eruption. The magma was subsequently transferred into a shallower feeder dike. We also find a shallow, horizontal magma lens elongated along axis inflating beneath the volcano during the later period of the eruption. Edifice stress modeling suggests the hydraulically connected system of horizontal and vertical thin magmatic bodies able to open and close are arranged spatially according to stresses induced by loading and unloading due to topographic changes. Our combined approach may provide new constraints on the organization of magma plumbing systems beneath volcanoes in continental and marine settings.

  2. Compatible, energy conserving, bounds preserving remap of hydrodynamic fields for an extended ALE scheme

    NASA Astrophysics Data System (ADS)

    Burton, D. E.; Morgan, N. R.; Charest, M. R. J.; Kenamond, M. A.; Fung, J.

    2018-02-01

    From the very origins of numerical hydrodynamics in the Lagrangian work of von Neumann and Richtmyer [83], the issue of total energy conservation as well as entropy production has been problematic. Because of well known problems with mesh deformation, Lagrangian schemes have evolved into Arbitrary Lagrangian-Eulerian (ALE) methods [39] that combine the best properties of Lagrangian and Eulerian methods. Energy issues have persisted for this class of methods. We believe that fundamental issues of energy conservation and entropy production in ALE require further examination. The context of the paper is an ALE scheme that is extended in the sense that it permits cyclic or periodic remap of data between grids of the same or differing connectivity. The principal design goals for a remap method then consist of total energy conservation, bounded internal energy, and compatibility of kinetic energy and momentum. We also have secondary objectives of limiting velocity and stress in a non-directional manner, keeping primitive variables monotone, and providing a higher than second order reconstruction of remapped variables. In particular, the new contributions fall into three categories associated with: energy conservation and entropy production, reconstruction and bounds preservation of scalar and tensor fields, and conservative remap of nonlinear fields. The paper presents a derivation of the methods, details of implementation, and numerical results for a number of test problems. The methods requires volume integration of polynomial functions in polytopal cells with planar facets, and the requisite expressions are derived for arbitrary order.

  3. Alterations in nonenzymatic biochemistry in uremia: origin and significance of "carbonyl stress" in long-term uremic complications.

    PubMed

    Miyata, T; van Ypersele de Strihou, C; Kurokawa, K; Baynes, J W

    1999-02-01

    Advanced glycation end products (AGEs), formed during Maillard or browning reactions by nonenzymatic glycation and oxidation (glycoxidation) of proteins, have been implicated in the pathogenesis of several diseases, including diabetes and uremia. AGEs, such as pentosidine and carboxymethyllysine, are markedly elevated in both plasma proteins and skin collagen of uremic patients, irrespective of the presence of diabetes. The increased chemical modification of proteins is not limited to AGEs, because increased levels of advanced lipoxidation end products (ALEs), such as malondialdehydelysine, are also detected in plasma proteins in uremia. The accumulation of AGEs and ALEs in uremic plasma proteins is not correlated with increased blood glucose or triglycerides, nor is it determined by a decreased removal of chemically modified proteins by glomerular filtration. It more likely results from increased plasma concentrations of small, reactive carbonyl precursors of AGEs and ALEs, such as glyoxal, methylglyoxal, 3-deoxyglucosone, dehydroascorbate, and malondialdehyde. Thus, uremia may be described as a state of carbonyl overload or "carbonyl stress" resulting from either increased oxidation of carbohydrates and lipids (oxidative stress) or inadequate detoxification or inactivation of reactive carbonyl compounds derived from both carbohydrates and lipids by oxidative and nonoxidative chemistry. Carbonyl stress in uremia may contribute to the long-term complications associated with chronic renal failure and dialysis, such as dialysis-related amyloidosis and accelerated atherosclerosis. The increased levels of AGEs and ALEs in uremic blood and tissue proteins suggest a broad derangement in the nonenzymatic biochemistry of both carbohydrates and lipids.

  4. A multi-dimensional high-order DG-ALE method based on gas-kinetic theory with application to oscillating bodies

    NASA Astrophysics Data System (ADS)

    Ren, Xiaodong; Xu, Kun; Shyy, Wei

    2016-07-01

    This paper presents a multi-dimensional high-order discontinuous Galerkin (DG) method in an arbitrary Lagrangian-Eulerian (ALE) formulation to simulate flows over variable domains with moving and deforming meshes. It is an extension of the gas-kinetic DG method proposed by the authors for static domains (X. Ren et al., 2015 [22]). A moving mesh gas kinetic DG method is proposed for both inviscid and viscous flow computations. A flux integration method across a translating and deforming cell interface has been constructed. Differently from the previous ALE-type gas kinetic method with piecewise constant mesh velocity at each cell interface within each time step, the mesh velocity variation inside a cell and the mesh moving and rotating at a cell interface have been accounted for in the finite element framework. As a result, the current scheme is applicable for any kind of mesh movement, such as translation, rotation, and deformation. The accuracy and robustness of the scheme have been improved significantly in the oscillating airfoil calculations. All computations are conducted in a physical domain rather than in a reference domain, and the basis functions move with the grid movement. Therefore, the numerical scheme can preserve the uniform flow automatically, and satisfy the geometric conservation law (GCL). The numerical accuracy can be maintained even for a largely moving and deforming mesh. Several test cases are presented to demonstrate the performance of the gas-kinetic DG-ALE method.

  5. Compatible, energy conserving, bounds preserving remap of hydrodynamic fields for an extended ALE scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burton, Donald E.; Morgan, Nathaniel Ray; Charest, Marc Robert Joseph

    From the very origins of numerical hydrodynamics in the Lagrangian work of von Neumann and Richtmyer [83], the issue of total energy conservation as well as entropy production has been problematic. Because of well known problems with mesh deformation, Lagrangian schemes have evolved into Arbitrary Lagrangian–Eulerian (ALE) methods [39] that combine the best properties of Lagrangian and Eulerian methods. Energy issues have persisted for this class of methods. We believe that fundamental issues of energy conservation and entropy production in ALE require further examination. The context of the paper is an ALE scheme that is extended in the sense thatmore » it permits cyclic or periodic remap of data between grids of the same or differing connectivity. The principal design goals for a remap method then consist of total energy conservation, bounded internal energy, and compatibility of kinetic energy and momentum. We also have secondary objectives of limiting velocity and stress in a non-directional manner, keeping primitive variables monotone, and providing a higher than second order reconstruction of remapped variables. Particularly, the new contributions fall into three categories associated with: energy conservation and entropy production, reconstruction and bounds preservation of scalar and tensor fields, and conservative remap of nonlinear fields. Our paper presents a derivation of the methods, details of implementation, and numerical results for a number of test problems. The methods requires volume integration of polynomial functions in polytopal cells with planar facets, and the requisite expressions are derived for arbitrary order.« less

  6. Light-Induced Retinopathy: Young Age Protects more than Ocular Pigmentation.

    PubMed

    Polosa, Anna; Bessaklia, Hyba; Lachapelle, Pierre

    2017-06-01

    The purpose of this study was to compare the efficacy that ocular melanin confers in protecting the retina of juvenile and adult rats exposed to a bright luminous environment. Juvenile (JLE) and adult (ALE) Long-Evans pigmented rats were thus exposed to a bright cyclic light (10,000lux; white light) from postnatal day 14-28 or for 6 consecutive days, respectively. Flash electroretinograms (ERG) and retinal histology were performed at different predetermined ages, post-light exposure. Despite a significant reduction in ERG responses immediately following light exposure, with time, retinal function fully recovered in JLE compared to a 54% recovery for the ALE. In ALE, we noted a region of the supero-temporal quadrant that was highly vulnerable to light damage. This region was also devoid of melanin granules prior to the light exposure. This melanin-free zone increased in size in the days that followed the end of exposure, a process that was accompanied by the gradual degeneration of the thus uncovered photoreceptors. In contrast, melanin and photoreceptor losses were minimal in JLE. Our results suggest that the light-induced photoreceptor degeneration in ALE would be secondary to the initial destruction of the RPE and ensuing loss of melanin protection. In contrast, the melanin granules of JLE appear to be significantly more resistant to light damage, a characteristic that would explain the higher resistance of JLE photoreceptors to light damage. Our results would thus suggest that the efficacy of ocular melanin protection against light damage declines with age.

  7. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.

  8. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    NASA Astrophysics Data System (ADS)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  9. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  10. Sample size and power considerations in network meta-analysis

    PubMed Central

    2012-01-01

    Background Network meta-analysis is becoming increasingly popular for establishing comparative effectiveness among multiple interventions for the same disease. Network meta-analysis inherits all methodological challenges of standard pairwise meta-analysis, but with increased complexity due to the multitude of intervention comparisons. One issue that is now widely recognized in pairwise meta-analysis is the issue of sample size and statistical power. This issue, however, has so far only received little attention in network meta-analysis. To date, no approaches have been proposed for evaluating the adequacy of the sample size, and thus power, in a treatment network. Findings In this article, we develop easy-to-use flexible methods for estimating the ‘effective sample size’ in indirect comparison meta-analysis and network meta-analysis. The effective sample size for a particular treatment comparison can be interpreted as the number of patients in a pairwise meta-analysis that would provide the same degree and strength of evidence as that which is provided in the indirect comparison or network meta-analysis. We further develop methods for retrospectively estimating the statistical power for each comparison in a network meta-analysis. We illustrate the performance of the proposed methods for estimating effective sample size and statistical power using data from a network meta-analysis on interventions for smoking cessation including over 100 trials. Conclusion The proposed methods are easy to use and will be of high value to regulatory agencies and decision makers who must assess the strength of the evidence supporting comparative effectiveness estimates. PMID:22992327

  11. SimpleITK Image-Analysis Notebooks: a Collaborative Environment for Education and Reproducible Research.

    PubMed

    Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard

    2018-06-01

    Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .

  12. Enabling Real-time Water Decision Support Services Using Model as a Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  13. Characterizing Strain Variation in Engineered E. coli Using a Multi-Omics-Based Workflow

    DOE PAGES

    Brunk, Elizabeth; George, Kevin W.; Alonso-Gutierrez, Jorge; ...

    2016-05-19

    Understanding the complex interactions that occur between heterologous and native biochemical pathways represents a major challenge in metabolic engineering and synthetic biology. We present a workflow that integrates metabolomics, proteomics, and genome-scale models of Escherichia coli metabolism to study the effects of introducing a heterologous pathway into a microbial host. This workflow incorporates complementary approaches from computational systems biology, metabolic engineering, and synthetic biology; provides molecular insight into how the host organism microenvironment changes due to pathway engineering; and demonstrates how biological mechanisms underlying strain variation can be exploited as an engineering strategy to increase product yield. As a proofmore » of concept, we present the analysis of eight engineered strains producing three biofuels: isopentenol, limonene, and bisabolene. Application of this workflow identified the roles of candidate genes, pathways, and biochemical reactions in observed experimental phenomena and facilitated the construction of a mutant strain with improved productivity. The contributed workflow is available as an open-source tool in the form of iPython notebooks.« less

  14. Barriers to critical thinking: workflow interruptions and task switching among nurses.

    PubMed

    Cornell, Paul; Riordan, Monica; Townsend-Gervis, Mary; Mobley, Robin

    2011-10-01

    Nurses are increasingly called upon to engage in critical thinking. However, current workflow inhibits this goal with frequent task switching and unpredictable demands. To assess workflow's cognitive impact, nurses were observed at 2 hospitals with different patient loads and acuity levels. Workflow on a medical/surgical and pediatric oncology unit was observed, recording tasks, tools, collaborators, and locations. Nineteen nurses were observed for a total of 85.2 hours. Tasks were short with a mean duration of 62.4 and 81.6 seconds on the 2 units. More than 50% of the recorded tasks were less than 30 seconds in length. An analysis of task sequence revealed few patterns and little pairwise repetition. Performance on specific tasks differed between the 2 units, but the character of the workflow was highly similar. The nonrepetitive flow and high amount of switching indicate nurses experience a heavy cognitive load with little uninterrupted time. This implies that nurses rarely have the conditions necessary for critical thinking.

  15. Adopting an Evidence-Based Lifestyle Physical Activity Program: Dissemination Study Design and Methods.

    PubMed

    Dunn, Andrea L; Buller, David B; Dearing, James W; Cutter, Gary; Guerra, Michele; Wilcox, Sara; Bettinghaus, Erwin P

    2012-06-01

    BACKGROUND: There is a scarcity of research studies that have examined academic-commercial partnerships to disseminate evidence-based physical activity programs. Understanding this approach to dissemination is essential because academic-commercial partnerships are increasingly common. Private companies have used dissemination channels and strategies to a degree that academicians have not, and declining resources require academicians to explore these partnerships. PURPOSE: This paper describes a retrospective case-control study design including the methods, demographics, organizational decision-making, implementation rates, and marketing strategy for Active Living Every Day (ALED), an evidence-based lifestyle physical activity program that has been commercially available since 2001. Evidence-based public health promotion programs rely on organizations and targeted sectors to disseminate these programs although relatively little is known about organizational-level and sector-level influences that lead to their adoption and implementation. METHODS: Cases (n=154) were eligible if they had signed an ALED license agreement with Human Kinetics (HK), publisher of the program's textbooks and facilitator manuals, between 2001 and 2008. Two types of controls were matched (2:2:1) and stratified by sector and region. Active controls (Control 1; n=319) were organizations that contacted HK to consider adopting ALED. Passive controls (Control 2; n=328) were organizations that received unsolicited marketing materials and did not initiate contact with HK. We used Diffusion of Innovations Theory (DIT) constructs as the basis for developing the survey of cases and controls. RESULTS: Using the multi-method strategy recommended by Dillman, a total of n=801 cases and controls were surveyed. Most organizations were from the fitness sector followed by medical, nongovernmental, governmental, educational, worksite and other sectors with significantly higher response rates from government, educational and medical sectors compared with fitness and other sectors, (p=0.02). More cases reported being involved in the decision to adopt ALED (p<0.0001). Data indicate that a low percentage of controls had ever heard of ALED despite repeated marketing and offering other types of physical activity programs and services. Finally, slightly over half of the adopters reported they had actually implemented the ALED program. CONCLUSION: Dissemination research requires new perspectives and designs to produce valid insights about the results of dissemination efforts. This study design, survey methods and theoretically-based questions can serve as a useful model for other evidence-based public health interventions that are marketed by commercial publishers to better understand key issues related to adoption and implementation of evidence-based programs.

  16. Building an efficient curation workflow for the Arabidopsis literature corpus

    PubMed Central

    Li, Donghui; Berardini, Tanya Z.; Muller, Robert J.; Huala, Eva

    2012-01-01

    TAIR (The Arabidopsis Information Resource) is the model organism database (MOD) for Arabidopsis thaliana, a model plant with a literature corpus of about 39 000 articles in PubMed, with over 4300 new articles added in 2011. We have developed a literature curation workflow incorporating both automated and manual elements to cope with this flood of new research articles. The current workflow can be divided into two phases: article selection and curation. Structured controlled vocabularies, such as the Gene Ontology and Plant Ontology are used to capture free text information in the literature as succinct ontology-based annotations suitable for the application of computational analysis methods. We also describe our curation platform and the use of text mining tools in our workflow. Database URL: www.arabidopsis.org PMID:23221298

  17. The Operational Use of an Automated High Frequency Radio System Incorporating Automatic Link Establishment and Single-Tone Serial Modem Technology for U.S. Navy Ship-Shore Communications

    DTIC Science & Technology

    1993-10-01

    between the link chronologically in the following sections. quality analysis ( LQA ) score measured by ALE and single- tone serial modem performance. A...receiving ends in turn and (propagation permitting), pass traffic and terminate the are used to calculate a combined link quality analysis ( LQA ...score. The LQA score is displayed to the operator NCCOSC RDTE DIV installation team accomplished the as a number on an arbitrary scale of 0 to 120, with a

  18. Meta-Analysis at Middle Age: A Personal History

    ERIC Educational Resources Information Center

    Glass, Gene V.

    2015-01-01

    The 40-year history of meta-analysis is traced from the vantage point of one of its originators. Research syntheses leading to the first examples of meta-analysis are identified. Early meta-analyses of the literature on psychotherapy outcomes and school class size are recounted. The influence on the development of meta-analysis of several…

  19. A User's Guide to the Meta-Analysis of Research Studies. Meta-Stat: Software To Aid in the Meta-Analysis of Research Findings.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Glass Gene V.; Evartt, David L.; Emery, Patrick J.

    This manual and the accompanying software are intended to provide a step-by-step guide to conducting a meta-analytic study along with references for further reading and free high-quality software, "Meta-Stat.""Meta-Stat" is a comprehensive package designed to help in the meta-analysis of research studies in the social and behavioral sciences.…

  20. Development of the workflow kine systems for support on KAIZEN.

    PubMed

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  1. [Integration of the radiotherapy irradiation planning in the digital workflow].

    PubMed

    Röhner, F; Schmucker, M; Henne, K; Momm, F; Bruggmoser, G; Grosu, A-L; Frommhold, H; Heinemann, F E

    2013-02-01

    At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority.

  2. [Miscommunication as a risk focus in patient safety : Work process analysis in prehospital emergency care].

    PubMed

    Wilk, S; Siegl, L; Siegl, K; Hohenstein, C

    2018-04-01

    In an analysis of a critical incident reporting system (CIRS) in out-of-hospital emergency medicine, it was demonstrated that in 30% of cases deficient communication led to a threat to patients; however, the analysis did not show what exactly the most dangerous work processes are. Current research shows the impact of poor communication on patient safety. An out-of-hospital workflow analysis collects data about key work processes and risk areas. The analysis points out confounding factors for a sufficient communication. Almost 70% of critical incidents are based on human factors. Factors, such as communication and teamwork have an impact but fatigue, noise levels and illness also have a major influence. (I) CIRS database analysis The workflow analysis was based on 247 CIRS cases. This was completed by participant observation and interviews with emergency doctors and paramedics. The 247 CIRS cases displayed 282 communication incidents, which are categorized into 6 subcategories of miscommunication. One CIRS case can be classified into different categories if more communication incidents were validated by the reviewers and four experienced emergency physicians sorted these cases into six subcategories. (II) Workflow analysis The workflow analysis was carried out between 2015 and 2016 in Jena and Berlin, Germany. The focal point of research was to find accumulation of communication risks in different parts of prehospital patient care. During 30 h driving with emergency ambulances, the author interviewed 12 members of the emergency medical service of which 5 were emergency physicians and 7 paramedics. A total of 11 internal medicine cases and one automobile accident were monitored. After patient care the author asked in a 15-min interview if miscommunication or communication incidents occurred. (I) CIRS analysis Between 2005 and 2015, 845 reports were reported to the database. The experts identified 247 incident reports with communication failure. All communication aspects were analyzed and classified. We identified 282 communication incidents. (II) Workflow analysis The analysis showed three phases of prehospital patient care: 1. incoming emergency call and dispatch of ambulance service, 2. prehospital treatment, 3. transportation to a hospital. Overall, the number of incidences is increasing as a consequence of parallel workflows. Category 1 was particularly significant and predominantly, paramedics criticized that emergency physicians did not acknowledge their advice (n = 73 vs. n = 9). Category 3 with n = 63, category 4 with n = 20 and category 2 with n = 13 were the major reasons for incidents. A better interface communication helps to coordinate patient transfer and is an option for optimizing resources. Frequent training in communication is an option to avoid incidents.

  3. RNA-Seq workflow: gene-level exploratory analysis and differential expression

    PubMed Central

    Love, Michael I.; Anders, Simon; Kim, Vladislav; Huber, Wolfgang

    2015-01-01

    Here we walk through an end-to-end gene-level RNA-Seq differential expression workflow using Bioconductor packages. We will start from the FASTQ files, show how these were aligned to the reference genome, and prepare a count matrix which tallies the number of RNA-seq reads/fragments within each gene for each sample. We will perform exploratory data analysis (EDA) for quality assessment and to explore the relationship between samples, perform differential gene expression analysis, and visually explore the results. PMID:26674615

  4. Adaptive signal processing and higher order time- frequency analysis for acoustic and vibration signatures in condition monitoring

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Kwon

    This thesis is concerned with the development of a useful engineering technique to detect and analyse faults in rotating machinery. The methods developed are based on the advanced signal processing such as the adaptive signal processing and higher-order time frequency methods. The two-stage Adaptive Line Enhancer (ALE), using adaptive signal processing, has been developed for increasing the Signal to Noise Ratio of impulsive signals. The enhanced signal can then be analysed using time frequency methods to identify fault characteristics. However, if after pre-processing by the two stage ALE, the SNR of the signals is low, the residual noise often hinders clear identification of the fault characteristics in the time-frequency domain. In such cases, higher order time-frequency methods have been proposed and studied. As examples of rotating machinery, the internal combustion engine and an industrial gear box are considered in this thesis. The noise signal from an internal combustion engine and vibration signal measured on a gear box are studied in detail. Typically an impulsive signal manifests itself when the fault occurs in the machinery and is embedded in background noise, such as the fundamental frequency and its harmonic orders of the rotation speed and broadband noise. The two-stage ALE is developed for reducing this background noise. Conditions for the choice of adaptive filter parameters are studied and suitable adaptive algorithms given. The enhanced impulsive signal is analysed in the time- frequency domain using the Wigner higher order moment spectra (WHOMS) and the multi-time WHOMS (which is a dual form of the WHOMS). The WHOMS suffers from unwanted cross-terms, which increase dramatically as the order increases. Novel expressions for the cross-terms in WHOMS have been presented. The number of cross-terms can be reduced by taking the principal slice of the WHOMS. The residual cross-terms are smoothed by using a general class of kernel functions and the γ-method kernel function which is a novel development in this thesis. The WVD and the sliced WHOMS for synthesised signals and measured data from rotating machinery are analysed. The estimated ROC (Receive Operating Characteristic) curves for these methods are computed. These results lead to the conclusion that the detection performance when using the sliced WHOMS, for impulsive signals in embedded in broadband noise, is better than that of the Wigner-Ville distribution. Real data from a faulty car engine and faulty industrial gears are analysed. The car engine radiates an impulsive noise signal due to the loosening of a spark plug. The faulty industrial gear produces an impulsive vibration signal due to a spall on the tooth face in gear. The two- stage ALE and WHOMS are successfully applied to detection and analysis of these impulsive signals.

  5. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  6. Initial steps towards a production platform for DNA sequence analysis on the grid.

    PubMed

    Luyf, Angela C M; van Schaik, Barbera D C; de Vries, Michel; Baas, Frank; van Kampen, Antoine H C; Olabarriaga, Silvia D

    2010-12-14

    Bioinformatics is confronted with a new data explosion due to the availability of high throughput DNA sequencers. Data storage and analysis becomes a problem on local servers, and therefore it is needed to switch to other IT infrastructures. Grid and workflow technology can help to handle the data more efficiently, as well as facilitate collaborations. However, interfaces to grids are often unfriendly to novice users. In this study we reused a platform that was developed in the VL-e project for the analysis of medical images. Data transfer, workflow execution and job monitoring are operated from one graphical interface. We developed workflows for two sequence alignment tools (BLAST and BLAT) as a proof of concept. The analysis time was significantly reduced. All workflows and executables are available for the members of the Dutch Life Science Grid and the VL-e Medical virtual organizations All components are open source and can be transported to other grid infrastructures. The availability of in-house expertise and tools facilitates the usage of grid resources by new users. Our first results indicate that this is a practical, powerful and scalable solution to address the capacity and collaboration issues raised by the deployment of next generation sequencers. We currently adopt this methodology on a daily basis for DNA sequencing and other applications. More information and source code is available via http://www.bioinformaticslaboratory.nl/

  7. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from usage history to help Earth scientists better understand existing artifacts and how to use them in a proper manner? R2: Informed by insights derived from their computing contexts, how could such hidden knowledge be used to facilitate artifact reuse by Earth scientists? Our study of the two research questions will provide answers to three technical questions aiming to assist NEX users during workflow development: 1) How to determine what topics interest the researcher? 2) How to find appropriate artifacts? and 3) How to advise the researcher in artifact reuse? In this paper, we report our on-going efforts of leveraging social networking theory and analysis techniques to provide dynamic advice on artifact reuse to NEX users based on their surrounding contexts. As a proof of concept, we have designed and developed a plug-in to the VisTrails workflow design tool. When users develop workflows using VisTrails, our plug-in will proactively recommend most relevant sub-workflows to the users.

  8. Seamless online science workflow development and collaboration using IDL and the ENVI Services Engine

    NASA Astrophysics Data System (ADS)

    Harris, A. T.; Ramachandran, R.; Maskey, M.

    2013-12-01

    The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL Workbench, coupled with ESE for execution in the cloud, asynchronous workflows could be executed in batch mode on large data in the cloud. We envision that a scientist will initially develop a scientific workflow locally on a small set of data. Once tested, the scientist will deploy the workflow to the cloud for execution. Depending on the results, the scientist may share the workflow and results, allowing them to be stored in a community catalog and instantly loaded into the IDL Workbench of other scientists. Thereupon, scientists can clone and modify or execute the workflow with different input parameters. The Collaborative Workbench will provide a platform for collaboration in the cloud, helping Earth scientists solve big-data problems in the Earth and planetary sciences.

  9. Metadata Management on the SCEC PetaSHA Project: Helping Users Describe, Discover, Understand, and Use Simulation Data in a Large-Scale Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.

    2007-12-01

    Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.

  10. Reactive Flow Modeling of Liquid Explosives via ALE3D/Cheetah Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuo, I W; Bastea, S; Fried, L E

    2010-03-10

    We carried out reactive flow simulations of liquid explosives such as nitromethane using the hydrodynamic code ALE3D coupled with equations of state and reaction kinetics modeled by the thermochemical code Cheetah. The simulation set-up was chosen to mimic cylinder experiments. For pure unconfined nitromethane we find that the failure diameter and detonation velocity dependence on charge diameter are in agreement with available experimental results. Such simulations are likely to be useful for determining detonability and failure behavior for a wide range of experimental conditions and explosive compounds.

  11. Instantons on ALE spaces and orbifold partitions

    NASA Astrophysics Data System (ADS)

    Dijkgraaf, Robbert; Sułkowski, Piotr

    2008-03-01

    We consider Script N = 4 theories on ALE spaces of Ak-1 type. As is well known, their partition functions coincide with Ak-1 affine characters. We show that these partition functions are equal to the generating functions of some peculiar classes of partitions which we introduce under the name 'orbifold partitions'. These orbifold partitions turn out to be related to the generalized Frobenius partitions introduced by G. E. Andrews some years ago. We relate the orbifold partitions to the blended partitions and interpret explicitly in terms of a free fermion system.

  12. Developing a Learning Algorithm-Generated Empirical Relaxer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Wayne; Kallman, Josh; Toreja, Allen

    2016-03-30

    One of the main difficulties when running Arbitrary Lagrangian-Eulerian (ALE) simulations is determining how much to relax the mesh during the Eulerian step. This determination is currently made by the user on a simulation-by-simulation basis. We present a Learning Algorithm-Generated Empirical Relaxer (LAGER) which uses a regressive random forest algorithm to automate this decision process. We also demonstrate that LAGER successfully relaxes a variety of test problems, maintains simulation accuracy, and has the potential to significantly decrease both the person-hours and computational hours needed to run a successful ALE simulation.

  13. TGF-Beta Antibody for Prostate Cancer: Role of ERK

    DTIC Science & Technology

    2012-07-01

    medicine has been used either as a major medication or as a supplement either for cancer prevention or for cancer treatment. These herbal products...Blot Analysis ell lysates were prepared by using cell lysis buffer (Cell Sig- aling, Danvers, MA) supplemented with 1 mM PMSF and 1% rotease inhibitor...of target protein was used. Negative controls were identical array sections stained in the absence of primary antibody. (TIF) Method S1 Supplemental

  14. A Bioinformatics Workflow for Variant Peptide Detection in Shotgun Proteomics*

    PubMed Central

    Li, Jing; Su, Zengliu; Ma, Ze-Qiang; Slebos, Robbert J. C.; Halvey, Patrick; Tabb, David L.; Liebler, Daniel C.; Pao, William; Zhang, Bing

    2011-01-01

    Shotgun proteomics data analysis usually relies on database search. However, commonly used protein sequence databases do not contain information on protein variants and thus prevent variant peptides and proteins from been identified. Including known coding variations into protein sequence databases could help alleviate this problem. Based on our recently published human Cancer Proteome Variation Database, we have created a protein sequence database that comprehensively annotates thousands of cancer-related coding variants collected in the Cancer Proteome Variation Database as well as noncancer-specific ones from the Single Nucleotide Polymorphism Database (dbSNP). Using this database, we then developed a data analysis workflow for variant peptide identification in shotgun proteomics. The high risk of false positive variant identifications was addressed by a modified false discovery rate estimation method. Analysis of colorectal cancer cell lines SW480, RKO, and HCT-116 revealed a total of 81 peptides that contain either noncancer-specific or cancer-related variations. Twenty-three out of 26 variants randomly selected from the 81 were confirmed by genomic sequencing. We further applied the workflow on data sets from three individual colorectal tumor specimens. A total of 204 distinct variant peptides were detected, and five carried known cancer-related mutations. Each individual showed a specific pattern of cancer-related mutations, suggesting potential use of this type of information for personalized medicine. Compatibility of the workflow has been tested with four popular database search engines including Sequest, Mascot, X!Tandem, and MyriMatch. In summary, we have developed a workflow that effectively uses existing genomic data to enable variant peptide detection in proteomics. PMID:21389108

  15. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  16. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    NASA Astrophysics Data System (ADS)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from various runs of geoKepler workflows. The communication between iPython and Kepler workflow executions is established through an iPython magic function for Kepler that we have implemented. In summary, geoKepler is an ecosystem that makes geospatial processing and analysis of any kind programmable, reusable, scalable and sharable.

  17. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Addressing informatics challenges in Translational Research with workflow technology.

    PubMed

    Beaulah, Simon A; Correll, Mick A; Munro, Robin E J; Sheldon, Jonathan G

    2008-09-01

    Interest in Translational Research has been growing rapidly in recent years. In this collision of different data, technologies and cultures lie tremendous opportunities for the advancement of science and business for organisations that are able to integrate, analyse and deliver this information effectively to users. Workflow-based integration and analysis systems are becoming recognised as a fast and flexible way to build applications that are tailored to scientific areas, yet are built on a common platform. Workflow systems are allowing organisations to meet the key informatics challenges in Translational Research and improve disease understanding and patient care.

  19. Data processing workflows from low-cost digital survey to various applications: three case studies of Chinese historic architecture

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Cao, Y. K.

    2015-08-01

    The paper focuses on the versatility of data processing workflows ranging from BIM-based survey to structural analysis and reverse modeling. In China nowadays, a large number of historic architecture are in need of restoration, reinforcement and renovation. But the architects are not prepared for the conversion from the booming AEC industry to architectural preservation. As surveyors working with architects in such projects, we have to develop efficient low-cost digital survey workflow robust to various types of architecture, and to process the captured data for architects. Although laser scanning yields high accuracy in architectural heritage documentation and the workflow is quite straightforward, the cost and portability hinder it from being used in projects where budget and efficiency are of prime concern. We integrate Structure from Motion techniques with UAV and total station in data acquisition. The captured data is processed for various purposes illustrated with three case studies: the first one is as-built BIM for a historic building based on registered point clouds according to Ground Control Points; The second one concerns structural analysis for a damaged bridge using Finite Element Analysis software; The last one relates to parametric automated feature extraction from captured point clouds for reverse modeling and fabrication.

  20. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial.

    PubMed

    Joda, Tim; Brägger, Urs

    2015-01-01

    To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.

  1. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. The Influence of Judgment Calls on Meta-Analytic Findings.

    PubMed

    Tarrahi, Farid; Eisend, Martin

    2016-01-01

    Previous research has suggested that judgment calls (i.e., methodological choices made in the process of conducting a meta-analysis) have a strong influence on meta-analytic findings and question their robustness. However, prior research applies case study comparison or reanalysis of a few meta-analyses with a focus on a few selected judgment calls. These studies neglect the fact that different judgment calls are related to each other and simultaneously influence the outcomes of a meta-analysis, and that meta-analytic findings can vary due to non-judgment call differences between meta-analyses (e.g., variations of effects over time). The current study analyzes the influence of 13 judgment calls in 176 meta-analyses in marketing research by applying a multivariate, multilevel meta-meta-analysis. The analysis considers simultaneous influences from different judgment calls on meta-analytic effect sizes and controls for alternative explanations based on non-judgment call differences between meta-analyses. The findings suggest that judgment calls have only a minor influence on meta-analytic findings, whereas non-judgment call differences between meta-analyses are more likely to explain differences in meta-analytic findings. The findings support the robustness of meta-analytic results and conclusions.

  3. KDE Bioscience: platform for bioinformatics analysis workflows.

    PubMed

    Lu, Qiang; Hao, Pei; Curcin, Vasa; He, Weizhong; Li, Yuan-Yuan; Luo, Qing-Ming; Guo, Yi-Ke; Li, Yi-Xue

    2006-08-01

    Bioinformatics is a dynamic research area in which a large number of algorithms and programs have been developed rapidly and independently without much consideration so far of the need for standardization. The lack of such common standards combined with unfriendly interfaces make it difficult for biologists to learn how to use these tools and to translate the data formats from one to another. Consequently, the construction of an integrative bioinformatics platform to facilitate biologists' research is an urgent and challenging task. KDE Bioscience is a java-based software platform that collects a variety of bioinformatics tools and provides a workflow mechanism to integrate them. Nucleotide and protein sequences from local flat files, web sites, and relational databases can be entered, annotated, and aligned. Several home-made or 3rd-party viewers are built-in to provide visualization of annotations or alignments. KDE Bioscience can also be deployed in client-server mode where simultaneous execution of the same workflow is supported for multiple users. Moreover, workflows can be published as web pages that can be executed from a web browser. The power of KDE Bioscience comes from the integrated algorithms and data sources. With its generic workflow mechanism other novel calculations and simulations can be integrated to augment the current sequence analysis functions. Because of this flexible and extensible architecture, KDE Bioscience makes an ideal integrated informatics environment for future bioinformatics or systems biology research.

  4. pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    White, J.; Brakefield, L. K.

    2015-12-01

    The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.

  5. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    PubMed

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  6. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreher, M.; Peterka, T.

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steeringmore » based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.« less

  7. Structured recording of intraoperative surgical workflows

    NASA Astrophysics Data System (ADS)

    Neumuth, T.; Durstewitz, N.; Fischer, M.; Strauss, G.; Dietz, A.; Meixensberger, J.; Jannin, P.; Cleary, K.; Lemke, H. U.; Burgert, O.

    2006-03-01

    Surgical Workflows are used for the methodical and scientific analysis of surgical interventions. The approach described here is a step towards developing surgical assist systems based on Surgical Workflows and integrated control systems for the operating room of the future. This paper describes concepts and technologies for the acquisition of Surgical Workflows by monitoring surgical interventions and their presentation. Establishing systems which support the Surgical Workflow in operating rooms requires a multi-staged development process beginning with the description of these workflows. A formalized description of surgical interventions is needed to create a Surgical Workflow. This description can be used to analyze and evaluate surgical interventions in detail. We discuss the subdivision of surgical interventions into work steps regarding different levels of granularity and propose a recording scheme for the acquisition of manual surgical work steps from running interventions. To support the recording process during the intervention, we introduce a new software architecture. Core of the architecture is our Surgical Workflow editor that is intended to deal with the manifold, complex and concurrent relations during an intervention. Furthermore, a method for an automatic generation of graphs is shown which is able to display the recorded surgical work steps of the interventions. Finally we conclude with considerations about extensions of our recording scheme to close the gap to S-PACS systems. The approach was used to record 83 surgical interventions from 6 intervention types from 3 different surgical disciplines: ENT surgery, neurosurgery and interventional radiology. The interventions were recorded at the University Hospital Leipzig, Germany and at the Georgetown University Hospital, Washington, D.C., USA.

  8. Trial Sequential Analysis in systematic reviews with meta-analysis.

    PubMed

    Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian

    2017-03-06

    Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors) and too many false negative conclusions (type II errors). We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D 2 ) measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can be reduced using Trial Sequential Analysis. Several empirical studies have demonstrated that the Trial Sequential Analysis provides better control of type I errors and of type II errors than the traditional naïve meta-analysis. Trial Sequential Analysis represents analysis of meta-analytic data, with transparent assumptions, and better control of type I and type II errors than the traditional meta-analysis using naïve unadjusted confidence intervals.

  9. SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, C; Wessels, B; Hamilton, H

    2014-06-01

    Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of amore » number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT.« less

  10. Introducing W.A.T.E.R.S.: a workflow for the alignment, taxonomy, and ecology of ribosomal sequences.

    PubMed

    Hartman, Amber L; Riddle, Sean; McPhillips, Timothy; Ludäscher, Bertram; Eisen, Jonathan A

    2010-06-12

    For more than two decades microbiologists have used a highly conserved microbial gene as a phylogenetic marker for bacteria and archaea. The small-subunit ribosomal RNA gene, also known as 16 S rRNA, is encoded by ribosomal DNA, 16 S rDNA, and has provided a powerful comparative tool to microbial ecologists. Over time, the microbial ecology field has matured from small-scale studies in a select number of environments to massive collections of sequence data that are paired with dozens of corresponding collection variables. As the complexity of data and tool sets have grown, the need for flexible automation and maintenance of the core processes of 16 S rDNA sequence analysis has increased correspondingly. We present WATERS, an integrated approach for 16 S rDNA analysis that bundles a suite of publicly available 16 S rDNA analysis software tools into a single software package. The "toolkit" includes sequence alignment, chimera removal, OTU determination, taxonomy assignment, phylogentic tree construction as well as a host of ecological analysis and visualization tools. WATERS employs a flexible, collection-oriented 'workflow' approach using the open-source Kepler system as a platform. By packaging available software tools into a single automated workflow, WATERS simplifies 16 S rDNA analyses, especially for those without specialized bioinformatics, programming expertise. In addition, WATERS, like some of the newer comprehensive rRNA analysis tools, allows researchers to minimize the time dedicated to carrying out tedious informatics steps and to focus their attention instead on the biological interpretation of the results. One advantage of WATERS over other comprehensive tools is that the use of the Kepler workflow system facilitates result interpretation and reproducibility via a data provenance sub-system. Furthermore, new "actors" can be added to the workflow as desired and we see WATERS as an initial seed for a sizeable and growing repository of interoperable, easy-to-combine tools for asking increasingly complex microbial ecology questions.

  11. La fibrose rétropéritonéale: à propos de 12 cas

    PubMed Central

    Majdoub, Aziz El; Khallouk, Abdelhak; Farih, Moulay Hassan

    2017-01-01

    La fibrose rétropéritonéale (FRP) est une maladie rare. Elle se caractérise par la transformation progressive du tissu adipeux rétopéritonéal en une masse fibreuse qui enserre l'aorte, la veine cave inférieure et les voies urinaires responsable d'une altération progressive de la fonction rénale. Le mode habituel de présentation de cette maladie comporte l'association de douleurs lombaires, d'une insuffisance rénale, et d'un syndrome inflammatoire biologique. Nous rapportons 12 cas de fibrose rétropéritonéale dont nous précisons les particularités cliniques, radiologiques et thérapeutiques. Il s'agit d'une étude rétrospective portant sur douze cas de fibrose rétropéritonéale colligés au service d'urologie au CHU Hassan II de Fès durant une période de 9 ans (2005-2013). Il s'agissait de dix hommes et deux femmes. La symptomatologie clinique était très variable, dominée par la douleur lombaire qui était présente chez tous les malades et une hydrocèle chez un patient. Les explorations biologiques avaient montré une insuffisance rénale chez tous les malades et un syndrome inflammatoire chez dix patients. Le diagnostic de la maladie était suspecté dans tous les cas sur les données de l'échographie qui a montré une obstruction de la voie excrétrice supérieure sans obstacle visible chez tous les malades, et confirmé par la TDM abdominale sans injection du produit de contraste qui objectivait une lésion tissulaire rétropéritonéale engainant les vaisseaux et les voies urinaires. Dans notre série, la fibrose rétropéritonéale était idiopathique dans neuf cas. Elle était péri anévrysmale chez deux malades, et post radiothérapie chez un malade. Tous nos patients avaient bénéficié d'un drainage urinaire par sonde urétérale double J. Sept malades avaient reçu une corticothérapie. Une amélioration clinique et biologique, avec disparition de la douleur et amélioration de l'état général, a été observée chez 6 patients. A travers cette étude nous avons confirmé la rareté de la fibrose rétropéritonéale, la difficulté de son diagnostic, la fréquence de la douleur, du syndrome inflammatoire et de l'insuffisance rénale. La TDM abdominale sans injection du produit de contraste confirme le diagnostic. Le drainage urinaire est indispensable dans la plupart des cas et le suivi régulier des malades est nécessaire. PMID:29610632

  12. EPOS Data and Service Provision

    NASA Astrophysics Data System (ADS)

    Bailo, Daniele; Jeffery, Keith G.; Atakan, Kuvvet; Harrison, Matt

    2017-04-01

    EPOS is now in IP (implementation phase) after a successful PP (preparatory phase). EPOS consists of essentially two components, one ICS (Integrated Core Services) representing the integrating ICT (Information and Communication Technology) and many TCS (Thematic Core Services) representing the scientific domains. The architecture developed, demonstrated and agreed within the project during the PP is now being developed utilising co-design with the TCS teams and agile, spiral methods within the ICS team. The 'heart' of EPOS is the metadata catalog. This provides for the ICS a digital representation of the TCS assets (services, data, software, equipment, expertise…) thus facilitating access, interoperation and (re-)use. A major part of the work has been interactions with the TCS. The original intention to harvest information from the TCS required (and still requires) discussions to understand fully the TCS organisational structures linked with rights, security and privacy; their (meta)data syntax (structure) and semantics (meaning); their workflows and methods of working and the services offered. To complicate matters further the TCS are each at varying stages of development and the ICS design has to accommodate pre-existing, developing and expected future standards for metadata, data, software and processes. Through information documents, questionnaires and interviews/meetings the EPOS ICS team has collected DDSS (Data, Data Products, Software and Services) information from the TCS. The ICS team developed a simplified metadata model for presentation to the TCS and the ICS team will perform the mapping and conversion from this model to the internal detailed technical metadata model using (CERIF: a EU recommendation to Member States maintained, developed and promoted by euroCRIS www.eurocris.org ). At the time of writing the final modifications of the EPOS metadata model are being made, and the mappings to CERIF designed, prior to the main phase of (meta)data collection into the EPOS metadata catalog. In parallel work proceeds on the user interface softsare, the APIs (Application Programming Interfaces) to the TCS services, the harvesting method and software, the AAAI (Authentication, Authorisation, Accounting Infrastructure) and the system manager. The next steps will involve interfaces to ICS-D (Distributed ICS i.e. facilities and services for computing, data storage, detectors and instruments for data collection etc.) to which requests, software and data will be deployed and from which data will be generated. Associated with this will be the development of the workflow system which will assist the end-user in building a workflow to achieve the scientific objectives.

  13. Evaluating the Quality of Evidence from a Network Meta-Analysis

    PubMed Central

    Salanti, Georgia; Del Giovane, Cinzia; Chaimani, Anna; Caldwell, Deborah M.; Higgins, Julian P. T.

    2014-01-01

    Systematic reviews that collate data about the relative effects of multiple interventions via network meta-analysis are highly informative for decision-making purposes. A network meta-analysis provides two types of findings for a specific outcome: the relative treatment effect for all pairwise comparisons, and a ranking of the treatments. It is important to consider the confidence with which these two types of results can enable clinicians, policy makers and patients to make informed decisions. We propose an approach to determining confidence in the output of a network meta-analysis. Our proposed approach is based on methodology developed by the Grading of Recommendations Assessment, Development and Evaluation (GRADE) Working Group for pairwise meta-analyses. The suggested framework for evaluating a network meta-analysis acknowledges (i) the key role of indirect comparisons (ii) the contributions of each piece of direct evidence to the network meta-analysis estimates of effect size; (iii) the importance of the transitivity assumption to the validity of network meta-analysis; and (iv) the possibility of disagreement between direct evidence and indirect evidence. We apply our proposed strategy to a systematic review comparing topical antibiotics without steroids for chronically discharging ears with underlying eardrum perforations. The proposed framework can be used to determine confidence in the results from a network meta-analysis. Judgements about evidence from a network meta-analysis can be different from those made about evidence from pairwise meta-analyses. PMID:24992266

  14. Forced folding in a salty basin: Gada'-Ale in the Afar

    NASA Astrophysics Data System (ADS)

    Rafflin, Victoria; Hetherington, Rachel; Hagos, Miruts; van Wyk de Vries, Benjamin

    2017-04-01

    The Gada'-Ale Volcano in the Danakil Depression of Ethiopia is a curious shield-like, or flat dome-like volcanic centre in the Afar Rift. It has several fissure eruptions seen on its mid and lower flanks. It has an even more curious ring structure on its western side that has been interpreted as a salt diapir. The complex lies the central part of the basin where there are 1-2 km thick salt deposits. The area was active in 1990's (Amelung et al 2000) with no eruptive activity, but a possible intrusion. There was also an intrusion north of Gada'-Ale at Dallol in 2005 (Nobile et al 2012). Using Google Earth imagery, we have mapped the volcano, and note that: a) the main edifice has a thin skin of lava lying light coloured rock; b) that these thin deposits are sliding down the flank of volcano, and thrusting at the base. In doing so, they are breaking into detached plates. The light colour of the deposits, and the ability of the rock to slide on them suggest that are salt; Fractures on and around the volcano form curved patterns, around raised areas with several km diameter. These could be surface expressions of shallow sills. Putting the observations together with the known geology of adjacent centres like Dallol and Alu, we suggest that Gada'-Ale is a forced fold, created over a sill that has either bulged into a laccolith, or risen as a saucer-shaped sill. The upraised salt has caused the thin veneer of volcanics to slide off. That there are eruptive fissures on Gada'-Ale, and possible sill intrusions around the base suggests that the centre lies over a complex of sills that have gradually intruded and bulged the structure to its present level. Eruptions have contribute only a small amount to the whole topography of the edifice. We hope to visit the volcano in March and will being hot-off-the press details back to the EGU!

  15. Mathematical Modeling and Analysis of Mass Spectrometry Data in Workflows for the Discovery of Biomarkets in Breast Cancer

    DTIC Science & Technology

    2008-07-01

    Mass Spectrometry Data in Workflows for the Discovery of Biomarkets in Breast Cancer PRINCIPAL INVESTIGATOR: Vladimir Fokin, Ph.D... Biomarkets in Breast Cancer 5b. GRANT NUMBER W81XWH-07-1-0447 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Vladimir Fokin, Ph.D

  16. Retinal compensatory changes after light damage in albino mice

    PubMed Central

    Montalbán-Soler, Luis; Alarcón-Martínez, Luis; Jiménez-López, Manuel; Salinas-Navarro, Manuel; Galindo-Romero, Caridad; Bezerra de Sá, Fabrízio; García-Ayuso, Diego; Avilés-Trigueros, Marcelino; Vidal-Sanz, Manuel; Agudo-Barriuso, Marta

    2012-01-01

    Purpose To investigate the anatomic and functional changes triggered by light exposure in the albino mouse retina and compare them with those observed in the albino rat. Methods BALB/c albino mice were exposed to 3,000 lx of white light during 24 h and their retinas analyzed from 1 to 180 days after light exposure (ALE). Left pupil mydriasis was induced with topical atropine. Retinal function was analyzed by electroretinographic (ERG) recording. To assess retinal degeneration, hematoxylin and eosin staining, the TdT-mediated dUTP nick-end labeling (TUNEL) technique, and quantitative immunohistofluorescence for synaptophysin and protein kinase Cα (PKCα) were used in cross sections. Intravenous injection of horseradish peroxidase and Fluoro-Gold™ tracing were used in whole-mounted retinas to study the retinal vasculature and the retinal ganglion cell (RGC) population, respectively. Results Light exposure caused apoptotic photoreceptor death in the central retina. This death was more severe in the dorsal than in the ventral retina, sparing the periphery. Neither retinal vascular leakage nor retinal ganglion cell death was observed ALE. The electroretinographic a-wave was permanently impaired, while the b-wave decreased but recovered gradually by 180 days ALE. The scotopic threshold responses, associated with the inner retinal function, diminished at first but recovered completely by 14 days ALE. This functional recovery was concomitant with the upregulation of protein kinase Cα and synaptophysin. Similar results were obtained in both eyes, irrespective of mydriasis. Conclusions In albino mice, light exposure induces substantial retinal damage, but the surviving photoreceptors, together with compensatory morphological/molecular changes, allow an important restoration of the retinal function. PMID:22509098

  17. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis

    PubMed Central

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  18. A general framework for the use of logistic regression models in meta-analysis.

    PubMed

    Simmonds, Mark C; Higgins, Julian Pt

    2016-12-01

    Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.

  19. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    NASA Astrophysics Data System (ADS)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).

  20. Longitudinal analysis of meta-analysis literatures in the database of ISI Web of Science.

    PubMed

    Zhu, Changtai; Jiang, Ting; Cao, Hao; Sun, Wenguang; Chen, Zhong; Liu, Jinming

    2015-01-01

    The meta-analysis is regarded as an important evidence for making scientific decision. The database of ISI Web of Science collected a great number of high quality literatures including meta-analysis literatures. However, it is significant to understand the general characteristics of meta-analysis literatures to outline the perspective of meta-analysis. In this present study, we summarized and clarified some features on these literatures in the database of ISI Web of Science. We retrieved the meta-analysis literatures in the database of ISI Web of Science including SCI-E, SSCI, A&HCI, CPCI-S, CPCI-SSH, CCR-E, and IC. The annual growth rate, literature category, language, funding, index citation, agencies and countries/territories of the meta-analysis literatures were analyzed, respectively. A total of 95,719 records, which account for 0.38% (99% CI: 0.38%-0.39%) of all literatures, were found in the database. From 1997 to 2012, the annual growth rate of meta-analysis literatures was 18.18%. The literatures involved in many categories, languages, fundings, citations, publication agencies, and countries/territories. Interestingly, the index citation frequencies of the meta-analysis were significantly higher than that of other type literatures such as multi-centre study, randomize controlled trial, cohort study, case control study, and cases report (P<0.0001). The increasing numbers, intensively global influence and high citations revealed that the meta-analysis has been becoming more and more prominent in recent years. In future, in order to promote the validity of meta-analysis, the CONSORT and PRISMA standard should be continuously popularized in the field of evidence-based medicine.

  1. GEOGLE: context mining tool for the correlation between gene expression and the phenotypic distinction.

    PubMed

    Yu, Yao; Tu, Kang; Zheng, Siyuan; Li, Yun; Ding, Guohui; Ping, Jie; Hao, Pei; Li, Yixue

    2009-08-25

    In the post-genomic era, the development of high-throughput gene expression detection technology provides huge amounts of experimental data, which challenges the traditional pipelines for data processing and analyzing in scientific researches. In our work, we integrated gene expression information from Gene Expression Omnibus (GEO), biomedical ontology from Medical Subject Headings (MeSH) and signaling pathway knowledge from sigPathway entries to develop a context mining tool for gene expression analysis - GEOGLE. GEOGLE offers a rapid and convenient way for searching relevant experimental datasets, pathways and biological terms according to multiple types of queries: including biomedical vocabularies, GDS IDs, gene IDs, pathway names and signature list. Moreover, GEOGLE summarizes the signature genes from a subset of GDSes and estimates the correlation between gene expression and the phenotypic distinction with an integrated p value. This approach performing global searching of expression data may expand the traditional way of collecting heterogeneous gene expression experiment data. GEOGLE is a novel tool that provides researchers a quantitative way to understand the correlation between gene expression and phenotypic distinction through meta-analysis of gene expression datasets from different experiments, as well as the biological meaning behind. The web site and user guide of GEOGLE are available at: http://omics.biosino.org:14000/kweb/workflow.jsp?id=00020.

  2. The Empirical Review of Meta-Analysis Published in Korea

    ERIC Educational Resources Information Center

    Park, Sunyoung; Hong, Sehee

    2016-01-01

    Meta-analysis is a statistical method that is increasingly utilized to combine and compare the results of previous primary studies. However, because of the lack of comprehensive guidelines for how to use meta-analysis, many meta-analysis studies have failed to consider important aspects, such as statistical programs, power analysis, publication…

  3. Extraction of Structural Extracellular Polymeric Substances from Aerobic Granular Sludge

    PubMed Central

    Felz, Simon; Al-Zuhairy, Salah; Aarstad, Olav Andreas; van Loosdrecht, Mark C.M.; Lin, Yue Mei

    2016-01-01

    To evaluate and develop methodologies for the extraction of gel-forming extracellular polymeric substances (EPS), EPS from aerobic granular sludge (AGS) was extracted using six different methods (centrifugation, sonication, ethylenediaminetetraacetic acid (EDTA), formamide with sodium hydroxide (NaOH), formaldehyde with NaOH and sodium carbonate (Na2CO3) with heat and constant mixing). AGS was collected from a pilot wastewater treatment reactor. The ionic gel-forming property of the extracted EPS of the six different extraction methods was tested with calcium ions (Ca2+). From the six extraction methods used, only the Na2CO3 extraction could solubilize the hydrogel matrix of AGS. The alginate-like extracellular polymers (ALE) recovered with this method formed ionic gel beads with Ca2+. The Ca2+-ALE beads were stable in EDTA, formamide with NaOH and formaldehyde with NaOH, indicating that ALE are one part of the structural polymers in EPS. It is recommended to use an extraction method that combines physical and chemical treatment to solubilize AGS and extract structural EPS. PMID:27768085

  4. Calibration of 3D ALE finite element model from experiments on friction stir welding of lap joints

    NASA Astrophysics Data System (ADS)

    Fourment, Lionel; Gastebois, Sabrina; Dubourg, Laurent

    2016-10-01

    In order to support the design of such a complex process like Friction Stir Welding (FSW) for the aeronautic industry, numerical simulation software requires (1) developing an efficient and accurate Finite Element (F.E.) formulation that allows predicting welding defects, (2) properly modeling the thermo-mechanical complexity of the FSW process and (3) calibrating the F.E. model from accurate measurements from FSW experiments. This work uses a parallel ALE formulation developed in the Forge® F.E. code to model the different possible defects (flashes and worm holes), while pin and shoulder threads are modeled by a new friction law at the tool / material interface. FSW experiments require using a complex tool with scroll on shoulder, which is instrumented for providing sensitive thermal data close to the joint. Calibration of unknown material thermal coefficients, constitutive equations parameters and friction model from measured forces, torques and temperatures is carried out using two F.E. models, Eulerian and ALE, to reach a satisfactory agreement assessed by the proper sensitivity of the simulation to process parameters.

  5. History of the pharmacies in the town of Aleşd, Bihor county

    PubMed Central

    PAŞCA, MANUELA BIANCA; GÎTEA, DANIELA; MOISA, CORINA

    2013-01-01

    In 1848 pharmacist Horváth Mihály established the first pharmacy in Aleşd, called Speranţa (Remény). Following the brief history of this pharmacy we will notice that in 1874 the pharmacy comes into the possession of Kocsiss József. In 1906 the personal rights of the pharmacy are transcribed to Kocsiss Béla, and since 1938 the his son, Kocsiss Dezső, pharmacist, became the new owner. In 1949 the pharmacy was nationalized and became the property of the Pharmaceutical Office Oradea, the pharmacy got the name Farmacia nr. 22 of Aleşd, and continued its activity throughout the whole communist period. Starting with the year 1991 it entered into private system as Angefarm, as the property of Mermeze Gheorghe, pharmacist, and from 2003 until now works under the name Vitalogy 3, as the property of Ghitea Sorin. A second pharmacy, Sfântul Anton was founded in 1937 by pharmacist Herceg Dobreanu Atena, which however had no continuity during the communist period. PMID:26527963

  6. Adding results to a meta-analysis: Theory and example

    NASA Astrophysics Data System (ADS)

    Willson, Victor L.

    Meta-analysis has been used as a research method to describe bodies of research data. It promotes hypothesis formation and the development of science education laws. A function overlooked, however, is the role it plays in updating research. Methods to integrate new research with meta-analysis results need explication. A procedure is presented using Bayesian analysis. Research in science education attitude correlation with achievement has been published after a recent meta-analysis of the topic. The results show how new findings complement the previous meta-analysis and extend its conclusions. Additional methodological questions adddressed are how studies are to be weighted, which variables are to be examined, and how often meta-analysis are to be updated.

  7. An integrated workflow for analysis of ChIP-chip data.

    PubMed

    Weigelt, Karin; Moehle, Christoph; Stempfl, Thomas; Weber, Bernhard; Langmann, Thomas

    2008-08-01

    Although ChIP-chip is a powerful tool for genome-wide discovery of transcription factor target genes, the steps involving raw data analysis, identification of promoters, and correlation with binding sites are still laborious processes. Therefore, we report an integrated workflow for the analysis of promoter tiling arrays with the Genomatix ChipInspector system. We compare this tool with open-source software packages to identify PU.1 regulated genes in mouse macrophages. Our results suggest that ChipInspector data analysis, comparative genomics for binding site prediction, and pathway/network modeling significantly facilitate and enhance whole-genome promoter profiling to reveal in vivo sites of transcription factor-DNA interactions.

  8. A genetic meta-algorithm-assisted inversion approach: hydrogeological study for the determination of volumetric rock properties and matrix and fluid parameters in unsaturated formations

    NASA Astrophysics Data System (ADS)

    Szabó, Norbert Péter

    2018-03-01

    An evolutionary inversion approach is suggested for the interpretation of nuclear and resistivity logs measured by direct-push tools in shallow unsaturated sediments. The efficiency of formation evaluation is improved by estimating simultaneously (1) the petrophysical properties that vary rapidly along a drill hole with depth and (2) the zone parameters that can be treated as constant, in one inversion procedure. In the workflow, the fractional volumes of water, air, matrix and clay are estimated in adjacent depths by linearized inversion, whereas the clay and matrix properties are updated using a float-encoded genetic meta-algorithm. The proposed inversion method provides an objective estimate of the zone parameters that appear in the tool response equations applied to solve the forward problem, which can significantly increase the reliability of the petrophysical model as opposed to setting these parameters arbitrarily. The global optimization meta-algorithm not only assures the best fit between the measured and calculated data but also gives a reliable solution, practically independent of the initial model, as laboratory data are unnecessary in the inversion procedure. The feasibility test uses engineering geophysical sounding logs observed in an unsaturated loessy-sandy formation in Hungary. The multi-borehole extension of the inversion technique is developed to determine the petrophysical properties and their estimation errors along a profile of drill holes. The genetic meta-algorithmic inversion method is recommended for hydrogeophysical logging applications of various kinds to automatically extract the volumetric ratios of rock and fluid constituents as well as the most important zone parameters in a reliable inversion procedure.

  9. Octopus-toolkit: a workflow to automate mining of public epigenomic and transcriptomic next-generation sequencing data

    PubMed Central

    Kim, Taemook; Seo, Hogyu David; Hennighausen, Lothar; Lee, Daeyoup

    2018-01-01

    Abstract Octopus-toolkit is a stand-alone application for retrieving and processing large sets of next-generation sequencing (NGS) data with a single step. Octopus-toolkit is an automated set-up-and-analysis pipeline utilizing the Aspera, SRA Toolkit, FastQC, Trimmomatic, HISAT2, STAR, Samtools, and HOMER applications. All the applications are installed on the user's computer when the program starts. Upon the installation, it can automatically retrieve original files of various epigenomic and transcriptomic data sets, including ChIP-seq, ATAC-seq, DNase-seq, MeDIP-seq, MNase-seq and RNA-seq, from the gene expression omnibus data repository. The downloaded files can then be sequentially processed to generate BAM and BigWig files, which are used for advanced analyses and visualization. Currently, it can process NGS data from popular model genomes such as, human (Homo sapiens), mouse (Mus musculus), dog (Canis lupus familiaris), plant (Arabidopsis thaliana), zebrafish (Danio rerio), fruit fly (Drosophila melanogaster), worm (Caenorhabditis elegans), and budding yeast (Saccharomyces cerevisiae) genomes. With the processed files from Octopus-toolkit, the meta-analysis of various data sets, motif searches for DNA-binding proteins, and the identification of differentially expressed genes and/or protein-binding sites can be easily conducted with few commands by users. Overall, Octopus-toolkit facilitates the systematic and integrative analysis of available epigenomic and transcriptomic NGS big data. PMID:29420797

  10. Electronic problem lists: a thematic analysis of a systematic literature review to identify aspects critical to success.

    PubMed

    Hodge, Chad M; Narus, Scott P

    2018-05-01

    Problem list data is a driving force for many beneficial clinical tools, yet these data remain underutilized. We performed a systematic literature review, pulling insights from previous research, aggregating insights into themes, and distilling themes into actionable advice. We sought to learn what changes we could make to existing applications, to the clinical workflow, and to clinicians' perceptions that would improve problem list utilization and increase the prevalence of problems data in the electronic medical record. We followed Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines to systematically curate a corpus of pertinent articles. We performed a thematic analysis, looking for interesting excerpts and ideas. By aggregating excerpts from many authors, we gained broader, more inclusive insights into what makes a good problem list and what factors are conducive to its success. Analysis led to a list of 7 benefits of using the problem list, 15 aspects critical to problem list success, and knowledge to help inform policy development, such as consensus on what belongs on the problem list, who should maintain the problem list, and when. A list of suggestions is made on ways in which the problem list can be improved to increase utilization by clinicians. There is also a need for standard measurements of the problem list, so that lists can be measured, compared, and discussed with rigor and a common vocabulary.

  11. Radiology Workflow Dynamics: How Workflow Patterns Impact Radiologist Perceptions of Workplace Satisfaction.

    PubMed

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum; Field, Aaron; Wiegmann, Douglas; Yu, John-Paul J

    2017-04-01

    The study aimed to assess perceptions of reading room workflow and the impact separating image-interpretive and nonimage-interpretive task workflows can have on radiologist perceptions of workplace disruptions, workload, and overall satisfaction. A 14-question survey instrument was developed to measure radiologist perceptions of workplace interruptions, satisfaction, and workload prior to and following implementation of separate image-interpretive and nonimage-interpretive reading room workflows. The results were collected over 2 weeks preceding the intervention and 2 weeks following the end of the intervention. The results were anonymized and analyzed using univariate analysis. A total of 18 people responded to the preintervention survey: 6 neuroradiology fellows and 12 attending neuroradiologists. Fifteen people who were then present for the 1-month intervention period responded to the postintervention survey. Perceptions of workplace disruptions, image interpretation, quality of trainee education, ability to perform nonimage-interpretive tasks, and quality of consultations (P < 0.0001) all improved following the intervention. Mental effort and workload also improved across all assessment domains, as did satisfaction with quality of image interpretation and consultative work. Implementation of parallel dedicated image-interpretive and nonimage-interpretive workflows may improve markers of radiologist perceptions of workplace satisfaction. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  12. Network meta-analysis: an introduction for clinicians.

    PubMed

    Rouse, Benjamin; Chaimani, Anna; Li, Tianjing

    2017-02-01

    Network meta-analysis is a technique for comparing multiple treatments simultaneously in a single analysis by combining direct and indirect evidence within a network of randomized controlled trials. Network meta-analysis may assist assessing the comparative effectiveness of different treatments regularly used in clinical practice and, therefore, has become attractive among clinicians. However, if proper caution is not taken in conducting and interpreting network meta-analysis, inferences might be biased. The aim of this paper is to illustrate the process of network meta-analysis with the aid of a working example on first-line medical treatment for primary open-angle glaucoma. We discuss the key assumption of network meta-analysis, as well as the unique considerations for developing appropriate research questions, conducting the literature search, abstracting data, performing qualitative and quantitative synthesis, presenting results, drawing conclusions, and reporting the findings in a network meta-analysis.

  13. You and me and the computer makes three: variations in exam room use of the electronic health record

    PubMed Central

    Saleem, Jason J; Flanagan, Mindy E; Russ, Alissa L; McMullen, Carmit K; Elli, Leora; Russell, Scott A; Bennett, Katelyn J; Matthias, Marianne S; Rehman, Shakaib U; Schwartz, Mark D; Frankel, Richard M

    2014-01-01

    Challenges persist on how to effectively integrate the electronic health record (EHR) into patient visits and clinical workflow, while maintaining patient-centered care. Our goal was to identify variations in, barriers to, and facilitators of the use of the US Department of Veterans Affairs (VA) EHR in ambulatory care workflow in order better to understand how to integrate the EHR into clinical work. We observed and interviewed 20 ambulatory care providers across three geographically distinct VA medical centers. Analysis revealed several variations in, associated barriers to, and facilitators of EHR use corresponding to different units of analysis: computer interface, team coordination/workflow, and organizational. We discuss our findings in the context of different units of analysis and connect variations in EHR use to various barriers and facilitators. Findings from this study may help inform the design of the next generation of EHRs for the VA and other healthcare systems. PMID:24001517

  14. A data-independent acquisition workflow for qualitative screening of new psychoactive substances in biological samples.

    PubMed

    Kinyua, Juliet; Negreira, Noelia; Ibáñez, María; Bijlsma, Lubertus; Hernández, Félix; Covaci, Adrian; van Nuijs, Alexander L N

    2015-11-01

    Identification of new psychoactive substances (NPS) is challenging. Developing targeted methods for their analysis can be difficult and costly due to their impermanence on the drug scene. Accurate-mass mass spectrometry (AMMS) using a quadrupole time-of-flight (QTOF) analyzer can be useful for wide-scope screening since it provides sensitive, full-spectrum MS data. Our article presents a qualitative screening workflow based on data-independent acquisition mode (all-ions MS/MS) on liquid chromatography (LC) coupled to QTOFMS for the detection and identification of NPS in biological matrices. The workflow combines and structures fundamentals of target and suspect screening data processing techniques in a structured algorithm. This allows the detection and tentative identification of NPS and their metabolites. We have applied the workflow to two actual case studies involving drug intoxications where we detected and confirmed the parent compounds ketamine, 25B-NBOMe, 25C-NBOMe, and several predicted phase I and II metabolites not previously reported in urine and serum samples. The screening workflow demonstrates the added value for the detection and identification of NPS in biological matrices.

  15. The View from a Few Hundred Feet : A New Transparent and Integrated Workflow for UAV-collected Data

    NASA Astrophysics Data System (ADS)

    Peterson, F. S.; Barbieri, L.; Wyngaard, J.

    2015-12-01

    Unmanned Aerial Vehicles (UAVs) allow scientists and civilians to monitor earth and atmospheric conditions in remote locations. To keep up with the rapid evolution of UAV technology, data workflows must also be flexible, integrated, and introspective. Here, we present our data workflow for a project to assess the feasibility of detecting threshold levels of methane, carbon-dioxide, and other aerosols by mounting consumer-grade gas analysis sensors on UAV's. Particularly, we highlight our use of Project Jupyter, a set of open-source software tools and documentation designed for developing "collaborative narratives" around scientific workflows. By embracing the GitHub-backed, multi-language systems available in Project Jupyter, we enable interaction and exploratory computation while simultaneously embracing distributed version control. Additionally, the transparency of this method builds trust with civilians and decision-makers and leverages collaboration and communication to resolve problems. The goal of this presentation is to provide a generic data workflow for scientific inquiries involving UAVs and to invite the participation of the AGU community in its improvement and curation.

  16. Integration of Research Studies: Meta-Analysis of Research. Methods of Integrative Analysis; Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; And Others

    Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…

  17. Dynamic Fracture Simulations of Explosively Loaded Cylinders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arthur, Carly W.; Goto, D. M.

    2015-11-30

    This report documents the modeling results of high explosive experiments investigating dynamic fracture of steel (AerMet® 100 alloy) cylinders. The experiments were conducted at Lawrence Livermore National Laboratory (LLNL) during 2007 to 2008 [10]. A principal objective of this study was to gain an understanding of dynamic material failure through the analysis of hydrodynamic computer code simulations. Two-dimensional and three-dimensional computational cylinder models were analyzed using the ALE3D multi-physics computer code.

  18. Statistical Learning in Specific Language Impairment: A Meta-Analysis

    ERIC Educational Resources Information Center

    Lammertink, Imme; Boersma, Paul; Wijnen, Frank; Rispens, Judith

    2017-01-01

    Purpose: The current meta-analysis provides a quantitative overview of published and unpublished studies on statistical learning in the auditory verbal domain in people with and without specific language impairment (SLI). The database used for the meta-analysis is accessible online and open to updates (Community-Augmented Meta-Analysis), which…

  19. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    PubMed

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  20. Model-driven meta-analyses for informing health care: a diabetes meta-analysis as an exemplar.

    PubMed

    Brown, Sharon A; Becker, Betsy Jane; García, Alexandra A; Brown, Adama; Ramírez, Gilbert

    2015-04-01

    A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. © The Author(s) 2014.

Top