Sample records for consequence codes maccs

  1. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  3. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  4. Evaluation of severe accident risks: Quantification of major input parameters: MAACS (MELCOR Accident Consequence Code System) input

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprung, J.L.; Jow, H-N; Rollstin, J.A.

    1990-12-01

    Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric andmore » biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.« less

  5. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  6. Input-output model for MACCS nuclear accident impacts estimation¹

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domesticmore » product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.« less

  7. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less

  8. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  9. Prognostic value of MACC1 and proficient mismatch repair status for recurrence risk prediction in stage II colon cancer patients: the BIOGRID studies.

    PubMed

    Rohr, U-P; Herrmann, P; Ilm, K; Zhang, H; Lohmann, S; Reiser, A; Muranyi, A; Smith, J; Burock, S; Osterland, M; Leith, K; Singh, S; Brunhoeber, P; Bowermaster, R; Tie, J; Christie, M; Wong, H-L; Waring, P; Shanmugam, K; Gibbs, P; Stein, U

    2017-08-01

    We assessed the novel MACC1 gene to further stratify stage II colon cancer patients with proficient mismatch repair (pMMR). Four cohorts with 596 patients were analyzed: Charité 1 discovery cohort was assayed for MACC1 mRNA expression and MMR in cryo-preserved tumors. Charité 2 comparison cohort was used to translate MACC1 qRT-PCR analyses to FFPE samples. In the BIOGRID 1 training cohort MACC1 mRNA levels were related to MACC1 protein levels from immunohistochemistry in FFPE sections; also analyzed for MMR. Chemotherapy-naïve pMMR patients were stratified by MACC1 mRNA and protein expression to establish risk groups based on recurrence-free survival (RFS). Risk stratification from BIOGRID 1 was confirmed in the BIOGRID 2 validation cohort. Pooled BIOGRID datasets produced a best effect-size estimate. In BIOGRID 1, using qRT-PCR and immunohistochemistry for MACC1 detection, pMMR/MACC1-low patients had a lower recurrence probability versus pMMR/MACC1-high patients (5-year RFS of 92% and 67% versus 100% and 68%, respectively). In BIOGRID 2, longer RFS was confirmed for pMMR/MACC1-low versus pMMR/MACC1-high patients (5-year RFS of 100% versus 90%, respectively). In the pooled dataset, 6.5% of patients were pMMR/MACC1-low with no disease recurrence, resulting in a 17% higher 5-year RFS [95% confidence interval (CI) (12.6%-21.3%)] versus pMMR/MACC1-high patients (P = 0.037). Outcomes were similar for pMMR/MACC1-low and deficient MMR (dMMR) patients (5-year RFS of 100% and 96%, respectively). MACC1 expression stratifies colon cancer patients with unfavorable pMMR status. Stage II colon cancer patients with pMMR/MACC1-low tumors have a similar favorable prognosis to those with dMMR with potential implications for the role of adjuvant therapy. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  10. Statin and rottlerin small-molecule inhibitors restrict colon cancer progression and metastasis via MACC1.

    PubMed

    Juneja, Manisha; Kobelt, Dennis; Walther, Wolfgang; Voss, Cynthia; Smith, Janice; Specker, Edgar; Neuenschwander, Martin; Gohlke, Björn-Oliver; Dahlmann, Mathias; Radetzki, Silke; Preissner, Robert; von Kries, Jens Peter; Schlag, Peter Michael; Stein, Ulrike

    2017-06-01

    MACC1 (Metastasis Associated in Colon Cancer 1) is a key driver and prognostic biomarker for cancer progression and metastasis in a large variety of solid tumor types, particularly colorectal cancer (CRC). However, no MACC1 inhibitors have been identified yet. Therefore, we aimed to target MACC1 expression using a luciferase reporter-based high-throughput screening with the ChemBioNet library of more than 30,000 compounds. The small molecules lovastatin and rottlerin emerged as the most potent MACC1 transcriptional inhibitors. They remarkably inhibited MACC1 promoter activity and expression, resulting in reduced cell motility. Lovastatin impaired the binding of the transcription factors c-Jun and Sp1 to the MACC1 promoter, thereby inhibiting MACC1 transcription. Most importantly, in CRC-xenografted mice, lovastatin and rottlerin restricted MACC1 expression and liver metastasis. This is-to the best of our knowledge-the first identification of inhibitors restricting cancer progression and metastasis via the novel target MACC1. This drug repositioning might be of therapeutic value for CRC patients.

  11. MACC1 - a novel target for solid cancers.

    PubMed

    Stein, Ulrike

    2013-09-01

    The metastatic dissemination of primary tumors is directly linked to patient survival in many tumor entities. The previously undescribed gene metastasis-associated in colon cancer 1 (MACC1) was discovered by genome-wide analyses in colorectal cancer (CRC) tissues. MACC1 is a tumor stage-independent predictor for CRC metastasis linked to metastasis-free survival. In this review, the discovery of MACC1 is briefly presented. In the following, the overwhelming confirmation of these data is provided supporting MACC1 as a new remarkable biomarker for disease prognosis and prediction of therapy response for CRC and also for a variety of additional forms of solid cancers. Lastly, the potential clinical utility of MACC1 as a target for prevention or restriction of tumor progression and metastasis is envisioned. MACC1 has been identified as a prognostic biomarker in a variety of solid cancers. MACC1 correlated with tumor formation and progression, development of metastases and patient survival representing a decisive driver for tumorigenesis and metastasis. MACC1 was also demonstrated to be of predictive value for therapy response. MACC1 is a promising therapeutic target for anti-tumor and anti-metastatic intervention strategies of solid cancers. Its clinical utility, however, must be demonstrated in clinical trials.

  12. Metastasis-associated in colon cancer-1 promotes vasculogenic mimicry in gastric cancer by upregulating TWIST1/2

    PubMed Central

    Wang, Lin; Lin, Li; Chen, Xi; Sun, Li; Liao, Yulin; Huang, Na; Liao, Wangjun

    2015-01-01

    Vasculogenic mimicry (VM) is a blood supply modality that is strongly associated with the epithelial-mesenchymal transition (EMT), TWIST1 activation and tumor progression. We previously reported that metastasis-associated in colon cancer-1 (MACC1) induced the EMT and was associated with a poor prognosis of patients with gastric cancer (GC), but it remains unknown whether MACC1 promotes VM and regulates the TWIST signaling pathway in GC. In this study, we investigated MACC1 expression and VM by immunohistochemistry in 88 patients with stage IV GC, and also investigated the role of TWIST1 and TWIST2 in MACC1-induced VM by using nude mice with GC xenografts and GC cell lines. We found that the VM density was significantly increased in the tumors of patients who died of GC and was positively correlated with MACC1 immunoreactivity (p < 0.05). The 3-year survival rate was only 8.6% in patients whose tumors showed double positive staining for MACC1 and VM, whereas it was 41.7% in patients whose tumors were negative for both MACC1 and VM. Moreover, nuclear expression of MACC1, TWIST1, and TWIST2 was upregulated in GC tissues compared with matched adjacent non-tumorous tissues (p < 0.05). Overexpression of MACC1 increased TWIST1/2 expression and induced typical VM in the GC xenografts of nude mice and in GC cell lines. MACC1 enhanced TWIST1/2 promoter activity and facilitated VM, while silencing of TWIST1 or TWIST2 inhibited VM. Hepatocyte growth factor (HGF) increased the nuclear translocation of MACC1, TWIST1, and TWIST2, while a c-Met inhibitor reduced these effects. These findings indicate that MACC1 promotes VM in GC by regulating the HGF/c-Met-TWIST1/2 signaling pathway, which means that MACC1 and this pathway are potential new therapeutic targets for GC. PMID:25895023

  13. MACC1 regulates Fas mediated apoptosis through STAT1/3 - Mcl-1 signaling in solid cancers.

    PubMed

    Radhakrishnan, Harikrishnan; Ilm, Katharina; Walther, Wolfgang; Shirasawa, Senji; Sasazuki, Takehiko; Daniel, Peter T; Gillissen, Bernhard; Stein, Ulrike

    2017-09-10

    MACC1 was identified as a novel player in cancer progression and metastasis, but its role in death receptor-mediated apoptosis is still unexplored. We show that MACC1 knockdown sensitizes cancer cells to death receptor-mediated apoptosis. For the first time, we provide evidence for STAT signaling as a MACC1 target. MACC1 knockdown drastically reduced STAT1/3 activating phosphorylation, thereby regulating the expression of its apoptosis targets Mcl-1 and Fas. STAT signaling inhibition by the JAK1/2 inhibitor ruxolitinib mimicked MACC1 knockdown-mediated molecular signatures and apoptosis sensitization to Fas activation. Despite the increased Fas expression, the reduced Mcl-1 expression was instrumental in apoptosis sensitization. This reduced Mcl-1-mediated apoptosis sensitization was Bax and Bak dependent. MACC1 knockdown also increased TRAIL-induced apoptosis. MACC1 overexpression enhanced STAT1/3 phosphorylation and increased Mcl-1 expression, which was abrogated by ruxolitinib. The central role of Mcl-1 was strengthened by the resistance of Mcl-1 overexpressing cells to apoptosis induction. The clinical relevance of Mcl-1 regulation by MACC1 was supported by their positive expression correlation in patient-derived tumors. Altogether, we reveal a novel death receptor-mediated apoptosis regulatory mechanism by MACC1 in solid cancers through modulation of the STAT1/3-Mcl-1 axis. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Clinicopathological and prognostic significance of metastasis-associated in colon cancer-1 (MACC1) overexpression in colorectal cancer: a meta-analysis

    PubMed Central

    Zhao, Yang; Dai, Cong; Wang, Meng; Kang, Huafeng; Lin, Shuai; Yang, Pengtao; Liu, Xinghan; Liu, Kang; Xu, Peng; Zheng, Yi; Li, Shanli; Dai, Zhijun

    2016-01-01

    Metastasis-associated in colon cancer-1 (MACC1) has been reported to be overexpressed in diverse human malignancies, and the increasing amount of evidences suggest that its overexpression is associated with the development and progression of many human tumors. However, the prognostic and clinicopathological value of MACC1 in colorectal cancer remains inconclusive. Therefore, we conducted this meta-analysis to investigate the effect of MACC1 overexpression on clinicopathological features and survival outcomes in colorectal cancer. PubMed, CNKI, and Wanfang databases were searched for relevant articles published update to December 2015. Correlation of MACC1 expression level with overall survival (OS), disease-free survival (DFS), and clinicopathological features were analyzed. In this meta-analysis, fifteen studies with a total of 2,161 colorectal cancer patients were included. Our results showed that MACC1 overexpression was significantly associated with poorer OS and DFS. Moreover, MACC1 overexpression was significantly associated with gender, localization, TNM stage, T stage, and N stage. Together, our meta-analysis showed that MACC1 overexpression was significantly associated with poor survival rates, regional invasion and lymph-node metastasis. MACC1 expression level can serve as a novel prognostic factor in colorectal cancer patients. PMID:27542234

  15. The role of metastasis-associated in colon cancer 1 (MACC1) in endometrial carcinoma tumorigenesis and progression.

    PubMed

    Chen, Shuo; Zong, Zhi-Hong; Wu, Dan-Dan; Sun, Kai-Xuan; Liu, Bo-Liang; Zhao, Yang

    2017-04-01

    Metastasis-associated in colon cancer-1 (MACC1), has recently been identified as a key regulator in the progression of many cancers. However, its role in endometrial carcinoma (EC) remains unknown. MACC1 expression was determined in EC and normal endometrial tissues by immunohistochemistry. EC cell phenotypes and related molecules were examined after MACC1 downregulation by Small interfering RNA (siRNA) or microRNA (miRNA) transfection. We found that MACC1 was highly expressed in EC tissues than normal samples, and was significantly different in FIGO staging (I and II vs. III and IV), the depth of myometrial infiltration (<1/2 vs. ≥1/2), lymph nodes metastasis (negative vs. positive), besides, MACC1 overexpression was correlated with lower cumulative and relapse-free survival rate. MACC1 downregulation by siRNA transfection significantly induced G1 phrase arrest, suppressed EC cell proliferation, migration, and invasion. In addition, MACC1 downregulation also reduced expression of Cyclin D1 and Cyclin-dependent Kinase 2 (CDK2), N-cadherin (N-Ca), α-SMA, matrix metalloproteinase 2 (MMP2), and MMP9, but increased expression of E-cadherin (E-Ca). Bioinformatic predictions and dual-luciferase reporter assays indicate that MACC1 is a possible target of miR-23b. MiR-23b overexpression reduced MACC1 expression in vitro and induced G1 phrase arrest, suppressed cell proliferation, migration, and invasion. MiR-23b transfection also reduced Cyclin D1 and CDK2, N-Ca, α-SMA, MMP2, MMP9 expression, but increased E-Ca expression. Furthermore, the nude mouse xenograft assay showed that miR-23b overexpression suppressed tumour growth through downregulating MACC1 expression. Taken together, our results demonstrate for the first time that MACC1 may be a new and important diagnosis and therapeutic target of endometrial carcinoma. © 2017 Wiley Periodicals, Inc.

  16. SecPop Version 4: Sector Population Land Fraction and Economic Estimation Program: Users? Guide Model Manual and Verification Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Scott; Bixler, Nathan E.; McFadden, Katherine Letizia

    In 1973 the U.S. Environmental Protection Agency (EPA) developed SecPop to calculate population estimates to support a study on air quality. The Nuclear Regulatory Commission (NRC) adopted this program to support siting reviews for nuclear power plant construction and license applications. Currently SecPop is used to prepare site data input files for offsite consequence calculations with the MELCOR Accident Consequence Code System (MACCS). SecPop enables the use of site-specific population, land use, and economic data for a polar grid defined by the user. Updated versions of SecPop have been released to use U.S. decennial census population data. SECPOP90 was releasedmore » in 1997 to use 1990 population and economic data. SECPOP2000 was released in 2003 to use 2000 population data and 1997 economic data. This report describes the current code version, SecPop version 4.3, which uses 2010 population data and both 2007 and 2012 economic data. It is also compatible with 2000 census and 2002 economic data. At the time of this writing, the current version of SecPop is 4.3.0, and that version is described herein. This report contains guidance for the installation and use of the code as well as a description of the theory, models, and algorithms involved. This report contains appendices which describe the development of the 2010 census file, 2007 county file, and 2012 county file. Finally, an appendix is included that describes the validation assessments performed.« less

  17. Investigation of MACC1 Gene Expression in Head and Neck Cancer and Cancer Stem Cells.

    PubMed

    Evran, Ebru; Şahin, Hilal; Akbaş, Kübra; Çiğdem, Sadik; Gündüz, Esra

    2016-12-01

    By investigating the MACC1 gene (metastasis-associated in colon cancer 1) in cancer stem cells (CSC) resistant to chemotherapy and in cancer stem cells (CSC) resistant to chemotherapy and in cancer cells (CS) sensitive to chemotherapy we determineda steady expression in both types of cells in head and neck cancer. In conformity with the result we examined if this gene could be a competitor gene for chemotherapy. According to literature, the MACC1 gene shows a clear expression in head and neck cancer cells [1]. Here we examined MACC1 expression in CSC and investigated it as a possible biomarker. Our experiments were performed in the UT -SCC -74 in primary head and neck cancer cell line. We examined the MACC -1 gene expression by Real Time PCR from both isolated CSC and CS. Expression of MACC -1 gene of cancer stem cells showed an two-fold increase compared with cancer cells. Based on the positive expression of MACC1 in both CS and CSC, this gene may serve as a potential biomarker in head and neck cancer. By comparing the results of this study with the novel features of MACC1, two important hypotheses could be examined. The first hypothesis is that MACC1 is a possible transcripton factor in colon cancer, which influences a high expression of CSC in head and neck and affects the expression of three biomarkers of the CSC control group biomarkers. The second hypothesisis is that the positive expression of MACC1 in patients with a malignant prognosis of tongue cancer, which belongs to head and neck cancer types, operates a faster development of CSC to cancer cells.

  18. [The value of SYNTAX score in predicting outcome patients undergoing percutaneous coronary intervention].

    PubMed

    Gao, Yue-chun; Yu, Xian-peng; He, Ji-qiang; Chen, Fang

    2012-01-01

    To assess the value of SYNTAX score to predict major adverse cardiac and cerebrovascular events (MACCE) among patients with three-vessel or left-main coronary artery disease undergoing percutaneous coronary intervention. 190 patients with three-vessel or left-main coronary artery disease undergoing percutaneous coronary intervention (PCI) with Cypher select drug-eluting stent were enrolled. SYNTAX score and clinical SYNTAX score were retrospectively calculated. Our clinical Endpoint focused on MACCE, a composite of death, nonfatal myocardial infarction (MI), stroke and repeat revascularization. The value of SYNTAX score and clinical SYNTAX score to predict MACCE were studied respectively. 29 patients were observed to suffer from MACCE, accounting 18.5% of the overall 190 patients. MACCE rates of low (≤ 20.5), intermediate (21.0 - 31.0), and high (≥ 31.5) tertiles according to SYNTAX score were 9.1%, 16.2% and 30.9% respectively. Both univariate and multivariate analysis showed that SYNTAX score was the independent predictor of MACCE. MACCE rates of low (≤ 19.5), intermediate (19.6 - 29.1), and high (≥ 29.2) tertiles according to clinical SYNTAX score were 14.9%, 9.8% and 30.6% respectively. Both univariate and multivariate analysis showed that clinical SYNTAX score was the independent predictor of MACCE. ROC analysis showed both SYNTAX score (AUC = 0.667, P = 0.004) and clinical SYNTAX score (AUC = 0.636, P = 0.020) had predictive value of MACCE. Clinical SYNTAX score failed to show better predictive ability than the SYNTAX score. Both SYNTAX score and clinical SYNTAX score could be independent risk predictors for MACCE among patients with three-vessel or left-main coronary artery disease undergoing percutaneous coronary intervention. Clinical SYNTAX score failed to show better predictive ability than the SYNTAX score in this group of patients.

  19. MISR GoMACCS Products

    Atmospheric Science Data Center

    2016-11-25

    Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) is an intensive ... study area encompasses Texas and the northwestern Gulf of Mexico during July, August, September, and October, 2006. The Multi-angle ...

  20. Circulating metastasis associated in colon cancer 1 transcripts in gastric cancer patient plasma as diagnostic and prognostic biomarker

    PubMed Central

    Burock, Susen; Herrmann, Pia; Wendler, Ina; Niederstrasser, Markus; Wernecke, Klaus-Dieter; Stein, Ulrike

    2015-01-01

    AIM: To evaluate the diagnostic and prognostic value of circulating Metastasis Associated in Colon Cancer 1 (MACC1) transcripts in plasma of gastric cancer patients. METHODS: We provide for the first time a blood-based assay for transcript quantification of the metastasis inducer MACC1 in a prospective study of gastric cancer patient plasma. MACC1 is a strong prognostic biomarker for tumor progression and metastasis in a variety of solid cancers. We conducted a study to define the diagnostic and prognostic power of MACC1 transcripts using 76 plasma samples from gastric cancer patients, either newly diagnosed with gastric cancer, newly diagnosed with metachronous metastasis of gastric cancer, as well as follow-up patients. Findings were controlled by using plasma samples from 54 tumor-free volunteers. Plasma was separated, RNA was isolated, and levels of MACC1 as well as S100A4 transcripts were determined by quantitative RT-PCR. RESULTS: Based on the levels of circulating MACC1 transcripts in plasma we significantly discriminated tumor-free volunteers and gastric cancer patients (P < 0.001). Levels of circulating MACC1 transcripts were increased in gastric cancer patients of each disease stage, compared to tumor-free volunteers: patients with tumors without metastasis (P = 0.005), with synchronous metastasis (P = 0.002), with metachronous metastasis (P = 0.005), and patients during follow-up (P = 0.021). Sensitivity was 0.68 (95%CI: 0.45-0.85) and specificity was 0.89 (95%CI: 0.77-0.95), respectively. Importantly, gastric cancer patients with high circulating MACC1 transcript levels in plasma demonstrated significantly shorter survival when compared with patients demonstrating low MACC1 levels (P = 0.0015). Furthermore, gastric cancer patients with high circulating transcript levels of MACC1 as well as of S100A4 in plasma demonstrated significantly shorter survival when compared with patients demonstrating low levels of both biomarkers or with only one biomarker elevated (P = 0.001). CONCLUSION: Levels of circulating MACC1 transcripts in plasma of gastric cancer patients are of diagnostic value and are prognostic for patient survival in a prospective study. PMID:25574109

  1. MISR Regional GoMACCS Imagery Overview

    Atmospheric Science Data Center

    2016-08-24

    ... View Data  |  Download Data About this Web Site: Visualizations of select MISR Level 3 data for special regional ... version used in support of the GoMACCS Campaign. More information about the Level 1 and Level 2 products subsetted for the GoMACCS ...

  2. Preliminary risks associated with postulated tritium release from production reactor operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Kula, K.R.; Horton, W.H.

    1988-01-01

    The Probabilistic Risk Assessment (PRA) of Savannah River Plant (SRP) reactor operation is assessing the off-site risk due to tritium releases during postulated full or partial loss of heavy water moderator accidents. Other sources of tritium in the reactor are less likely to contribute to off-site risk in non-fuel melting accident scenarios. Preliminary determination of the frequency of average partial moderator loss (including incidents with leaks as small as .5 kg) yields an estimate of /approximately/1 per reactor year. The full moderator loss frequency is conservatively chosen as 5 /times/ 10/sup /minus/3/ per reactor year. Conditional consequences, determined with amore » version of the MACCS code modified to handle tritium, are found to be insignificant. The 95th percentile individual cancer risk is 4 /times/ 10/sup /minus/8/ per reactor year within 16 km of the release point. The full moderator loss accident contributes about 75% of the evaluated risks. 13 refs., 4 figs., 5 tabs.« less

  3. A Web Server for MACCS Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  4. MACC1 mediates chemotherapy sensitivity of 5-FU and cisplatin via regulating MCT1 expression in gastric cancer.

    PubMed

    Wang, Chunlin; Wen, Zhaowei; Xie, Jianming; Zhao, Yang; Zhao, Liang; Zhang, Shuyi; Liu, Yajing; Xue, Yan; Shi, Min

    2017-04-08

    Chemotherapeutic insensitivity is a main obstacle for effective treatment of gastric cancer (GC), the underlying mechanism remains to be investigated. Metastasis-associated in colon cancer-1 (MACC1), a transcription factor highly expressed in GC, is found to be related to chemotherapy sensitivity. Monocarboxylate transporter 1 (MCT1), a plasma membrane protein co-transporting lactate and H + , mediates drug sensitivity by regulating lactate metabolism. Targeting MCT1 has recently been regarded as a promising way to treat cancers and MCT1 inhibitor has entered the clinical trial for GC treatment. However, the correlation of these two genes and their combined effects on chemotherapy sensitivity has not been clarified. In this study, we found that MACC1 and MCT1 were both highly expressed in GC and exhibited a positive correlation in clinical samples. Further, we demonstrated that MACC1 could mediate sensitivity of 5-FU and cisplatin in GC cells, and MACC1 mediated MCT1 regulation was closely related to this sensitivity. A MCT1 inhibitor AZD3965 recovered the sensitivity of 5-FU and cisplatin in GC cells which overexpressed MACC1. These results suggested that MACC1 could influence the chemotherapy sensitivity by regulating MCT1 expression, providing new ideas and strategy for GC treatment. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. High MACC1 expression in combination with mutated KRAS G13 indicates poor survival of colorectal cancer patients.

    PubMed

    Ilm, Katharina; Kemmner, Wolfgang; Osterland, Marc; Burock, Susen; Koch, Gudrun; Herrmann, Pia; Schlag, Peter M; Stein, Ulrike

    2015-02-14

    The metastasis-associated in colon cancer 1 (MACC1) gene has been identified as prognostic biomarker for colorectal cancer (CRC). Here, we aimed at the refinement of risk assessment by separate and combined survival analyses of MACC1 expression with any of the markers KRAS mutated in codon 12 (KRAS G12) or codon 13 (KRAS G13), BRAF V600 mutation and MSI status in a retrospective study of 99 CRC patients with tumors UICC staged I, II and III. We showed that only high MACC1 expression (HR: 6.09, 95% CI: 2.50-14.85, P < 0.001) and KRAS G13 mutation (HR: 5.19, 95% CI: 1.06-25.45, P = 0.042) were independent prognostic markers for shorter metastasis-free survival (MFS). Accordingly, Cox regression analysis revealed that patients with high MACC1 expression and KRAS G13 mutation exhibited the worst prognosis (HR: 14.48, 95% CI: 3.37-62.18, P < 0.001). Patients were classified based on their molecular characteristics into four clusters with significant differences in MFS (P = 0.003) by using the SPSS 2-step cluster function and Kaplan-Meier survival analysis. According to our results, patients with high MACC1 expression and mutated KRAS G13 exhibited the highest risk for metachronous metastases formation. Moreover, we demonstrated that the "Traditional pathway" with an intermediate risk for metastasis formation can be further subdivided by assessing MACC1 expression into a low and high risk group with regard to MFS prognosis. This is the first report showing that identification of CRC patients at high risk for metastasis is possible by assessing MACC1 expression in combination with KRAS G13 mutation.

  6. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-02-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in-situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols and greenhouse gases, and is based on the Integrated Forecast System of the ECMWF. The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past three years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  7. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-11-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols, and greenhouse gases, and is based on the Integrated Forecasting System of the European Centre for Medium-Range Weather Forecasts (ECMWF). The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past 3 years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high-pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  8. Performance of magnetic activated carbon composite as peroxymonosulfate activator and regenerable adsorbent via sulfate radical-mediated oxidation processes.

    PubMed

    Oh, Wen-Da; Lua, Shun-Kuang; Dong, Zhili; Lim, Teik-Thye

    2015-03-02

    Magnetic activated carbon composite (CuFe2O4/AC, MACC) was prepared by a co-precipitation-calcination method. The MACC consisted of porous micro-particle morphology with homogeneously distributed CuFe2O4 and possessed high magnetic saturation moment (8.1 emu g(-1)). The performance of MACC was evaluated as catalyst and regenerable adsorbent via peroxymonosulfate (PMS, Oxone(®)) activation for methylene blue (MB) removal. Optimum CuFe2O4/AC w/w ratio was 1:1.5 giving excellent performance and can be reused for at least 3 cycles. The presence of common inorganic ions, namely Cl(-) and NO3(-) did not exert significant influence on MB degradation but humic acid decreased the MB degradation rate. As a regenerable adsorbent, negligible difference in regeneration efficiency was observed when a higher Oxone(®) dosage was employed but a better efficiency was obtained at a lower MACC loading. The factors hindering complete MACC regeneration are MB adsorption irreversibility and AC surface modification by PMS making it less favorable for subsequent MB adsorption. With an additional mild heat treatment (150 °C) after regeneration, 82% of the active sites were successfully regenerated. A kinetic model incorporating simultaneous first-order desorption, second-order adsorption and pseudo-first order degradation processes was numerically-solved to describe the rate of regeneration. The regeneration rate increased linearly with increasing Oxone(®):MACC ratio. The MACC could potentially serve as a catalyst for PMS activation and regenerable adsorbent. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Improved operator agreement and efficiency using the minimum area contour change method for delineation of hyperintense multiple sclerosis lesions on FLAIR MRI

    PubMed Central

    2013-01-01

    Background Activity of disease in patients with multiple sclerosis (MS) is monitored by detecting and delineating hyper-intense lesions on MRI scans. The Minimum Area Contour Change (MACC) algorithm has been created with two main goals: a) to improve inter-operator agreement on outlining regions of interest (ROIs) and b) to automatically propagate longitudinal ROIs from the baseline scan to a follow-up scan. Methods The MACC algorithm first identifies an outer bound for the solution path, forms a high number of iso-contour curves based on equally spaced contour values, and then selects the best contour value to outline the lesion. The MACC software was tested on a set of 17 FLAIR MRI images evaluated by a pair of human experts and a longitudinal dataset of 12 pairs of T2-weighted Fluid Attenuated Inversion Recovery (FLAIR) images that had lesion analysis ROIs drawn by a single expert operator. Results In the tests where two human experts evaluated the same MRI images, the MACC program demonstrated that it could markedly reduce inter-operator outline error. In the longitudinal part of the study, the MACC program created ROIs on follow-up scans that were in close agreement to the original expert’s ROIs. Finally, in a post-hoc analysis of 424 follow-up scans 91% of propagated MACC were accepted by an expert and only 9% of the final accepted ROIS had to be created or edited by the expert. Conclusion When used with an expert operator's verification of automatically created ROIs, MACC can be used to improve inter- operator agreement and decrease analysis time, which should improve data collected and analyzed in multicenter clinical trials. PMID:24004511

  10. Prognostic Value of MACC1 in Digestive System Neoplasms: A Systematic Review and Meta-Analysis

    PubMed Central

    Wu, Zhenzhen; Zhou, Rui; Su, Yuqi; Sun, Li; Liao, Yulin; Liao, Wangjun

    2015-01-01

    Metastasis associated in colon cancer 1 (MACC1), a newly identified oncogene, has been associated with poor survival of cancer patients by multiple studies. However, the prognostic value of MACC1 in digestive system neoplasms needs systematic evidence to verify. Therefore, we aimed to provide further evidence on this topic by systematic review and meta-analysis. Literature search was conducted in multiple databases and eligible studies analyzing survival data and MACC1 expression were included for meta-analysis. Hazard ratio (HR) for clinical outcome was chosen as an effect measure of interest. According to our inclusion criteria, 18 studies with a total of 2,948 patients were identified. Pooled HRs indicated that high MACC1 expression significantly correlates with poorer OS in patients with digestive system neoplasms (HR = 1.94; 95% CI: 1.49–2.53) as well as poorer relapse-free survival (HR = 1.94, 95% CI: 1.33–2.82). The results of subgroup studies categorized by methodology, anatomic structure, and cancer subtype for pooled OS were all consistent with the overall pooled HR for OS as well. No publication bias was detected according to test of funnel plot asymmetry and Egger's test. In conclusion, high MACC1 expression may serve as a prognostic biomarker to guide individualized management in clinical practice for digestive system neoplasms. PMID:26090393

  11. Prognostic Value of MACC1 in Digestive System Neoplasms: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Zhenzhen; Zhou, Rui; Su, Yuqi; Sun, Li; Liao, Yulin; Liao, Wangjun

    2015-01-01

    Metastasis associated in colon cancer 1 (MACC1), a newly identified oncogene, has been associated with poor survival of cancer patients by multiple studies. However, the prognostic value of MACC1 in digestive system neoplasms needs systematic evidence to verify. Therefore, we aimed to provide further evidence on this topic by systematic review and meta-analysis. Literature search was conducted in multiple databases and eligible studies analyzing survival data and MACC1 expression were included for meta-analysis. Hazard ratio (HR) for clinical outcome was chosen as an effect measure of interest. According to our inclusion criteria, 18 studies with a total of 2,948 patients were identified. Pooled HRs indicated that high MACC1 expression significantly correlates with poorer OS in patients with digestive system neoplasms (HR = 1.94; 95% CI: 1.49-2.53) as well as poorer relapse-free survival (HR = 1.94, 95% CI: 1.33-2.82). The results of subgroup studies categorized by methodology, anatomic structure, and cancer subtype for pooled OS were all consistent with the overall pooled HR for OS as well. No publication bias was detected according to test of funnel plot asymmetry and Egger's test. In conclusion, high MACC1 expression may serve as a prognostic biomarker to guide individualized management in clinical practice for digestive system neoplasms.

  12. SYNTAX score based on coronary computed tomography angiography may have a prognostic value in patients with complex coronary artery disease: An observational study from a retrospective cohort.

    PubMed

    Suh, Young Joo; Han, Kyunghwa; Chang, Suyon; Kim, Jin Young; Im, Dong Jin; Hong, Yoo Jin; Lee, Hye-Jeong; Hur, Jin; Kim, Young Jin; Choi, Byoung Wook

    2017-09-01

    The SYNergy between percutaneous coronary intervention with TAXus and cardiac surgery (SYNTAX) score is an invasive coronary angiography (ICA)-based score for quantifying the complexity of coronary artery disease (CAD). Although the SYNTAX score was originally developed based on ICA, recent publications have reported that coronary computed tomography angiography (CCTA) is a feasible modality for the estimation of the SYNTAX score.The aim of our study was to investigate the prognostic value of the SYNTAX score, based on CCTA for the prediction of major adverse cardiac and cerebrovascular events (MACCEs) in patients with complex CAD.The current study was approved by the institutional review board of our institution, and informed consent was waived for this retrospective cohort study. We included 251 patients (173 men, mean age 66.0 ± 9.29 years) who had complex CAD [3-vessel disease or left main (LM) disease] on CCTA. SYNTAX score was obtained on the basis of CCTA. Follow-up clinical outcome data regarding composite MACCEs were also obtained. Cox proportional hazards models were developed to predict the risk of MACCEs based on clinical variables, treatment, and computed tomography (CT)-SYNTAX scores.During the median follow-up period of 1517 days, there were 48 MACCEs. Univariate Cox hazards models demonstrated that MACCEs were associated with advanced age, low body mass index (BMI), and dyslipidemia (P < .2). In patients with LM disease, MACCEs were associated with a higher SYNTAX score. In patients with CT-SYNTAX score ≥23, patients who underwent coronary artery bypass graft surgery (CABG) and percutaneous coronary intervention had significantly lower hazard ratios than patients who were treated with medication alone. In multivariate Cox hazards model, advanced age, low BMI, and higher SYNTAX score showed an increased hazard ratio for MACCE, while treatment with CABG showed a lower hazard ratio (P < .2).On the basis of our results, CT-SYNTAX score can be a useful method for noninvasively predicting MACCEs in patients with complex CAD, especially in patients with LM disease.

  13. In-depth characterization of the salivary adenoid cystic carcinoma transcriptome with emphasis on dominant cell type.

    PubMed

    Bell, Diana; Bell, Achim H; Bondaruk, Jolanta; Hanna, Ehab Y; Weber, Randall S

    2016-05-15

    Adenoid cystic carcinoma (ACC), 1 of the most common salivary gland malignancies, arises from the intercalated ducts, which are composed of inner ductal epithelial cells and outer myoepithelial cells. The objective of this study was to determine the genomic subtypes of ACC with emphasis on dominant cell type to identify potential specific biomarkers for each subtype and to improve the understanding of this disease. A whole-genome expression study was performed based on 42 primary salivary ACCs and 5 normal salivary glands. RNA from these specimens was subjected to expression profiling with RNA sequencing, and results were analyzed to identify transcripts in epithelial-dominant ACC (E-ACC), myoepithelial-dominant ACC (M-ACC), and all ACC that were expressed differentially compared with the transcripts in normal salivary tissue. In total, the authors identified 430 differentially expressed transcripts that were unique to E-ACC, 392 that were unique to M-ACC, and 424 that were common to both M-ACC and E-ACC. The sets of E-ACC-specific and M-ACC-specific transcripts were sufficiently large to define and differentiate E-ACC from M-ACC. Ingenuity pathway analysis identified known cancer-related genes for 60% of the E-ACC transcripts, 69% of the M-ACC transcripts, and 68% of the transcripts that were common in both E-ACC and M-ACC. Three sets of highly expressed candidate genes-distal-less homeobox 6 (DLX6) for E-ACC; protein keratin 16 (KRT16), SRY box 11 (SOX11), and v-myb avian myeloblastosis viral oncogene homolog (MYB) for M-ACC; and engrailed 1 (EN1) and statherin (STATH), which are common to both E-ACC and M-ACC)-were further validated at the protein level. The current results enabled the authors to identify novel potential therapeutic targets and biomarkers in E-ACC and M-ACC individually, with the implication that EN1, DLX6, and OTX1 (orthodenticle homeobox 1) are potential drivers of these cancers. Cancer 2016;122:1513-22. © 2016 American Cancer Society. © 2016 American Cancer Society.

  14. High power vertical stacked diode laser development using macro-channel water cooling and hard solder bonding technology

    NASA Astrophysics Data System (ADS)

    Yu, Dongshan; Liang, Xuejie; Wang, Jingwei; Li, Xiaoning; Nie, Zhiqiang; Liu, Xingsheng

    2017-02-01

    A novel marco channel cooler (MaCC) has been developed for packaging high power diode vertical stacked (HPDL) lasers, which eliminates many of the issues in commercially-available copper micro-channel coolers (MCC). The MaCC coolers, which do not require deionized water as coolant, were carefully designed for compact size and superior thermal dissipation capability. Indium-free packaging technology was adopted throughout product design and fabrication process to minimize the risk of solder electromigration and thermal fatigue at high current density and long pulse width under QCW operation. Single MaCC unit with peak output power of up to 700W/bar at pulse width in microsecond range and 200W/bar at pulse width in millisecond range has been recorded. Characteristic comparison on thermal resistivity, spectrum, near filed and lifetime have been conducted between a MaCC product and its counterpart MCC product. QCW lifetime test (30ms 10Hz, 30% duty cycle) has also been conducted with distilled water as coolant. A vertical 40-MaCC stack product has been fabricated, total output power of 9 kilowatts has been recorded under QCW mode (3ms, 30Hz, 9% duty cycle).

  15. Ten-Year Cross-Sectional Study of Mechanically Assisted Crevice Corrosion in 1352 Consecutive Patients With Metal-on-Polyethylene Total Hip Arthroplasty.

    PubMed

    Hussey, Daniel K; McGrory, Brian J

    2017-08-01

    Mechanically assisted crevice corrosion (MACC) in metal-on-polyethylene total hip arthroplasty (THA) is of concern, but its prevalence, etiology, and natural history are incompletely understood. From January 2003 to December 2012, 1352 consecutive THA surgeries using a titanium stem, cobalt-chromium alloy femoral head, and highly cross-linked polyethylene liner from a single manufacturer were performed. Patients were followed at 1-year and 5-year intervals for surveillance, but also seen earlier if they had symptoms. Any patient with osteolysis >1 cm (n = 3) or unexplained pain (n = 85) underwent examination, radiographs, complete blood count, erythrocyte sedimentation rate, and C-reactive protein, as well as tests for serum cobalt and chromium levels. Symptomatic MACC was present in 43 of 1352 patients (3.2%). Prevalence of MACC by year of implant ranged from 0% (0 of 61, 2003; 0 of 138, 2005) to 10.5% (17 of 162; 2009). The M/L Taper stem had a greater prevalence (4.9%) of MACC than all other Zimmer (Zimmer, Inc, Warsaw, IN) 12/14 trunnion stem types combined (1.2%; P < .001). Twenty-seven of 43 (62.8%) patients have undergone revision surgery, and 16 of 43 (37.2%) patients have opted for ongoing surveillance. Comparing symptomatic THA patients with and without MACC, no demographic, clinical, or radiographic differences were found. MACC was significantly more common in 0 length femoral heads (compared with both -3.5 mm and +3.5 mm heads). The prevalence of MACC in metal-on-polyethylene hips is higher in this cross-sectional study than previously reported. A significantly higher prevalence was found in patients with M/L Taper style stem and THA performed both in 2009 and also between 2009 and 2012 with this manufacturer. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Relation of Stature to Outcomes in Korean Patients Undergoing Primary Percutaneous Coronary Intervention for Acute ST-Elevation Myocardial Infarction (from the INTERSTELLAR Registry).

    PubMed

    Moon, Jeonggeun; Suh, Jon; Oh, Pyung Chun; Lee, Kyounghoon; Park, Hyun Woo; Jang, Ho-Jun; Kim, Tae-Hoon; Park, Sang-Don; Kwon, Sung Woo; Kang, Woong Chol

    2016-07-15

    Although epidemiologic studies have shown the impact of height on occurrence and/or prognosis of cardiovascular diseases, the underlying mechanism is unclear. In addition, the relation in patients with ST-segment elevation myocardial infarction (STEMI) who underwent primary percutaneous coronary intervention (PCI) remains unknown. We sought to assess the influence of height on outcomes of patients with acute STEMI undergoing primary PCI and to provide a pathophysiological explanation. All 1,490 patients with STEMI undergoing primary PCI were analyzed. Major adverse cardiac and cerebrovascular events (MACCE) were defined as all-cause mortality, nonfatal myocardial infarction, nonfatal stroke, and unplanned hospitalization for heart failure (HF). Patients were divided into (1) MACCE (+) versus MACCE (-) and (2) first- to third-tertile groups according to height. MACCE (+) group was shorter than MACCE (-) group (164 ± 8 vs 166 ± 8 cm, p = 0.012). Prognostic impact of short stature was significant in older (≥70 years) male patients even after adjusting for co-morbidities (hazard ratio 0.951, 95% confidence interval 0.912 to 0.991, p = 0.017). The first-tertile group showed the worst MACCE-free survival (p = 0.035), and most cases of MACCE were HF (n, 17 [3%] vs 6 [1%] vs 2 [0%], p = 0.004). On post-PCI echocardiography, left atrial volume and early diastolic mitral velocity to early diastolic mitral annulus velocity ratio showed an inverse relation with height (p <0.001 for all) despite similar left ventricular ejection fraction. In conclusion, short stature is associated with occurrence of HF after primary PCI for STEMI, and its influence is prominent in aged male patients presumably for its correlation with diastolic dysfunction. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. MACCS : Multi-Mission Atmospheric Correction and Cloud Screening tool for high-frequency revisit data processing

    NASA Astrophysics Data System (ADS)

    Petrucci, B.; Huc, M.; Feuvrier, T.; Ruffel, C.; Hagolle, O.; Lonjou, V.; Desjardins, C.

    2015-10-01

    For the production of Level2A products during Sentinel-2 commissioning in the Technical Expertise Center Sentinel-2 in CNES, CESBIO proposed to adapt the Venus Level-2 , taking advantage of the similarities between the two missions: image acquisition at a high frequency (2 days for Venus, 5 days with the two Sentinel-2), high resolution (5m for Venus, 10, 20 and 60m for Sentinel-2), images acquisition under constant viewing conditions. The Multi-Mission Atmospheric Correction and Cloud Screening (MACCS) tool was born: based on CNES Orfeo Toolbox Library, Venμs processor which was already able to process Formosat2 and VENμS data, was adapted to process Sentinel-2 and Landsat5-7 data; since then, a great effort has been made reviewing MACCS software architecture in order to ease the add-on of new missions that have also the peculiarity of acquiring images at high resolution, high revisit and under constant viewing angles, such as Spot4/Take5 and Landsat8. The recursive and multi-temporal algorithm is implemented in a core that is the same for all the sensors and that combines several processing steps: estimation of cloud cover, cloud shadow, water, snow and shadows masks, of water vapor content, aerosol optical thickness, atmospheric correction. This core is accessed via a number of plug-ins where the specificity of the sensor and of the user project are taken into account: products format, algorithmic processing chaining and parameters. After a presentation of MACCS architecture and functionalities, the paper will give an overview of the production facilities integrating MACCS and the associated specificities: the interest for this tool has grown worldwide and MACCS will be used for extensive production within the THEIA land data center and Agri-S2 project. Finally the paper will zoom on the use of MACCS during Sentinel-2 In Orbit Test phase showing the first Level-2A products.

  18. MISR Regional GoMACCS Products

    Atmospheric Science Data Center

    2016-08-24

    ... parameters from one Level 1 or Level 2 product. Further information about the Level 1 and Level 2 data products can be found on the  ... MISR GoMACCS data table . Images available on this web site include the following parameters: Image Description ...

  19. The Mobile Advanced Command and Control Station (MACCS) Experimental Testbed

    DTIC Science & Technology

    2007-10-01

    were selected: Vehicle: Dodge (Sprinter 2500 high-roof - Mercedes - Benz vehicle) Electrical equipment and habitability equipment: Crossroads Coaches...this innovative , mobile, experimental tested. IMPACT/APPLICATIONS While MACCS clearly supports the research agenda for both HAL and ONR (as well as

  20. Efficacy and safety of aspirin, clopidogrel, and warfarin after coronary artery stenting in Korean patients with atrial fibrillation.

    PubMed

    Suh, Soon Yong; Kang, Woong Chol; Oh, Pyung Chun; Choi, Hanul; Moon, Chan Il; Lee, Kyounghoon; Han, Seung Hwan; Ahn, Taehoon; Choi, In Suck; Shin, Eak Kyun

    2014-09-01

    There are limited data on the optimal antithrombotic therapy for patients with atrial fibrillation (AF) who undergoing coronary stenting. We reviewed 203 patients (62.6 % men, mean age 68.3 ± 10.1 years) between 2003 and 2012, and recorded clinical and demographic characteristics of the patients. Clinical follow-up included major adverse cardiac and cerebrovascular events (MACCE) (cardiac death, myocardial infarction, target lesion revascularization, and stroke), stent thrombosis, and bleeding. The most commonly associated comorbidities were hypertension (70.4 %), diabetes mellitus (35.5 %), and congestive heart failure (26.6 %). Sixty-three percent of patients had stroke risk higher than CHADS2 score 2. At discharge, dual-antiplatelet therapy (aspirin, clopidogrel) was used in 166 patients (81.8 %; Group I), whereas 37 patients (18.2 %) were discharged with triple therapy (aspirin, clopidogrel, warfarin; Group II). The mean follow-up period was 42.0 ± 29.0 months. The mean international normalized ratio (INR) in group II was 1.83 ± 0.41. The total MACCE was 16.3 %, with stroke in 3.4 %. Compared with the group II, the incidence of MACCE (2.7 % vs 19.3 %, P = 0.012) and cardiac death (0 % vs 11.4 %, P = 0.028) were higher in the group I. Major and any bleeding, however, did not differ between the two groups. In multivariate analysis, no warfarin therapy (odds ratio 7.8, 95 % confidence interval 1.02-59.35; P = 0.048) was an independent predictor of MACCE. By Kaplan-Meier survival analysis, warfarin therapy was associated with a lower risk of MACCE (P = 0.024). In patients with AF undergoing coronary artery stenting, MACCE were reduced by warfarin therapy without increased bleeding, which might be related to tighter control with a lower INR value.

  1. Does geographical variability influence five-year MACCE rates in the multicentre SYNTAX revascularisation trial?

    PubMed

    Roy, Andrew K; Chevalier, Bernard; Lefèvre, Thierry; Louvard, Yves; Segurado, Ricardo; Sawaya, Fadi; Spaziano, Marco; Neylon, Antoinette; Serruys, Patrick A; Dawkins, Keith D; Kappetein, Arie Pieter; Mohr, Friedrich-Wilhelm; Colombo, Antonio; Feldman, Ted; Morice, Marie-Claude

    2017-09-20

    The use of multiple geographical sites for randomised cardiovascular trials may lead to important heterogeneity in treatment effects. This study aimed to determine whether treatment effects from different geographical recruitment regions impacted significantly on five-year MACCE rates in the SYNTAX trial. Five-year SYNTAX results (n=1,800) were analysed for geographical variability by site and country for the effect of treatment (CABG vs. PCI) on MACCE rates. Fixed, random, and linear mixed models were used to test clinical covariate effects, such as diabetes, lesion characteristics, and procedural factors. Comparing five-year MACCE rates, the pooled odds ratio (OR) between study sites was 0.58 (95% CI: 0.47-0.71), and countries 0.59 (95% CI: 0.45-0.73). By homogeneity testing, no individual site (X2=93.8, p=0.051) or country differences (X2=25.7, p=0.080) were observed. For random effects models, the intraclass correlation was minimal (ICC site=5.1%, ICC country=1.5%, p<0.001), indicating minimal geographical heterogeneity, with a hazard ratio of 0.70 (95% CI: 0.59-0.83). Baseline risk (smoking, diabetes, PAD) did not influence regional five-year MACCE outcomes (ICC 1.3%-5.2%), nor did revascularisation of the left main vs. three-vessel disease (p=0.241), across site or country subgroups. For CABG patients, the number of arterial (p=0.49) or venous (p=0.38) conduits used also made no difference. Geographic variability has no significant treatment effect on MACCE rates at five years. These findings highlight the generalisability of the five-year outcomes of the SYNTAX study.

  2. Comorbidities and Ventricular Dysfunction Drive Excess Mid-Term Morbidity in an Indigenous Australian Coronary Revascularisation Cohort.

    PubMed

    Wiemers, Paul D; Marney, Lucy; White, Nicole; Bough, Georgina; Hustig, Alistair; Tan, Wei; Cheng, Ching-Siang; Kang, Dong; Yadav, Sumit; Tam, Robert; Fraser, John F

    2018-04-24

    There is a paucity of data in regards to longer term morbidity outcomes in Indigenous Australian patients undergoing coronary artery bypass grafting (CABG). No comparative data on re-infarction, stroke or reintervention rates exist. Outcome data following percutaneous coronary intervention (PCI) is also extremely limited. Addressing this gap in knowledge forms the major aim of our study. This was a single centre cohort study conducted at the Townsville Hospital, Australia which provides tertiary adult cardiac surgical services to the northern parts of the state of Queensland. It incorporated consecutive patients (n=350) undergoing isolated CABG procedures, 2008-2010, 20.9% (73/350) of whom were Indigenous Australians. The main outcome measures were major adverse cardiac or cerebrovascular events (MACCE) at mid-term follow-up (mean 38.9 months). The incidence of MACCE among Indigenous Australian patients was approximately twice that of non-Indigenous patients at mid-term follow-up (36.7% vs. 18.6%; p=0.005; OR 2.525 (1.291-4.880)). Following adjustment for preoperative and operative variables, Indigenous Australian status itself was not significantly associated with MACCE (AOR 1.578 (0.637-3.910)). Significant associations with MACCE included renal impairment (AOR 2.198 (1.010-4.783)) and moderate-severe left ventricular impairment (AOR 3.697 (1.820-7.508)). An association between diabetes and MACCE failed to reach statistical significance (AOR 1.812 (0.941-3.490)). Indigenous Australians undergoing CABG suffer an excess of MACCE when followed-up in the longer term. High rates of comorbidities in the Indigenous Australian population likely play an aetiological role. Copyright © 2018. Published by Elsevier B.V.

  3. Integrative marker analysis allows risk assessment for metastasis in stage II colon cancer.

    PubMed

    Nitsche, Ulrich; Rosenberg, Robert; Balmert, Alexander; Schuster, Tibor; Slotta-Huspenina, Julia; Herrmann, Pia; Bader, Franz G; Friess, Helmut; Schlag, Peter M; Stein, Ulrike; Janssen, Klaus-Peter

    2012-11-01

    Individualized risk assessment in patients with UICC stage II colon cancer based on a panel of molecular genetic alterations. Risk assessment in patients with colon cancer and localized disease (UICC stage II) is not sufficiently reliable. Development of metachronous metastasis is assumed to be governed largely by individual tumor genetics. Fresh frozen tissue from 232 patients (T3-4, N0, M0) with complete tumor resection and a median follow-up of 97 months was analyzed for microsatellite stability, KRAS exon 2, and BRAF exon 15 mutations. Gene expression of the WNT-pathway surrogate marker osteopontin and the metastasis-associated genes SASH1 and MACC1 was determined for 179 patients. The results were correlated with metachronous distant metastasis risk (n = 22 patients). Mutations of KRAS were detected in 30% patients, mutations of BRAF in 15% patients, and microsatellite instability in 26% patients. Risk of recurrence was associated with KRAS mutation (P = 0.033), microsatellite stable tumors (P = 0.015), decreased expression of SASH1 (P = 0.049), and increased expression of MACC1 (P < 0.001). MACC1 was the only independent parameter for recurrence prediction (hazard ratio: 6.2; 95% confidence interval: 2.4-16; P < 0.001). Integrative 2-step cluster analysis allocated patients into 4 groups, according to their tumor genetics. KRAS mutation, BRAF wild type, microsatellite stability, and high MACC1 expression defined the group with the highest risk of recurrence (16%, 7 of 43), whereas BRAF wild type, microsatellite instability, and low MACC1 expression defined the group with the lowest risk (4%, 1 of 26). MACC1 expression predicts development of metastases, outperforming microsatellite stability status, as well as KRAS/BRAF mutation status.

  4. Weighting Composite Endpoints in Clinical Trials: Essential Evidence for the Heart Team

    PubMed Central

    Tong, Betty C.; Huber, Joel C.; Ascheim, Deborah D.; Puskas, John D.; Ferguson, T. Bruce; Blackstone, Eugene H.; Smith, Peter K.

    2013-01-01

    Background Coronary revascularization trials often use a composite endpoint of major adverse cardiac and cerebrovascular events (MACCE). The usual practice in analyzing data with a composite endpoint is to assign equal weights to each of the individual MACCE elements. Non-inferiority margins are used to offset effects of presumably less important components, but their magnitudes are subject to bias. This study describes the relative importance of MACCE elements from a patient perspective. Methods A discrete choice experiment was conducted. Survey respondents were presented with a scenario that would make them eligible for the SYNTAX 3-Vessel Disease cohort. Respondents chose among pairs of procedures that differed on the 3-year probability of MACCE, potential for increased longevity, and procedure/recovery time. Conjoint analysis derived relative weights for these attributes. Results In all, 224 respondents completed the survey. The attributes did not have equal weight. Risk of death was most important (relative weight 0.23), followed by stroke (.18), potential increased longevity and recovery time (each 0.17), MI (0.14) and risk of repeat revascularization (0.11). Applying these weights to the SYNTAX 3-year endpoints resulted in a persistent, but decreased margin of difference in MACCE favoring CABG compared to PCI. When labeled only as “Procedure A” and “B,” 87% of respondents chose CABG over PCI. When procedures were labeled as “Coronary Stent” and “Coronary Bypass Surgery,” only 73% chose CABG. Procedural preference varied with demographics, gender and familiarity with the procedures. Conclusions MACCE elements do not carry equal weight in a composite endpoint, from a patient perspective. Using a weighted composite endpoint increases the validity of statistical analyses and trial conclusions. Patients are subject to bias by labels when considering coronary revascularization. PMID:22795064

  5. Electrolytic conditioning of a magnesium aluminum chloride complex for reversible magnesium deposition

    DOE PAGES

    Barile, Christopher J.; Barile, Elizabeth C.; Zavadil, Kevin R.; ...

    2014-12-04

    We describe in this report the electrochemistry of Mg deposition and dissolution from the magnesium aluminum chloride complex (MACC). The results define the requirements for reversible Mg deposition and definitively establish that voltammetric cycling of the electrolyte significantly alters its composition and performance. Elemental analysis, scanning electron microscopy, and energy-dispersive X-ray spectroscopy (SEM-EDS) results demonstrate that irreversible Mg and Al deposits form during early cycles. Electrospray ionization-mass spectrometry (ESI-MS) data show that inhibitory oligomers develop in THF-based solutions. These oligomers form via the well-established mechanism of a cationic ring-opening polymerization of THF during the initial synthesis of the MACC andmore » under resting conditions. In contrast, MACC solutions in 1,2-dimethoxyethane (DME), an acyclic solvent, do not evolve as dramatically at open circuit potential. Furthermore, we propose a mechanism describing how the conditioning process of the MACC in THF improves its performance by both tuning the Mg:Al stoichiometry and eliminating oligomers.« less

  6. Marginal abatement cost curve for nitrogen oxides incorporating controls, renewable electricity, energy efficiency, and fuel switching.

    PubMed

    Loughlin, Daniel H; Macpherson, Alexander J; Kaufman, Katherine R; Keaveny, Brian N

    2017-10-01

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs are typically developed by sorting control technologies by their relative cost-effectiveness. Other potentially important abatement measures such as renewable electricity, energy efficiency, and fuel switching (RE/EE/FS) are often not incorporated into MACCs, as it is difficult to quantify their costs and abatement potential. In this paper, a U.S. energy system model is used to develop a MACC for nitrogen oxides (NO x ) that incorporates both traditional controls and these additional measures. The MACC is decomposed by sector, and the relative cost-effectiveness of RE/EE/FS and traditional controls are compared. RE/EE/FS are shown to have the potential to increase emission reductions beyond what is possible when applying traditional controls alone. Furthermore, a portion of RE/EE/FS appear to be cost-competitive with traditional controls. Renewable electricity, energy efficiency, and fuel switching can be cost-competitive with traditional air pollutant controls for abating air pollutant emissions. The application of renewable electricity, energy efficiency, and fuel switching is also shown to have the potential to increase emission reductions beyond what is possible when applying traditional controls alone.

  7. Marginal abatement cost curves for NOx that account for renewable electricity, energy efficiency, and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their resp...

  8. Regional and sectoral marginal abatement cost curves for NOx incorporating controls, renewable electricity, energy efficiency and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their resp...

  9. Marginal abatement cost curve for NOx incorporating controls, renewable electricity, energy efficiency and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their resp...

  10. Marginal abatement cost curve for NOx incorporating controls, renewable electricity, energy efficiency and fuel switching

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their rela...

  11. Clinical outcomes of patients with hypothyroidism undergoing percutaneous coronary intervention

    PubMed Central

    Zhang, Ming; Sara, Jaskanwal D.S.; Matsuzawa, Yasushi; Gharib, Hossein; Bell, Malcolm R.; Gulati, Rajiv; Lerman, Lilach O.

    2016-01-01

    Abstract Aims The aim of this study was to investigate the association between hypothyroidism and major adverse cardiovascular and cerebral events (MACCE) in patients undergoing percutaneous coronary intervention (PCI). Methods and results Two thousand four hundred and thirty patients who underwent PCI were included. Subjects were divided into two groups: hypothyroidism ( n = 686) defined either as a history of hypothyroidism or thyroid-stimulating hormone (TSH) ≥5.0 mU/mL, and euthyroidism ( n = 1744) defined as no history of hypothyroidism and/or 0.3 mU/mL ≤ TSH < 5.0 mU/mL. Patients with hypothyroidism were further categorized as untreated ( n = 193), or those taking thyroid replacement therapy (TRT) with adequate replacement (0.3 mU/mL ≤ TSH < 5.0 mU/mL, n = 175) or inadequate replacement (TSH ≥ 5.0 mU/mL, n = 318). Adjusted hazard ratios (HRs) were calculated using Cox proportional hazards models. Median follow-up was 3.0 years (interquartile range, 0.5–7.0). After adjustment for covariates, the risk of MACCE and its constituent parts was higher in patients with hypothyroidism compared with those with euthyroidism (MACCE: HR: 1.28, P = 0.0001; myocardial infarction (MI): HR: 1.25, P = 0.037; heart failure: HR: 1.46, P = 0.004; revascularization: HR: 1.26, P = 0.0008; stroke: HR: 1.62, P = 0.04). Compared with untreated patients or those with inadequate replacement, adequately treated hypothyroid patients had a lower risk of MACCE (HR: 0.69, P = 0.005; HR: 0.78, P = 0.045), cardiac death (HR: 0.43, P = 0.008), MI (HR: 0.50, P = 0.0004; HR: 0.60, P = 0.02), and heart failure (HR: 0.50, P = 0.02; HR: 0.52, P = 0.017). Conclusion Hypothyroidism is associated with a higher incidence of MACCE compared with euthyroidism in patients undergoing PCI. Maintaining adequate control on TRT is beneficial in preventing MACCE. PMID:26757789

  12. Differential Event Rates and Independent Predictors of Long-Term Major Cardiovascular Events and Death in 5795 Patients With Unprotected Left Main Coronary Artery Disease Treated With Stents, Bypass Surgery, or Medication: Insights From a Large International Multicenter Registry.

    PubMed

    Kang, Se Hun; Ahn, Jung-Min; Lee, Cheol Hyun; Lee, Pil Hyung; Kang, Soo-Jin; Lee, Seung-Whan; Kim, Young-Hak; Lee, Cheol Whan; Park, Seong-Wook; Park, Duk-Woo; Park, Seung-Jung

    2017-07-01

    Identifying predictive factors for major cardiovascular events and death in patients with unprotected left main coronary artery disease is of great clinical value for risk stratification and possible guidance for tailored preventive strategies. The Interventional Research Incorporation Society-Left MAIN Revascularization registry included 5795 patients with unprotected left main coronary artery disease (percutaneous coronary intervention, n=2850; coronary-artery bypass grafting, n=2337; medication alone, n=608). We analyzed the incidence and independent predictors of major adverse cardiac and cerebrovascular events (MACCE; a composite of death, MI, stroke, or repeat revascularization) and all-cause mortality in each treatment stratum. During follow-up (median, 4.3 years), the rates of MACCE and death were substantially higher in the medical group than in the percutaneous coronary intervention and coronary-artery bypass grafting groups ( P <0.001). In the percutaneous coronary intervention group, the 3 strongest predictors for MACCE were chronic renal failure, old age (≥65 years), and previous heart failure; those for all-cause mortality were chronic renal failure, old age, and low ejection fraction. In the coronary-artery bypass grafting group, old age, chronic renal failure, and low ejection fraction were the 3 strongest predictors of MACCE and death. In the medication group, old age, low ejection fraction, and diabetes mellitus were the 3 strongest predictors of MACCE and death. Among patients with unprotected left main coronary artery disease, the key clinical predictors for MACCE and death were generally similar regardless of index treatment. This study provides effect estimates for clinically relevant predictors of long-term clinical outcomes in real-world left main coronary artery patients, providing possible guidance for tailored preventive strategies. URL: https://clinicaltrials.gov. Unique identifier: NCT01341327. © 2017 American Heart Association, Inc.

  13. Severity of OSAS, CPAP and cardiovascular events: A follow-up study.

    PubMed

    Baratta, Francesco; Pastori, Daniele; Fabiani, Mario; Fabiani, Valerio; Ceci, Fabrizio; Lillo, Rossella; Lolli, Valeria; Brunori, Marco; Pannitteri, Gaetano; Cravotto, Elena; De Vito, Corrado; Angelico, Francesco; Del Ben, Maria

    2018-05-01

    Previous studies suggested obstructive sleep apnoea syndrome (OSAS) as a major risk factor for incident cardiovascular events. However, the relationship between OSAS severity, the use of continuous positive airway pressure (CPAP) treatment and the development of cardiovascular disease is still matter of debate. The aim was to test the association between OSAS and cardiovascular events in patients with concomitant cardio-metabolic diseases and the potential impact of CPAP therapy on cardiovascular outcomes. Prospective observational cohort study of consecutive outpatients with suspected metabolic disorders who had complete clinical and biochemical workup including polysomnography because of heavy snoring and possible OSAS. The primary endpoint was a composite of major adverse cardiovascular and cerebrovascular events (MACCE). Median follow-up was 81.3 months, including 434 patients (2701.2 person/years); 83 had a primary snoring, 84 had mild, 93 moderate and 174 severe OSAS, respectively. The incidence of MACCE was 0.8% per year (95% confidence interval [CI] 0.2-2.1) in primary snorers and 2.1% per year (95% CI 1.5-2.8) for those with OSAS. A positive association was observed between event-free survival and OSAS severity (log-rank test; P = .041). A multivariable Cox regression analysis showed obesity (HR = 8.011, 95% CI 1.071-59.922, P = .043), moderate OSAS (vs non-OSAS HR = 3.853, 95% CI 1.069-13.879, P = .039) and severe OSAS (vs non-OSAS HR = 3.540, 95% CI 1.026-12.217, P = .045) as predictors of MACCE. No significant association was observed between CPAP treatment and MACCE (log-rank test; P = .227). Our findings support the role of moderate/severe OSAS as a risk factor for incident MACCE. CPAP treatment was not associated with a lower rate of MACCE. © 2018 Stichting European Society for Clinical Investigation Journal Foundation.

  14. A Near-real-time Data Transport System for Selected Stations in the Magnetometer Array for Cusp and Cleft Studies (MACCS)

    NASA Astrophysics Data System (ADS)

    Engebretson, M. J.; Valentic, T. A.; Stehle, R. H.; Hughes, W. J.

    2004-05-01

    The Magnetometer Array for Cusp and Cleft Studies (MACCS) is a two-dimensional array of eight fluxgate magnetometers that was established in 1992-1993 in the Eastern Canadian Arctic from 75° to over 80° MLAT to study electrodynamic interactions between the solar wind and Earth's magnetosphere and high-latitude ionosphere. A ninth site in Nain, Labrador, extends coverage down to 66° between existing Canadian and Greenland stations. Originally designed as part of NSF's GEM (Geospace Environment Modeling) Program, MACCS has contributed to the study of transients and waves at the magnetospheric boundary and in the near-cusp region as well as to large, cooperative, studies of ionospheric convection and substorm processes. Because of the limitations of existing telephone lines to each site, it has not been possible to economically access MACCS data promptly; instead, each month's collected data is recorded and mailed to the U.S. for processing and eventual posting on a publicly-accessible web site, http://space.augsburg.edu/space. As part of its recently renewed funding, NSF has supported the development of a near-real-time data transport system using the Iridium satellite network, which will be implemented at two MACCS sites in summer 2004. At the core of the new MACCS communications system is the Data Transport Network, software developed with NSF-ITR funding to automate the transfer of scientific data from remote field stations over unreliable, bandwidth-constrained network connections. The system utilizes a store-and-forward architecture based on sending data files as attachments to Usenet messages. This scheme not only isolates the instruments from network outages, but also provides a consistent framework for organizing and accessing multiple data feeds. Client programs are able to subscribe to data feeds to perform tasks such as system health monitoring, data processing, web page updates and e-mail alerts. The MACCS sites will employ the Data Transport Network on a small local Linux-based computer connected to an Iridium transceiver. Between 3-5Mb of data a day will be collected from the magnetometers and delivered in near-real-time for automatic distribution to modelers and index developers. More information about the Data Transport Network can be found at http://transport.sri.com/TransportDevel .

  15. The Interplay of Al and Mg Speciation in Advanced Mg Battery Electrolyte Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    See, Kimberly A.; Chapman, Karena W.; Zhu, Lingyang

    2016-01-13

    Mg batteries are an attractive alternative to Li-based energy storage due to the possibility of higher volumetric capacities with the added advantage of using sustainable materials. A promising emerging electrolyte for Mg batteries is the magnesium aluminum chloride complex (MACC) which shows high Mg electrodeposition and stripping efficiencies and relatively high anodic stabilities. As prepared, MACC is inactive with respect to Mg deposition; however, efficient Mg electrodeposition can be achieved following an electrolytic conditioning process. Through the use of Raman spectroscopy, surface enhanced Raman spectroscopy, 27Al and 35Cl nuclear magnetic resonance spectroscopy, and pair distribution function analysis, we explore themore » active vs inactive complexes in the MACC electrolyte and demonstrate the codependence of Al and Mg speciation. These techniques report on significant changes occurring in the bulk speciation of the conditioned electrolyte relative to the as-prepared solution. Analysis shows that the active Mg complex in conditioned MACC is very likely the [Mg2(μ–Cl)3·6THF]+ complex that is observed in the solid state structure. Additionally, conditioning creates free Cl– in the electrolyte solution, and we suggest the free Cl– adsorbs at the electrode surface to enhance Mg electrodeposition.« less

  16. Effect of growth phase on the fatty acid compositions of four species of marine diatoms

    NASA Astrophysics Data System (ADS)

    Liang, Ying; Mai, Kangsen

    2005-04-01

    The fatty acid compositions of four species of marine diatoms ( Chaetoceros gracilis MACC/B13, Cylindrotheca fusiformis MACC/B211, Phaeodactylum tricornutum MACC/B221 and Nitzschia closterium MACC/B222), cultivated at 22°C±1°C with the salinity of 28 in f/2 medium and harvested in the exponential growth phase, the early stationary phase and the late stationary phase, were determined. The results showed that growth phase has significant effect on most fatty acid contents in the four species of marine diatoms. The proportions of 16:0 and 16:1n-7 fatty acids increased while those of 16:3n-4 and eicosapentaenoic acid (EPA) decreased with increasing culture age in all species studied. The subtotal of saturated fatty acids (SFA) increased with the increasing culture age in all species with the exception of B13. The subtotal of monounsaturated fatty acids (MUFA) increased while that of polyunsaturated fatty acids (PUFA) decreased with culture age in the four species of marine diatoms. MUFA reached their lowest value in the exponential growth phase, whereas PUFA reached their highest value in the same phase.

  17. Air Support Control Officer Individual Position Training Simulation

    DTIC Science & Technology

    2017-06-01

    Analysis design development implementation evaluation ASCO Air support control officer ASLT Air support liaison team ASNO Air support net operator...Instructional system design LSTM Long-short term memory MACCS Marine Air Command and Control System MAGTF Marine Air Ground Task Force MASS Marine Air...information to designated MACCS agencies. ASCOs play an important part in facilitating the safe and successful conduct of air operations in DASC- controlled

  18. Clinical outcomes of patients with hypothyroidism undergoing percutaneous coronary intervention.

    PubMed

    Zhang, Ming; Sara, Jaskanwal D S; Matsuzawa, Yasushi; Gharib, Hossein; Bell, Malcolm R; Gulati, Rajiv; Lerman, Lilach O; Lerman, Amir

    2016-07-07

    The aim of this study was to investigate the association between hypothyroidism and major adverse cardiovascular and cerebral events (MACCE) in patients undergoing percutaneous coronary intervention (PCI). Two thousand four hundred and thirty patients who underwent PCI were included. Subjects were divided into two groups: hypothyroidism (n = 686) defined either as a history of hypothyroidism or thyroid-stimulating hormone (TSH) ≥5.0 mU/mL, and euthyroidism (n = 1744) defined as no history of hypothyroidism and/or 0.3 mU/mL ≤ TSH < 5.0 mU/mL. Patients with hypothyroidism were further categorized as untreated (n = 193), or those taking thyroid replacement therapy (TRT) with adequate replacement (0.3 mU/mL ≤ TSH < 5.0 mU/mL, n = 175) or inadequate replacement (TSH ≥ 5.0 mU/mL, n = 318). Adjusted hazard ratios (HRs) were calculated using Cox proportional hazards models. Median follow-up was 3.0 years (interquartile range, 0.5-7.0). After adjustment for covariates, the risk of MACCE and its constituent parts was higher in patients with hypothyroidism compared with those with euthyroidism (MACCE: HR: 1.28, P = 0.0001; myocardial infarction (MI): HR: 1.25, P = 0.037; heart failure: HR: 1.46, P = 0.004; revascularization: HR: 1.26, P = 0.0008; stroke: HR: 1.62, P = 0.04). Compared with untreated patients or those with inadequate replacement, adequately treated hypothyroid patients had a lower risk of MACCE (HR: 0.69, P = 0.005; HR: 0.78, P = 0.045), cardiac death (HR: 0.43, P = 0.008), MI (HR: 0.50, P = 0.0004; HR: 0.60, P = 0.02), and heart failure (HR: 0.50, P = 0.02; HR: 0.52, P = 0.017). Hypothyroidism is associated with a higher incidence of MACCE compared with euthyroidism in patients undergoing PCI. Maintaining adequate control on TRT is beneficial in preventing MACCE. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  19. The Minnesota Adolescent Community Cohort Study: Design and Baseline Results

    PubMed Central

    Forster, Jean; Chen, Vincent; Perry, Cheryl; Oswald, John; Willmorth, Michael

    2014-01-01

    The Minnesota Adolescent Community Cohort (MACC) Study is a population-based, longitudinal study that enrolled 3636 youth from Minnesota and 605 youth from comparison states age 12 to 16 years in 2000–2001. Participants have been surveyed by telephone semi-annually about their tobacco-related attitudes and behaviors. The goals of the study are to evaluate the effects of the Minnesota Youth Tobacco Prevention Initiative and its shutdown on youth smoking patterns, and to better define the patterns of development of tobacco use in adolescents. A multilevel sample was constructed representing individuals, local jurisdictions and the entire state, and data are collected to characterize each of these levels. This paper presents the details of the multilevel study design. We also provide baseline information about MACC participants including demographics and tobacco-related attitudes and behaviors. This paper describes smoking prevalence at the local level, and compares MACC participants to the state as a whole. PMID:21360063

  20. Marginal abatement cost curves for NOx that account for ...

    EPA Pesticide Factsheets

    A marginal abatement cost curve (MACC) traces out the relationship between the quantity of pollution abated and the marginal cost of abating each additional unit. In the context of air quality management, MACCs typically are developed by sorting end-of-pipe controls by their respective cost effectiveness. Alternative measures, such as renewable electricity, energy efficiency, and fuel switching (RE/EE/FS), are not considered as it is difficult to quantify their abatement potential. In this paper, we demonstrate the use of an energy system model to develop a MACC for nitrogen oxides (NOx) that incorporates both end-of-pipe controls and these alternative measures. We decompose the MACC by sector, and evaluate the cost-effectiveness of RE/EE/FS relative to end-of-pipe controls. RE/EE/FS are shown to produce considerable emission reductions after end-of-pipe controls have been exhausted. Furthermore, some RE/EE/FS are shown to be cost-competitive with end-of-pipe controls. Demonstrate how the MARKAL energy system model can be used to evaluate the potential role of renewable electricity, energy efficiency and fuel switching (RE/EE/FS) in achieving NOx reductions. For this particular analysis, we show that RE/EE/FSs are able to increase the quantity of NOx reductions available for a particular marginal cost (ranging from $5k per ton to $40k per ton) by approximately 50%.

  1. Prognostic Implications of Dual Platelet Reactivity Testing in Acute Coronary Syndrome.

    PubMed

    de Carvalho, Leonardo P; Fong, Alan; Troughton, Richard; Yan, Bryan P; Chin, Chee-Tang; Poh, Sock-Cheng; Mejin, Melissa; Huang, Nancy; Seneviratna, Aruni; Lee, Chi-Hang; Low, Adrian F; Tan, Huay-Cheem; Chan, Siew-Pang; Frampton, Christopher; Richards, A Mark; Chan, Mark Y

    2018-02-01

    Studies on platelet reactivity (PR) testing commonly test PR only after percutaneous coronary intervention (PCI) has been performed. There are few data on pre- and post-PCI testing. Data on simultaneous testing of aspirin and adenosine diphosphate antagonist response are conflicting. We investigated the prognostic value of combined serial assessments of high on-aspirin PR (HASPR) and high on-adenosine diphosphate receptor antagonist PR (HADPR) in patients with acute coronary syndrome (ACS). HASPR and HADPR were assessed in 928 ACS patients before (initial test) and 24 hours after (final test) coronary angiography, with or without revascularization. Patients with HASPR on the initial test, compared with those without, had significantly higher intraprocedural thrombotic events (IPTE) (8.6 vs. 1.2%, p  ≤ 0.001) and higher 30-day major adverse cardiovascular and cerebrovascular events (MACCE; 5.2 vs. 2.3%, p  = 0.05), but not 12-month MACCE (13.0 vs. 15.1%, p  = 0.50). Patients with initial HADPR, compared with those without, had significantly higher IPTE (4.4 vs. 0.9%, p  = 0.004), but not 30-day (3.5 vs. 2.3%, p  = 0.32) or 12-month MACCE (14.0 vs. 12.5%, p  = 0.54). The c-statistic of the Global Registry of Acute Coronary Events (GRACE) score alone, GRACE score + ASPR test and GRACE score + ADPR test for discriminating 30-day MACCE was 0.649, 0.803 and 0.757, respectively. Final ADPR was associated with 30-day MACCE among patients with intermediate-to-high GRACE score (adjusted odds ratio [OR]: 4.50, 95% confidence interval [CI]: 1.14-17.66), but not low GRACE score (adjusted OR: 1.19, 95% CI: 0.13-10.79). In conclusion, both HASPR and HADPR predict ischaemic events in ACS. This predictive utility is time-dependent and risk-dependent. Schattauer GmbH Stuttgart.

  2. microRNA-598 inhibits cell proliferation and invasion of glioblastoma by directly targeting metastasis associated in colon cancer-1.

    PubMed

    Wang, Ning; Zhang, Yang; Liang, Huaxin

    2018-02-14

    The dysregulation of microRNAs (miRNAs) expression is closely related with tumorigenesis and tumour development in glioblastoma (GBM). In this study, we found that miRNA-598 (miR-598) expression was significantly downregulated in GBM tissues and cell lines. Restoring miR-598 expression inhibited cell proliferation and invasion in GBM. Moreover, we validated that metastasis associated in colon cancer-1 (MACC1) is a novel target of miR-598 in GBM. Recovered MACC1 expression reversed the inhibitory effects of miR-598 overexpression on GBM cells. In addition, miR-598 overexpression suppressed the Met/AKT pathway activation in GBM. Our results provided compelling evidence that miR-598 serves tumour suppressive roles in GBM and that its anti-oncogenic effects are mediated chiefly through the direct suppression of MACC1 expression and regulation of the Met/AKT signalling pathway. Therefore, miR-598 is a potential target in the treatment of GBM.

  3. Anterior Cingulate Glutamate Is Reduced by Acamprosate Treatment in Patients With Alcohol Dependence.

    PubMed

    Frye, Mark A; Hinton, David J; Karpyak, Victor M; Biernacka, Joanna M; Gunderson, Lee J; Feeder, Scott E; Choi, Doo-Sup; Port, John D

    2016-12-01

    Although the precise drug mechanism of action of acamprosate remains unclear, its antidipsotropic effect is mediated in part through glutamatergic neurotransmission. We evaluated the effect of 4 weeks of acamprosate treatment in a cohort of 13 subjects with alcohol dependence (confirmed by a structured interview, Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) on proton magnetic resonance spectroscopy glutamate levels in the midline anterior cingulate cortex (MACC). We compared levels of metabolites with a group of 16 healthy controls. The Pennsylvania Alcohol Craving Scale was used to assess craving intensity. At baseline, before treatment, the mean cerebrospinal fluid-corrected MACC glutamate (Glu) level was significantly elevated in subjects with alcohol dependence compared with controls (P = 0.004). Four weeks of acamprosate treatment reduced glutamate levels (P = 0.025), an effect that was not observed in subjects who did not take acamprosate. At baseline, there was a significant positive correlation between cravings, measured by the Pennsylvania Alcohol Craving Scale, and MACC (Glu) levels (P = 0.019). Overall, these data would suggest a normalizing effect of acamprosate on a hyperglutamatergic state observed in recently withdrawn patients with alcohol dependence and a positive association between MACC glutamate levels and craving intensity in early abstinence. Further research is needed to evaluate the use of these findings for clinical practice, including monitoring of craving intensity and individualized selection of treatment with antidipsotropic medications in subjects with alcohol dependence.

  4. Profiling of Resistance Patterns & Oncogenic Signaling Pathways in Evaluation of Cancers of the Thorax and Therapeutic Target Identification

    DTIC Science & Technology

    2010-06-01

    mutation si gnature i s prognostic in EGFR wild-type l ung adenocarcinomas and identifies Metastasis associated in colon cancer 1 (MACC1) as an EGFR...T790M mutation (N=7, blue curve) (AUC: area under the curve). Figure 3. EGFR dependency signature is a favorable prognostic factor. EGFR index...developed. T he si gnature w as shown t o b e prognostic regardless of EGFR status. T he results also suggest MACC1 to be a regulator of MET in NSCLC

  5. Evaluation of the MACC operational forecast system - potential and challenges of global near-real-time modelling with respect to reactive gases in the troposphere

    NASA Astrophysics Data System (ADS)

    Wagner, A.; Blechschmidt, A.-M.; Bouarar, I.; Brunke, E.-G.; Clerbaux, C.; Cupeiro, M.; Cristofanelli, P.; Eskes, H.; Flemming, J.; Flentje, H.; George, M.; Gilge, S.; Hilboll, A.; Inness, A.; Kapsomenakis, J.; Richter, A.; Ries, L.; Spangl, W.; Stein, O.; Weller, R.; Zerefos, C.

    2015-12-01

    The Monitoring Atmospheric Composition and Climate (MACC) project represents the European Union's Copernicus Atmosphere Monitoring Service (CAMS) (http://www.copernicus.eu/), which became fully operational during 2015. The global near-real-time MACC model production run for aerosol and reactive gases provides daily analyses and 5-day forecasts of atmospheric composition fields. It is the only assimilation system worldwide that is operational to produce global analyses and forecasts of reactive gases and aerosol fields. We have investigated the ability of the MACC analysis system to simulate tropospheric concentrations of reactive gases covering the period between 2009 and 2012. A validation was performed based on carbon monoxide (CO), nitrogen dioxide (NO2) and ozone (O3) surface observations from the Global Atmosphere Watch (GAW) network, the O3 surface observations from the European Monitoring and Evaluation Programme (EMEP) and, furthermore, NO2 tropospheric columns, as well as CO total columns, derived from satellite sensors. The MACC system proved capable of reproducing reactive gas concentrations with consistent quality; however, with a seasonally dependent bias compared to surface and satellite observations - for northern hemispheric surface O3 mixing ratios, positive biases appear during the warm seasons and negative biases during the cold parts of the year, with monthly modified normalised mean biases (MNMBs) ranging between -30 and 30 % at the surface. Model biases are likely to result from difficulties in the simulation of vertical mixing at night and deficiencies in the model's dry deposition parameterisation. Observed tropospheric columns of NO2 and CO could be reproduced correctly during the warm seasons, but are mostly underestimated by the model during the cold seasons, when anthropogenic emissions are at their highest level, especially over the US, Europe and Asia. Monthly MNMBs of the satellite data evaluation range from values between -110 and 40 % for NO2 and at most -20 % for CO, over the investigated regions. The underestimation is likely to result from a combination of errors concerning the dry deposition parameterisation and certain limitations in the current emission inventories, together with an insufficiently established seasonality in the emissions.

  6. Angiographic outcomes following stenting or coronary artery bypass surgery of the left main coronary artery: fifteen-month outcomes from the synergy between PCI with TAXUS express and cardiac surgery left main angiographic substudy (SYNTAX-LE MANS).

    PubMed

    Morice, Marie-Claude; Feldman, Ted E E; Mack, Michael J; Ståhle, Elisabeth; Holmes, David R; Colombo, Antonio; Morel, Marie-Angèle; van den Brand, Marcel; Serruys, Patrick W; Mohr, Friedrich; Carrié, Didier; Fournial, Gérard; James, Stefan; Leadley, Katrin; Dawkins, Keith D; Kappetein, A Pieter

    2011-10-30

    The SYNTAX-LE MANS substudy prospectively evaluated 15-month angiographic and clinical outcomes in patients with treated left main (LM) disease. In the SYNTAX trial, 1,800 patients with three-vessel and/or LM disease were randomised to either CABG or PCI; of these, 271 LM patients were prospectively assigned to receive a 15-month angiogram. The primary endpoint for the CABG arm was the ratio of ≥50% to <100% obstructed/occluded grafts bypassing LM lesions to the number placed. The primary endpoint for the PCI arm was the proportion of patients with ≤50% diameter stenosis ('patent' stents) of treated LM lesions. Per protocol, no formal comparison between CABG and PCI arms was intended based on the differing primary endpoints. Available 15-month angiograms were analysed for 114 CABG and 149 PCI patients. At 15 months, 9.9% (26/263) of CABG grafts were 100% occluded and an additional 5.7% (15/263) were ≥50% to <100% occluded. Overall, 27.2% (31/114) of patients had ≥1 obstructed/occluded graft. The 15-month CABG MACCE rate was 8.8% (10/114) and MACCE at 15 months was not significantly associated with graft obstruction/occlusion (p=0.85). In the PCI arm, 92.4% (134/145) of patients had ≤50% diameter LM stenosis at 15 months (89.7% [87/97] distal LM lesions and 97.9% [47/48] non-distal LM lesions). The 15-month PCI MACCE rate was 12.8% (20/156) and this was significantly associated with lack of stent patency at 15 months (p<0.001), mainly due to repeat revascularisation. At 15 months, 15.6% (41/263) of grafts were at least 50% obstructed but this was not significantly associated with MACCE; 92.4% (134/145) of patients had stents that remained patent at 15 months, and stent restenosis was significantly associated with MACCE, predominantly due to revascularisation.

  7. 2-year results of the AUTAX (Austrian Multivessel TAXUS-Stent) registry beyond the SYNTAX (synergy between percutaneous coronary intervention with TAXUS and cardiac surgery) study.

    PubMed

    Gyöngyösi, Mariann; Christ, Günter; Lang, Irene; Kreiner, Gerhard; Sochor, Heinz; Probst, Peter; Neunteufl, Thomas; Badr-Eslam, Rosa; Winkler, Susanne; Nyolczas, Noemi; Posa, Aniko; Leisch, Franz; Karnik, Ronald; Siostrzonek, Peter; Harb, Stefan; Heigert, Matthias; Zenker, Gerald; Benzer, Werner; Bonner, Gerhard; Kaider, Alexandra; Glogar, Dietmar

    2009-08-01

    The multicenter AUTAX (Austrian Multivessel TAXUS-Stent) registry investigated the 2-year clinical/angiographic outcomes of patients with multivessel coronary artery disease after implantation of TAXUS Express stents (Boston Scientific, Natick, Massachusetts), in a "real-world" setting. The AUTAX registry included patients with 2- or 3-vessel disease, with/without previous percutaneous coronary intervention (PCI) and concomitant surgery. Patients (n = 441, 64 +/- 12 years, 78% men) (n = 1,080 lesions) with possible complete revascularization by PCI were prospectively included. Median clinical follow-up was 753 (quartiles 728 to 775) days after PCI in 95.7%, with control angiography of 78% at 6 months. The primary end point was the composite of major adverse cardiac (nonfatal acute myocardial infarction [AMI], all-cause mortality, target lesion revascularization [TLR]) and cerebrovascular events (MACCE). Potential risk factor effects on 2-year MACCE were evaluated using Cox regression. Complete revascularization was successful in 90.5%, with left main PCI of 6.8%. Rates of acute, subacute, and late stent thrombosis were 0.7%, 0.5%, and 0.5%. Two-year follow-up identified AMI (1.4%), death (3.6%), stroke (0.2%), and TLR (13.1%), for a composite MACCE of 18.3%. The binary restenosis rate was 10.8%. The median of cumulative SYNTAX score was 23.0 (range 12.0 to 56.5). The SYNTAX score did not predict TLR or MACCE, due to lack of scoring of restenotic or bypass stenoses (29.8%). Age (hazard ratio [HR]: 1.03, p = 0.019) and acute coronary syndrome (HR: 2.1, p = 0.001) were significant predictors of 2-year MACCE. Incomplete revascularization predicted death or AMI (HR: 3.84, p = 0.002). With the aim of complete revascularization, TAXUS stent implantations can be safe for patients with multivessel disease. The AUTAX registry including patients with post-PCI lesions provides additional information to the SYNTAX (Synergy Between Percutaneous Coronary Intervention With TAXUS and Cardiac Surgery) study. (Austrian Multivessel TAXUS-Stent Registry; NCT00738686).

  8. Influence of sleep-disordered breathing assessed by pulse oximetry on long-term clinical outcomes in patients who underwent percutaneous coronary intervention.

    PubMed

    Yatsu, Shoichiro; Naito, Ryo; Kasai, Takatoshi; Matsumoto, Hiroki; Shitara, Jun; Shimizu, Megumi; Murata, Azusa; Kato, Takao; Suda, Shoko; Hiki, Masaru; Sai, Eiryu; Miyauchi, Katsumi; Daida, Hiroyuki

    2018-03-31

    Sleep-disordered breathing (SDB) has been recognized as an important risk factor for coronary artery disease (CAD). However, SDB was not fully examined, because sleep studies are limited. Nocturnal pulse oximetry has been suggested to be a useful tool for evaluating SDB. Therefore, the aim of this study was to investigate the influence of SDB assessed by nocturnal pulse oximetry on clinical outcomes in patients who underwent percutaneous coronary intervention (PCI). We conducted a prospective, multicenter, observational cohort study, wherein SDB was assessed by finger pulse oximetry in patients who underwent PCI from January 2014 to December 2016. SDB was defined as 4% oxygen desaturation index of 5 and higher. The primary endpoint was major adverse cardiac or cerebrovascular event (MACCE), defined as a composite of all-cause mortality, acute coronary syndrome, and/or stroke. Of 539 patients, 296 (54.9%) had SDB. MACCE occurred in 32 patients (5.8%) during a median follow-up of 1.9 years. The cumulative incidence of MACCE was significantly higher in patients with SDB (P = 0.0134). In the stepwise multivariable Cox proportional model, the presence of SDB was a significant predictor of MACCE (hazard ratio 2.26; 95% confidence interval 1.05-5.4, P = 0.036). SDB determined by nocturnal pulse oximetry was associated with worse clinical outcomes in patients who underwent PCI. Screening for SDB with nocturnal pulse oximetry was considered to be important for risk stratification in patients with CAD.

  9. Access to MISR Aerosol Data and Imagery for the GoMACCS Field Study

    NASA Astrophysics Data System (ADS)

    Ritchey, N.; Watkinson, T.; Davis, J.; Walter, J.; Protack, S.; Matthews, J.; Smyth, M.; Rheingans, B.; Gaitley, B.; Ferebee, M.; Haberer, S.

    2006-12-01

    NASA Langley Atmospheric Science Data Center (ASDC) and NASA Jet Propulsion Laboratory (JPL) Multi- angle Imaging SpectroRadiometer (MISR) teams collaborated to provide special data products and images in an innovative approach for the Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) field campaign. GoMACCS was an intensive field study focused on providing a better understanding of the sources and atmospheric processes responsible for the formation and distribution of ozone and aerosols in the atmosphere and the influence that these species have on the radiative forcing of regional and global climate, as well as their impact on human health and regional haze. The study area encompassed Texas and the northwestern Gulf of Mexico. Numerous U. S. Government agencies, universities and commercial entities participated in the field campaign which occurred August through September 2006. Aerosol and meteorological measurements were provided by a network of instruments on land, buoys and ships, by airborne in situ and remote instruments, and by satellite retrievals. MISR's role in GoMACCS was to provide satellite retrievals of aerosols and cloud properties and imagery as quickly as possible after data acquisition. The diverse group of scientific participants created unique opportunities for ASDC and MISR to develop special data products and images that were easily accessible by all participants. Examples of the data products, images and access methods as well as the data and imagery flow will be presented. Additional information about ASDC and MISR is available from the following web sites, http://eosweb.larc.nasa.gov and http://www-misr.jpl.nasa.gov/.

  10. A regional air quality forecasting system over Europe: the MACC-II daily ensemble production

    NASA Astrophysics Data System (ADS)

    Marécal, V.; Peuch, V.-H.; Andersson, C.; Andersson, S.; Arteta, J.; Beekmann, M.; Benedictow, A.; Bergström, R.; Bessagnet, B.; Cansado, A.; Chéroux, F.; Colette, A.; Coman, A.; Curier, R. L.; Denier van der Gon, H. A. C.; Drouin, A.; Elbern, H.; Emili, E.; Engelen, R. J.; Eskes, H. J.; Foret, G.; Friese, E.; Gauss, M.; Giannaros, C.; Guth, J.; Joly, M.; Jaumouillé, E.; Josse, B.; Kadygrov, N.; Kaiser, J. W.; Krajsek, K.; Kuenen, J.; Kumar, U.; Liora, N.; Lopez, E.; Malherbe, L.; Martinez, I.; Melas, D.; Meleux, F.; Menut, L.; Moinat, P.; Morales, T.; Parmentier, J.; Piacentini, A.; Plu, M.; Poupkou, A.; Queguiner, S.; Robertson, L.; Rouïl, L.; Schaap, M.; Segers, A.; Sofiev, M.; Thomas, M.; Timmermans, R.; Valdebenito, Á.; van Velthoven, P.; van Versendaal, R.; Vira, J.; Ung, A.

    2015-03-01

    This paper describes the pre-operational analysis and forecasting system developed during MACC (Monitoring Atmospheric Composition and Climate) and continued in MACC-II (Monitoring Atmospheric Composition and Climate: Interim Implementation) European projects to provide air quality services for the European continent. The paper gives an overall picture of its status at the end of MACC-II (summer 2014). This system is based on seven state-of-the art models developed and run in Europe (CHIMERE, EMEP, EURAD-IM, LOTOS-EUROS, MATCH, MOCAGE and SILAM). These models are used to calculate multi-model ensemble products. The MACC-II system provides daily 96 h forecasts with hourly outputs of 10 chemical species/aerosols (O3, NO2, SO2, CO, PM10, PM2.5, NO, NH3, total NMVOCs and PAN + PAN precursors) over 8 vertical levels from the surface to 5 km height. The hourly analysis at the surface is done a posteriori for the past day using a selection of representative air quality data from European monitoring stations. The performances of the system are assessed daily, weekly and 3 monthly (seasonally) through statistical indicators calculated using the available representative air quality data from European monitoring stations. Results for a case study show the ability of the median ensemble to forecast regional ozone pollution events. The time period of this case study is also used to illustrate that the median ensemble generally outperforms each of the individual models and that it is still robust even if two of the seven models are missing. The seasonal performances of the individual models and of the multi-model ensemble have been monitored since September 2009 for ozone, NO2 and PM10 and show an overall improvement over time. The change of the skills of the ensemble over the past two summers for ozone and the past two winters for PM10 are discussed in the paper. While the evolution of the ozone scores is not significant, there are improvements of PM10 over the past two winters that can be at least partly attributed to new developments on aerosols in the seven individual models. Nevertheless, the year to year changes in the models and ensemble skills are also linked to the variability of the meteorological conditions and of the set of observations used to calculate the statistical indicators. In parallel, a scientific analysis of the results of the seven models and of the ensemble is also done over the Mediterranean area because of the specificity of its meteorology and emissions. The system is robust in terms of the production availability. Major efforts have been done in MACC-II towards the operationalisation of all its components. Foreseen developments and research for improving its performances are discussed in the conclusion.

  11. Lipoprotein(a) levels predict adverse vascular events after acute myocardial infarction.

    PubMed

    Mitsuda, Takayuki; Uemura, Yusuke; Ishii, Hideki; Takemoto, Kenji; Uchikawa, Tomohiro; Koyasu, Masayoshi; Ishikawa, Shinji; Miura, Ayako; Imai, Ryo; Iwamiya, Satoshi; Ozaki, Yuta; Kato, Tomohiro; Shibata, Rei; Watarai, Masato; Murohara, Toyoaki

    2016-12-01

    Lipoprotein(a) [Lp(a)], which is genetically determined, has been reported as an independent risk factor for atherosclerotic vascular disease. However, the prognostic value of Lp(a) for secondary vascular events in patients after coronary artery disease has not been fully elucidated. This 3-year observational study included a total of 176 patients with ST-elevated myocardial infarction (STEMI), whose Lp(a) levels were measured within 24 h after primary percutaneous coronary intervention. We divided enrolled patients into two groups according to Lp(a) level and investigated the association between Lp(a) and the incidence of major adverse cardiac and cerebrovascular events (MACCE). A Kaplan-Meier analysis demonstrated that patients with higher Lp(a) levels had a higher incidence of MACCE than those with lower Lp(a) levels (log-rank P = 0.034). A multivariate Cox regression analysis revealed that Lp(a) levels were independently correlated with the occurrence of MACCE after adjusting for other classical risk factors of atherosclerotic vascular diseases (hazard ratio 1.030, 95 % confidence interval: 1.011-1.048, P = 0.002). In receiver-operating curve analysis, the cutoff value to maximize the predictive power of Lp(a) was 19.0 mg/dl (area under the curve = 0.674, sensitivity 69.2 %, specificity 62.0 %). Evaluation of Lp(a) in addition to the established coronary risk factors improved their predictive value for the occurrence of MACCE. In conclusion, Lp(a) levels at admission independently predict secondary vascular events in patients with STEMI. Lp(a) might provide useful information for the development of secondary prevention strategies in patients with myocardial infarction.

  12. Mid-latitude storm track variability and its influence on atmospheric composition

    NASA Astrophysics Data System (ADS)

    Knowland, K. E.; Doherty, R. M.; Hodges, K.

    2013-12-01

    Using the storm tracking algorithm, TRACK (Hodges, 1994, 1995, 1999), we have studied the behaviour of storm tracks in the North Atlantic basin, using 850-hPa relative vorticity from the ERA-Interim Re-analysis (Dee et al., 2011). We have correlated surface ozone measurements at rural coastal sites in Europe to the storm track data to explore the role mid-latitude cyclones and their transport of pollutants play in determining surface air quality in Western Europe. To further investigate this relationship, we have used the Monitoring Atmospheric Composition Climate (MACC) Re-analysis dataset (Inness et al., 2013) in TRACK. The MACC Re-analysis is a 10-year dataset which couples a chemistry transport model (Mozart-3; Stein 2009, 2012) to an extended version of the European Centre for Medium-Range Weather Forecasts' (ECMWF) Integrated Forecast System (IFS). Storm tracks in the MACC Re-analysis compare well to the storm tracks using the ERA-Interim Re-analysis for the same 10-year period, as both are based on ECMWF IFSs. We also compare surface ozone values from MACC to surface ozone measurements previously studied. Using TRACK, we follow ozone (O3) and carbon monoxide (CO) through the life cycle of storms from North America to Western Europe. Along the storm tracks, we examine the distribution of CO and O3 within 6 degrees of the center of each storm and vertically at different pressure levels in the troposphere. We hope to better understand the mechanisms with which pollution is vented from the boundary layer to the free troposphere, as well as transport of pollutants to rural areas. Our hope is to give policy makers more detailed information on how climate variability associated with storm tracks between 1979-2013 may affect air quality in Northeast USA and Western Europe.

  13. Seasonal and interannual variability of carbon monoxide based on MOZAIC observations, MACC reanalysis, and model simulations over an urban site in India

    NASA Astrophysics Data System (ADS)

    Sheel, Varun; Sahu, L. K.; Kajino, M.; Deushi, M.; Stein, O.; Nedelec, P.

    2014-07-01

    The spatial and temporal variations of carbon monoxide (CO) are analyzed over a tropical urban site, Hyderabad (17°27'N, 78°28'E) in central India. We have used vertical profiles from the Measurement of ozone and water vapor by Airbus in-service aircraft (MOZAIC) aircraft observations, Monitoring Atmospheric Composition and Climate (MACC) reanalysis, and two chemical transport model simulations (Model for Ozone And Related Tracers (MOZART) and MRI global Chemistry Climate Model (MRI-CCM2)) for the years 2006-2008. In the lower troposphere, the CO mixing ratio showed strong seasonality, with higher levels (>300 ppbv) during the winter and premonsoon seasons associated with a stable anticyclonic circulation, while lower CO values (up to 100 ppbv) were observed in the monsoon season. In the planetary boundary layer (PBL), the seasonal distribution of CO shows the impact of both local meteorology and emissions. While the PBL CO is predominantly influenced by strong winds, bringing regional background air from marine and biomass burning regions, under calm conditions CO levels are elevated by local emissions. On the other hand, in the free troposphere, seasonal variation reflects the impact of long-range transport associated with the Intertropical Convergence Zone and biomass burning. The interannual variations were mainly due to transition from El Niño to La Niña conditions. The overall modified normalized mean biases (normalization based on the observed and model mean values) with respect to the observed CO profiles were lower for the MACC reanalysis than the MOZART and MRI-CCM2 models. The CO in the PBL region was consistently underestimated by MACC reanalysis during all the seasons, while MOZART and MRI-CCM2 show both positive and negative biases depending on the season.

  14. Previous cerebrovascular disease is an important predictor of clinical outcomes in elderly patients with percutaneous coronary interventions: The Nobori-Biolimus eluting stent prospective multicenter 1-year observational registry in South Korea

    PubMed Central

    Kim, Yong Hoon; Her, Ae-Young; Kim, Byeong-Keuk; Shin, Dong-Ho; Kim, Jung-Sun; Ko, Young-Guk; Choi, Donghoon; Hong, Myeong-Ki; Jang, Yangsoo

    2017-01-01

    Objective: The appropriate selection of elderly patients for revascularization has become increasingly important because these subsets of patients are more likely to experience a major cardiac or cerebrovascular event—percutaneous coronary intervention (PCI). The objective of this study was to determine important independent risk factor for predicting clinical outcomes in the elderly patients after successful PCI, particularly in a series of South Korean population. Methods: This study is prospective, multicenter, observational cross-sectional study. A total of 1,884 consecutive patients who underwent successful PCI with Nobori® Biolimus A9-eluting stents were enrolled between April 2010 and December 2012. They were divided into two groups according to the age: patients <75 years old (younger patient group) and ≥75 years old (elderly patient group). The primary endpoint was major adverse cardiac or cerebrovascular events (MACCE) at 1-year after index PCI. Results: The 1-year cumulative incidence of MACCE (12.9% vs. 4.3%, p<0.001) and total death (7.1% vs. 1.5%, p<0.001) was significantly higher in the elderly group than in younger group. Previous cerebrovascular disease was significantly correlated with MACCE in elderly patients 1-year after PCI (hazard ratio, 2.804; 95% confidence interval, 1.290–6.093 p=0.009). Conclusion: Previous cerebrovascular disease is important independent predictor of the MACCE in elderly patients at 1-year after PCI with Nobori® Biolimus A9-eluting stents especially in a series of South Korean population. Therefore, careful PCI with intensive monitoring and management can improve major clinical outcomes after successful PCI in elderly patients with previous cerebrovascular disease compared with younger patients. PMID:28554989

  15. The prognostic role of stress echocardiography in a contemporary population and the clinical significance of limited apical ischaemia.

    PubMed

    Papachristidis, Alexandros; Roper, Damian; Cassar Demarco, Daniela; Tsironis, Ioannis; Papitsas, Michael; Byrne, Jonathan; Alfakih, Khaled; Monaghan, Mark J

    2016-12-01

    In this study, we aim to reassess the prognostic value of stress echocardiography (SE) in a contemporary population and to evaluate the clinical significance of limited apical ischaemia, which has not been previously studied. We included 880 patients who underwent SE. Follow-up data with regards to MACCE (cardiac death, myocardial infarction, any repeat revascularisation and cerebrovascular accident) were collected over 12 months after the SE. Mortality data were recorded over 27.02 ± 4.6 months (5.5-34.2 months). We sought to investigate the predictors of MACCE and all-cause mortality. In a multivariable analysis, only the positive result of SE was predictive of MACCE (HR, 3.71; P = 0.012). The positive SE group was divided into 2 subgroups: (a) inducible ischaemia limited to the apical segments ('apical ischaemia') and (b) ischaemia in any other segments with or without apical involvement ('other positive'). The subgroup of patients with apical ischaemia had a significantly worse outcome compared to the patients with a negative SE (HR, 3.68; P = 0.041) but a similar outcome to the 'other positive' subgroup. However, when investigated with invasive coronary angiography, the prevalence of coronary artery disease (CAD) and their rate of revascularisation was considerably lower. Only age (HR, 1.07; P < 0.001) was correlated with all-cause mortality. SE remains a strong predictor of patients' outcome in a contemporary population. A positive SE result was the only predictor of 12-month MACCE. The subgroup of patients with limited apical ischaemia have similar outcome to patients with ischaemia in other segments despite a lower prevalence of CAD and a lower revascularisation rate. © 2016 The authors.

  16. Evaluation of the MACC operational forecast system - potential and challenges of global near-real-time modelling with respect to reactive gases in the troposphere

    NASA Astrophysics Data System (ADS)

    Wagner, A.; Blechschmidt, A.-M.; Bouarar, I.; Brunke, E.-G.; Clerbaux, C.; Cupeiro, M.; Cristofanelli, P.; Eskes, H.; Flemming, J.; Flentje, H.; George, M.; Gilge, S.; Hilboll, A.; Inness, A.; Kapsomenakis, J.; Richter, A.; Ries, L.; Spangl, W.; Stein, O.; Weller, R.; Zerefos, C.

    2015-03-01

    Monitoring Atmospheric Composition and Climate (MACC/MACCII) currently represents the European Union's Copernicus Atmosphere Monitoring Service (CAMS) (http://www.copernicus.eu), which will become fully operational in the course of 2015. The global near-real-time MACC model production run for aerosol and reactive gases provides daily analyses and 5 day forecasts of atmospheric composition fields. It is the only assimilation system world-wide that is operational to produce global analyses and forecasts of reactive gases and aerosol fields. We have investigated the ability of the MACC analysis system to simulate tropospheric concentrations of reactive gases (CO, O3, and NO2) covering the period between 2009 and 2012. A validation was performed based on CO and O3 surface observations from the Global Atmosphere Watch (GAW) network, O3 surface observations from the European Monitoring and Evaluation Programme (EMEP) and furthermore, NO2 tropospheric columns derived from the satellite sensors SCIAMACHY and GOME-2, and CO total columns derived from the satellite sensor MOPITT. The MACC system proved capable of reproducing reactive gas concentrations in consistent quality, however, with a seasonally dependent bias compared to surface and satellite observations: for northern hemispheric surface O3 mixing ratios, positive biases appear during the warm seasons and negative biases during the cold parts of the years, with monthly Modified Normalised Mean Biases (MNMBs) ranging between -30 and 30% at the surface. Model biases are likely to result from difficulties in the simulation of vertical mixing at night and deficiencies in the model's dry deposition parameterization. Observed tropospheric columns of NO2 and CO could be reproduced correctly during the warm seasons, but are mostly underestimated by the model during the cold seasons, when anthropogenic emissions are at a highest, especially over the US, Europe and Asia. Monthly MNMBs of the satellite data evaluation range between -110 and 40% for NO2 and at most -20% for CO, over the investigated regions. The underestimation is likely to result from a combination of errors concerning the dry deposition parameterization and certain limitations in the current emission inventories, together with an insufficiently established seasonality in the emissions.

  17. Treatment of complex coronary artery disease in patients with diabetes: 5-year results comparing outcomes of bypass surgery and percutaneous coronary intervention in the SYNTAX trial.

    PubMed

    Kappetein, Arie Pieter; Head, Stuart J; Morice, Marie-Claude; Banning, Adrian P; Serruys, Patrick W; Mohr, Friedrich-Wilhelm; Dawkins, Keith D; Mack, Michael J

    2013-05-01

    This prespecified subgroup analysis examined the effect of diabetes on left main coronary disease (LM) and/or three-vessel disease (3VD) in patients treated with percutaneous coronary intervention (PCI) or coronary artery bypass grafting (CABG) in the SYNTAX trial. Patients (n = 1800) with LM and/or 3VD were randomized to receive either PCI with TAXUS Express paclitaxel-eluting stents or CABG. Five-year outcomes in subgroups with (n = 452) or without (n = 1348) diabetes were examined: major adverse cardiac or cerebrovascular events (MACCE), the composite safety end-point of all-cause death/stroke/myocardial infarction (MI) and individual MACCE components death, stroke, MI and repeat revascularization. Event rates were estimated with Kaplan-Meier analyses. In diabetic patients, 5-year rates were significantly higher for PCI vs CABG for MACCE (PCI: 46.5% vs CABG: 29.0%; P < 0.001) and repeat revascularization (PCI: 35.3% vs CABG: 14.6%; P < 0.001). There was no difference in the composite of all-cause death/stroke/MI (PCI: 23.9% vs CABG: 19.1%; P = 0.26) or individual components all-cause death (PCI: 19.5% vs CABG: 12.9%; P = 0.065), stroke (PCI: 3.0% vs CABG: 4.7%; P = 0.34) or MI (PCI: 9.0% vs CABG: 5.4%; P = 0.20). In non-diabetic patients, rates with PCI were also higher for MACCE (PCI: 34.1% vs CABG: 26.3%; P = 0.002) and repeat revascularization (PCI: 22.8% vs CABG: 13.4%; P < 0.001), but not for the composite end-point of all-cause death/stroke/MI (PCI: 19.8% vs CABG: 15.9%; P = 0.069). There were no differences in all-cause death (PCI: 12.0% vs CABG: 10.9%; P = 0.48) or stroke (PCI: 2.2% vs CABG: 3.5%; P = 0.15), but rates of MI (PCI: 9.9% vs CABG: 3.4%; P < 0.001) were significantly increased in the PCI arm in non-diabetic patients. In both diabetic and non-diabetic patients, PCI resulted in higher rates of MACCE and repeat revascularization at 5 years. Although PCI is a potential treatment option in patients with less-complex lesions, CABG should be the revascularization option of choice for patients with more-complex anatomic disease, especially with concurrent diabetes.

  18. Diabetes mellitus: long-term prognostic value of whole-body MR imaging for the occurrence of cardiac and cerebrovascular events.

    PubMed

    Bamberg, Fabian; Parhofer, Klaus G; Lochner, Elena; Marcus, Roy P; Theisen, Daniel; Findeisen, Hannes M; Hoffmann, Udo; Schönberg, Stefan O; Schlett, Christopher L; Reiser, Maximilian F; Weckbach, Sabine

    2013-12-01

    To study the predictive value of whole-body magnetic resonance (MR) imaging for the occurrence of cardiac and cerebrovascular events in a cohort of patients with diabetes mellitus (DM). This HIPAA-compliant study was approved by the institutional review board. Informed consent was obtained from all patients before enrollment into the study. The authors followed up 65 patients with DM (types 1 and 2) who underwent a comprehensive, contrast material-enhanced whole-body MR imaging protocol, including brain, cardiac, and vascular sequences at baseline. Follow-up was performed by phone interview. The primary endpoint was a major adverse cardiac and cerebrovascular event (MACCE), which was defined as composite cardiac-cerebrovascular death, myocardial infarction, cerebrovascular event, or revascularization. MR images were assessed for the presence of systemic atherosclerotic vessel changes, white matter lesions, and myocardial changes. Kaplan-Meier survival and Cox regression analyses were performed to determine associations. Follow-up was completed in 61 patients (94%; median age, 67.5 years; 30 women [49%]; median follow-up, 70 months); 14 of the 61 patients (23%) experienced MACCE. Although normal whole-body MR imaging excluded MACCE during the follow-up period (0%; 95% confidence interval [CI]: 0%, 17%), any detectable ischemic and/or atherosclerotic changes at whole-body MR imaging (prevalence, 66%) conferred a cumulative event rate of 20% at 3 years and 35% at 6 years. Whole-body MR imaging summary estimate of disease was strongly predictive for MACCE (one increment of vessel score and each territory with atherosclerotic changes: hazard ratio, 13.2 [95% CI: 4.5, 40.1] and 3.9 [95% CI: 2.2, 7.5], respectively), also beyond clinical characteristics as well as individual cardiac or cerebrovascular MR findings. These initial data indicate that disease burden as assessed with whole-body MR imaging confers strong prognostic information in patients with DM. Online supplemental material is available for this article. © RSNA, 2013.

  19. [Percutaneous coronary intervention of unprotected left main coronary compared with coronary artery bypass grafting; 3 years of experience in the National Institute of Cardiology, Mexico].

    PubMed

    López-Aguilar, Carlos; Abundes-Velasco, Arturo; Eid-Lidt, Guering; Piña-Reyna, Yigal; Gaspar-Hernández, Jorge

    The best revascularisation method of the unprotected left main artery is a current and evolving topic. A total of 2439 percutaneous coronary interventions (PCI) were registered during a 3-year period. The study included all the patients with PCI of the unprotected left main coronary (n=48) and matched with patients who underwent coronary artery bypass graft (CABG) (n=50). Major adverse cerebral and cardiac events (MACCE) were assessed within the hospital and in outpatients during a 16 month follow up. The cardiovascular risk was greater in the PCI group; logEuroSCORE 16±21 vs. 5±6, P=.001; clinical Syntax 77±74 vs 53±39, P=.04. On admission, the PCI group of patients had a higher frequency of ST segment elevation myocardial infarction (STEMI) and cardiogenic shock. The MACCE were similar in both groups (14% vs. 18%, P=.64). STEMI was less frequent in the PCI group (0% vs. 10%, P=.03). Cardiovascular events were lower in the PCI group (2.3% vs. 18%, P=.01), and there was a decrease in general and cardiac mortality (2.3% vs. 12%, P=.08 y 2.3% vs. 8%, P=.24), on excluding the patients with cardiogenic shock as a presentation. MACCE were similar in both groups in the out-patient phase (15% vs. 12%, P=.46). Survival without MACCE, general and cardiac death were comparable between groups (log rank, P=.38, P=.44 and P=.16, respectively). Even though the clinical and peri-procedural risk profile of the PCI patients were higher, the in-hospital and out-hospital efficacy and safety were comparable with CABG. Copyright © 2016 Instituto Nacional de Cardiología Ignacio Chávez. Publicado por Masson Doyma México S.A. All rights reserved.

  20. Cost-effectiveness of percutaneous coronary intervention with drug-eluting stents in patients with multivessel coronary artery disease compared to coronary artery bypass surgery five-years after intervention

    PubMed Central

    Krenn, Lisa; Kopp, Christoph; Glogar, Dietmar; Lang, Irene M; Delle-Karth, Georg; Neunteufl, Thomas; Kreiner, Gerhard; Kaider, Alexandra; Bergler-Klein, Jutta; Khorsand, Aliasghar; Nikfardjam, Mariam; Laufer, Günther; Maurer, Gerald; Gyöngyösi, Mariann

    2014-01-01

    Objectives Cost-effectiveness of percutaneous coronary intervention (PCI) using drug-eluting stents (DES), and coronary artery bypass surgery (CABG) was analyzed in patients with multivessel coronary artery disease over a 5-year follow-up. Background DES implantation reducing revascularization rate and associated costs might be attractive for health economics as compared to CABG. Methods Consecutive patients with multivessel DES-PCI (n = 114, 3.3 ± 1.2 DES/patient) or CABG (n = 85, 2.7 ± 0.9 grafts/patient) were included prospectively. Primary endpoint was cost-benefit of multivessel DES-PCI over CABG, and the incremental cost-effectiveness ratio (ICER) was calculated. Secondary endpoint was the incidence of major adverse cardiac and cerebrovascular events (MACCE), including acute myocardial infarction (AMI), all-cause death, revascularization, and stroke. Results Despite multiple uses for DES, in-hospital costs were significantly less for PCI than CABG, with 4551 €/patient difference between the groups. At 5-years, the overall costs remained higher for CABG patients (mean difference 5400 € between groups). Cost-effectiveness planes including all patients or subgroups of elderly patients, diabetic patients, or Syntax score >32 indicated that CABG is a more effective, more costly treatment mode for multivessel disease. At the 5-year follow-up, a higher incidence of MACCE (37.7% vs. 25.8%; log rank P = 0.048) and a trend towards more AMI/death/stroke (25.4% vs. 21.2%, log rank P = 0.359) was observed in PCI as compared to CABG. ICER indicated 45615 € or 126683 € to prevent one MACCE or AMI/death/stroke if CABG is performed. Conclusions Cost-effectiveness analysis of DES-PCI vs. CABG demonstrated that CABG is the most effective, but most costly, treatment for preventing MACCE in patients with multivessel disease. © 2014 Wiley Periodicals, Inc. PMID:24403120

  1. Observational Prospective study to esTIMAte the rates of outcomes in patients undergoing PCI with drug-eluting stent implantation who take statins -follow-up (OPTIMA II).

    PubMed

    Karpov, Yu; Logunova, N; Tomilova, D; Buza, V; Khomitskaya, Yu

    2017-02-01

    The OPTIMA II study sought to evaluate rates of major adverse cardiac and cerebrovascular events (MACCEs) during the long-term follow-up of chronic statin users who underwent percutaneous coronary intervention (PCI) with implantation of a drug-eluting stent (DES). OPTIMA II was a non-interventional, observational study conducted at a single center in the Russian Federation. Included patients were aged ≥18 years with stable angina who had received long-term (≥1 month) statin therapy prior to elective PCI with DES implantation and who had participated in the original OPTIMA study. Patients received treatment for stable angina after PCI as per routine study site clinical practice. Study data were collected from patient medical records and a routine visit 4 years after PCI. NCT02099565. Rate of MACCEs 4 years after PCI. Overall, 543 patients agreed to participate in the study (90.2% of patients in the original OPTIMA study). The mean (± standard deviation [SD]) duration of follow-up from the date of PCI to data collection was 4.42 ± 0.58 (range: 0.28-5.56) years. The frequency of MACCEs (including data in patients who died) was 30.8% (95% confidence interval: 27.0-34.7); half of MACCEs occurred in the first year of follow-up. After PCI, the majority of patients had no clinical signs of angina. Overall, 24.3% of patients discontinued statin intake in the 4 years after PCI. Only 7.7% of patients achieved a low-density lipoprotein (LDL) cholesterol goal of <1.8 mmol/L. Key limitations of this study related to its observational nature; for example, the sample size was small, the clinical results were derived from outpatients and hospitalized medical records, only one follow-up visit was performed at the end of the study (after 4 years' follow-up), only depersonalized medical information was made available for statistical analysis, and adherence to statin treatment was evaluated on the basis of patient questionnaire. Long-term follow-up of patients who underwent PCI with DES implantation demonstrated MACCEs in nearly one-third of patients, which is comparable to data from other studies. PCI was associated with relief from angina or minimal angina frequency, but compliance with statin therapy and the achievement of LDL cholesterol targets 4 years after PCI were suboptimal.

  2. Self-identification and empathy modulate error-related brain activity during the observation of penalty shots between friend and foe

    PubMed Central

    Ganesh, Shanti; van Schie, Hein T.; De Bruijn, Ellen R. A.; Bekkering, Harold

    2009-01-01

    The ability to detect and process errors made by others plays an important role is many social contexts. The capacity to process errors is typically found to rely on sites in the medial frontal cortex. However, it remains to be determined whether responses at these sites are driven primarily by action errors themselves or by the affective consequences normally associated with their commission. Using an experimental paradigm that disentangles action errors and the valence of their affective consequences, we demonstrate that sites in the medial frontal cortex (MFC), including the ventral anterior cingulate cortex (vACC) and pre-supplementary motor area (pre-SMA), respond to action errors independent of the valence of their consequences. The strength of this response was negatively correlated with the empathic concern subscale of the Interpersonal Reactivity Index. We also demonstrate a main effect of self-identification by showing that errors committed by friends and foes elicited significantly different BOLD responses in a separate region of the middle anterior cingulate cortex (mACC). These results suggest that the way we look at others plays a critical role in determining patterns of brain activation during error observation. These findings may have important implications for general theories of error processing. PMID:19015079

  3. AIRS Views of Anthropogenic and Biomass Burning CO: INTEX-B/MILAGRO and TEXAQS/GoMACCS

    NASA Astrophysics Data System (ADS)

    McMillan, W. W.; Warner, J.; Wicks, D.; Barnet, C.; Sachse, G.; Chu, A.; Sparling, L.

    2006-12-01

    Utilizing the Atmospheric InfraRed Sounder's (AIRS) unique spatial and temporal coverage, we present observations of anthropogenic and biomass burning CO emissions as observed by AIRS during the 2006 field experiments INTEX-B/MILAGRO and TEXAQS/GoMACCS. AIRS daily CO maps covering more than 75% of the planet demonstrate the near global transport of these emissions. AIRS day/night coverage of significant portions of the Earth often show substantial changes in 12 hours or less. However, the coarse vertical resolution of AIRS retrieved CO complicates its interpretation. For example, extensive CO emissions are evident from Asia during April and May 2006, but it is difficult to determine the relative contributions of biomass burning in Thailand vs. domestic and industrial emissions from China. Similarly, sometimes AIRS sees enhanced CO over and downwind of Mexico City and other populated areas. AIRS low information content and decreasing sensitivity in the boundary layer can result in underestimates of CO total columns and free tropospheric abundances. Building on our analyses of INTEX-A/ICARTT data from 2004, we present comparisons with INTEX-B/MILAGRO and TEXAQS/GoMACCS in situ aircraft measurements and other satellite CO observations. The combined analysis of AIRS CO, water vapor and O3 retrievals; MODIS aerosol optical depths; and forward trajectory computations illuminate a variety of dynamical processes in the troposphere.

  4. Consistent Evaluation of ACOS-GOSAT, BESD-SCIAMACHY, CarbonTracker, and MACC Through Comparisons to TCCON

    NASA Technical Reports Server (NTRS)

    Kulawik, Susan; Wunch, Debra; O’Dell, Christopher; Frankenberg, Christian; Reuter, Maximilian; Chevallier, Frederic; Oda, Tomohiro; Sherlock, Vanessa; Buchwitz, Michael; Osterman, Greg; hide

    2016-01-01

    Consistent validation of satellite CO2 estimates is a prerequisite for using multiple satellite CO2 measurements for joint flux inversion, and for establishing an accurate long-term atmospheric CO2 data record. Harmonizing satellite CO2 measurements is particularly important since the differences in instruments, observing geometries, sampling strategies, etc. imbue different measurement characteristics in the various satellite CO2 data products. We focus on validating model and satellite observation attributes that impact flux estimates and CO2 assimilation, including accurate error estimates, correlated and random errors, overall biases, biases by season and latitude, the impact of coincidence criteria, validation of seasonal cycle phase and amplitude, yearly growth, and daily variability. We evaluate dry-air mole fraction (X(sub CO2)) for Greenhouse gases Observing SATellite (GOSAT) (Atmospheric CO2 Observations from Space, ACOS b3.5) and SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY) (Bremen Optimal Estimation DOAS, BESD v2.00.08) as well as the CarbonTracker (CT2013b) simulated CO2 mole fraction fields and the Monitoring Atmospheric Composition and Climate (MACC) CO2 inversion system (v13.1) and compare these to Total Carbon Column Observing Network (TCCON) observations (GGG2012/2014). We find standard deviations of 0.9, 0.9, 1.7, and 2.1 parts per million vs. TCCON for CT2013b, MACC, GOSAT, and SCIAMACHY, respectively, with the single observation errors 1.9 and 0.9 times the predicted errors for GOSAT and SCIAMACHY, respectively. We quantify how satellite error drops with data averaging by interpreting according to (error(sup 2) equals a(sup 2) plus b(sup 2) divided by n (with n being the number of observations averaged, a the systematic (correlated) errors, and b the random (uncorrelated) errors). a and b are estimated by satellites, coincidence criteria, and hemisphere. Biases at individual stations have year-to-year variability of 0.3 parts per million, with biases larger than the TCCON predicted bias uncertainty of 0.4 parts per million at many stations. We find that GOSAT and CT2013b under-predict the seasonal cycle amplitude in the Northern Hemisphere (NH) between 46 and 53 degrees North latitude, MACC over-predicts between 26 and 37 degrees North latitude, and CT2013b under-predicts the seasonal cycle amplitude in the Southern Hemisphere (SH). The seasonal cycle phase indicates whether a data set or model lags another data set in time. We find that the GOSAT measurements improve the seasonal cycle phase substantially over the prior while SCIAMACHY measurements improve the phase significantly for just two of seven sites. The models reproduce the measured seasonal cycle phase well except for at Lauder_125HR (CT2013b) and Darwin (MACC). We compare the variability within 1 day between TCCON and models in June-July-August; there is correlation between 0.2 and 0.8 in the NH, with models showing 10-50 percent the variability of TCCON at different stations and CT2013b showing more variability than MACC. This paper highlights findings that provide inputs to estimate flux errors in model assimilations, and places where models and satellites need further investigation, e.g., the SH for models and 45-67 degrees North latitude for GOSAT and CT2013b.

  5. Impact of anaemia on long-term outcomes in patients treated with first- and second-generation drug-eluting stents; Katowice-Zabrze Registry.

    PubMed

    Wańha, Wojciech; Kawecki, Damian; Roleder, Tomasz; Pluta, Aleksandra; Marcinkiewicz, Kamil; Dola, Janusz; Morawiec, Beata; Krzych, Łukasz; Pawłowski, Tomasz; Smolka, Grzegorz; Ochała, Andrzej; Nowalany-Kozielska, Ewa; Tendera, Michał; Wojakowski, Wojciech

    2016-01-01

    Coexisting anaemia is associated with an increased risk of major adverse cardiac and cerebrovascular events (MACCE) and bleeding complications after percutaneous coronary intervention (PCI), especially in patients with acute coronary syndrome. To assess the impact of anaemia in patients with coronary artery disease (CAD) treated with first- and second-generation drug-eluting stents (DES) on one-year MACCE. The registry included 1916 consecutive patients (UA: n = 1502, 78.3%; NSTEMI: n = 283, 14.7%; STEMI/LBBB: n = 131, 6.8%) treated either with first- (34%) or second-generation (66%) DES. The study population was divided into two groups: patients presenting with anaemia 217 (11%) and without anaemia 1699 (89%) prior to PCI. Anaemia was defined according to World Heart Organisation (haemoglobin [Hb] level < 13 g/dL for men and < 12 g/dL for women). Patients with anaemia were older (69, IQR: 61-75 vs. 62, IQR: 56-70, p < 0.001), had higher prevalence of co-morbidities: diabetes (44.7% vs. 36.4%, p = 0.020), chronic kidney disease (31.3% vs. 19.4%; p < 0.001), peripheral artery disease (10.1% vs. 5.4%, p = 0.005), and lower left ventricular ejection fraction values (50, IQR: 40-57% vs. 55, IQR: 45-60%; p < 0.001). No difference between gender in frequency of anaemia was found. Patients with anaemia more often had prior myocardial infarction (MI) (57.6% vs. 46.4%; p = 0.002) and coronary artery bypass grafting (31.3% vs. 19.4%; p < 0.001) in comparison to patients without anaemia. They also more often had multivessel disease in angiography (36.4% vs. 26.1%; p = 0.001) and more complexity CAD as measured by SYNTAX score (21, IQR: 12-27 points vs. 14, IQR: 8-22 points; p = 0.001). In-hospital risk of acute heart failure (2.7% vs. 0.7%; p = 0.006) and bleeding requiring transfusion (3.2% vs. 0.5%; p < 0.001) was significantly higher in patients with anaemia. One-year follow-up showed that there was higher rate of death in patients with anaemia. However, there were no differences in MI, stroke, target vessel revascularisation (TVR) and MACCE in comparison to patients with normal Hb. There were no differences according to type of DES (first vs. second generation) in the population of patients with anaemia. In patients with anaemia there is a significantly higher risk of death in 12-month follow-up, but anaemia has no impact on the incidence of MI, repeat revascularisation, stroke and MACCE. There is no advantage of II-DES over I-DES generation in terms of MACCE and TVR in patients with anaemia.

  6. Green Infrastructure Barriers and Opportunities in the Macatawa Watershed, Michigan

    EPA Pesticide Factsheets

    The project supports MACC outreach and implementation efforts of the watershed management plan by facilitating communication with local municipal staff and educating local decision makers about green infrastructure.

  7. Impact of Chronic Obstructive Pulmonary Disease on Long-Term Outcome in Patients with Coronary Artery Disease Undergoing Percutaneous Coronary Intervention.

    PubMed

    Zhang, Ming; Cheng, Yun-Jiu; Zheng, Wei-Ping; Liu, Guang-Hui; Chen, Huai-Sheng; Ning, Yu; Zhao, Xin; Su, Li-Xiao; Liu, Li-Juan

    2016-01-01

    Objective . The aim of this study was to investigate the association between COPD and major adverse cardiovascular and cerebral events (MACCE) in patients undergoing percutaneous coronary intervention (PCI). Methods . 2,362 patients who underwent PCI were included in this study. Subjects were divided into 2 groups: with COPD ( n = 233) and without COPD ( n = 2,129). Cox proportional hazards models were analyzed to determine the effect of COPD on the incidence of MACCE. Results . The patients with COPD were older ( P < 0.0001) and were more likely to be current smokers ( P = 0.02) and have had hypertension ( P = 0.02) and diabetes mellitus ( P = 0.01). Prevalence of serious cardiovascular comorbidity was higher in the patients with COPD, including a history of MI ( P = 0.02) and HF ( P < 0.0001). Compared with non-COPD group, the COPD group showed a higher risk of all-cause death (hazard ratio (HR): 2.45, P < 0.0001), cardiac death (HR: 2.53, P = 0.0002), MI (HR: 1.387, P = 0.027), and HF (HR: 2.25, P < 0.0001). Conclusions . Patients with CAD and concomitant COPD are associated with a higher incidence of MACCE (all-cause death, cardiac death, MI, and HF) compared to patients without COPD. The patients with a history of COPD have higher in-hospital and long-term mortality rates than those without COPD after PCI.

  8. Global data set of biogenic VOC emissions calculated by the MEGAN model over the last 30 years

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sindelarova, K.; Granier, Claire; Bouarar, I.

    The Model of Emissions of Gases and Aerosols from Nature (MEGANv2.1) together with the Modern-Era Retrospective Analysis for Research and Applications (MERRA) meteorological fields were used to create a global emission dataset of biogenic VOCs available on a monthly basis for the time period of 1980 - 2010. This dataset is called MEGAN-MACC. The model estimated mean annual total BVOC emission of 760 Tg(C) yr1 consisting of isoprene (70%), monoterpenes (11%), methanol (6%), acetone (3%), sesquiterpenes (2.5%) and other BVOC species each contributing less than 2 %. Several sensitivity model runs were performed to study the impact of different modelmore » input and model settings on isoprene estimates and resulted in differences of * 17% of the reference isoprene total. A greater impact was observed for sensitivity run applying parameterization of soil moisture deficit that led to a 50% reduction of isoprene emissions on a global scale, most significantly in specific regions of Africa, South America and Australia. MEGAN-MACC estimates are comparable to results of previous studies. More detailed comparison with other isoprene in ventories indicated significant spatial and temporal differences between the datasets especially for Australia, Southeast Asia and South America. MEGAN-MACC estimates of isoprene and*-pinene showed a reasonable agreement with surface flux measurements in the Amazon andthe model was able to capture the seasonal variation of emissions in this region.« less

  9. Comparison of different antithrombotic regimens for patients with atrial fibrillation undergoing drug-eluting stent implantation.

    PubMed

    Gao, Fei; Zhou, Yu Jie; Wang, Zhi Jian; Shen, Hua; Liu, Xiao Li; Nie, Bin; Yan, Zhen Xian; Yang, Shi Wei; Jia, De An; Yu, Miao

    2010-04-01

    The optimal antithrombotic strategy for patients with atrial fibrillation (AF) undergoing drug-eluting stent (DES) implantation is unknown. The 622 consecutive AF patients undergoing DES implantation were prospectively enrolled. Among them, 142 patients (TT group) continued triple antithrombotic therapy comprising aspirin, clopidogrel and warfarin after discharge; 355 patients (DT group) had dual antiplatelet therapy; 125 patients (WS group) were discharged with warfarin and a single antiplatelet agent. Target INR was set as 1.8-2.5 and was regularly monitored after discharge. The TT group had a significant reduction in stroke and major adverse cardiac and cerebral events (MACCE) (8.8% vs 20.1% vs 14.9%, P=0.010) as compared with either the DT or WS group. In the Cox regression analysis, administration with warfarin (hazard ratio (HR) 0.49; 95% confidence interval (CI) 0.31-0.77; P=0.002) and baseline CHADS(2) score >or=2 (HR 2.09; 95%CI 1.27-3.45; P=0.004) were independent predictors of MACCE. Importantly, the incidence of major bleeding was comparable among 3 groups (2.9% vs 1.8% vs 2.5%, P=0.725), although the overall bleeding rate was increased in the TT group. Kaplan-Meier analysis indicated that the TT group was associated with the best net clinical outcome. The cardiovascular benefits of triple antithrombotic therapy were confirmed by reducing the MACCE rate, and its major bleeding risk might be acceptable if the INR is closely monitored.

  10. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less

  11. ChemoPy: freely available python package for computational biology and chemoinformatics.

    PubMed

    Cao, Dong-Sheng; Xu, Qing-Song; Hu, Qian-Nan; Liang, Yi-Zeng

    2013-04-15

    Molecular representation for small molecules has been routinely used in QSAR/SAR, virtual screening, database search, ranking, drug ADME/T prediction and other drug discovery processes. To facilitate extensive studies of drug molecules, we developed a freely available, open-source python package called chemoinformatics in python (ChemoPy) for calculating the commonly used structural and physicochemical features. It computes 16 drug feature groups composed of 19 descriptors that include 1135 descriptor values. In addition, it provides seven types of molecular fingerprint systems for drug molecules, including topological fingerprints, electro-topological state (E-state) fingerprints, MACCS keys, FP4 keys, atom pairs fingerprints, topological torsion fingerprints and Morgan/circular fingerprints. By applying a semi-empirical quantum chemistry program MOPAC, ChemoPy can also compute a large number of 3D molecular descriptors conveniently. The python package, ChemoPy, is freely available via http://code.google.com/p/pychem/downloads/list, and it runs on Linux and MS-Windows. Supplementary data are available at Bioinformatics online.

  12. Evaluation of the ability of the MACC-II Reanalysis to reproduce the distribution of O3 and CO in the UTLS as measured by MOZAIC-IAGOS

    NASA Astrophysics Data System (ADS)

    Gaudel, A.; Clark, H.; Thouret, V.; Eskes, H.; Huijnen, V.; Nedelec, P.

    2013-12-01

    Tropospheric ozone is probably one of the most important trace gases in the atmosphere. It plays a major role in the chemistry of the troposphere by exerting a strong influence on the concentrations of oxidants such as hydroxyl radical (OH) and is the third greenhouse gas after carbon dioxide and methane. Its radiative impact is of particular importance in the Upper Troposphere / Lower Stratosphere (UTLS), the most critical region regarding the climate change. Carbon Monoxide (CO) is one of the major ozone precursors (originating from all types of combustion) in the troposphere. In the UTLS, it also has implications for stratospheric chemistry and indirect radiative forcing effects (as a chemical precursor of CO2 and O3). Assessing the global distribution (and possibly trends) of O3 and CO in this region of the atmosphere, combining high resolution in situ data and the most appropriate global 3D model to further quantify the different sources and their origins is then of particular interest. This is one of the objectives of the MOZAIC-IAGOS (http://www.iagos.fr) and MACC-II (http://www.gmes-atmosphere.eu) European programs. The aircraft of the MOZAIC program have collected simultaneously O3 and CO data regularly all over the world since the end of 2001. Most of the data are recorded in northern mid-latitudes, in the UTLS region (as commercial aircraft cruise altitude is between 9 and 12 km). MACC-II aims at providing information services covering air quality, climate forcing and stratospheric ozone, UV radiation and solar-energy resources, using near real time analysis and forecasting products, and reanalysis. The validation reports of the MACC models are regularly published (http://www.gmes-atmosphere.eu/services/gac/nrt/ and http://www.gmes-atmosphere.eu/services/gac/reanalysis/). We will present and discuss the performance of the MACC-reanalysis, including the ECMWF-Integrated Forecasting System (IFS) coupled to the CTM MOZART with 4DVAR data assimilation, to reproduce ozone and CO in the UTLS, as evaluated by the observations of MOZAIC between 2003 and 2008. In the UT, the model tends to overestimate O3 by about 30-40 % in the mid-latitudes and polar regions. This applies broadly to all seasons but is more marked in DJF and MAM. In tropical regions, the model underestimates UT ozone by about 20 % in all seasons but this is stronger in JJA. Upper-tropospheric CO is globally underestimated by the model in all seasons, by 10-20 %. In the southern hemisphere, it is particularly the case in SON in the regions of wildfires in South Africa. In the northern hemisphere, the zonal gradient of CO between the US, Europe and Asia is not well-captured by the model, especially in MAM.

  13. The development of a classification system for maternity models of care.

    PubMed

    Donnolley, Natasha; Butler-Henderson, Kerryn; Chapman, Michael; Sullivan, Elizabeth

    2016-08-01

    A lack of standard terminology or means to identify and define models of maternity care in Australia has prevented accurate evaluations of outcomes for mothers and babies in different models of maternity care. As part of the Commonwealth-funded National Maternity Data Development Project, a classification system was developed utilising a data set specification that defines characteristics of models of maternity care. The Maternity Care Classification System or MaCCS was developed using a participatory action research design that built upon the published and grey literature. The study identified the characteristics that differentiate models of care and classifies models into eleven different Major Model Categories. The MaCCS will enable individual health services, local health districts (networks), jurisdictional and national health authorities to make better informed decisions for planning, policy development and delivery of maternity services in Australia. © The Author(s) 2016.

  14. Use of the RenalGuard system to prevent contrast-induced AKI: A meta-analysis.

    PubMed

    Mattathil, Stephanie; Ghumman, Saad; Weinerman, Jonathan; Prasad, Anand

    2017-10-01

    Contrast-induced kidney injury (CI-AKI) following cardiovascular interventions results in increased morbidity and mortality. RenalGuard (RG) is a novel, closed loop system which balances volume administration with forced diuresis to maintain a high urine output. We performed a meta-analysis of the existing data comparing use of RG to conventional volume expansion. Ten studies were found eligible, of which four were randomized controlled trials. Of an aggregate sample size (N) of 1585 patients, 698 were enrolled in the four RCTs and 887 belonged to the remaining registries included in this meta-analysis. Primary outcomes included CI-AKI incidence and relative risk. Mortality, dialysis, and major adverse cardiovascular events (MACCE) were secondary outcomes. A random effects model was used and data were evaluated for publication bias. RG was associated with significant risk reduction in CI-AKI compared to control (RR: 0.30, 95%CI: 0.18-0.50, P < 0.01). CI-AKI in RG was found to be 7.7% versus 23.6% in the control group (P < 0.01). Use of RG was associated with decreased mortality (RR: 0.43, 95%CI: 0.18-0.99, P = 0.05), dialysis (RR: 0.20, 95%CI: 0.06-0.61, P = 0.01), and MACCE (RR: 0.42, 95%CI: 0.27-0.65, P < 0.01) compared to control. RG significantly reduces rates of CI-AKI compared to standard volume expansion and is also associated with decreased rates of death, dialysis, and MACCE. © 2017, Wiley Periodicals, Inc.

  15. Construction of a Calibrated Probabilistic Classification Catalog: Application to 50k Variable Sources in the All-Sky Automated Survey

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Brink, Henrik; Crellin-Quick, Arien

    2012-12-01

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  16. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.

    2012-12-15

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In additionmore » to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.« less

  17. A comparison of two brands of clopidogrel in patients with drug-eluting stent implantation.

    PubMed

    Park, Yae Min; Ahn, Taehoon; Lee, Kyounghoon; Shin, Kwen-Chul; Jung, Eul Sik; Shin, Dong Su; Kim, Myeong Gun; Kang, Woong Chol; Han, Seung Hwan; Choi, In Suck; Shin, Eak Kyun

    2012-07-01

    Although generic clopidogrel is widely used, clinical efficacy and safety between generic and original clopidogrel had not been well evaluated. The aim of this study was to evaluate the clinical outcomes of 2 oral formulations of clopidogrel 75 mg tablets in patients with coronary artery disease (CAD) undergoing drug-eluting stent (DES) implantation. Between July 2006 and February 2009, 428 patients that underwent implantation with DES for CAD and completed >1 year of clinical follow-up were enrolled in this study. Patients were divided into the following 2 groups based on treatment formulation, Platless® (test formulation, n=211) or Plavix® (reference formulation, n=217). The incidence of 1-year major adverse cardiovascular and cerebrovascular event (MACCE) and stent thrombosis (ST) were retrospectively reviewed. The baseline demographic and procedural characteristics were not significantly different between two treatment groups. The incidence of 1-year MACCEs was 8.5% {19/211, 2 deaths, 4 myocardial infarctions (MIs), 2 strokes, and 11 target vessel revascularizations (TVRs)} in Platless® group vs. 7.4% (16/217, 4 deaths, 1 MI, 2 strokes, and 9 TVRs) in Plavix® group (p=0.66). The incidence of 1-year ST was 0.5% (1 definite and subacute ST) in Platless® group vs. 0% in Plavix® group (p=0.49). In this study, the 2 tablet preparations of clopidogrel showed similar rates of MACCEs, but additional prospective randomized studies with pharmacodynamics and platelet reactivity are needed to conclude whether generic clopidgrel may replace original clopidogrel.

  18. A study of cellular counting to determine minimum thresholds for adequacy for liquid-based cervical cytology using a survey and counting protocol.

    PubMed

    Kitchener, Henry C; Gittins, Matthew; Desai, Mina; Smith, John H F; Cook, Gary; Roberts, Chris; Turnbull, Lesley

    2015-03-01

    Liquid-based cytology (LBC) for cervical screening would benefit from laboratory practice guidelines that define specimen adequacy for reporting of slides. The evidence base required to define cell adequacy should incorporate both ThinPrep™ (TP; Hologic, Inc., Bedford, MA, USA) and SurePath™ (SP; BD Diagnostics, Burlington, NC, USA), the two LBC systems used in the UK cervical screening programmes. The objectives of this study were to determine (1) current practice for reporting LBC in England, Wales and Scotland, (2) a reproducible method for cell counting, (3) the cellularity of slides classified as inadequate, negative or abnormal and (4) the impact of varying cellularity on the likelihood of detecting cytological abnormalities. The study involved four separate arms to pursue each of the four objectives. (1) A questionnaire survey of laboratories was conducted. (2) A standard counting protocol was developed and used by three experienced cytopathologists to determine a reliable and reproducible cell counting method. (3) Slide sets which included a range of cytological abnormalities were each sent to three laboratories for cell counting to study the correlation between cell counts and reported cytological outcomes. (4) Dilution of LBC samples by fluid only (unmixed) or by dilution with a sample containing normal cells (mixed) was performed to study the impact on reporting of reducing either the total cell count or the relative proportion of abnormal to normal cells. The study was conducted within the cervical screening programmes in England, Wales and Scotland, using routinely obtained cervical screening samples, and in 56 participating NHS cervical cytology laboratories. The study involved only routinely obtained cervical screening samples. There was no clinical intervention. The main outcome measures were (1) reliability of counting method, (2) correlation of reported cytology grades with cellularity and (3) levels of detection of abnormal cells in progressively diluted cervical samples. Laboratory practice varied in terms of threshold of cellular adequacy and of morphological markers of adequacy. While SP laboratories generally used a minimum acceptable cell count (MACC) of 15,000, the MACC employed by TP laboratories varied between 5000 and 15,000. The cell counting study showed that a standard protocol achieved moderate to strong inter-rater reproducibility. Analysis of slide reporting from laboratories revealed that a large proportion of the samples reported as inadequate had cell counts above a threshold of 15,000 for SP, and 5000 and 10,000 for TP. Inter-rater unanimity was greater among more cellular preparations. Dilution studies demonstrated greater detection of abnormalities in slides with counts above the MACC and among slides with more than 25 dyskaryotic cells. Variation in laboratory practice demonstrates a requirement for evidence-based standards for designating a MACC. This study has indicated that a MACC of 15,000 and 5000 for SP and TP, respectively, achieves a balance in terms of maintaining sensitivity and low inadequacy rates. The findings of this study should inform the development of laboratory practice guidelines. The National Institute for Health Research Health Technology Assessment programme.

  19. The IAGOS information system

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie

    2015-04-01

    IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.

  20. GLANCE - calculatinG heaLth impActs of atmospheric pollutioN in a Changing climatE

    NASA Astrophysics Data System (ADS)

    Vogel, Leif; Faria, Sérgio; Markandya, Anil

    2016-04-01

    Current annual global estimates of premature deaths from poor air quality are estimated in the range of 2.6-4.4 million, and 2050 projections are expected to double against 2010 levels. In Europe, annual economic burdens are estimated at around 750 bn €. Climate change will further exacerbate air pollution burdens; therefore, a better understanding of the economic impacts on human societies has become an area of intense investigation. European research efforts are being carried out within the MACC project series, which started in 2005. The outcome of this work has been integrated into a European capacity for Earth Observation, the Copernicus Atmospheric Monitoring Service (CAMS). In MACC/CAMS, key pollutant concentrations are computed at the European scale and globally by employing chemically-driven advanced transport models. The project GLANCE (calculatinG heaLth impActs of atmospheric pollutioN in a Changing climatE) aims at developing an integrated assessment model for calculating the health impacts and damage costs of air pollution at different physical scales. It combines MACC/CAMS (assimilated Earth Observations, an ensemble of chemical transport models and state of the art ECWMF weather forecasting) with downscaling based on in-situ network measurements. The strengthening of modelled projections through integration with empirical evidence reduces errors and uncertainties in the health impact projections and subsequent economic cost assessment. In addition, GLANCE will yield improved data accuracy at different time resolutions. This project is a multidisciplinary approach which brings together expertise from natural sciences and socio economic fields. Here, its general approach will be presented together with first results for the years 2007 - 2012 on the European scale. The results on health impacts and economic burdens are compared to existing assessments.

  1. Comparative Effectiveness of Blood Pressure-lowering Drugs in Patients who have Already Suffered From Stroke

    PubMed Central

    Wang, Wei-Ting; You, Li-Kai; Chiang, Chern-En; Sung, Shih-Hsien; Chuang, Shao-Yuan; Cheng, Hao-Min; Chen, Chen-Huan

    2016-01-01

    Abstract Hypertension is the most important risk factor for stroke and stroke recurrence. However, the preferred blood pressure (BP)-lowering drug class for patients who have suffered from a stroke has yet to be determined. To investigate the relative effects of BP-lowering therapies [angiotensin-converting enzyme inhibitor (ACEI), angiotensin receptor blockers (ARB), β blockers, calcium channel blockers (CCBs), diuretics, and combinations of these drugs] in patients with a prior stroke history, we performed a systematic review and meta-analysis using both traditional frequentist and Bayesian random-effects models and meta-regression of randomized controlled trials (RCTs) on the outcomes of recurrent stroke, coronary heart disease (CHD), and any major adverse cardiac and cerebrovascular events (MACCE). Trials were identified from searches of published hypertension guidelines, electronic databases, and previous systematic reviews. Fifteen RCTs composed of 39,329 participants with previous stroke were identified. Compared with the placebo, only ACEI along with diuretics significantly reduced recurrent stroke events [odds ratio (OR) = 0.54, 95% credibility interval (95% CI) 0.33–0.90]. On the basis of the distribution of posterior probabilities, the treatment ranking consistently identified ACEI along with diuretics as the preferred BP-lowering strategy for the reduction of recurrent stroke and CHD (31% and 35%, respectively). For preventing MACCE, diuretics appeared to be the preferred agent for stroke survivors (34%). Moreover, the meta-regression analysis failed to demonstrate a statistical significance between BP reduction and all outcomes (P = 0.1618 for total stroke, 0.4933 for CHD, and 0.2411 for MACCE). Evidence from RCTs supports the use of diuretics-based treatment, especially when combined with ACEI, for the secondary prevention of recurrent stroke and any vascular events in patients who have suffered from stroke. PMID:27082571

  2. Global height-resolved methane retrievals from the Infrared Atmospheric Sounding Interferometer (IASI) on MetOp

    NASA Astrophysics Data System (ADS)

    Siddans, Richard; Knappett, Diane; Kerridge, Brian; Waterfall, Alison; Hurley, Jane; Latter, Barry; Boesch, Hartmut; Parker, Robert

    2017-11-01

    This paper describes the global height-resolved methane (CH4) retrieval scheme for the Infrared Atmospheric Sounding Interferometer (IASI) on MetOp, developed at the Rutherford Appleton Laboratory (RAL). The scheme precisely fits measured spectra in the 7.9 micron region to allow information to be retrieved on two independent layers centred in the upper and lower troposphere. It also uses nitrous oxide (N2O) spectral features in the same spectral interval to directly retrieve effective cloud parameters to mitigate errors in retrieved methane due to residual cloud and other geophysical variables. The scheme has been applied to analyse IASI measurements between 2007 and 2015. Results are compared to model fields from the MACC greenhouse gas inversion and independent measurements from satellite (GOSAT), airborne (HIPPO) and ground (TCCON) sensors. The estimated error on methane mixing ratio in the lower- and upper-tropospheric layers ranges from 20 to 100 and from 30 to 40 ppbv, respectively, and error on the derived column-average ranges from 20 to 40 ppbv. Vertical sensitivity extends through the lower troposphere, though it decreases near to the surface. Systematic differences with the other datasets are typically < 10 ppbv regionally and < 5 ppbv globally. In the Southern Hemisphere, a bias of around 20 ppbv is found with respect to MACC, which is not explained by vertical sensitivity or found in comparison of IASI to TCCON. Comparisons to HIPPO and MACC support the assertion that two layers can be independently retrieved and provide confirmation that the estimated random errors on the column- and layer-averaged amounts are realistic. The data have been made publically available via the Centre for Environmental Data Analysis (CEDA) data archive (Siddans, 2016).

  3. A new method for assessing surface solar irradiance: Heliosat-4

    NASA Astrophysics Data System (ADS)

    Qu, Z.; Oumbe, A.; Blanc, P.; Lefèvre, M.; Wald, L.; Schroedter-Homscheidt, M.; Gesell, G.

    2012-04-01

    Downwelling shortwave irradiance at surface (SSI) is more and more often assessed by means of satellite-derived estimates of optical properties of the atmosphere. Performances are judged satisfactory for the time being but there is an increasing need for the assessment of the direct and diffuse components of the SSI. MINES ParisTech and the German Aerospace Center (DLR) are currently developing the Heliosat-4 method to assess the SSI and its components in a more accurate way than current practices. This method is composed by two parts: a clear sky module based on the radiative transfer model libRadtran, and a cloud-ground module using two-stream and delta-Eddington approximations for clouds and a database of ground albedo. Advanced products derived from geostationary satellites and recent Earth Observation missions are the inputs of the Heliosat-4 method. Such products are: cloud optical depth, cloud phase, cloud type and cloud coverage from APOLLO of DLR, aerosol optical depth, aerosol type, water vapor in clear-sky, ozone from MACC products (FP7), and ground albedo from MODIS of NASA. In this communication, we briefly present Heliosat-4 and focus on its performances. The results of Heliosat-4 for the period 2004-2010 will be compared to the measurements made in five stations within the Baseline Surface Radiation Network. Extensive statistic analysis as well as case studies are performed in order to better understand Heliosat-4 and have an in-depth view of the performance of Heliosat-4, to understand its advantages comparing to existing methods and to identify its defaults for future improvements. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 218793 (MACC project) and no. 283576 (MACC-II project).

  4. Triple antithrombotic therapy versus dual antiplatelet therapy in patients with atrial fibrillation undergoing drug-eluting stent implantation.

    PubMed

    Kang, Dong Oh; Yu, Cheol Woong; Kim, Hee Dong; Cho, Jae Young; Joo, Hyung Joon; Choi, Rak Kyong; Park, Jin Sik; Lee, Hyun Jong; Kim, Je Sang; Park, Jae Hyung; Hong, Soon Jun; Lim, Do-Sun

    2015-08-01

    The optimal antithrombotic regimen in patients with atrial fibrillation (AF) undergoing drug-eluting stent (DES) implantation for complex coronary artery disease is unclear. We compared the net clinical outcomes of triple antithrombotic therapy (TAT; aspirin, thienopyridine, and warfarin) and dual antiplatelet therapy (DAPT; aspirin and thienopyridine) in AF patients who had undergone DES implantation. A total of 367 patients were enrolled and analyzed retrospectively; 131 patients (35.7%) received TAT and 236 patients (64.3%) received DAPT. DAPT and warfarin were maintained for a minimum of 12 and 24 months, respectively. The primary endpoint was the 2-year net clinical outcomes, a composite of major bleeding and major adverse cardiac and cerebral events (MACCE). Propensity score-matching analysis was carried out in 99 patient pairs. The 2-year net clinical outcomes of the TAT group were worse than those of the DAPT group (34.3 vs. 21.1%, P=0.006), which was mainly due to the higher incidence of major bleeding (16.7 vs. 4.6%, P<0.001), without any significant increase in MACCE (22.1 vs. 17.7%, P=0.313). In the multivariate analysis, TAT was an independent predictor of worse net clinical outcomes (odds ratio 1.63, 95% confidence interval 1.06-2.50) and major bleeding (odds ratio 3.54, 95% confidence interval 1.65-7.58). After propensity score matching, the TAT group still had worse net clinical outcomes and a higher incidence of major bleeding compared with the DAPT group. In AF patients undergoing DES implantation, prolonged administration of TAT may be harmful due to the substantial increase in the risk for major bleeding without any reduction in MACCE.

  5. Clinical events after interruption of anticoagulation in patients with atrial fibrillation: An analysis from the ENGAGE AF-TIMI 48 trial.

    PubMed

    Cavallari, Ilaria; Ruff, Christian T; Nordio, Francesco; Deenadayalu, Naveen; Shi, Minggao; Lanz, Hans; Rutman, Howard; Mercuri, Michele F; Antman, Elliott M; Braunwald, Eugene; Giugliano, Robert P

    2018-04-15

    Patients with atrial fibrillation (AF) who interrupt anticoagulation are at high risk of thromboembolism and death. Patients enrolled in the ENGAGE AF-TIMI 48 trial (randomized comparison of edoxaban vs. warfarin) who interrupted study anticoagulant for >3 days were identified. Clinical events (ischemic stroke/systemic embolism, major cardiac and cerebrovascular events [MACCE]) were analyzed from day 4 after interruption until day 34 or study drug resumption. During 2.8 years median follow-up, 13,311 (63%) patients interrupted study drug for >3 days. After excluding those who received open-label anticoagulation during the at-risk window, the population for analysis included 9148 patients. The rates of ischemic stroke/systemic embolism and MACCE post interruption were substantially greater than in patients who never interrupted (15.42 vs. 0.26 and 60.82 vs. 0.36 per 100 patient-years, respectively, p adj  < .001). Patients who interrupted study drug for an adverse event (44.1% of the cohort), compared to those who interrupted for other reasons, had an increased risk of MACCE (HR adj 2.75; 95% CI 2.02-3.74, p < .0001), but similar rates of ischemic stroke/systemic embolism. Rates of clinical events after interruption of warfarin and edoxaban were similar. Interruption of study drug was frequent in patients with AF and was associated with a substantial risk of major cardiac and cerebrovascular events over the ensuing 30 days. This risk was particularly high in patients who interrupted as a result of an adverse event; these patients deserve close monitoring and resumption of anticoagulation as soon as it is safe to do so. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Impact of remote ischaemic preconditioning on major clinical outcomes in patients undergoing cardiovascular surgery: A meta-analysis with trial sequential analysis of 32 randomised controlled trials.

    PubMed

    Wang, Shifei; Li, Hairui; He, Nvqin; Sun, Yili; Guo, Shengcun; Liao, Wangjun; Liao, Yulin; Chen, Yanmei; Bin, Jianping

    2017-01-15

    The impact of remote ischaemic preconditioning (RIPC) on major clinical outcomes in patients undergoing cardiovascular surgery remains controversial. We systematically reviewed the available evidence to evaluate the potential benefits of RIPC in such patients. PubMed, Embase, and Cochrane Library databases were searched for relevant randomised controlled trials (RCTs) conducted between January 2006 and March 2016. The pooled population of patients who underwent cardiovascular surgery was divided into the RIPC and control groups. Trial sequential analysis was applied to judge data reliability. The pooled relative risks (RRs) with 95% confidence intervals (CIs) between the groups were calculated for all-cause mortality, major adverse cardiovascular and cerebral events (MACCEs), myocardial infarction (MI), and renal failure. RIPC was not associated with improvement in all-cause mortality (RR, 1.04; 95%CI, 0.82-1.31; I 2 =26%; P>0.05) or MACCE incidence (RR, 0.90; 95%CI, 0.71-1.14; I 2 =40%; P>0.05) after cardiovascular surgery, and both results were assessed by trial sequential analysis as sufficient and conclusive. Nevertheless, RIPC was associated with a significantly lower incidence of MI (RR, 0.87; 95%CI, 0.76-1.00; I 2 =13%; P≤0.05). However, after excluding a study that had a high contribution to heterogeneity, RIPC was associated with increased rates of renal failure (RR, 1.53; 95%CI, 1.12-2.10; I 2 =5%; P≤0.05). In patients undergoing cardiovascular surgery, RIPC reduced the risk for postoperative MI, but not that for MACCEs or all-cause mortality, a discrepancy likely related to the higher rate of renal failure associated with RIPC. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Red light regulation of ethylene biosynthesis and gravitropism in etiolated pea stems

    NASA Technical Reports Server (NTRS)

    Steed, C. L.; Taylor, L. K.; Harrison, M. A.

    2004-01-01

    During gravitropism, the accumulation of auxin in the lower side of the stem causes increased growth and the subsequent curvature, while the gaseous hormone ethylene plays a modulating role in regulating the kinetics of growth asymmetries. Light also contributes to the control of gravitropic curvature, potentially through its interaction with ethylene biosynthesis. In this study, red-light pulse treatment of etiolated pea epicotyls was evaluated for its effect on ethylene biosynthesis during gravitropic curvature. Ethylene biosynthesis analysis included measurements of ethylene; the ethylene precursor 1-aminocyclopropane-1-carboxylic acid (ACC); malonyl-conjugated ACC (MACC); and expression levels of pea ACC oxidase (Ps-ACO1) and ACC synthase (Ps-ACS1, Ps-ACS2) genes by reverse transcriptase-polymerase chain reaction analysis. Red-pulsed seedlings were given a 6 min pulse of 11 micromoles m-2 s-1 red-light 15 h prior to horizontal reorientation for consistency with the timeline of red-light inhibition of ethylene production. Red-pulse treatment significantly reduced ethylene production and MACC levels in epicotyl tissue. However, there was no effect of red-pulse treatment on ACC level, or expression of ACS or ACO genes. During gravitropic curvature, ethylene production increased from 60 to 120 min after horizontal placement in both control and red-pulsed epicotyls. In red-pulsed tissues, ACC levels increased by 120 min after horizontal reorientation, accompanied by decreased MACC levels in the lower portion of the epicotyl. Overall, our results demonstrate that ethylene production in etiolated epicotyls increases after the initiation of curvature. This ethylene increase may inhibit cell growth in the lower portion of the epicotyl and contribute to tip straightening and reduced overall curvature observed after the initial 60 min of curvature in etiolated pea epicotyls.

  8. Elevated troponin predicts long-term adverse cardiovascular outcomes in hypertensive crisis: a retrospective study.

    PubMed

    Pattanshetty, Deepak J; Bhat, Pradeep K; Aneja, Ashish; Pillai, Dilip P

    2012-12-01

    Hypertensive crisis is associated with poor clinical outcomes. Elevated troponin, frequently observed in hypertensive crisis, may be attributed to myocardial supply-demand mismatch or obstructive coronary artery disease (CAD). However, in patients presenting with hypertensive crisis and an elevated troponin, the prevalence of CAD and the long-term adverse cardiovascular outcomes are unknown. We sought to assess the impact of elevated troponin on cardiovascular outcomes and evaluate the role of troponin as a predictor of obstructive CAD in patients with hypertensive crisis. Patients who presented with hypertensive crisis (n = 236) were screened retrospectively. Baseline and follow-up data including the event rates were obtained using electronic patient records. Those without an assay for cardiac Troponin I (cTnI) (n = 65) were excluded. Of the remaining 171 patients, those with elevated cTnI (cTnI ≥ 0.12 ng/ml) (n = 56) were compared with those with normal cTnI (cTnI < 0.12 ng/ml) (n = 115) at 2 years for the occurrence of major adverse cardiac or cerebrovascular events (MACCE) (composite of myocardial infarction, unstable angina, hypertensive crisis, pulmonary edema, stroke or transient ischemic attack). At 2 years, MACCE occurred in 40 (71.4%) patients with elevated cTnI compared with 44 (38.3%) patients with normal cTnI [hazard ratio: 2.77; 95% confidence interval (CI): 1.79-4.27; P < 0.001]. Also, patients with elevated cTnI were significantly more likely to have underlying obstructive CAD (odds ratio: 8.97; 95% CI: 1.4-55.9; P < 0.01). In patients with hypertensive crisis, elevated cTnI confers a significantly greater risk of long-term MACCE, and is a strong predictor of obstructive CAD.

  9. MISR Regional GoMACCS Map Projection

    Atmospheric Science Data Center

    2017-03-29

    ... Regional Imagery:  Overview  |  Products  |  Data Quality  | Map Projection |  File Format  |  View Data  |  ... is needed if you are doing high precision work. The packages mentioned about (HDF-EOS library, GCTP, and IDL) all convert to and ...

  10. Deconvolution of magnetic acoustic change complex (mACC).

    PubMed

    Bardy, Fabrice; McMahon, Catherine M; Yau, Shu Hui; Johnson, Blake W

    2014-11-01

    The aim of this study was to design a novel experimental approach to investigate the morphological characteristics of auditory cortical responses elicited by rapidly changing synthesized speech sounds. Six sound-evoked magnetoencephalographic (MEG) responses were measured to a synthesized train of speech sounds using the vowels /e/ and /u/ in 17 normal hearing young adults. Responses were measured to: (i) the onset of the speech train, (ii) an F0 increment; (iii) an F0 decrement; (iv) an F2 decrement; (v) an F2 increment; and (vi) the offset of the speech train using short (jittered around 135ms) and long (1500ms) stimulus onset asynchronies (SOAs). The least squares (LS) deconvolution technique was used to disentangle the overlapping MEG responses in the short SOA condition only. Comparison between the morphology of the recovered cortical responses in the short and long SOAs conditions showed high similarity, suggesting that the LS deconvolution technique was successful in disentangling the MEG waveforms. Waveform latencies and amplitudes were different for the two SOAs conditions and were influenced by the spectro-temporal properties of the sound sequence. The magnetic acoustic change complex (mACC) for the short SOA condition showed significantly lower amplitudes and shorter latencies compared to the long SOA condition. The F0 transition showed a larger reduction in amplitude from long to short SOA compared to the F2 transition. Lateralization of the cortical responses were observed under some stimulus conditions and appeared to be associated with the spectro-temporal properties of the acoustic stimulus. The LS deconvolution technique provides a new tool to study the properties of the auditory cortical response to rapidly changing sound stimuli. The presence of the cortical auditory evoked responses for rapid transition of synthesized speech stimuli suggests that the temporal code is preserved at the level of the auditory cortex. Further, the reduced amplitudes and shorter latencies might reflect intrinsic properties of the cortical neurons to rapidly presented sounds. This is the first demonstration of the separation of overlapping cortical responses to rapidly changing speech sounds and offers a potential new biomarker of discrimination of rapid transition of sound. Crown Copyright © 2014. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Marginal abatement cost curves for NOx incorporating both controls and alternative measures

    EPA Science Inventory

    A marginal abatement cost curve (MACC) traces out the efficient marginal abatement cost level for any aggregate emissions target when a least cost approach is implemented. In order for it to represent the efficient MAC level, all abatement opportunities across all sectors and loc...

  12. Consistent evaluation of GOSAT, SCIAMACHY, carbontracker, and MACC through comparisons to TCCON

    DOE PAGES

    Kulawik, S. S.; Wunch, D.; O'Dell, C.; ...

    2015-06-22

    Consistent validation of satellite CO 2 estimates is a prerequisite for using multiple satellite CO 2 measurements for joint flux inversion, and for establishing an accurate long-term atmospheric CO 2 data record. We focus on validating model and satellite observation attributes that impact flux estimates and CO 2 assimilation, including accurate error estimates, correlated and random errors, overall biases, biases by season and latitude, the impact of coincidence criteria, validation of seasonal cycle phase and amplitude, yearly growth, and daily variability. We evaluate dry air mole fraction (X CO 2) for GOSAT (ACOS b3.5) and SCIAMACHY (BESD v2.00.08) as wellmore » as the CarbonTracker (CT2013b) simulated CO 2 mole fraction fields and the MACC CO 2 inversion system (v13.1) and compare these to TCCON observations (GGG2014). We find standard deviations of 0.9 ppm, 0.9, 1.7, and 2.1 ppm versus TCCON for CT2013b, MACC, GOSAT, and SCIAMACHY, respectively, with the single target errors 1.9 and 0.9 times the predicted errors for GOSAT and SCIAMACHY, respectively. When satellite data are averaged and interpreted according to error 2 = a 2+ b 2 / n (where n are the number of observations averaged, a are the systematic (correlated) errors, and b are the random (uncorrelated) errors), we find that the correlated error term a = 0.6 ppm and the uncorrelated error term b = 1.7 ppm for GOSAT and a = 1.0 ppm, b = 1.4 ppm for SCIAMACHY regional averages. Biases at individual stations have year-to-year variability of ~ 0.3 ppm, with biases larger than the TCCON predicted bias uncertainty of 0.4 ppm at many stations. Using fitting software, we find that GOSAT underpredicts the seasonal cycle amplitude in the Northern Hemisphere (NH) between 46–53° N. In the Southern Hemisphere (SH), CT2013b underestimates the seasonal cycle amplitude. Biases are calculated for 3-month intervals and indicate the months that contribute to the observed amplitude differences. The seasonal cycle phase indicates whether a dataset or model lags another dataset in time. We calculate this at a subset of stations where there is adequate satellite data, and find that the GOSAT retrieved phase improves substantially over the prior and the SCIAMACHY retrieved phase improves substantially for 2 of 7 sites. The models reproduce the measured seasonal cycle phase well except for at Lauder125 (CT2013b), Darwin (MACC), and Izana (+ 10 days, CT2013b), as for Bremen and Four Corners, which are highly influenced by local effects. We compare the variability within one day between TCCON and models in JJA; there is correlation between 0.2 and 0.8 in the NH, with models showing 10–100 % the variability of TCCON at different stations (except Bremen and Four Corners which have no variability compared to TCCON) and CT2013b showing more variability than MACC. This paper highlights findings that provide inputs to estimate flux errors in model assimilations, and places where models and satellites need further investigation, e.g. the SH for models and 45–67° N for GOSAT« less

  13. Assessment of the MACC reanalysis and its influence as chemical boundary conditions for regional air quality modeling in AQMEII-2

    EPA Science Inventory

    The Air Quality Model Evaluation International Initiative (AQMEII) has now reached its second phase which is dedicated to the evaluation of online coupled chemistry-meteorology models. Sixteen modeling groups from Europe and five from North America have run regional air quality m...

  14. Validating the EXCEL hypothesis: a propensity score matched 3-year comparison of percutaneous coronary intervention versus coronary artery bypass graft in left main patients with SYNTAX score ≤32.

    PubMed

    Capodanno, Davide; Caggegi, Anna; Capranzano, Piera; Cincotta, Glauco; Miano, Marco; Barrano, Gionbattista; Monaco, Sergio; Calvo, Francesco; Tamburino, Corrado

    2011-06-01

    The aim of this study is to verify the study hypothesis of the EXCEL trial by comparing percutaneous coronary intervention (PCI) and coronary artery bypass graft (CABG) in an EXCEL-like population of patients. The upcoming EXCEL trial will test the hypothesis that left main patients with SYNTAX score ≤ 32 experience similar rates of 3-year death, myocardial infarction (MI), or cerebrovascular accidents (CVA) following revascularization by PCI or CABG. We compared the 3-year rates of death/MI/CVA and death/MI/CVA/target vessel revascularization (MACCE) in 556 patients with left main disease and SYNTAX score ≤ 32 undergoing PCI (n = 285) or CABG (n = 271). To account for confounders, outcome parameters underwent extensive statistical adjustment. The unadjusted incidence of death/MI/CVA was similar between PCI and CABG (12.7% vs. 8.4%, P = 0.892), while MACCE were higher in the PCI group compared to the CABG group (27.0% vs. 11.8%, P < 0.001). After propensity score matching, PCI was not associated with a significant increase in the rate of death/MI/CVA (11.8% vs. 10.7%, P = 0.948), while MACCE were more frequently noted among patients treated with PCI (28.8% vs. 14.1%, P = 0.002). Adjustment by means of SYNTAX score and EUROSCORE, covariates with and without propensity score, and propensity score alone did not change significantly these findings. In an EXCEL-like cohort of patients with left main disease, there seems to be a clinical equipoise between PCI and CABG in terms of death/MI/CVA. However, even in patients with SYNTAX score ≤ 32, CABG is superior to PCI when target vessel revascularization is included in the combined endpoint. Copyright © 2011 Wiley-Liss, Inc.

  15. Impact of dual antiplatelet therapy after coronary artery bypass surgery on 1-year outcomes in the Arterial Revascularization Trial.

    PubMed

    Benedetto, Umberto; Altman, Douglas G; Gerry, Stephen; Gray, Alastair; Lees, Belinda; Flather, Marcus; Taggart, David P

    2017-09-01

    There is still little evidence to boldport routine dual antiplatelet therapy (DAPT) with P2Y12 antagonists following coronary artery bypass grafting (CABG). The Arterial Revascularization Trial (ART) was designed to compare 10-year survival after bilateral versus single internal thoracic artery grafting. We aimed to get insights into the effect of DAPT (with clopidogrel) following CABG on 1-year outcomes by performing a post hoc ART analysis. Among patients enrolled in the ART (n = 3102), 609 (21%) and 2308 (79%) were discharged on DAPT or aspirin alone, respectively. The primary end-point was the incidence of major adverse cerebrovascular and cardiac events (MACCE) at 1 year including cardiac death, myocardial infarction, cerebrovascular accident and reintervention; safety end-point was bleeding requiring hospitalization. Propensity score (PS) matching was used to create comparable groups. Among 609 PS-matched pairs, MACCE occurred in 34 (5.6%) and 34 (5.6%) in the DAPT and aspirin alone groups, respectively, with no significant difference between the 2 groups [hazard ratio (HR) 0.97, 95% confidence interval (CI) 0.59-1.59; P = 0.90]. Only 188 (31%) subjects completed 1 year of DAPT, and in this subgroup, MACCE rate was 5.8% (HR 1.11, 95% CI 0.53-2.30; P = 0.78). In the overall sample, bleeding rate was higher in DAPT group (2.3% vs 1.1%; P = 0.02), although this difference was no longer significant after matching (2.3% vs 1.8%; P = 0.54). Based on these findings, when compared with aspirin alone, DAPT with clopidogrel prescribed at discharge was not associated with a significant reduction of adverse cardiac and cerebrovascular events at 1 year following CABG. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  16. Tropospheric chemistry in the integrated forecasting system of ECMWF

    NASA Astrophysics Data System (ADS)

    Flemming, J.; Huijnen, V.; Arteta, J.; Bechtold, P.; Beljaars, A.; Blechschmidt, A.-M.; Josse, B.; Diamantakis, M.; Engelen, R. J.; Gaudel, A.; Inness, A.; Jones, L.; Katragkou, E.; Marecal, V.; Peuch, V.-H.; Richter, A.; Schultz, M. G.; Stein, O.; Tsikerdekis, A.

    2014-11-01

    A representation of atmospheric chemistry has been included in the Integrated Forecasting System (IFS) of the European Centre for Medium-range Weather Forecasts (ECMWF). The new chemistry modules complement the aerosol modules of the IFS for atmospheric composition, which is named C-IFS. C-IFS for chemistry supersedes a coupled system, in which the Chemical Transport Model (CTM) Model for OZone and Related chemical Tracers 3 was two-way coupled to the IFS (IFS-MOZART). This paper contains a description of the new on-line implementation, an evaluation with observations and a comparison of the performance of C-IFS with MOZART and with a re-analysis of atmospheric composition produced by IFS-MOZART within the Monitoring Atmospheric Composition and Climate (MACC) project. The chemical mechanism of C-IFS is an extended version of the Carbon Bond 2005 (CB05) chemical mechanism as implemented in the CTM Transport Model 5 (TM5). CB05 describes tropospheric chemistry with 54 species and 126 reactions. Wet deposition and lightning nitrogen monoxide (NO) emissions are modelled in C-IFS using the detailed input of the IFS physics package. A one-year simulation by C-IFS, MOZART and the MACC re-analysis is evaluated against ozonesondes, carbon monoxide (CO) aircraft profiles, European surface observations of ozone (O3), CO, sulphur dioxide (SO2) and nitrogen dioxide (NO2) as well as satellite retrievals of CO, tropospheric NO2 and formaldehyde. Anthropogenic emissions from the MACC/CityZen (MACCity) inventory and biomass burning emissions from the Global Fire Assimilation System (GFAS) data set were used in the simulations by both C-IFS and MOZART. C-IFS (CB05) showed an improved performance with respect to MOZART for CO, upper tropospheric O3, winter time SO2 and was of a similar accuracy for other evaluated species. C-IFS (CB05) is about ten times more computationally efficient than IFS-MOZART.

  17. Tropospheric chemistry in the Integrated Forecasting System of ECMWF

    NASA Astrophysics Data System (ADS)

    Flemming, J.; Huijnen, V.; Arteta, J.; Bechtold, P.; Beljaars, A.; Blechschmidt, A.-M.; Diamantakis, M.; Engelen, R. J.; Gaudel, A.; Inness, A.; Jones, L.; Josse, B.; Katragkou, E.; Marecal, V.; Peuch, V.-H.; Richter, A.; Schultz, M. G.; Stein, O.; Tsikerdekis, A.

    2015-04-01

    A representation of atmospheric chemistry has been included in the Integrated Forecasting System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF). The new chemistry modules complement the aerosol modules of the IFS for atmospheric composition, which is named C-IFS. C-IFS for chemistry supersedes a coupled system in which chemical transport model (CTM) Model for OZone and Related chemical Tracers 3 was two-way coupled to the IFS (IFS-MOZART). This paper contains a description of the new on-line implementation, an evaluation with observations and a comparison of the performance of C-IFS with MOZART and with a re-analysis of atmospheric composition produced by IFS-MOZART within the Monitoring Atmospheric Composition and Climate (MACC) project. The chemical mechanism of C-IFS is an extended version of the Carbon Bond 2005 (CB05) chemical mechanism as implemented in CTM Transport Model 5 (TM5). CB05 describes tropospheric chemistry with 54 species and 126 reactions. Wet deposition and lightning nitrogen monoxide (NO) emissions are modelled in C-IFS using the detailed input of the IFS physics package. A 1 year simulation by C-IFS, MOZART and the MACC re-analysis is evaluated against ozonesondes, carbon monoxide (CO) aircraft profiles, European surface observations of ozone (O3), CO, sulfur dioxide (SO2) and nitrogen dioxide (NO2) as well as satellite retrievals of CO, tropospheric NO2 and formaldehyde. Anthropogenic emissions from the MACC/CityZen (MACCity) inventory and biomass burning emissions from the Global Fire Assimilation System (GFAS) data set were used in the simulations by both C-IFS and MOZART. C-IFS (CB05) showed an improved performance with respect to MOZART for CO, upper tropospheric O3, and wintertime SO2, and was of a similar accuracy for other evaluated species. C-IFS (CB05) is about 10 times more computationally efficient than IFS-MOZART.

  18. Monitoring Air Quality over China: Evaluation of the modeling system of the PANDA project

    NASA Astrophysics Data System (ADS)

    Bouarar, Idir; Katinka Petersen, Anna; Brasseur, Guy; Granier, Claire; Xie, Ying; Wang, Xuemei; Fan, Qi; Wang, Lili

    2015-04-01

    Air pollution has become a pressing problem in Asia and specifically in China due to rapid increase in anthropogenic emissions related to growth of China's economic activity and increasing demand for energy in the past decade. Observed levels of particulate matter and ozone regularly exceed World Health Organization (WHO) air quality guidelines in many parts of the country leading to increased risk of respiratory illnesses and other health problems. The EU-funded project PANDA aims to establish a team of European and Chinese scientists to monitor air pollution over China and elaborate air quality indicators in support of European and Chinese policies. PANDA combines state-of-the-art air pollution modeling with space and surface observations of chemical species to improve methods for monitoring air quality. The modeling system of the PANDA project follows a downscaling approach: global models such as MOZART and MACC system provide initial and boundary conditions to regional WRF-Chem and EMEP simulations over East Asia. WRF-Chem simulations at higher resolution (e.g. 20km) are then performed over a smaller domain covering East China and initial and boundary conditions from this run are used to perform simulations at a finer resolution (e.g. 5km) over specific megacities like Shanghai. Here we present results of model simulations for January and July 2010 performed during the first year of the project. We show an intercomparison of the global (MACC, EMEP) and regional (WRF-Chem) simulations and a comprehensive evaluation with satellite measurements (NO2, CO) and in-situ data (O3, CO, NOx, PM10 and PM2.5) at several surface stations. Using the WRF-Chem model, we demonstrate that model performance is influenced not only by the resolution (e.g. 60km, 20km) but also the emission inventories used (MACCity, HTAPv2), their resolution and diurnal variation, and the choice of initial and boundary conditions (e.g. MOZART, MACC analysis).

  19. The Influence of the North Atlantic Oscillation on Tropospheric Distributions of Ozone and Carbon Monoxide.

    NASA Astrophysics Data System (ADS)

    Knowland, K. E.; Doherty, R. M.; Hodges, K.

    2015-12-01

    The influence of the North Atlantic Oscillation (NAO) on the tropospheric distributions of ozone (O3) and carbon monoxide (CO) has been quantified. The Monitoring Atmospheric Composition and Climate (MACC) Reanalysis, a combined meteorology and composition dataset for the period 2003-2012 (Innes et al., 2013), is used to investigate the composition of the troposphere and lower stratosphere in relation to the location of the storm track as well as other meteorological parameters over the North Atlantic associated with the different NAO phases. Cyclone tracks in the MACC Reanalysis compare well to the cyclone tracks in the widely-used ERA-Interim Reanalysis for the same 10-year period (cyclone tracking performed using the tracking algorithm of Hodges (1995, 1999)), as both are based on the European Centre for Medium-Range Weather Forecasts' (ECMWF) Integrated Forecast System (IFS). A seasonal analysis is performed whereby the MACC reanalysis meteorological fields, O3 and CO mixing ratios are weighted by the monthly NAO index values. The location of the main storm track, which tilts towards high latitudes (toward the Arctic) during positive NAO phases to a more zonal location in the mid-latitudes (toward Europe) during negative NAO phases, impacts the location of both horizontal and vertical transport across the North Atlantic and into the Arctic. During positive NAO seasons, the persistence of cyclones over the North Atlantic coupled with a stronger Azores High promotes strong horizontal transport across the North Atlantic throughout the troposphere. In all seasons, significantly more intense cyclones occur at higher latitudes (north of ~50°C) during the positive phase of the NAO and in the southern mid-latitudes during the negative NAO phase. This impacts the location of stratospheric intrusions within the descending dry airstream behind the associated cold front of the extratropical cyclone and the venting of low-level pollution up into the free troposphere within the warm conveyor belt airstream which rises ahead of the cold front.

  20. Factors affecting cardiovascular and cerebrovascular complications of carotid artery stenting in Northern Michigan: A retrospective study.

    PubMed

    Mammo, Dalia F; Cheng, Chin-I; Ragina, Neli P; Alani, Firas

    This study seeks to identify factors associated with periprocedural complications of carotid artery stenting (CAS) to best understand CAS complication rates and optimize patient outcomes. Periprocedural complications include major adverse cardiovascular and cerebrovascular events (MACCE) that include myocardial infarction (MI), stroke, or death. We retrospectively analyzed 181 patients from Northern Michigan who underwent CAS. Rates of stroke, MI, and death occurring within 30days post-procedure were examined. Associations of open vs. closed cell stent type, demographics, comorbidities, and symptomatic carotid stenosis were compared to determine significance. All patients had three NIH Stroke Scale (NIHSS) exams: at baseline, 24h post-procedure, and at the one-month visit. Cardiac enzymes were measured twice in all patients, within 24h post-procedure. All patients were treated with dual anti-platelet therapy for at least 6months post-procedure. Three patients (1.66%) experienced a major complication within one-month post-procedure. These complications included one MI (0.55%), one stroke (0.55%), and one death (0.55%). The following variable factors were not associated with the occurrence of MACCE complications within 30days post-procedure: stent design (open vs. closed cell) (p=1.000), age ≥80 (p=0.559), smoking history (p=0.569), hypertension (p=1.000), diabetes (p=1.000), and symptomatic carotid stenosis (p=0.254). Age of 80years old or above, symptomatic carotid stenosis, open-cell stent design, and history of diabetes, smoking, or hypertension were not found to have an association with MACCE within 1month after CAS. Future studies using a greater sample size will be beneficial to better assess periprocedural complication risks of CAS, while also considering the effect of operator experience and technological advancements on decreasing periprocedural complication rates. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Effect of Concentration on the Electrochemistry and Speciation of the Magnesium Aluminum Chloride Complex Electrolyte Solution.

    PubMed

    See, Kimberly A; Liu, Yao-Min; Ha, Yeyoung; Barile, Christopher J; Gewirth, Andrew A

    2017-10-18

    Magnesium batteries offer an opportunity to use naturally abundant Mg and achieve large volumetric capacities reaching over four times that of conventional Li-based intercalation anodes. High volumetric capacity is enabled by the use of a Mg metal anode in which charge is stored via electrodeposition and stripping processes, however, electrolytes that support efficient Mg electrodeposition and stripping are few and are often prepared from highly reactive compounds. One interesting electrolyte solution that supports Mg deposition and stripping without the use of highly reactive reagents is the magnesium aluminum chloride complex (MACC) electrolyte. The MACC exhibits high Coulombic efficiencies and low deposition overpotentials following an electrolytic conditioning protocol that stabilizes species necessary for such behavior. Here, we discuss the effect of the MgCl 2 and AlCl 3 concentrations on the deposition overpotential, current density, and the conditioning process. Higher concentrations of MACC exhibit enhanced Mg electrodeposition current density and much faster conditioning. An increase in the salt concentrations causes a shift in the complex equilibria involving both cations. The conditioning process is strongly dependent on the concentration suggesting that the electrolyte is activated through a change in speciation of electrolyte complexes and is not simply due to the annihilation of electrolyte impurities. Additionally, the presence of the [Mg 2 (μ-Cl) 3 ·6THF] + in the electrolyte solution is again confirmed through careful analysis of experimental Raman spectra coupled with simulation and direct observation of the complex in sonic spray ionization mass spectrometry. Importantly, we suggest that the ∼210 cm -1 mode commonly observed in the Raman spectra of many Mg electrolytes is indicative of the C 3v symmetric [Mg 2 (μ-Cl) 3 ·6THF] + . The 210 cm -1 mode is present in many electrolytes containing MgCl 2 , so its assignment is of broad interest to the Mg electrolyte community.

  2. Methodology for Air Quality Forecast Downscaling from Regional- to Street-Scale

    NASA Astrophysics Data System (ADS)

    Baklanov, Alexander; Nuterman, Roman; Mahura, Alexander; Amstrup, Bjarne; Hansen Saas, Bent; Havskov Sørensen, Jens; Lorenzen, Thomas; Weismann, Jakob

    2010-05-01

    The most serious air pollution events occur in cities where there is a combination of high population density and air pollution, e.g. from vehicles. The pollutants can lead to serious human health problems, including asthma, irritation of the lungs, bronchitis, pneumonia, decreased resistance to respiratory infections, and premature death. In particular air pollution is associated with increase in cardiovascular disease and lung cancer. In 2000 WHO estimated that between 2.5 % and 11 % of total annual deaths are caused by exposure to air pollution. However, European-scale air quality models are not suited for local forecasts, as their grid-cell is typically of the order of 5 to 10km and they generally lack detailed representation of urban effects. Two suites are used in the framework of the EC FP7 project MACC (Monitoring of Atmosphere Composition and Climate) to demonstrate how downscaling from the European MACC ensemble to local-scale air quality forecast will be carried out: one will illustrate capabilities for the city of Copenhagen (Denmark); the second will focus on the city of Bucharest (Romania). This work is devoted to the first suite, where methodological aspects of downscaling from regional (European/ Denmark) to urban scale (Copenhagen), and from the urban down to street scale. The first results of downscaling according to the proposed methodology are presented. The potential for downscaling of European air quality forecasts by operating urban and street-level forecast models is evaluated. This will bring a strong support for continuous improvement of the regional forecast modelling systems for air quality in Europe, and underline clear perspectives for the future regional air quality core and downstream services for end-users. At the end of the MACC project, requirements on "how-to-do" downscaling of European air-quality forecasts to the city and street levels with different approaches will be formulated.

  3. The safety, efficacy and cost-effectiveness of stress echocardiography in patients with high pretest probability of coronary artery disease.

    PubMed

    Papachristidis, Alexandros; Demarco, Daniela Cassar; Roper, Damian; Tsironis, Ioannis; Papitsas, Michael; Byrne, Jonathan; Alfakih, Khaled; Monaghan, Mark J

    2017-01-01

    In this study, we assess the clinical and cost-effectiveness of stress echocardiography (SE), as well as the place of SE in patients with high pretest probability (PTP) of coronary artery disease (CAD). We investigated 257 patients with no history of CAD, who underwent SE, and they had a PTP risk score >61% (high PTP). According to the National Institute for Health and Care Excellence guidance (NICE CG95, 2010), these patients should be investigated directly with an invasive coronary angiogram (ICA). We investigated those patients with SE initially and then with ICA when appropriate. Follow-up data with regard to Major Adverse Cardiac and Cerebrovascular Events (MACCE, defined as cardiovascular mortality, cerebrovascular accident (CVA), myocardial infarction (MI) and late revascularisation for acute coronary syndrome/unstable angina) were recorded for a period of 12 months following the SE. The tariff for SE and ICA is £300 and £1400, respectively. 106 patients had a positive SE (41.2%) and 61 of them (57.5%) had further investigation with ICA. 15 (24.6%) of these patients were revascularised. The average cost per patient for investigations was £654.09. If NICE guidance had been followed, the cost would have been significantly higher at £1400 (p<0.001). Overall, 5 MACCE (2.0%) were recorded; 4 (3.8%) in the group of positive SE (2 CVAs and 2 MIs) and 1 (0.7%) in the group of negative SE (1 CVA). There was no MI and no need for revascularisation in the negative SE group. Our approach to investigate patients who present with de novo chest pain and high PTP, with SE initially and subsequently with ICA when appropriate, reduces the cost significantly (£745.91 per patient) with a very low rate of MACCE. However, this study is underpowered to assess safety of SE.

  4. Phytohormone Interaction Modulating Fruit Responses to Photooxidative and Heat Stress on Apple (Malus domestica Borkh.).

    PubMed

    Torres, Carolina A; Sepúlveda, Gloria; Kahlaoui, Besma

    2017-01-01

    Sun-related physiological disorders such as sun damage on apples ( Malus domestica Borkh) are caused by cumulative photooxidative and heat stress during their growing season triggering morphological, physiological, and biochemical changes in fruit tissues not only while it is on the tree but also after it has been harvested. The objective of the work was to establish the interaction of auxin (indole-3-acetic acid; IAA), abscisic acid (ABA), jasmonic acid (JA), salicylic acid (SA), and ethylene (ET) and its precursor ACC (free and conjugated, MACC) during development of sun-injury-related disorders pre- and post-harvest on apples. Peel tissue was extracted from fruit growing under different sun exposures (Non-exposed, NE; Exposed, EX) and with sun injury symptoms (Moderate, Mod). Sampling was carried out every 15 days from 75 days after full bloom (DAFB) until 120 days post-harvest in cold storage (1°C, > 90%RH). Concentrations of IAA, ABA, JA, SA, were determined using UHPLC mass spectrometry, and ET and ACC (free and conjugated MACC) using gas chromatography. IAA was found not to be related directly to sun injury development, but it decreased 60% in sun exposed tissue, and during fruit development. ABA, JA, SA, and ethylene concentrations were significantly higher ( P ≤ 0.05) in Mod tissue, but their concentration, except for ethylene, were not affected by sun exposure. ACC and MACC concentrations increased until 105 DAFB in all sun exposure categories. During post-harvest, ethylene climacteric peak was delayed on EX compared to Mod. ABA and SA concentrations remained stable throughout storage in both tissue. JA dramatically increased post-harvest in both EX and Mod tissue, and orchards, confirming its role in low temperature tolerance. The results suggest that ABA, JA, and SA together with ethylene are modulating some of the abiotic stress defense responses on sun-exposed fruit during photooxidative and heat stress on the tree.

  5. Phytohormone Interaction Modulating Fruit Responses to Photooxidative and Heat Stress on Apple (Malus domestica Borkh.)

    PubMed Central

    Torres, Carolina A.; Sepúlveda, Gloria; Kahlaoui, Besma

    2017-01-01

    Sun-related physiological disorders such as sun damage on apples (Malus domestica Borkh) are caused by cumulative photooxidative and heat stress during their growing season triggering morphological, physiological, and biochemical changes in fruit tissues not only while it is on the tree but also after it has been harvested. The objective of the work was to establish the interaction of auxin (indole-3-acetic acid; IAA), abscisic acid (ABA), jasmonic acid (JA), salicylic acid (SA), and ethylene (ET) and its precursor ACC (free and conjugated, MACC) during development of sun-injury-related disorders pre- and post-harvest on apples. Peel tissue was extracted from fruit growing under different sun exposures (Non-exposed, NE; Exposed, EX) and with sun injury symptoms (Moderate, Mod). Sampling was carried out every 15 days from 75 days after full bloom (DAFB) until 120 days post-harvest in cold storage (1°C, > 90%RH). Concentrations of IAA, ABA, JA, SA, were determined using UHPLC mass spectrometry, and ET and ACC (free and conjugated MACC) using gas chromatography. IAA was found not to be related directly to sun injury development, but it decreased 60% in sun exposed tissue, and during fruit development. ABA, JA, SA, and ethylene concentrations were significantly higher (P ≤ 0.05) in Mod tissue, but their concentration, except for ethylene, were not affected by sun exposure. ACC and MACC concentrations increased until 105 DAFB in all sun exposure categories. During post-harvest, ethylene climacteric peak was delayed on EX compared to Mod. ABA and SA concentrations remained stable throughout storage in both tissue. JA dramatically increased post-harvest in both EX and Mod tissue, and orchards, confirming its role in low temperature tolerance. The results suggest that ABA, JA, and SA together with ethylene are modulating some of the abiotic stress defense responses on sun-exposed fruit during photooxidative and heat stress on the tree. PMID:29491868

  6. TriGuard™ HDH embolic deflection device for cerebral protection during transcatheter aortic valve replacement.

    PubMed

    Samim, Mariam; van der Worp, Bart; Agostoni, Pierfrancesco; Hendrikse, Jeroen; Budde, Ricardo P J; Nijhoff, Freek; Ramjankhan, Faiz; Doevendans, Pieter A; Stella, Pieter R

    2017-02-15

    This study aims to evaluate the safety and performance of the new embolic deflection device TriGuard™HDH in patients undergoing TAVR. Transcatheter aortic valve replacement (TAVR) is associated with a high incidence of new cerebral ischemic lesions. The use of an embolic protection device may reduce the frequency of TAVR-related embolic events. This prospective, single arm feasibility pilot study included 14 patients with severe symptomatic aortic stenosis scheduled for TAVR. Cerebral diffusion weighted magnetic resonance imaging (DWI) was planned in all patients one day before and at day 4 (±2) after the procedure. Major adverse cerebral and cardiac events (MACCEs) were recorded for all patients. Primary endpoints of this study were I) device performance success defined as coverage of the aortic arch takeoffs throughout the entire TAVR procedure and II) MACCE occurrence. Secondary endpoints included the number and the volume of new cerebral ischemic lesions on DWI. Thirteen patients underwent transfemoral TAVR and one patient a transapical procedure. Edwards SAPIEN valve prosthesis was implanted in 8 (57%) patients and Medtronic CoreValve prosthesis in the remaining 6 (43%). Predefined performance success of the TriGuard™HDH device was achieved in 9 (64%) patients. The composite endpoint MACCE occurred in none of the patients. Post-procedural DWI was performed in 11 patients. Comparing the DWI of these patients to a historical control group showed no reduction in number [median 5.5 vs. 5.0, P = 0.857], however there was a significant reduction in mean lesion volume per patient [median 13.8 vs. 25.1, P = 0.049]. This study showed the feasibility and safety of using the TriGuard™HDH for cerebral protection during TAVR. This device did not decrease the number of post-procedural new cerebral DWI lesions, however its use showed decreased lesion volume as compared to unprotected TAVR. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  7. Systematic review of preoperative physical activity and its impact on postcardiac surgical outcomes.

    PubMed

    Kehler, D Scott; Stammers, Andrew N; Tangri, Navdeep; Hiebert, Brett; Fransoo, Randy; Schultz, Annette S H; Macdonald, Kerry; Giacomontonio, Nicholas; Hassan, Ansar; Légaré, Jean-Francois; Arora, Rakesh C; Duhamel, Todd A

    2017-08-11

    The objective of this systematic review was to study the impact of preoperative physical activity levels on adult cardiac surgical patients' postoperative: (1) major adverse cardiac and cerebrovascular events (MACCEs), (2) adverse events within 30 days, (3) hospital length of stay (HLOS), (4) intensive care unit length of stay (ICU LOS), (5) activities of daily living (ADLs), (6) quality of life, (7) cardiac rehabilitation attendance and (8) physical activity behaviour. A systematic search of MEDLINE, Embase, AgeLine and Cochrane library for cohort studies was conducted. Eleven studies (n=5733 patients) met the inclusion criteria. Only self-reported physical activity tools were used. Few studies used multivariate analyses to compare active versus inactive patients prior to surgery. When comparing patients who were active versus inactive preoperatively, there were mixed findings for MACCE, 30 day adverse events, HLOS and ICU LOS. Of the studies that adjusted for confounding variables, five studies found a protective, independent association between physical activity and MACCE (n=1), 30-day postoperative events (n=2), HLOS (n=1) and ICU LOS (n=1), but two studies found no protective association for 30-day postoperative events (n=1) and postoperative ADLs (n=1). No studies investigated if activity status before surgery impacted quality of life or cardiac rehabilitation attendance postoperatively. Three studies found that active patients prior to surgery were more likely to be inactive postoperatively. Due to the mixed findings, the literature does not presently support that self-reported preoperative physical activity behaviour is associated with postoperative cardiac surgical outcomes. Future studies should objectively measure physical activity, clearly define outcomes and adjust for clinically relevant variables. Trial registration number NCT02219815. PROSPERO number CRD42015023606. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. An Analysis of CPA Firm Recruiters' Perceptions of Online Masters of Accounting Degrees

    ERIC Educational Resources Information Center

    Metrejean, Eddie; Noland, Thomas G.

    2011-01-01

    Online education continues to grow at a rapid pace. Assessment of the effectiveness of online programs is needed to differentiate legitimate programs from diploma mills. The authors examined the perceptions of CPA firm recruiters on whether an online Master of Accounting (MACC) matters in the hiring decision. Results show that recruiters do not…

  9. Joint Interdiction

    DTIC Science & Technology

    2016-09-09

    law enforcement detachment (USCG) LEO law enforcement operations LOC line of communications MACCS Marine air command and control system MAS...enemy command and control [C2], intelligence, fires, reinforcing units, lines of communications [ LOCs ], logistics, and other operational- and tactical...enemy naval, engineering, and personnel resources to the tasks of repairing and recovering damaged equipment, facilities, and LOCs . It can draw the

  10. PsyScript: a Macintosh application for scripting experiments.

    PubMed

    Bates, Timothy C; D'Oliveiro, Lawrence

    2003-11-01

    PsyScript is a scriptable application allowing users to describe experiments in Apple's compiled high-level object-oriented AppleScript language, while still supporting millisecond or better within-trial event timing (delays can be in milliseconds or refresh-based, and PsyScript can wait on external I/O, such as eye movement fixations). Because AppleScript is object oriented and system-wide, PsyScript experiments support complex branching, code reuse, and integration with other applications. Included AppleScript-based libraries support file handling and stimulus randomization and sampling, as well as more specialized tasks, such as adaptive testing. Advanced features include support for the BBox serial port button box, as well as a low-cost USB-based digital I/O card for millisecond timing, recording of any number and types of responses within a trial, novel responses, such as graphics tablet drawing, and use of the Macintosh sound facilities to provide an accurate voice key, saving voice responses to disk, scriptable image creation, support for flicker-free animation, and gaze-dependent masking. The application is open source, allowing researchers to enhance the feature set and verify internal functions. Both the application and the source are available for free download at www.maccs.mq.edu.au/-tim/psyscript/.

  11. Report from Hawai'i: The Rising Tide of Arts Education in the Islands

    ERIC Educational Resources Information Center

    Wood, Paul

    2005-01-01

    The establishment of Maui Arts & Cultural Center (MACC), a community arts facility that prioritizes education at the top of its mission, has been a significant factor in the growth of arts education in Hawai'i. This article describes the role such a facility can play in the kind of educational reform that people envision, and the author's…

  12. Comparative Evaluation of the Impact of WRF-NMM and WRF-ARW Meteorology on CMAQ Simulations for O3 and Related Species During the 2006 TexAQS/GoMACCS Campaign

    EPA Science Inventory

    In this paper, impact of meteorology derived from the Weather, Research and Forecasting (WRF)– Non–hydrostatic Mesoscale Model (NMM) and WRF–Advanced Research WRF (ARW) meteorological models on the Community Multiscale Air Quality (CMAQ) simulations for ozone and its related prec...

  13. Corruption in Myanmar - Holding a Country and its People from Economic Prosperity

    DTIC Science & Technology

    2014-10-30

    censorship laws and freedom to information by banning independent newspapers thereby repressing efforts towards democracy even further. 6 The SPP... censorship laws, insisting state officials return embezzled funds, signing and ratifying the United Nations Convention against Corruption (UNCAC), and...instill a culture of change. For example, in Malaysia , the government formed the Malaysian Anti-Corruption Commission (MACC), an independent watch

  14. Effects of continuous positive airway pressure on anxiety, depression, and major cardiac and cerebro-vascular events in obstructive sleep apnea patients with and without coronary artery disease.

    PubMed

    Lee, Ming-Chung; Shen, Yu-Chih; Wang, Ji-Hung; Li, Yu-Ying; Li, Tzu-Hsien; Chang, En-Ting; Wang, Hsiu-Mei

    2017-01-01

    Obstructive sleep apnea (OSA) is associated with bad cardiovascular outcomes and a high prevalence of anxiety and depression. This study investigated the effects of continuous positive airway pressure (CPAP) on the severity of anxiety and depression in OSA patients with or without coronary artery disease (CAD) and on the rate of cardio- and cerebro-vascular events in those with OSA and CAD. This prospective study included patients with moderate-to-severe OSA, with or without a recent diagnosis of CAD; all were started on CPAP therapy. Patients completed the Chinese versions of the Beck Anxiety Inventory (BAI) and Beck Depression Inventory-II (BDI-II) at baseline and after 6-month follow-up. The occurrence of major adverse cardiac and cerebrovascular events (MACCE) was assessed every 3 months up to 1 year. BAI scores decreased from 8.5 ± 8.4 at baseline to 5.4 ± 6.9 at 6 months in CPAP-compliant OSA patients without CAD ( P < 0.05). BAI scores also decreased from 20.7 ± 14.9 to 16.1 ± 14.5 in CPAP-compliant OSA patients with CAD. BDI-II scores decreased in CPAP-compliant OSA patients without CAD (from 11.1 ± 10.7 at baseline to 6.6 ± 9.5 at 6 months) and in CPAP-compliant OSA patients with CAD (from 20.4 ± 14.3 to 15.9 ± 7.3). In addition, there was a large effect size (ES) of BAI and BDI in 6-month CPAP treatment of OSA patients with CAD and a large ES in those with OSA under CPAP treatment. In OSA patients with CAD, the occurrence of MACCE was significantly lower in CPAP-compliant patients than that in CPAP noncompliant patients (11% in CPAP compliant and 50% in noncompliant; P < 0.05). CPAP improved anxiety and depression in OSA patients regardless of CAD. In OSA patients with CAD, CPAP-compliant patients had a lower 1-year rate of MACCE than CPAP-noncompliant patients.

  15. Percutaneous coronary intervention vs coronary artery bypass grafting for left main coronary artery disease? A systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Sharma, Sharan P; Dahal, Khagendra; Khatra, Jaspreet; Rosenfeld, Alan; Lee, Juyong

    2017-06-01

    It is not clear whether percutaneous coronary intervention (PCI) is as effective and safe as coronary artery bypass grafting (CABG) for left main coronary artery disease. We aimed to perform a systematic review and meta-analysis of all randomized controlled trials (RCTs) that compared PCI and CABG in left main coronary disease. We searched PubMed, EMBASE, Cochrane, Scopus and relevant references for RCTs (inception through, November 20, 2016 without language restrictions) and performed meta-analysis using random-effects model. All-cause mortality, myocardial infarction, revascularization rate, stroke, and major adverse cardiac and cerebrovascular events (MACCE) were the measured outcomes. Six RCTs with a total population of 4700 were analyzed. There was no difference in all-cause mortality at 30-day, one-year, and five-year (1.8% vs 1.1%; OR 0.60; 95% CI: 0.26-1.39; P=.23; I 2 =9%) follow-up between PCI and CABG. CABG group had less myocardial infarction (MI) at five-year follow-up than PCI (5% vs 2.5%; OR 2.04; CI: 1.30-3.19; P=.002; I 2 =1%). Revascularization rate favored CABG in one-year (8.6% vs 4.5%; OR 2; CI: 1.46-2.73; P<.0001; I 2 =45%) and five-year (15.9% vs 9.9%; OR 1.73; CI: 1.36-2.20; P<.0001; I 2 =0%) follow-up. Although stroke rate was lower in PCI group at 1 year, there was no difference in longer follow-up. MACCE at 5 years favored CABG (24% vs 18%; OR 1.45; CI: 1.19-1.76; P=.0001; I 2 =0%). On subgroup analysis, MACCE were not different between two groups in low-to-intermediate SYNTAX group while it was higher for PCI group with high SYNTAX group. Percutaneous coronary intervention could be as safe and effective as CABG in a select group of left main coronary artery disease patients. © 2017 John Wiley & Sons Ltd.

  16. Comparative Evaluation of the Impact of WRF/NMM and WRF/ARW Meteorology on CMAQ Simulations for PM2.5 and its Related Precursors during the 2006 TexAQS/GoMACCS Study

    EPA Science Inventory

    This study presents a comparative evaluation of the impact of WRF-NMM and WRF-ARW meteorology on CMAQ simulations of PM2.5, its composition and related precursors over the eastern United States with the intensive observations obtained by aircraft (NOAA WP-3), ship and ...

  17. Unveiling the High Energy Obscured Universe: Hunting Collapsed Objects Physics

    NASA Technical Reports Server (NTRS)

    Ubertini, P.; Bazzano, A.; Cocchi, M.; Natalucci, L.; Bassani, L.; Caroli, E.; Stephen, J. B.; Caraveo, P.; Mereghetti, S.; Villa, G.

    2005-01-01

    A large part of energy from space is coming from collapsing stars (SN, Hypernovae) and collapsed stars (black holes, neutron stars and white dwarfs). The peak of their energy release is in the hard-X and gamma-ray wavelengths where photons are insensitive to absorption and can travel from the edge the Universe or the central core of the Galaxy without loosing the primordial information of energy, time signature and polarization. The most efficient process to produce energetic photons is gravitational accretion of matter from a "normal" star onto a collapsed companion (LGxMcollxdMacc/dtx( 1Rdisc)-dMacc/dt x c2), exceeding by far the nuclear reaction capability to generate high energy quanta. Thus our natural laboratory for "in situ" investigations are collapsed objects in which matter and radiation co-exist in extreme conditions of temperature and density due to gravitationally bent geometry and magnetic fields. This is a unique opportunity to study the physics of accretion flows in stellar mass and super-massive Black Holes (SMBHs), plasmoids generated in relativistic jets in galactic microQSOs and AGNs, ionised plasma interacting at the touching point of weakly magnetized NS surface, GRB/Supernovae connection, and the mysterious origins of "dark" GRB and X-ray flash.

  18. Culotte stenting for coronary bifurcation lesions with 2nd and 3rd generation everolimus-eluting stents: the CELTIC Bifurcation Study.

    PubMed

    Walsh, Simon J; Hanratty, Colm G; Watkins, Stuart; Oldroyd, Keith G; Mulvihill, Niall T; Hensey, Mark; Chase, Alex; Smith, Dave; Cruden, Nick; Spratt, James C; Mylotte, Darren; Johnson, Tom; Hill, Jonathan; Hussein, Hafiz M; Bogaerts, Kris; Morice, Marie-Claude; Foley, David P

    2018-05-24

    The aim of this study was to provide contemporary outcome data for patients with de novo coronary disease and Medina 1,1,1 lesions who were treated with a culotte two-stent technique, and to compare the performance of two modern-generation drug-eluting stent (DES) platforms, the 3-connector XIENCE and the 2-connector SYNERGY. Patients with Medina 1,1,1 bifurcation lesions who had disease that was amenable to culotte stenting were randomised 1:1 to treatment with XIENCE or SYNERGY DES. A total of 170 patients were included. Technical success and final kissing balloon inflation occurred in >96% of cases. Major adverse cardiovascular or cerebrovascular events (MACCE: a composite of death, myocardial infarction [MI], cerebrovascular accident [CVA] and target vessel revascularisation [TVR]) occurred in 5.9% of patients by nine months. The primary endpoint was a composite of death, MI, CVA, target vessel failure (TVF), stent thrombosis and binary angiographic restenosis. At nine months, the primary endpoint occurred in 19% of XIENCE patients and 16% of SYNERGY patients (p=0.003 for non-inferiority for platform performance). MACCE rates for culotte stenting using contemporary everolimus-eluting DES are low at nine months. The XIENCE and SYNERGY stents demonstrated comparable performance for the primary endpoint.

  19. Simultaneous Traveling Convection Vortex (TCV) Events and Pc 1-2 Wave Bursts at Cusp/Cleft Latitudes observed in Arctic Canada and Svalbard

    NASA Astrophysics Data System (ADS)

    Posch, J. L.; Witte, A. J.; Engebretson, M. J.; Murr, D.; Lessard, M.; Raita, T.; Singer, H. J.

    2010-12-01

    Traveling convection vortices (TCVs), which appear in ground magnetometer records at near-cusp latitudes as solitary ~5 mHz pulses, are now known to originate in instabilities in the ion foreshock just upstream of Earth’s bow shock. They can also stimulate compressions or relaxations of the dayside magnetosphere (evident in geosynchronous satellite data). These transient compressions can in turn sharply increase the growth rate of electromagnetic ion cyclotron (EMIC) waves, which also appear in ground records at near-cusp latitudes as bursts of Pc 1-2 pulsations. In this study we have identified simultaneous TCV - Pc 1-2 burst events occurring from 2008 through the first 7 months of 2010 in Eastern Arctic Canada and Svalbard, using a combination of fluxgate magnetometers (MACCS and IMAGE) and search coil magnetometers in each region. Magnetometer observations at GOES 10 and 12, at longitudes near the MACCS sites, are also used to characterize the strength of the magnetic perturbations. There is no direct proportion between the amplitude of TCV and Pc 1-2 wave events in either region, consistent with the highly variable densities and pitch angle distributions of plasma of ring current / plasma sheet energies in the outer dayside magnetosphere.

  20. Evaluation of a new microphysical aerosol module in the ECMWF Integrated Forecasting System

    NASA Astrophysics Data System (ADS)

    Woodhouse, Matthew; Mann, Graham; Carslaw, Ken; Morcrette, Jean-Jacques; Schulz, Michael; Kinne, Stefan; Boucher, Olivier

    2013-04-01

    The Monitoring Atmospheric Composition and Climate II (MACC-II) project will provide a system for monitoring and predicting atmospheric composition. As part of the first phase of MACC, the GLOMAP-mode microphysical aerosol scheme (Mann et al., 2010, GMD) was incorporated within the ECMWF Integrated Forecasting System (IFS). The two-moment modal GLOMAP-mode scheme includes new particle formation, condensation, coagulation, cloud-processing, and wet and dry deposition. GLOMAP-mode is already incorporated as a module within the TOMCAT chemistry transport model and within the UK Met Office HadGEM3 general circulation model. The microphysical, process-based GLOMAP-mode scheme allows an improved representation of aerosol size and composition and can simulate aerosol evolution in the troposphere and stratosphere. The new aerosol forecasting and re-analysis system (known as IFS-GLOMAP) will also provide improved boundary conditions for regional air quality forecasts, and will benefit from assimilation of observed aerosol optical depths in near real time. Presented here is an evaluation of the performance of the IFS-GLOMAP system in comparison to in situ aerosol mass and number measurements, and remotely-sensed aerosol optical depth measurements. Future development will provide a fully-coupled chemistry-aerosol scheme, and the capability to resolve nitrate aerosol.

  1. Buried Underwater Munitions and Clutter Discrimination

    DTIC Science & Technology

    2010-10-01

    closest point of approach of the cylinder. The k space amplitude beam pattern, sin Δ( ) Δ , in Stanton’s treatment is obtained from the Fourier ...simple modifications to be useful here. First, the amplitude of the incident plane wave P0 should be replaced by P1r0/r, where P1 is the magnitude of...Instrument Source Information Site Selec- tion MACC Phase I Input Location Resolution Age Bathymetry SEA Ltd. SWATHPlus McNinch

  2. Top-down NOX Emissions of European Cities Derived from Modelled and Spaceborne Tropospheric NO2 Columns

    NASA Astrophysics Data System (ADS)

    Verstraeten, W. W.; Boersma, K. F.; Douros, J.; Williams, J. E.; Eskes, H.; Delcloo, A. W.

    2017-12-01

    High nitrogen oxides (NOX = NO + NO2) concentrations near the surface impact humans and ecosystems badly and play a key role in tropospheric chemistry. NO2 is an important precursor of tropospheric ozone (O3) which in turn affects the production of the hydroxyl radical controlling the chemical lifetime of key atmospheric pollutants and reactive greenhouse gases. Combustion from industrial, traffic and household activities in large and densely populated urban areas result in high NOX emissions. Accurate mapping of these emissions is essential but hard to do since reported emissions factors may differ from real-time emissions in order of magnitude. Modelled NO2 levels and lifetimes also have large associated uncertainties and overestimation in the chemical lifetime which may mask missing NOX chemistry in current chemistry transport models (CTM's). The simultaneously estimation of both the NO2 lifetime and as well as the concentrations by applying the Exponentially Modified Gaussian (EMG) method on tropospheric NO2 columns lines densities should improve the surface NOX emission estimates. Here we evaluate if the EMG methodology applied on the tropospheric NO2 columns simulated by the LOTOS-EUROS (Long Term Ozone Simulation-European Ozone Simulation) CTM can reproduce the NOX emissions used as model input. First we process both the modelled tropospheric NO2 columns for the period April-September 2013 for 21 selected European urban areas under windy conditions (averaged vertical wind speeds between surface and 500 m from ECMWF > 2 m s-1) as well as the accompanying OMI (Ozone Monitoring Instrument) data providing us with real-time observation-based estimates of midday NO2 columns. Then we compare the top-down derived surface NOX emissions with the 2011 MACC-III emission inventory, used in the CTM as input to simulate the NO2 columns. For cities where NOX emissions can be assumed as originating from one large source good agreement is found between the top-down derived NOX emissions from CTM and OMI with the MACC-III inventory. For cities where multiple sources of NOX are observed (e.g. Brussels, London), an adapted methodology is required. For some cities such as St-Petersburg and Moscow the top-down NOX estimates from 2013 OMI data are biased low compared to the MACC-III inventory which uses a 2011 NOX emissions update.

  3. GIRAFE, a campaign forecast tool for anthropogenic and biomass burning plumes

    NASA Astrophysics Data System (ADS)

    Fontaine, Alain; Mari, Céline; Drouin, Marc-Antoine; Lussac, Laure

    2015-04-01

    GIRAFE (reGIonal ReAl time Fire plumEs, http://girafe.pole-ether.fr, alain.fontaine@obs-mip.fr) is a forecast tool supported by the French atmospheric chemistry data centre Ether (CNES and CNRS), build on the lagrangian particle dispersion model FLEXPART coupled with ECMWF meteorological fields and emission inventories. GIRAFE was used during the CHARMEX campaign (Chemistry-Aerosol Mediterranean Experiment http://charmex.lsce.ipsl.fr) in order to provide daily 5-days plumes trajectory forecast over the Mediterranean Sea. For this field experiment, the lagrangian model was used to mimic carbon monoxide pollution plumes emitted either by anthropogenic or biomass burning emissions. Sources from major industrial areas as Fos-Berre or the Po valley were extracted from the MACC-TNO inventory. Biomass burning sources were estimated based on MODIS fire detection. Comparison with MACC and CHIMERE APIFLAME models revealed that GIRAFE followed pollution plumes from small and short-duration fires which were not captured by low resolution models. GIRAFE was used as a decision-making tool to schedule field campaign like airbone operations or balloons launching. Thanks to recent features, GIRAFE is able to read the ECCAD database (http://eccad.pole-ether.fr) inventories. Global inventories such as MACCITY and ECLIPSE will be used to predict CO plumes trajectories from major urban and industrial sources over West Africa for the DACCIWA campaign (Dynamic-Aerosol-Chemistry-Cloud interactions in West Africa).

  4. Variations of trace gases over the Bay of Bengal during the summer monsoon

    NASA Astrophysics Data System (ADS)

    Girach, I. A.; Ojha, Narendra; Nair, Prabha R.; Tiwari, Yogesh K.; Kumar, K. Ravi

    2018-02-01

    In situ measurements of near-surface ozone (O3), carbon monoxide (CO), and methane (CH4) were carried out over the Bay of Bengal (BoB) as a part of the Continental Tropical Convergence Zone (CTCZ) campaign during the summer monsoon season of 2009. O3, CO and CH4 mixing ratios varied in the ranges of 8-54 ppbv, 50-200 ppbv and 1.57-2.15 ppmv, respectively during 16 July-17 August 2009. The spatial distribution of mean tropospheric O3 from satellite retrievals is found to be similar to that in surface O3 observations, with higher levels over coastal and northern BoB as compared to central BoB. The comparison of in situ measurements with the Monitoring Atmospheric Composition & Climate (MACC) global reanalysis shows that MACC simulations reproduce the observations with small mean biases of 1.6 ppbv, -2.6 ppbv and 0.07 ppmv for O3, CO and CH4, respectively. The analysis of diurnal variation of O3 based on observations and the simulations from Weather Research and Forecasting coupled with Chemistry (WRF-Chem) at a stationary point over the BoB did not show a net photochemical build up during daytime. Satellite retrievals show limitations in capturing CH4 variations as measured by in situ sample analysis highlighting the need of more shipborne in situ measurements of trace gases over this region during monsoon.

  5. Five-year outcomes of staged percutaneous coronary intervention in the SYNTAX study.

    PubMed

    Watkins, Stuart; Oldroyd, Keith G; Preda, Istvan; Holmes, David R; Colombo, Antonio; Morice, Marie-Claude; Leadley, Katrin; Dawkins, Keith D; Mohr, Friedrich W; Serruys, Patrick W; Feldman, Ted E

    2015-04-01

    The SYNTAX study compared PCI with TAXUS Express stents to CABG for the treatment of de novo 3-vessel and/or left main coronary disease. This study aimed to determine patient characteristics and five-year outcomes after a staged PCI strategy compared to single-session PCI. In the SYNTAX trial, staged procedures were discouraged but were allowed within 72 hours or, if renal insufficiency or contrast-induced nephropathy occurred, within 14 days (mean 9.8±18.1 days post initial procedure). A total of 125 (14%) patients underwent staged PCI. These patients had greater disease severity and/or required a more complex procedure. MACCE was significantly increased in staged patients (48.1% vs. 35.5%, p=0.004), as was the composite of death/stroke/MI (32.2% vs. 19%, p=0.0007). Individually, cardiac death and stroke occurred more frequently in the staged PCI group (p=0.03). Repeat revascularisation was significantly higher in staged patients (32.8% vs 24.8%, p=0.035), as was stent thrombosis (10.9% vs. 4.7%, p=0.005). There is a higher incidence of MACCE in patients undergoing staged compared to single-session PCI for 3-vessel and/or left main disease over the first five years of follow-up. However, these patients had more comorbidities and more diffuse disease.

  6. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  7. Orbital-Dependent Density Functionals for Chemical Catalysis

    DTIC Science & Technology

    2014-10-17

    noncollinear density functional theory to show that the low-spin state of Mn3 in a model of the oxygen -evolving complex of photosystem II avoids...DK, which denotes the cc-pV5Z-DK basis set for 3d metals and hydrogen and the ma-cc- pV5Z-DK basis set for oxygen ) and to nonrelativistic all...cc-pV5Z basis set for oxygen ). As compared to NCBS-DK results, all ECP calculations perform worse than def2-TZVP all-electron relativistic

  8. The kinetics of aerosol particle formation and removal in NPP severe accidents

    NASA Astrophysics Data System (ADS)

    Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.

    2016-06-01

    Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.

  9. The kinetics of aerosol particle formation and removal in NPP severe accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.

    2016-06-08

    Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less

  10. Unfiltered Talk--A Challenge to Categories.

    ERIC Educational Resources Information Center

    McCormick, Kay

    A study investigated how and why code switching and mixing occurs between English and Afrikaans in a region of South Africa. In District Six, non-standard Afrikaans seems to be a mixed code, and it is unclear whether non-standard English is a mixed code. Consequently, it is unclear when codes are being switched or mixed. The analysis looks at…

  11. Adjusted Levenberg-Marquardt method application to methene retrieval from IASI/METOP spectra

    NASA Astrophysics Data System (ADS)

    Khamatnurova, Marina; Gribanov, Konstantin

    2016-04-01

    Levenberg-Marquardt method [1] with iteratively adjusted parameter and simultaneous evaluation of averaging kernels together with technique of parameters selection are developed and applied to the retrieval of methane vertical profiles in the atmosphere from IASI/METOP spectra. Retrieved methane vertical profiles are then used for calculation of total atmospheric column amount. NCEP/NCAR reanalysis data provided by ESRL (NOAA, Boulder,USA) [2] are taken as initial guess for retrieval algorithm. Surface temperature, temperature and humidity vertical profiles are retrieved before methane vertical profile retrieval for each selected spectrum. Modified software package FIRE-ARMS [3] were used for numerical experiments. To adjust parameters and validate the method we used ECMWF MACC reanalysis data [4]. Methane columnar values retrieved from cloudless IASI spectra demonstrate good agreement with MACC columnar values. Comparison is performed for IASI spectra measured in May of 2012 over Western Siberia. Application of the method for current IASI/METOP measurements are discussed. 1.Ma C., Jiang L. Some Research on Levenberg-Marquardt Method for the Nonlinear Equations // Applied Mathematics and Computation. 2007. V.184. P. 1032-1040 2.http://www.esrl.noaa.gov/psdhttp://www.esrl.noaa.gov/psd 3.Gribanov K.G., Zakharov V.I., Tashkun S.A., Tyuterev Vl.G.. A New Software Tool for Radiative Transfer Calculations and its application to IMG/ADEOS data // JQSRT.2001.V.68.№ 4. P. 435-451. 4.http://www.ecmwf.int/http://www.ecmwf.int

  12. Climate Literacy Through Student-Teacher-Scientist Research Partnerships

    NASA Astrophysics Data System (ADS)

    Niepold, F.; Brooks, D.; Lefer, B.; Linsley, A.; Duckenfield, K.

    2006-12-01

    Expanding on the GLOBE Program's Atmosphere and Aerosol investigations, high school students can conduct Earth System scientific research that promotes scientific literacy in both content and the science process. Through the use of Student-Teacher-Scientist partnerships, Earth system scientific investigations can be conducted that serve the needs of the classroom as well as participating scientific investigators. During the proof-of-concept phase of this partnership model, teachers and their students developed science plans, through consultation with scientists, and began collecting atmospheric and aerosol data in support of the Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) campaign in Houston Texas. This effort uses some pre-existing GLOBE materials, but draws on a variety of other resources to tailor the teacher development activities and intended student participation in a way that addresses local and regional problems. Students and teachers have learned about best practices in scientific inquiry and they also helped to expand the pipeline of potential future scientists and researchers for industry, academia, and government. This work began with a Student-Teacher-Scientist partnership started in 2002 during a GLOBE Aerosol Protocol Cross- Ground Validation of AERONET with MODIS Satellite Aerosol Measurements. Several other GLOBE schools, both national and international, have contributed to this research. The current project support of the intensive GoMACCS air quality and atmospheric dynamics field campaign during September and October of 2006. This model will be evaluated for wider use in other project-focused partnerships led by NOAA's Climate Program Office.

  13. Performance evaluation of CESM in simulating the dust cycle

    NASA Astrophysics Data System (ADS)

    Parajuli, S. P.; Yang, Z. L.; Kocurek, G.; Lawrence, D. M.

    2014-12-01

    Mineral dust in the atmosphere has implications for Earth's radiation budget, biogeochemical cycles, hydrological cycles, human health and visibility. Mineral dust is injected into the atmosphere during dust storms when the surface winds are sufficiently strong and the land surface conditions are favorable. Dust storms are very common in specific regions of the world including the Middle East and North Africa (MENA) region, which contains more than 50% of the global dust sources. In this work, we present simulation of the dust cycle under the framework of CESM1.2.2 and evaluate how well the model captures the spatio-temporal characteristics of dust sources, transport and deposition at global scale, especially in dust source regions. We conducted our simulations using two existing erodibility maps (geomorphic and topographic) and a new erodibility map, which is based on the correlation between observed wind and dust. We compare the simulated results with MODIS satellite data, MACC reanalysis data, and AERONET station data. Comparison with MODIS satellite data and MACC reanalysis data shows that all three erodibility maps generally reproduce the spatio-temporal characteristics of dust optical depth globally. However, comparison with AERONET station data shows that the simulated dust optical depth is generally overestimated for all erodibility maps. Results vary greatly by region and scale of observational data. Our results also show that the simulations forced by reanalysis meteorology capture the overall dust cycle more realistically compared to the simulations done using online meteorology.

  14. Methods for nuclear air-cleaning-system accident-consequence assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.

    1982-01-01

    This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less

  15. Influence of Northeast Monsoon cold surges on air quality in Southeast Asia

    NASA Astrophysics Data System (ADS)

    Ashfold, M. J.; Latif, M. T.; Samah, A. A.; Mead, M. I.; Harris, N. R. P.

    2017-10-01

    Ozone (O3) is an important ground-level pollutant. O3 levels and emissions of O3 precursors have increased significantly over recent decades in East Asia and export of this O3 eastward across the Pacific Ocean is well documented. Here we show that East Asian O3 is also transported southward to tropical Southeast (SE) Asia during the Northeast Monsoon (NEM) season (defined as November to February), and that this transport pathway is especially strong during 'cold surges'. Our analysis employs reanalysis data and measurements from surface sites in Peninsular Malaysia, both covering 2003-2012, along with trajectory calculations. Using a cold surge index (northerly winds at 925 hPa averaged over 105-110°E, 5°N) to define sub-seasonal strengthening of the NEM winds, we find the largest changes in a region covering much of the Indochinese Peninsula and surrounding seas. Here, the levels of O3 and another key pollutant, carbon monoxide, calculated by the Monitoring Atmospheric Composition and Climate (MACC) Reanalysis are on average elevated by, respectively, >40% (∼15 ppb) and >60% (∼80 ppb) during cold surges. Further, in the broader region of SE Asia local afternoon exceedances of the World Health Organization's air quality guideline for O3 (100 μg m-3, or ∼50 ppb, averaged over 8 h) largely occur during these cold surges. Day-to-day variations in available O3 observations at surface sites on the east coast of Peninsular Malaysia and in corresponding parts of the MACC Reanalysis are similar, and are clearly linked to cold surges. However, observed O3 levels are typically ∼10-20 ppb lower than the MACC Reanalysis. We show that these observations are also subject to influence from local urban pollution. In agreement with past work, we find year-to-year variations in cold surge activity related to the El Nino-Southern Oscillation (ENSO), but this does not appear to be the dominant influence of ENSO on atmospheric composition in this region. Overall, our study indicates that the influence of East Asian pollution on air quality in SE Asia during the NEM could be at least as large as the corresponding, well-studied spring-time influence on North America. Both an enhanced regional observational capability and chemical modelling studies will be required to fully untangle the importance of this long-range influence relative to local processes.

  16. Postoperative Outcomes in Obstructive Sleep Apnea Patients Undergoing Cardiac Surgery: A Systematic Review and Meta-analysis of Comparative Studies.

    PubMed

    Nagappa, Mahesh; Ho, George; Patra, Jayadeep; Wong, Jean; Singh, Mandeep; Kaw, Roop; Cheng, Davy; Chung, Frances

    2017-12-01

    Obstructive sleep apnea (OSA) is a common comorbidity in patients undergoing cardiac surgery and may predispose patients to postoperative complications. The purpose of this meta-analysis is to determine the evidence of postoperative complications associated with OSA patients undergoing cardiac surgery. A literature search of Cochrane Database of Systematic Reviews, Medline, Medline In-process, Web of Science, Scopus, EMBASE, Cochrane Central Register of Controlled Trials, and CINAHL until October 2016 was performed. The search was constrained to studies in adult cardiac surgical patients with diagnosed or suspected OSA. All included studies must report at least 1 postoperative complication. The primary outcome is major adverse cardiac or cerebrovascular events (MACCEs) up to 30 days after surgery, which includes death from all-cause mortality, myocardial infarction, myocardial injury, nonfatal cardiac arrest, revascularization process, pulmonary embolism, deep venous thrombosis, newly documented postoperative atrial fibrillation (POAF), stroke, and congestive heart failure. Secondary outcome is newly documented POAF. The other exploratory outcomes include the following: (1) postoperative tracheal intubation and mechanical ventilation; (2) infection and/or sepsis; (3) unplanned intensive care unit (ICU) admission; and (4) duration of stay in hospital and ICU. Meta-analysis and meta- regression were conducted using Cochrane Review Manager 5.3 (Cochrane, London, UK) and OpenBUGS v3.0, respectively. Eleven comparative studies were included (n = 1801 patients; OSA versus non-OSA: 688 vs 1113, respectively). MACCEs were 33.3% higher odds in OSA versus non-OSA patients (OSA versus non-OSA: 31% vs 10.6%; odds ratio [OR], 2.4; 95% confidence interval [CI], 1.38-4.2; P = .002). The odds of newly documented POAF (OSA versus non-OSA: 31% vs 21%; OR, 1.94; 95% CI, 1.13-3.33; P = .02) was higher in OSA compared to non-OSA. Even though the postoperative tracheal intubation and mechanical ventilation (OSA versus non-OSA: 13% vs 5.4%; OR, 2.67; 95% CI, 1.03-6.89; P = .04) were significantly higher in OSA patients, the length of ICU stay and hospital stay were not significantly prolonged in patients with OSA compared to non-OSA. The majority of OSA patients were not treated with continuous positive airway pressure therapy. Meta-regression and sensitivity analysis of the subgroups did not impact the OR of postoperative complications for OSA versus non-OSA groups. Our meta-analysis demonstrates that after cardiac surgery, MACCEs and newly documented POAF were 33.3% and 18.1% higher odds in OSA versus non-OSA patients, respectively.

  17. Assimilation of atmospheric methane products into the MACC-II system: from SCIAMACHY to TANSO and IASI

    NASA Astrophysics Data System (ADS)

    Massart, S.; Agusti-Panareda, A.; Aben, I.; Butz, A.; Chevallier, F.; Crevosier, C.; Engelen, R.; Frankenberg, C.; Hasekamp, O.

    2014-06-01

    The Monitoring Atmospheric Composition and Climate Interim Implementation (MACC-II) delayed-mode (DM) system has been producing an atmospheric methane (CH4) analysis 6 months behind real time since June 2009. This analysis used to rely on the assimilation of the CH4 product from the SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY) instrument onboard Envisat. Recently the Laboratoire de Météorologie Dynamique (LMD) CH4 products from the Infrared Atmospheric Sounding Interferometer (IASI) and the SRON Netherlands Institute for Space Research CH4 products from the Thermal And Near-infrared Sensor for carbon Observation (TANSO) were added to the DM system. With the loss of Envisat in April 2012, the DM system now has to rely on the assimilation of methane data from TANSO and IASI. This paper documents the impact of this change in the observing system on the methane tropospheric analysis. It is based on four experiments: one free run and three analyses from respectively the assimilation of SCIAMACHY, TANSO and a combination of TANSO and IASI CH4 products in the MACC-II system. The period between December 2010 and April 2012 is studied. The SCIAMACHY experiment globally underestimates the tropospheric methane by 35 part per billion (ppb) compared to the HIAPER Pole-to-Pole Observations (HIPPO) data and by 28 ppb compared the Total Carbon Column Observing Network (TCCON) data, while the free run presents an underestimation of 5 ppb and 1 ppb against the same HIPPO and TCCON data, respectively. The assimilated TANSO product changed in October 2011 from version v.1 to version v.2.0. The analysis of version v.1 globally underestimates the tropospheric methane by 18 ppb compared to the HIPPO data and by 15 ppb compared to the TCCON data. In contrast, the analysis of version v.2.0 globally overestimates the column by 3 ppb. When the high density IASI data are added in the tropical region between 30° N and 30° S, their impact is mainly positive but more pronounced and effective when combined with version v.2.0 of the TANSO products. The resulting analysis globally underestimates the column-averaged dry-air mole fractions of methane (xCH4) just under 1 ppb on average compared to the TCCON data, whereas in the tropics it overestimates xCH4 by about 3 ppb. The random error is estimated to be less than 7 ppb when compared to TCCON data.

  18. Efficacy of multiple arterial coronary bypass grafting in patients with diabetes mellitus.

    PubMed

    Yamaguchi, Atsushi; Kimura, Naoyuki; Itoh, Satoshi; Adachi, Koichi; Yuri, Koichi; Okamura, Homare; Adachi, Hideo

    2016-09-01

    Use of the left internal mammary artery in patients with diabetes mellitus and multivessel coronary artery disease is known to improve survival after coronary artery bypass grafting (CABG); however, the survival benefit of multiple arterial grafts (MAGs) in diabetic patients is debated. We investigated the efficacy of CABG performed with MAGs in diabetic patients. The overall patient group comprised 2618 consecutive patients who underwent isolated CABG at our hospital between 1990 and 2014. Perioperative characteristics, in-hospital outcomes and long-term outcomes were compared between diabetic (n = 1110) and non-diabetic patients (n = 1508). The long-term outcomes of diabetic and non-diabetic patients were analysed between those who received a single arterial graft (SAG) and those who received MAGs. Both full unmatched patient population and propensity-matched patient population analyses (diabetic cohort = 431 pairs, non-diabetic cohort = 577 pairs) were performed. Preoperative comorbidities were much more common in the diabetic patients than in the non-diabetic patients; however, comorbidities were not associated with in-hospital outcomes (diabetes versus non-diabetes group, in-hospital mortality: 2.2 vs 1.5%; deep sternal wound infection: 2.2 vs 1.8%, P > 0.05). Although survival and freedom from major cardiac and cerebrovascular events (MACCEs) at 15 years were lower in the diabetes group than in the non-diabetes group (survival: 48.6 vs 55.0%, P = 0.019; MACCE-free survival: 40.8 vs 46.1%, P = 0.02), cardiac death-free survival at 15 years was similar (81.7 vs 83.9%, P = 0.24). Overall, 12-year survival was higher in both diabetic and non-diabetic patients treated with MAGs than in those treated with an SAG (64.9 vs 56.8%, P = 0.006, and 71.9 vs 60.5%, P < 0.001). Propensity-matched patient cohort analysis revealed improved 12-year survival with MAGs versus SAG in both the diabetes group (64.9 vs 58.8%, P = 0.041) and non-diabetes group (71.4 vs 63.8%, P = 0.014). Similarly, MACCE-free survival was improved in both groups. A long-term survival advantage, with no increase in perioperative morbidity, is conferred with the use of multiple arterial bypass grafts not only in non-diabetic patients but also in diabetic patients. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  19. Plasma CX3CL1 levels and long term outcomes of patients with atrial fibrillation: the West Birmingham Atrial Fibrillation Project.

    PubMed

    Guo, Yutao; Apostalakis, Stavros; Blann, Andrew D; Lip, Gregory Y H

    2014-01-01

    There is growing evidence that chemokines are potentially important mediators of the pathogenesis of atherosclerotic disease. Major atherothrombotic complications, such as stroke and myocardial infarction, are common among atrial fibrillation (AF) patients. This increase in risk of adverse events may be predicted by a score based on the presence of certain clinical features of chronic heart failure, hypertension, age 75 years or greater, diabetes and stroke (the CHADS2 score). Our objective was to assess the prognostic value of plasma chemokines CCL2, CXCL4 and CX3CL1, and their relationship with the CHADS2 score, in AF patients. Plasma CCL2, CXCL4 and CX3CL1 were measured in 441 patients (59% male, mean age 75 years, 12% paroxysmal, 99% on warfarin) with AF. Baseline clinical and demographic factors were used to define each subject's CHADS2 score. Patients were followed up for a mean 2.1 years, and major adverse cardiovascular and cerebrovascular events (MACCE) were sought, being the combination of cardiovascular death, acute coronary events, stroke and systemic embolism. Fifty-five of the AF patients suffered a MACCE (6% per year). Those in the lowest CX3CL1 quartile (≤ 0.24 ng/ml) had fewest MACCE (p = 0.02). In the Cox regression analysis, CX3CL1 levels >0.24 ng/ml (Hazard ratio 2.8, 95% CI 1.02-8.2, p = 0.045) and age (p = 0.042) were independently linked with adverse outcomes. The CX3CL1 levels rose directly with the CHADS2 risk score (p = 0.009). The addition of CX3CL1 did not significantly increased the discriminatory ability of the CHADS2 clinical factor-based risk stratification (c-index 0.60 for CHADS2 alone versus 0.67 for CHADS2 plus CX3CL1 >0.24 ng/ml, p = 0.1). Aspirin use was associated with lower levels of CX3CL1 (p = 0.0002) and diabetes with higher levels (p = 0.031). There was no association between CXCL4 and CCL2 plasma levels and outcomes. There is an independent association between low plasma CX3CL1 levels and low risk of major cardiovascular events in AF patients, as well as a linear association between CX3CL1 plasma levels and CHADS2-defined cardiovascular risk. The potential for CX3CL1 in refining risk stratification in AF patients merits consideration. © 2014 S. Karger AG, Basel.

  20. Stochastic Modeling of Radioactive Material Releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason; Pope, Chad

    2015-09-01

    Nonreactor nuclear facilities operated under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA was developed using the MATLAB coding framework. The software application has a graphical user input. SODA can be installed on both Windows and Mac computers and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC, rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The work was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  1. The joint methane profiles retrieval approach from GOSAT TIR and SWIR spectra

    NASA Astrophysics Data System (ADS)

    Zadvornykh, Ilya V.; Gribanov, Konstantin G.; Zakharov, Vyacheslav I.; Imasu, Ryoichi

    2017-11-01

    In this paper we present a method, using methane as example, which allows more accurate greenhouse gases retrieval in the Earth's atmosphere. Using the new version of the FIRE-ARMS software, supplemented with the VLIDORT vector radiation transfer model, we carried out joint methane retrieval from TIR (Thermal Infrared Range) and SWIR (ShortWavelength Infrared Range) GOSAT spectra using optimal estimation method. MACC reanalysis data from the European Center for Medium-Range Forecasts (ECMWF), supplemented by data from aircraft measurements of the HIPPO experiment were used as a statistical ensemble.

  2. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  3. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  4. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  5. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  6. Bilingual Processing of ASL-English Code-Blends: The Consequences of Accessing Two Lexical Representations Simultaneously

    ERIC Educational Resources Information Center

    Emmorey, Karen; Petrich, Jennifer A. F.; Gollan, Tamar H.

    2012-01-01

    Bilinguals who are fluent in American Sign Language (ASL) and English often produce "code-blends"--simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization…

  7. Non-coding variants contribute to the clinical heterogeneity of TTR amyloidosis.

    PubMed

    Iorio, Andrea; De Lillo, Antonella; De Angelis, Flavio; Di Girolamo, Marco; Luigetti, Marco; Sabatelli, Mario; Pradotto, Luca; Mauro, Alessandro; Mazzeo, Anna; Stancanelli, Claudia; Perfetto, Federico; Frusconi, Sabrina; My, Filomena; Manfellotto, Dario; Fuciarelli, Maria; Polimanti, Renato

    2017-09-01

    Coding mutations in TTR gene cause a rare hereditary form of systemic amyloidosis, which has a complex genotype-phenotype correlation. We investigated the role of non-coding variants in regulating TTR gene expression and consequently amyloidosis symptoms. We evaluated the genotype-phenotype correlation considering the clinical information of 129 Italian patients with TTR amyloidosis. Then, we conducted a re-sequencing of TTR gene to investigate how non-coding variants affect TTR expression and, consequently, phenotypic presentation in carriers of amyloidogenic mutations. Polygenic scores for genetically determined TTR expression were constructed using data from our re-sequencing analysis and the GTEx (Genotype-Tissue Expression) project. We confirmed a strong phenotypic heterogeneity across coding mutations causing TTR amyloidosis. Considering the effects of non-coding variants on TTR expression, we identified three patient clusters with specific expression patterns associated with certain phenotypic presentations, including late onset, autonomic neurological involvement, and gastrointestinal symptoms. This study provides novel data regarding the role of non-coding variation and the gene expression profiles in patients affected by TTR amyloidosis, also putting forth an approach that could be used to investigate the mechanisms at the basis of the genotype-phenotype correlation of the disease.

  8. Gender-Based Differences in Outcomes After Orbital Atherectomy for the Treatment of De Novo Severely Calcified Coronary Lesions.

    PubMed

    Lee, Michael S; Shlofmitz, Evan; Mansourian, Pejman; Sethi, Sanjum; Shlofmitz, Richard A

    2016-11-01

    We evaluated the relationship between gender and angiographic and clinical outcomes in patients with severely calcified lesions who underwent orbital atherectomy. Female gender is associated with increased risk of adverse clinical events after percutaneous coronary intervention (PCI). Severe coronary artery calcification increases the complexity of PCI and increases the risk of adverse cardiac events. Orbital atherectomy is effective in plaque modification, which facilitates stent delivery and expansion. Whether gender differences exist after orbital atherectomy is unclear. Our analysis retrospectively analyzed 458 consecutive real-world patients (314 males and 144 females) from three centers who underwent orbital atherectomy. The primary endpoint was the major adverse cardiac and cerebrovascular event (MACCE) rate, defined as the composite of death, myocardial infarction (MI), target-vessel revascularization (TVR), and stroke, at 30 days. The primary endpoint of MACCE was low and similar in females and males (0.7% vs 2.9%; P=.14). The individual endpoints of death (0.7% vs 1.6%; P=.43), MI (0.7% vs 1.3%; P=.58), TVR (0% vs 0%; P>.99), and stroke (0% vs 0.3%; P=.50) were low in both groups and did not differ. Angiographic complications were low: perforation (0.8% vs 0.7%; P>.90), dissection (0.8% vs 1.1%; P=.80), and no-reflow (0.8% vs 0.7%; P>.90). Plaque modification with orbital atherectomy was safe and provided similar angiographic and clinical outcomes between females and males. Randomized trials with longer-term follow-up are needed to support our results.

  9. An overview of cancer research in South African academic and research institutions, 2013 - 2014.

    PubMed

    Moodley, Jennifer; Stefan, D Cristina; Sewram, Vikash; Ruff, Paul; Freeman, Melvyn; Asante-Shongwe, Kwanele

    2016-05-10

    Cancer is emerging as a critical public health problem in South Africa (SA). Recognising the importance of research in addressing the cancer burden, the Ministerial Advisory Committee on the Prevention and Control of Cancer (MACC) research working group undertook a review of the current cancer research landscape in SA and related this to the cancer burden. Academic and research institutions in SA were contacted to provide information on the titles of all current and recently completed (2013/2014) cancer research projects. Three MACC research working group members used the project titles to independently classify the projects by type of research (basic, clinical and public health - projects could be classified in more than one category) and disease site. A more detailed classification of projects addressing the five most common cancers diagnosed in males and females in SA was conducted using an adapted Common Scientific Outline (CSO) categorisation. Information was available on 556 cancer research projects. Overall, 301 projects were classified as clinical, 254 as basic science and 71 as public health research. The most common cancers being researched were cancers of the breast (n=95 projects) and cervix (n=43), leukaemia (n=36), non-Hodgkin's lymphoma (n=35) and lung cancer (n=23). Classification of the five most common cancers in males and females in SA, using the adapted CSO categories, showed that the majority of projects related to treatment, with relatively few projects on prevention, survivorship and patient perspectives. Our findings established that there is a dearth of public health cancer research in SA.

  10. Guideline-adherence and perspectives in the acute management of unstable angina - Initial results from the German chest pain unit registry.

    PubMed

    Breuckmann, Frank; Hochadel, Matthias; Darius, Harald; Giannitsis, Evangelos; Münzel, Thomas; Maier, Lars S; Schmitt, Claus; Schumacher, Burghard; Heusch, Gerd; Voigtländer, Thomas; Mudra, Harald; Senges, Jochen

    2015-08-01

    We investigated the current management of unstable angina pectoris (UAP) in certified chest pain units (CPUs) in Germany and focused on the European Society of Cardiology (ESC) guideline-adherence in the timing of invasive strategies or choice of conservative treatment options. More specifically, we analyzed differences in clinical outcome with respect to guideline-adherence. Prospective data from 1400 UAP patients were collected. Analyses of high-risk criteria with indication for invasive management and 3-month clinical outcome data were performed. Guideline-adherence was tested for a primarily conservative strategy as well as for percutaneous coronary intervention (PCI) within <24 and <72h after admission. Overall guideline-conforming management was performed in 38.2%. In UAP patients at risk, undertreatment caused by an insufficient consideration of risk criteria was obvious in 78%. Reciprocally, overtreatment in the absence of adequate risk markers was performed in 27%, whereas a guideline-conforming primarily conservative strategy was chosen in 73% of the low-risk patients. Together, the 3-month major adverse coronary and cerebrovascular events (MACCE) were low (3.6%). Nonetheless, guideline-conforming treatment was even associated with significantly lower MACCE rates (1.6% vs. 4.0%, p<0.05). The data suggest an inadequate adherence to ESC guidelines in nearly two thirds of the patients, particularly in those patients at high to intermediate risk with secondary risk factors, emphasizing the need for further attention to consistent risk profiling in the CPU and its certification process. Copyright © 2014 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  11. Coding Manual for Continuous Observation of Interactions by Single Subjects in an Academic Setting.

    ERIC Educational Resources Information Center

    Cobb, Joseph A.; Hops, Hyman

    The manual, designed particularly for work with acting-out or behavior problem students, describes coding procedures used in the observation of continuous classroom interactions between the student and his peers and teacher. Peer and/or teacher behaviors antecedent and consequent to the subject's behavior are identified in the coding process,…

  12. Finding Resolution for the Responsible Transparency of Economic Models in Health and Medicine.

    PubMed

    Padula, William V; McQueen, Robert Brett; Pronovost, Peter J

    2017-11-01

    The Second Panel on Cost-Effectiveness in Health and Medicine recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses has a number of questions unanswered with respect to the implementation of transparent, open source code interface for economic models. The possibility of making economic model source code could be positive and progressive for the field; however, several unintended consequences of this system should be first considered before complete implementation of this model. First, there is the concern regarding intellectual property rights that modelers have to their analyses. Second, the open source code could make analyses more accessible to inexperienced modelers, leading to inaccurate or misinterpreted results. We propose several resolutions to these concerns. The field should establish a licensing system of open source code such that the model originators maintain control of the code use and grant permissions to other investigators who wish to use it. The field should also be more forthcoming towards the teaching of cost-effectiveness analysis in medical and health services education so that providers and other professionals are familiar with economic modeling and able to conduct analyses with open source code. These types of unintended consequences need to be fully considered before the field's preparedness to move forward into an era of model transparency with open source code.

  13. Institutional Controls and Educational Research.

    ERIC Educational Resources Information Center

    Homan, Roger

    1990-01-01

    Recognizing tendencies toward contract research and possible consequences, advocates creating a conduct code to regulate educational research and protect its integrity. Reports survey responses from 48 British institutions, showing no systematic code. States confidence in supervisory discretion currently guides research. Proposes a specific code…

  14. Separable concatenated codes with iterative map decoding for Rician fading channels

    NASA Technical Reports Server (NTRS)

    Lodge, J. H.; Young, R. J.

    1993-01-01

    Very efficient signalling in radio channels requires the design of very powerful codes having special structure suitable for practical decoding schemes. In this paper, powerful codes are obtained by combining comparatively simple convolutional codes to form multi-tiered 'separable' convolutional codes. The decoding of these codes, using separable symbol-by-symbol maximum a posteriori (MAP) 'filters', is described. It is known that this approach yields impressive results in non-fading additive white Gaussian noise channels. Interleaving is an inherent part of the code construction, and consequently, these codes are well suited for fading channel communications. Here, simulation results for communications over Rician fading channels are presented to support this claim.

  15. Top-down NOX emissions over European cities from LOTOS-EUROS simulated and OMI observed tropospheric NO2 columns using the Exponentially Modified Gaussian approach

    NASA Astrophysics Data System (ADS)

    Verstraeten, Willem W.; Folkert Boersma, K.; Douros, John; Williams, Jason E.; Eskes, Henk H.; Delcloo, Andy

    2017-04-01

    High nitrogen oxides concentrations at the surface (NOX = NO + NO2) impact humans and ecosystem badly and play a key role in tropospheric chemistry. Surface NOX emissions drive major processes in regional and global chemistry transport models (CTM). NOX contributes to the formation of acid rain, act as aerosol precursors and is an important trace gas for the formation of tropospheric ozone (O3). Via tropospheric O3, NOX indirectly affects the production of the hydroxyl radical which controls the chemical lifetime of key atmospheric pollutants and reactive greenhouse gases. High NOX emissions are mainly observed in polluted regions produced by anthropogenic combustion from industrial, traffic and household activities typically observed in large and densely populated urban areas. Accurate NOX inventories are essential, but state-of the- art emission databases may vary substantially and uncertainties are high since reported emissions factors may differ in order of magnitude and more. To date, the modelled NO2 concentrations and lifetimes have large associated uncertainties due to the highly non-linear small-scale chemistry that occurs in urban areas and uncertainties in the reaction rate data, missing nitrogen (N) species and volatile organic compounds (VOC) emissions, and incomplete knowledge of nitrogen oxides chemistry. Any overestimation in the chemical lifetime may mask missing NOX chemistry in current CTM's. By simultaneously estimating both the NO2 lifetime and concentrations, for instance by using the Exponentially Modified Gaussian (EMG), a better surface NOX emission flux estimate can be obtained. Here we evaluate if the EMG methodology can reproduce the emissions input from the tropospheric NO2 columns simulated by the LOTOS-EUROS (Long Term Ozone Simulation-European Ozone Simulation) CTM model. We apply the EMG methodology on LOTOS-EUROS simulated tropospheric NO2 columns for the period April-September 2013 for 21 selected European urban areas under windy conditions (surface wind speeds > 3 m s-1). We then compare the top-down derived surface NOX emissions with the 2011 MACC-III emission inventory, used in the LOTOS-EUROS model as input to simulate the NO2 columns. We also apply the EMG methodology on OMI (Ozone Monitoring Instrument) tropospheric NO2 column data, providing us with real-time observation-based estimates of midday NO2 lifetime and NOX emissions over 21 European cities in 2013. Results indicate that the top-down derived NOX emissions from LOTOS-EUROS (respectively OMI) are comparable with the MACC-III inventory with a R2 of 0.99 (respectively R2 = 0.79). For St-Petersburg and Moscow the top-down NOX estimates from 2013 OMI data are biased low compared to the MACC-III inventory which uses a 2011 NOX emissions update.

  16. RISKIND : an enhanced computer code for National Environmental Policy Act transportation consequence analysis

    DOT National Transportation Integrated Search

    1996-01-01

    The RISKIND computer program was developed for the analysis of radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel (SNF) or other radioactive ...

  17. Sanctions Connected to Dress Code Violations in Secondary School Handbooks

    ERIC Educational Resources Information Center

    Workman, Jane E.; Freeburg, Elizabeth W.; Lentz-Hees, Elizabeth S.

    2004-01-01

    This study identifies and evaluates sanctions for dress code violations in secondary school handbooks. Sanctions, or consequences for breaking rules, vary along seven interrelated dimensions: source, formality, retribution, obtrusiveness, magnitude, severity, and pervasiveness. A content analysis of handbooks from 155 public secondary schools…

  18. Natural Language Interface for Safety Certification of Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2011-01-01

    Model-based design and automated code generation are being used increasingly at NASA. The trend is to move beyond simulation and prototyping to actual flight code, particularly in the guidance, navigation, and control domain. However, there are substantial obstacles to more widespread adoption of code generators in such safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. The AutoCert generator plug-in supports the certification of automatically generated code by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews.

  19. Safety and effectiveness of the INVATEC MO.MA proximal cerebral protection device during carotid artery stenting: results from the ARMOUR pivotal trial.

    PubMed

    Ansel, Gary M; Hopkins, L Nelson; Jaff, Michael R; Rubino, Paolo; Bacharach, J Michael; Scheinert, Dierk; Myla, Subbarao; Das, Tony; Cremonesi, Alberto

    2010-07-01

    The multicenter ARMOUR (ProximAl PRotection with the MO.MA Device DUring CaRotid Stenting) trial evaluated the 30-day safety and effectiveness of the MO.MA Proximal Cerebral Protection Device (Invatec, Roncadelle, Italy) utilized to treat high surgical risk patients undergoing carotid artery stenting (CAS). Distal embolic protection devices (EPD) have been traditionally utilized during CAS. The MO.MA device acts as a balloon occlusion "endovascular clamping" system to achieve cerebral protection prior to crossing the carotid stenosis. This prospective registry enrolled 262 subjects, 37 roll-in and 225 pivotal subjects evaluated with intention to treat (ITT) from September 2007 to February 2009. Subjects underwent CAS using the MO.MA device. The primary endpoint, myocardial infarction, stroke, or death through 30 days (30-day major adverse cardiac and cerebrovascular events [MACCE]) was compared to a performance goal of 13% derived from trials utilizing distal EPD. For the ITT population, the mean age was 74.7 years with 66.7% of the cohort being male. Symptomatic patients comprised 15.1% and 28.9% were octogenarians. Device success was 98.2% and procedural success was 93.2%. The 30-day MACCE rate was 2.7% [95% CI (1.0-5.8%)] with a 30-day major stroke rate of 0.9%. No symptomatic patient suffered a stroke during this trial. The ARMOUR trial demonstrated that the MO.MA(R) Proximal Cerebral Protection Device is safe and effective for high surgical risk patients undergoing CAS. The absence of stroke in symptomatic patients is the lowest rate reported in any independently adjudicated prospective multicenter registry trial to date. (c) 2010 Wiley-Liss, Inc.

  20. Off-pump compared to minimal extracorporeal circulation surgery in coronary artery bypass grafting.

    PubMed

    Reuthebuch, Oliver; Koechlin, Luca; Gahl, Brigitta; Matt, Peter; Schurr, Ulrich; Grapow, Martin; Eckstein, Friedrich

    2014-01-01

    Coronary artery bypass grafting (CABG) using extracorporeal circulation (ECC) is still the gold standard. However, alternative techniques have been developed to avoid ECC and its potential adverse effects. These encompass minimal extracorporeal circulation (MECC) or off-pump coronary artery bypass grafting (OPCAB). However, the prevailing potential benefits when comparing MECC and OPCABG are not yet clearly established. In this retrospective study we investigated the potential benefits of MECC and OPCABG in 697 patients undergoing CABG. Of these, 555 patients had been operated with MECC and 142 off-pump. The primary endpoint was Troponin T level as an indicator for myocardial damage. Study groups were not significantly different in general. However, patients undergoing OPCABG were significantly older (65.01 years ± 9.5 vs. 69.39 years ± 9.5; p value <0.001) with a higher Logistic EuroSCORE I (4.92% ± 6.5 vs. 5.88% ± 6.8; p value = 0.017). Operating off pump significantly reduced the need for intra-operative blood products (0.7% vs. 8.6%; p-value <0.001) and the length of stay in the intensive care unit (ICU) (2.04 days ± 2.63 vs. 2.76 days ± 2.79; p value <0.001). Regarding other blood values a significant difference could not be found in the adjusted calculations. The combined secondary endpoint, major cardiac or cerebrovascular events (MACCE), was equal in both groups as well. Coronary artery bypass grafting using MECC or OPCABG are two comparable techniques with advantages for OPCABG regarding the reduced need for intra-operative blood products and shorter length of stay in the ICU. However serological values and combined endpoint MACCE did not differ significantly in both groups.

  1. Measurements of volatile organic compounds during the 2006 TexAQS/GoMACCS campaign: Industrial influences, regional characteristics, and diurnal dependencies of the OH reactivity

    NASA Astrophysics Data System (ADS)

    Gilman, Jessica B.; Kuster, William C.; Goldan, Paul D.; Herndon, Scott C.; Zahniser, Mark S.; Tucker, Sara C.; Brewer, W. Alan; Lerner, Brian M.; Williams, Eric J.; Harley, Robert A.; Fehsenfeld, Fred C.; Warneke, Carsten; de Gouw, Joost A.

    2009-04-01

    An extensive set of volatile organic compounds (VOCs) and other gas phase species were measured in situ aboard the NOAA R/V Ronald H. Brown as the ship sailed in the Gulf of Mexico and the Houston and Galveston Bay (HGB) area as part of the Texas Air Quality (TexAQS)/Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS) conducted from July-September 2006. The magnitudes of the reactivities of CH4, CO, VOCs, and NO2 with the hydroxyl radical, OH, were determined in order to quantify the contributions of these compounds to potential ozone formation. The average total OH reactivity (ROH,TOTAL) increased from 1.01 s-1 in the central gulf to 10.1 s-1 in the HGB area as a result of the substantial increase in the contribution from VOCs and NO2. The increase in the measured concentrations of reactive VOCs in the HGB area compared to the central gulf was explained by the impact of industrial emissions, the regional distribution of VOCs, and the effects of local meteorology. By compensating for the effects of boundary layer mixing, the diurnal profiles of the OH reactivity were used to characterize the source signatures and relative magnitudes of biogenic, anthropogenic (urban + industrial), and oxygenated VOCs as a function of the time of day. The source of reactive oxygenated VOCs (e.g., formaldehyde) was determined to be almost entirely from secondary production. The secondary formation of oxygenated VOCs, in addition to the continued emissions of reactive anthropogenic VOCs, served to sustain elevated levels of OH reactivity throughout the time of peak ozone production.

  2. Sex-related differences after contemporary primary percutaneous coronary intervention for ST-segment elevation myocardial infarction.

    PubMed

    Barthélémy, Olivier; Degrell, Philippe; Berman, Emmanuel; Kerneis, Mathieu; Petroni, Thibaut; Silvain, Johanne; Payot, Laurent; Choussat, Remi; Collet, Jean-Philippe; Helft, Gerard; Montalescot, Gilles; Le Feuvre, Claude

    2015-01-01

    Whether outcomes differ for women and men after percutaneous coronary intervention (PCI) for ST-segment elevation myocardial infarction (STEMI) remains controversial. To compare 1-year outcomes after primary PCI in women and men with STEMI, matched for age and diabetes. Consecutive women with STEMI of<24 hours' duration referred (August 2007 to January 2011) for primary PCI were compared with men matched for age and diabetes. Rates of all-cause mortality, target vessel revascularization (TVR) and major cardiovascular and cerebrovascular events (MACCE) (death/myocardial infarction/stroke) were assessed at 1 year. Among 775 consecutive patients, 182 (23.5%) women were compared with 182 matched men. Mean age was 69±15 years, 18% had diabetes. Patient characteristics were similar, except for lower creatinine clearance (73±41 vs 82±38 μmol/L; P=0.041), more cardiogenic shock (14.8% vs 6.6%; P=0.017) and less radial PCI (81.3% vs 90.1%; P=0.024) in women. Rates of 1-year death (22.7% vs 18.1%), TVR (8.3% vs 6.0%) and MACCE (24.3% vs 20.9%) were not statistically different in women (P>0.05 for all). After exclusion of patients with shock (10.7%) and out-of-hospital cardiac arrest (6.6%), death rates were even more similar (11.3% vs 11.8%; P=0.10). Female sex was not independently associated with death (odds ratio 1.01, 95% confidence interval 0.55-1.87; P=0.97). In our consecutive unselected patient population, women had similar 1-year outcomes to men matched for age and diabetes, after contemporary primary PCI for STEMI, despite having a higher risk profile at baseline. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  3. Marine aerosol distribution and variability over the pristine Southern Indian Ocean

    NASA Astrophysics Data System (ADS)

    Mallet, Paul-Étienne; Pujol, Olivier; Brioude, Jérôme; Evan, Stéphanie; Jensen, Andrew

    2018-06-01

    This paper presents an 8-year (2005-2012 inclusive) study of the marine aerosol distribution and variability over the Southern Indian Ocean, precisely in the area { 10 °S - 40 °S ; 50 °E - 110 °E } which has been identified as one of the most pristine regions of the globe. A large dataset consisting of satellite data (POLDER, CALIOP), AERONET measurements at Saint-Denis (French Réunion Island) and model reanalysis (MACC), has been used. In spite of a positive bias of about 0.05 between the AOD (aerosol optical depth) given by POLDER and MACC on one hand and the AOD measured by AERONET on the other, consistent results for aerosol distribution and variability over the area considered have been obtained. First, aerosols are mainly confined below 2km asl (above sea level) and are dominated by sea salt, especially in the center of the area of interest, with AOD ≤ 0 . 1. This zone is the most pristine and is associated with the position of the Mascarene anticyclone. There, the direct radiative effect is assessed around - 9 Wm-2 at the top of the atmosphere and probability density functions of the AOD s are leptokurtic lognormal functions without any significant seasonal variation. It is also suggested that the Madden-Jullian oscillation impacts sea salt emissions in the northern part of the area considered by modifying the state of the ocean surface. Finally, this area is surrounded in the northeast and the southwest by seasonal Australian and South African intrusions (AOD > 0.1) ; throughout the year, the ITCZ seems to limit continental contaminations from Asia. Due to the long period of time considered (almost a decade), this paper completes and strengthens results of studies based on observations performed during previous specific field campaigns.

  4. New quantum codes derived from a family of antiprimitive BCH codes

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin

    The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.

  5. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  6. The Therapy Process Observational Coding System for Child Psychotherapy Strategies Scale

    ERIC Educational Resources Information Center

    McLeod, Bryce D.; Weisz, John R.

    2010-01-01

    Most everyday child and adolescent psychotherapy does not follow manuals that document the procedures. Consequently, usual clinical care has remained poorly understood and rarely studied. The Therapy Process Observational Coding System for Child Psychotherapy-Strategies scale (TPOCS-S) is an observational measure of youth psychotherapy procedures…

  7. Perceptual consequences of disrupted auditory nerve activity.

    PubMed

    Zeng, Fan-Gang; Kong, Ying-Yee; Michalewski, Henry J; Starr, Arnold

    2005-06-01

    Perceptual consequences of disrupted auditory nerve activity were systematically studied in 21 subjects who had been clinically diagnosed with auditory neuropathy (AN), a recently defined disorder characterized by normal outer hair cell function but disrupted auditory nerve function. Neurological and electrophysical evidence suggests that disrupted auditory nerve activity is due to desynchronized or reduced neural activity or both. Psychophysical measures showed that the disrupted neural activity has minimal effects on intensity-related perception, such as loudness discrimination, pitch discrimination at high frequencies, and sound localization using interaural level differences. In contrast, the disrupted neural activity significantly impairs timing related perception, such as pitch discrimination at low frequencies, temporal integration, gap detection, temporal modulation detection, backward and forward masking, signal detection in noise, binaural beats, and sound localization using interaural time differences. These perceptual consequences are the opposite of what is typically observed in cochlear-impaired subjects who have impaired intensity perception but relatively normal temporal processing after taking their impaired intensity perception into account. These differences in perceptual consequences between auditory neuropathy and cochlear damage suggest the use of different neural codes in auditory perception: a suboptimal spike count code for intensity processing, a synchronized spike code for temporal processing, and a duplex code for frequency processing. We also proposed two underlying physiological models based on desynchronized and reduced discharge in the auditory nerve to successfully account for the observed neurological and behavioral data. These methods and measures cannot differentiate between these two AN models, but future studies using electric stimulation of the auditory nerve via a cochlear implant might. These results not only show the unique contribution of neural synchrony to sensory perception but also provide guidance for translational research in terms of better diagnosis and management of human communication disorders.

  8. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  9. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    2016-12-01

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  10. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  11. Radiative and temperature effects of aerosol simulated by the COSMO-Ru model for different atmospheric conditions and their testing against ground-based measurements and accurate RT simulations

    NASA Astrophysics Data System (ADS)

    Chubarova, Nataly; Poliukhov, Alexei; Shatunova, Marina; Rivin, Gdali; Becker, Ralf; Muskatel, Harel; Blahak, Ulrich; Kinne, Stefan; Tarasova, Tatiana

    2017-04-01

    We use the operational Russian COSMO-Ru weather forecast model (Ritter and and Geleyn, 1991) with different aerosol input data for the evaluation of radiative and temperature effects of aerosol in different atmospheric conditions. Various aerosol datasets were utilized including Tegen climatology (Tegen et al., 1997), updated Macv2 climatology (Kinne et al., 2013), Tanre climatology (Tanre et al., 1984) as well as the MACC data (Morcrette et al., 2009). For clear sky conditions we compare the radiative effects from the COSMO-Ru model over Moscow (55.7N, 37.5E) and Lindenberg/Falkenberg sites (52.2N, 14.1E) with the results obtained using long-term aerosol measurements. Additional tests of the COSMO RT code were performed against (FC05)-SW model (Tarasova T.A. and Fomin B.A., 2007). The overestimation of about 5-8% of COSMO RT code was obtained. The study of aerosol effect on temperature at 2 meters has revealed the sensitivity of about 0.7-1.1 degree C per 100 W/m2 change in shortwave net radiation due to aerosol variations. We also discuss the radiative impact of urban aerosol properties according to the long-term AERONET measurements in Moscow and Moscow suburb as well as long-term aerosol trends over Moscow from the measurements and Macv2 dataset. References: Kinne, S., O'Donnel D., Stier P., et al., J. Adv. Model. Earth Syst., 5, 704-740, 2013. Morcrette J.-J.,O. Boucher, L. Jones, eet al, J.GEOPHYS. RES.,VOL. 114, D06206, doi:10.1029/2008JD011235, 2009. Ritter, B. and Geleyn, J., Monthly Weather Review, 120, 303-325, 1992. Tanre, D., Geleyn, J., and Slingo, J., A. Deepak Publ., Hampton, Virginia, 133-177, 1984. Tarasova, T., and Fomin, B., Journal of Atmospheric and Oceanic Technology, 24, 1157-1162, 2007. Tegen, I., Hollrig, P., Chin, M., et al., Journal of Geophysical Research- Atmospheres, 102, 23895-23915, 1997.

  12. 25 CFR 11.1212 - Consequences of disobedience or interference.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Consequences of disobedience or interference. 11.1212 Section 11.1212 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW AND ORDER COURTS OF INDIAN OFFENSES AND LAW AND ORDER CODE Child Protection and Domestic Violence Procedures § 11.1212...

  13. 25 CFR 11.1212 - Consequences of disobedience or interference.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Consequences of disobedience or interference. 11.1212 Section 11.1212 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW AND ORDER COURTS OF INDIAN OFFENSES AND LAW AND ORDER CODE Child Protection and Domestic Violence Procedures § 11.1212...

  14. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  15. Multiyear applications of WRF/Chem over continental U.S.: Model evaluation, variation trend, and impacts of boundary conditions

    NASA Astrophysics Data System (ADS)

    Yahya, Khairunnisa; He, Jian; Zhang, Yang

    2015-12-01

    Multiyear applications of an online-coupled meteorology-chemistry model allow an assessment of the variation trends in simulated meteorology, air quality, and their interactions to changes in emissions and meteorology, as well as the impacts of initial and boundary conditions (ICONs/BCONs) on simulated aerosol-cloud-radiation interactions over a period of time. In this work, the Weather Research and Forecasting model with Chemistry version 3.4.1 (WRF/Chem v. 3.4.1) with the 2005 Carbon Bond mechanism coupled with the Volatility Basis Set module for secondary organic aerosol formation (WRF/Chem-CB05-VBS) is applied for multiple years (2001, 2006, and 2010) over continental U.S. This work also examines the changes in simulated air quality and meteorology due to changes in emissions and meteorology and the model's capability in reproducing the observed variation trends in species concentrations from 2001 to 2010. In addition, the impacts of the chemical ICONs/BCONs on model predictions are analyzed. ICONs/BCONs are downscaled from two global models, the modified Community Earth System Model/Community Atmosphere model version 5.1 (CESM/CAM v5.1) and the Monitoring Atmospheric Composition and Climate model (MACC). The evaluation of WRF/Chem-CB05-VBS simulations with the CESM ICONs/BCONs for 2001, 2006, and 2010 shows that temperature at 2 m (T2) is underpredicted for all three years likely due to inaccuracies in soil moisture and soil temperature, resulting in biases in surface relative humidity, wind speed, and precipitation. With the exception of cloud fraction, other aerosol-cloud variables including aerosol optical depth, cloud droplet number concentration, and cloud optical thickness are underpredicted for all three years, resulting in overpredictions of radiation variables. The model performs well for O3 and particulate matter with diameter less than or equal to 2.5 (PM2.5) for all three years comparable to other studies from literature. The model is able to reproduce observed annual average trends in O3 and PM2.5 concentrations from 2001 to 2006 and from 2006 to 2010 but is less skillful in simulating their observed seasonal trends. The 2006 and 2010 results using CESM and MACC ICONs/BCONs are compared to analyze the impact of ICONs/BCONs on model performance and their feedbacks to aerosol, clouds, and radiation. Comparing to the simulations with MACC ICONs/BCONs, the simulations with the CESM ICONs/BCONs improve the performance of O3 mixing ratios (e.g., the normalized mean bias for maximum 8 h O3 is reduced from -17% to -1% in 2010), PM2.5 in 2010, and sulfate in 2006 (despite a slightly larger normalized mean bias for PM2.5 in 2006). The impacts of different ICONs/BCONs on simulated aerosol-cloud-radiation variables are not negligible, with larger impacts in 2006 compared to 2010.

  16. Summary of evidence for an anticodonic basis for the origin of the genetic code

    NASA Technical Reports Server (NTRS)

    Lacey, J. C., Jr.; Mullins, D. W., Jr.

    1981-01-01

    This article summarizes data supporting the hypothesis that the genetic code origin was based on relationships (probably affinities) between amino acids and their anticodon nucleotides. Selective activation seems to follow from selective affinity and consequently, incorporation of amino acids into peptides can also be selective. It is suggested that these selectivities in affinity and activation, coupled with the base pairing specificities, allowed the origin of the code and the process of translation.

  17. A low-complexity and high performance concatenated coding scheme for high-speed satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Rhee, Dojun; Rajpal, Sandeep

    1993-01-01

    This report presents a low-complexity and high performance concatenated coding scheme for high-speed satellite communications. In this proposed scheme, the NASA Standard Reed-Solomon (RS) code over GF(2(exp 8) is used as the outer code and the second-order Reed-Muller (RM) code of Hamming distance 8 is used as the inner code. The RM inner code has a very simple trellis structure and is decoded with the soft-decision Viterbi decoding algorithm. It is shown that the proposed concatenated coding scheme achieves an error performance which is comparable to that of the NASA TDRS concatenated coding scheme in which the NASA Standard rate-1/2 convolutional code of constraint length 7 and d sub free = 10 is used as the inner code. However, the proposed RM inner code has much smaller decoding complexity, less decoding delay, and much higher decoding speed. Consequently, the proposed concatenated coding scheme is suitable for reliable high-speed satellite communications, and it may be considered as an alternate coding scheme for the NASA TDRS system.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  19. Mechanisms and consequences of alternative polyadenylation

    PubMed Central

    Di Giammartino, Dafne Campigli; Nishida, Kensei; Manley, James L.

    2011-01-01

    Summary Alternative polyadenylation (APA) is emerging as a widespread mechanism used to control gene expression. Like alternative splicing, usage of alternative poly(A) sites allows a single gene to encode multiple mRNA transcripts. In some cases, this changes the mRNA coding potential; in other cases, the code remains unchanged but the 3’UTR length is altered, influencing the fate of mRNAs in several ways, for example, by altering the availability of RNA binding protein sites and microRNA binding sites. The mechansims governing both global and gene-specific APA are only starting to be deciphered. Here we review what is known about these mechanisms and the functional consequences of alternative polyadenlyation. PMID:21925375

  20. Ancient DNA sequence revealed by error-correcting codes.

    PubMed

    Brandão, Marcelo M; Spoladore, Larissa; Faria, Luzinete C B; Rocha, Andréa S L; Silva-Filho, Marcio C; Palazzo, Reginaldo

    2015-07-10

    A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code.

  1. Ancient DNA sequence revealed by error-correcting codes

    PubMed Central

    Brandão, Marcelo M.; Spoladore, Larissa; Faria, Luzinete C. B.; Rocha, Andréa S. L.; Silva-Filho, Marcio C.; Palazzo, Reginaldo

    2015-01-01

    A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code. PMID:26159228

  2. Violence and its injury consequences in American movies

    PubMed Central

    McArthur, David L; Peek-Asa, Corinne; Webb, Theresa; Fisher, Kevin; Cook, Bernard; Browne, Nick; Kraus, Jess

    2000-01-01

    Objectives To evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top-grossing American films of 1994. Methods Each scene in each film was examined for the presentation of violent actions on persons and coded by a systematic context-sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. Results The median number of violent actions per film was 16 (range, 0-110). Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Conclusions Violent force in American films of 1994 was overwhelmingly intentional and in 4 of 5 cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings. PMID:10986175

  3. Violence and its injury consequences in American movies: a public health perspective.

    PubMed

    McArthur, D L; Peek-Asa, C; Webb, T; Fisher, K; Cook, B; Browne, N; Kraus, J

    2000-09-01

    To evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top-grossing American films of 1994. Each scene in each film was examined for the presentation of violent actions on persons and coded by a systematic context-sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. The median number of violent actions per film was 16 (range, 0-110). Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Violent force in American films of 1994 was overwhelmingly intentional and in 4 of 5 cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings.

  4. Dual Coding Theory and Computer Education: Some Media Experiments To Examine the Effects of Different Media on Learning.

    ERIC Educational Resources Information Center

    Alty, James L.

    Dual Coding Theory has quite specific predictions about how information in different media is stored, manipulated and recalled. Different combinations of media are expected to have significant effects upon the recall and retention of information. This obviously may have important consequences in the design of computer-based programs. The paper…

  5. An Exploratory Study of the Impact of Self-Efficacy and Learning Engagement in Coding Learning Activities in Italian Middle School

    ERIC Educational Resources Information Center

    Banzato, Monica; Tosato, Paolo

    2017-01-01

    In Italy, teaching coding at primary and secondary levels is emerging as a major educational issue, particularly in light of the recent reforms now being implemented. Consequently, there has been increased research on how to introduce information technology in lower secondary schools. This paper presents an exploratory survey, carried out through…

  6. Weighted SAW reflector gratings for orthogonal frequency coded SAW tags and sensors

    NASA Technical Reports Server (NTRS)

    Puccio, Derek (Inventor); Malocha, Donald (Inventor)

    2011-01-01

    Weighted surface acoustic wave reflector gratings for coding identification tags and sensors to enable unique sensor operation and identification for a multi-sensor environment. In an embodiment, the weighted reflectors are variable while in another embodiment the reflector gratings are apodized. The weighting technique allows the designer to decrease reflectively and allows for more chips to be implemented in a device and, consequently, more coding diversity. As a result, more tags and sensors can be implemented using a given bandwidth when compared with uniform reflectors. Use of weighted reflector gratings with OFC makes various phase shifting schemes possible, such as in-phase and quadrature implementations of coded waveforms resulting in reduced device size and increased coding.

  7. Spatial Disorientation in Military Vehicles: Causes, Consequences and Cures (Desorientation spaiale dans les vehicules militaires: causes, consequences et remedes)

    DTIC Science & Technology

    2003-02-01

    servcice warfighters (Training devices and protocols, Onboard equipment, Cognitive and sensorimotor aids, Visual and auditory symbology, Peripheral visual...vestibular stimulation causing a decrease in cerebral blood pressure with the consequent reduction in G-tolerance and increased likelihood of ALOC or GLOC...tactile stimulators (e.g. one providing a sensation of movement) or of displays with a more complex coding (e.g. by increase in the number of tactors, or

  8. Mechanism on brain information processing: Energy coding

    NASA Astrophysics Data System (ADS)

    Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa

    2006-09-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.

  9. Energy coding in biological neural networks

    PubMed Central

    Zhang, Zhikang

    2007-01-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513

  10. Deconstructing processing speed deficits in schizophrenia: application of a parametric digit symbol coding test.

    PubMed

    Bachman, Peter; Reichenberg, Abraham; Rice, Patrick; Woolsey, Mary; Chaves, Olga; Martinez, David; Maples, Natalie; Velligan, Dawn I; Glahn, David C

    2010-05-01

    Cognitive processing inefficiency, often measured using digit symbol coding tasks, is a putative vulnerability marker for schizophrenia and a reliable indicator of illness severity and functional outcome. Indeed, performance on the digit symbol coding task may be the most severe neuropsychological deficit patients with schizophrenia display at the group level. Yet, little is known about the contributions of simpler cognitive processes to coding performance in schizophrenia (e.g. decision making, visual scanning, relational memory, motor ability). We developed an experimental behavioral task, based on a computerized digit symbol coding task, which allows the manipulation of demands placed on visual scanning efficiency and relational memory while holding decisional and motor requirements constant. Although patients (n=85) were impaired on all aspects of the task when compared to demographically matched healthy comparison subjects (n=30), they showed a particularly striking failure to benefit from the presence of predictable target information. These findings are consistent with predicted impairments in cognitive processing speed due to schizophrenia patients' well-known memory impairment, suggesting that this mnemonic deficit may have consequences for critical aspects of information processing that are traditionally considered quite separate from the memory domain. Future investigation into the mechanisms underlying the wide-ranging consequences of mnemonic deficits in schizophrenia should provide additional insight. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  11. Assessment of discrepancies between bottom-up and regional emission inventories in Norwegian urban areas

    NASA Astrophysics Data System (ADS)

    López-Aparicio, Susana; Guevara, Marc; Thunis, Philippe; Cuvelier, Kees; Tarrasón, Leonor

    2017-04-01

    This study shows the capabilities of a benchmarking system to identify inconsistencies in emission inventories, and to evaluate the reason behind discrepancies as a mean to improve both bottom-up and downscaled emission inventories. Fine scale bottom-up emission inventories for seven urban areas in Norway are compared with three regional emission inventories, EC4MACS, TNO_MACC-II and TNO_MACC-III, downscaled to the same areas. The comparison shows discrepancies in nitrogen oxides (NOx) and particulate matter (PM2.5 and PM10) when evaluating both total and sectorial emissions. The three regional emission inventories underestimate NOx and PM10 traffic emissions by approximately 20-80% and 50-90%, respectively. The main reasons for the underestimation of PM10 emissions from traffic in the regional inventories are related to non-exhaust emissions due to resuspension, which are included in the bottom-up emission inventories but are missing in the official national emissions, and therefore in the downscaled regional inventories. The benchmarking indicates that the most probable reason behind the underestimation of NOx traffic emissions by the regional inventories is the activity data. The fine scale NOx traffic emissions from bottom-up inventories are based on the actual traffic volume at the road link and are much higher than the NOx emissions downscaled from national estimates based on fuel sales and based on population for the urban areas. We have identified important discrepancies in PM2.5 emissions from wood burning for residential heating among all the inventories. These discrepancies are associated with the assumptions made for the allocation of emissions. In the EC4MACs inventory, such assumptions imply high underestimation of PM2.5 emissions from the residential combustion sector in urban areas, which ranges from 40 to 90% compared with the bottom-up inventories. The study shows that in three of the seven Norwegian cities there is need for further improvement of the emission inventories.

  12. Stent Thrombosis in Drug-Eluting or Bare-Metal Stents in Patients Receiving Dual Antiplatelet Therapy.

    PubMed

    Kereiakes, Dean J; Yeh, Robert W; Massaro, Joseph M; Driscoll-Shempp, Priscilla; Cutlip, Donald E; Steg, P Gabriel; Gershlick, Anthony H; Darius, Harald; Meredith, Ian T; Ormiston, John; Tanguay, Jean-François; Windecker, Stephan; Garratt, Kirk N; Kandzari, David E; Lee, David P; Simon, Daniel I; Iancu, Adrian Corneliu; Trebacz, Jaroslaw; Mauri, Laura

    2015-10-01

    This study sought to compare rates of stent thrombosis and major adverse cardiac and cerebrovascular events (MACCE) (composite of death, myocardial infarction, or stroke) after coronary stenting with drug-eluting stents (DES) versus bare-metal stents (BMS) in patients who participated in the DAPT (Dual Antiplatelet Therapy) study, an international multicenter randomized trial comparing 30 versus 12 months of dual antiplatelet therapy in subjects undergoing coronary stenting with either DES or BMS. Despite antirestenotic efficacy of coronary DES compared with BMS, the relative risk of stent thrombosis and adverse cardiovascular events is unclear. Many clinicians perceive BMS to be associated with fewer adverse ischemic events and to require shorter-duration dual antiplatelet therapy than DES. Prospective propensity-matched analysis of subjects enrolled into a randomized trial of dual antiplatelet therapy duration was performed. DES- and BMS-treated subjects were propensity-score matched in a many-to-one fashion. The study design was observational for all subjects 0 to 12 months following stenting. A subset of eligible subjects without major ischemic or bleeding events were randomized at 12 months to continued thienopyridine versus placebo; all subjects were followed through 33 months. Among 10,026 propensity-matched subjects, DES-treated subjects (n = 8,308) had a lower rate of stent thrombosis through 33 months compared with BMS-treated subjects (n = 1,718, 1.7% vs. 2.6%; weighted risk difference -1.1%, p = 0.01) and a noninferior rate of MACCE (11.4% vs. 13.2%, respectively, weighted risk difference -1.8%, p = 0.053, noninferiority p < 0.001). DES-treated subjects have long-term rates of stent thrombosis that are lower than BMS-treated subjects. (The Dual Antiplatelet Therapy Study [DAPT study]; NCT00977938). Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. Relationship between CHA2DS2-VASc score, coronary artery disease severity, residual platelet reactivity and long-term clinical outcomes in patients with acute coronary syndrome.

    PubMed

    Scudiero, Fernando; Zocchi, Chiara; De Vito, Elena; Tarantini, Giuseppe; Marcucci, Rossella; Valenti, Renato; Migliorini, Angela; Antoniucci, David; Marchionni, Niccolò; Parodi, Guido

    2018-07-01

    The CHA 2 DS 2 -VASc score predicts stroke risk in patients with atrial fibrillation, but recently has been reported to have a prognostic role even in patients with ACS. We sought to assess the ability of the CHA 2 DS 2 -VASc score to predict the severity of coronary artery disease, high residual platelet reactivity and long-term outcomes in patients with acute coronary syndrome (ACS). Overall, 1729 consecutive patients with ACS undergoing invasive management were included in this prospective registry. We assessed platelet reactivity via light transmittance aggregometry after clopidogrel loading. Patients were divided according to the CHA 2 DS 2 -VASc score: group A = 0, B = 1, C = 2, D = 3, E = 4 and F ≥ 5. Patients with higher CHA 2 DS 2 -VASc score were more likely to have a higher rate of multivessel CAD (37%, 47%, 55%, 62%, 67 and 75% in Group A, B, C, D, E and F; p < 0.001); moreover, CHA 2 DS 2 -VASc score correlated linearly with residual platelet reactivity (R = 0.77; p < 0.001). At long-term follow-up, estimated adverse event rates (MACCE: cardiac death, MI, stroke or any urgent coronary revascularization) were 3%, 8%, 10%, 14%, 19% and 24% in group A, B, C, D, E and F; p < 0.001. Multivariable analysis demonstrated CHA 2 DS 2 -VASc to be an independent predictor of severity of coronary artery disease, of high residual platelet reactivity and of MACCE. In a cohort of patients with ACS, CHA 2 DS 2 -VASc score correlated with coronary disease severity and residual platelet reactivity, and therefore it predicted the risk of long-term adverse events. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Coronary Artery Bypass Grafting in Diabetic Patients: Complete Arterial versus Internal Thoracic Artery and Sequential Vein Grafts-A Propensity-Score Matched Analysis.

    PubMed

    Kunihara, Takashi; Wendler, Olaf; Heinrich, Kerstin; Nomura, Ryota; Schäfers, Hans-Joachim

    2018-06-20

     The optimal choice of conduit and configuration for coronary artery bypass grafting (CABG) in diabetic patients remains somewhat controversial, even though arterial grafts have been proposed as superior. We attempted to clarify the role of complete arterial revascularization using the left internal thoracic artery (LITA) and the radial artery (RA) alone in "T-Graft" configuration on long-term outcome.  From 1994 to 2001, 104 diabetic patients with triple vessel disease underwent CABG using LITA/RA "T-Grafts" (Group-A). Using propensity-score matching, 104 patients with comparable preoperative characteristics who underwent CABG using LITA and one sequential vein graft were identified (Group-V). Freedom from all causes of death, cardiac death, major adverse cardiac event (MACE), major adverse cardiac (and cerebral) event (MACCE), and repeat revascularization at 10 years of Group-A was 60 ± 5%, 67 ± 5%, 48 ± 5%, 37 ± 5%, and 81 ± 4%, respectively, compared with 58 ± 5%, 70 ± 5%, 49 ± 5%, 39 ± 5%, and 93 ± 3% in Group-V. There were no significant differences in these end points between groups regardless of insulin-dependency. Multivariable Cox proportional hazards model identified age, left ventricular ejection fraction, renal failure, and hyperlipidemia as independent predictors for all death, age and left ventricular ejection fraction for cardiac death, sinus rhythm for both MACE and MACCE, and prior percutaneous coronary intervention for re-revascularization.  In our experience, complete arterial revascularization using LITA/RA "T-Grafts" does not provide superior long-term clinical benefits for diabetic patients compared with a combination of LITA and sequential vein graft. Georg Thieme Verlag KG Stuttgart · New York.

  15. Comparison of coronary bypass surgery with drug-eluting stenting for the treatment of left main and/or three-vessel disease: 3-year follow-up of the SYNTAX trial.

    PubMed

    Kappetein, Arie Pieter; Feldman, Ted E; Mack, Michael J; Morice, Marie-Claude; Holmes, David R; Ståhle, Elisabeth; Dawkins, Keith D; Mohr, Friedrich W; Serruys, Patrick W; Colombo, Antonio

    2011-09-01

    Long-term randomized comparisons of percutaneous coronary intervention (PCI) to coronary artery bypass grafting (CABG) in left main coronary (LM) disease and/or three-vessel disease (3VD) patients have been limited. This analysis compares 3-year outcomes in LM and/or 3VD patients treated with CABG or PCI with TAXUS Express stents. SYNTAX is an 85-centre randomized clinical trial (n= 1800). Prospectively screened, consecutive LM and/or 3VD patients were randomized if amenable to equivalent revascularization using either technique; if not, they were entered into a registry. Patients in the randomized cohort will continue to be followed for 5 years. At 3 years, major adverse cardiac and cerebrovascular events [MACCE: death, stroke, myocardial infarction (MI), and repeat revascularization; CABG 20.2% vs. PCI 28.0%, P< 0.001], repeat revascularization (10.7 vs. 19.7%, P< 0.001), and MI (3.6 vs. 7.1%, P= 0.002) were elevated in the PCI arm. Rates of the composite safety endpoint (death/stroke/MI 12.0 vs. 14.1%, P= 0.21) and stroke alone (3.4 vs. 2.0%, P= 0.07) were not significantly different between treatment groups. Major adverse cardiac and cerebrovascular event rates were not significantly different between arms in the LM subgroup (22.3 vs. 26.8%, P= 0.20) but were higher with PCI in the 3VD subgroup (18.8 vs. 28.8%, P< 0.001). At 3 years, MACCE was significantly higher in PCI- compared with CABG-treated patients. In patients with less complex disease (low SYNTAX scores for 3VD or low/intermediate terciles for LM patients), PCI is an acceptable revascularization, although longer follow-up is needed to evaluate these two revascularization strategies.

  16. Impact of aspirin resistance on outcomes among patients following coronary artery bypass grafting: exploratory analysis from randomized controlled trial (NCT01159639).

    PubMed

    Petricevic, Mate; Kopjar, Tomislav; Gasparovic, Hrvoje; Milicic, Davor; Svetina, Lucija; Zdilar, Boris; Boban, Marko; Mihaljevic, Martina Zrno; Biocina, Bojan

    2015-05-01

    Individual variability in the response to aspirin, has been established by various platelet function assays, however, the clinical relevance of aspirin resistance (AR) in patients undergoing coronary artery bypass grafting (CABG) has to be evaluated. Our working group conducted a randomized controlled trial (NCT01159639) with the aim to assess impact of dual antiplatelet therapy (APT) on outcomes among patients with AR following CABG. Patients that were aspirin resistant on fourth postoperative day (POD 4) were randomly assigned to receive either dual APT with clopidogrel (75 mg) plus aspirin (300 mg)-intervention arm or monotherapy with aspirin (300 mg)-control arm. This exploratory analysis compares clinical outcomes between aspirin resistant patients allocated to control arm and patients that have had adequate platelet inhibitory response to aspirin at POD 4. Both groups were treated with 300 mg of aspirin per day following surgery. We sought to evaluate the impact of early postoperative AR on outcomes among patients following CABG. Exploratory analysis included a total number of 325 patients. Of those, 215 patients with adequate response to aspirin and 110 patients with AR allocated to aspirin monotherapy following randomization protocol. The primary efficacy end point (MACCEs-major adverse cardiac and cardiovascular events) occurred in 10 and 6 % of patients with AR and with adequate aspirin response, respectively (p = 0.27). Non-significant differences were observed in bleeding events occurrence. Subgroup analysis of the primary end point revealed that aspirin resistant patients with BMI > 30 kg/m(2) tend to have a higher occurrence of MACCEs 18 versus 5 % (relative risk 0.44 [95 % CI 0.16-1.16]; p = 0.05). This exploratory analysis did not reveal significant impact of aspirin resistance on outcomes among patients undergoing CABG. Further, sufficiently powered studies are needed in order to evaluate clinical relevance of AR in patients undergoing CABG.

  17. Preoperative platelet transfusions to reverse antiplatelet therapy for urgent non-cardiac surgery: an observational cohort study.

    PubMed

    Baschin, M; Selleng, S; Hummel, A; Diedrich, S; Schroeder, H W; Kohlmann, T; Westphal, A; Greinacher, A; Thiele, T

    2018-04-01

    Essentials An increasing number of patients requiring surgery receive antiplatelet therapy (APT). We analyzed 181 patients receiving presurgery platelet transfusions to reverse APT. No coronary thrombosis occurred after platelet transfusion. This justifies a prospective trial to test preoperative platelet transfusions to reverse APT. Background Patients receiving antiplatelet therapy (APT) have an increased risk of perioperative bleeding and cardiac adverse events (CAE). Preoperative platelet transfusions may reduce the bleeding risk but may also increase the risk of CAE, particularly coronary thrombosis in patients after recent stent implantation. Objectives To analyze the incidence of perioperative CAE and bleeding in patients undergoing non-cardiac surgery using a standardized management of transfusing two platelet concentrates preoperatively and restart of APT within 24-72 h after surgery. Methods A cohort of consecutive patients on APT treated with two platelet concentrates before non-cardiac surgery between January 2012 and December 2014 was retrospectively identified. Patients were stratified by the risk of major adverse cardiac and cerebrovascular events (MACCE). The primary objective was the incidence of CAE (myocardial infarction, acute heart failure and cardiac troponine T increase). Secondary objectives were incidences of other thromboembolic events, bleedings, transfusions and mortality. Results Among 181 patients, 88 received aspirin, 21 clopidogrel and 72 dual APT. MACCE risk was high in 63, moderate in 103 and low in 15 patients; 67 had cardiac stents. Ten patients (5.5%; 95% CI, 3.0-9.9%) developed a CAE (three myocardial infarctions, four cardiac failures and three troponin T increases). None was caused by coronary thrombosis. Surgery-related bleeding occurred in 22 patients (12.2%; 95% CI, 8.2-17.7%), making 12 re-interventions necessary (6.6%; 95% CI, 3.8-11.2%). Conclusion Preoperative platelet transfusions and early restart of APT allowed urgent surgery and did not cause coronary thromboses, but non-thrombotic CAEs and re-bleeding occurred. Randomized trials are warranted to test platelet transfusion against other management strategies. © 2018 International Society on Thrombosis and Haemostasis.

  18. Effect of postprocedural full-dose infusion of bivalirudin on acute stent thrombosis in patients with ST-elevation myocardial infarction undergoing primary percutaneous coronary intervention: Outcomes in a large real-world population.

    PubMed

    Wang, Heyang; Liang, Zhenyang; Li, Yi; Li, Bin; Liu, Junming; Hong, Xueyi; Lu, Xin; Wu, Jiansheng; Zhao, Wei; Liu, Qiang; An, Jian; Li, Linfeng; Pu, Fanli; Ming, Qiang; Han, Yaling

    2017-06-01

    This study aimed to evaluate the effect of prolonged full-dose bivalirudin infusion in real-world population with ST-elevation myocardial infarction (STEMI). Subgroup data as well as meta-analysis from randomized clinical trials have shown the potency of postprocedural full-dose infusion (1.75 mg/kg/h) of bivalirudin on attenuating acute stent thrombosis (ST) after primary percutaneous coronary intervention (PCI). In this multicenter retrospective observational study, 2047 consecutive STEMI patients treated with bivalirudin during primary PCI were enrolled in 65 Chinese centers between July 2013 and May 2016. The primary outcome was acute ST defined as ARC definite/probable within 24 hours after the index procedure, and the secondary endpoints included total ST, major adverse cardiac or cerebral events (MACCE, defined as death, reinfarction, stroke, and target vessel revascularization), and any bleeding at 30 days. Among 2047 STEMI patients, 1123 (54.9%) were treated with postprocedural bivalirudin full-dose infusion (median 120 minutes) while the other 924 (45.1%) received low-dose (0.25 mg/kg/h) or null postprocedural infusion. A total of three acute ST (0.3%) occurred in STEMI patients with none or low-dose prolonged infusion of bivalirudin, but none was observed in those treated with post-PCI full-dose infusion (0.3% vs 0.0%, P=.092). Outcomes on MACCE (2.1% vs 2.7%, P=.402) and total bleeding (2.1% vs 1.4%, P=.217) at 30 days showed no significant difference between the two groups, and no subacute ST was observed. Post-PCI full-dose bivalirudin infusion is safe and has a trend to protect against acute ST in STEMI patients undergoing primary PCI in real-world settings. © 2017 John Wiley & Sons Ltd.

  19. A 4-D Climatology (1979-2009) of the Monthly Tropospheric Aerosol Optical Depth Distribution over the Mediterranean Region from a Comparative Evaluation and Blending of Remote Sensing and Model Products

    NASA Technical Reports Server (NTRS)

    Nabat, P.; Somot, S.; Mallet, M.; Chiapello, I; Morcrette, J. J.; Solomon, F.; Szopa, S.; Dulac, F; Collins, W.; Ghan, S.; hide

    2013-01-01

    Since the 1980s several spaceborne sensors have been used to retrieve the aerosol optical depth (AOD) over the Mediterranean region. In parallel, AOD climatologies coming from different numerical model simulations are now also available, permitting to distinguish the contribution of several aerosol types to the total AOD. In this work, we perform a comparative analysis of this unique multiyear database in terms of total AOD and of its apportionment by the five main aerosol types (soil dust, seasalt, sulfate, black and organic carbon). We use 9 different satellite-derived monthly AOD products: NOAA/AVHRR, SeaWiFS (2 products), TERRA/MISR, TERRA/MODIS, AQUA/MODIS, ENVISAT/MERIS, PARASOL/POLDER and MSG/SEVIRI, as well as 3 more historical datasets: NIMBUS7/CZCS, TOMS (onboard NIMBUS7 and Earth- Probe) and METEOSAT/MVIRI. Monthly model datasets include the aerosol climatology from Tegen et al. (1997), the climate-chemistry models LMDz-OR-INCA and RegCM-4, the multi-model mean coming from the ACCMIP exercise, and the reanalyses GEMS and MACC. Ground-based Level- 2 AERONET AOD observations from 47 stations around the basin are used here to evaluate the model and satellite data. The sensor MODIS (on AQUA and TERRA) has the best average AOD scores over this region, showing a relevant spatio-temporal variability and highlighting high dust loads over Northern Africa and the sea (spring and summer), and sulfate aerosols over continental Europe (summer). The comparison also shows limitations of certain datasets (especially MERIS and SeaWiFS standard products). Models reproduce the main patterns of the AOD variability over the basin. The MACC reanalysis is the closest to AERONET data, but appears to underestimate dust over Northern Africa, where RegCM-4 is found closer to MODIS thanks to its interactive scheme for dust emissions. The vertical dimension is also investigated using the CALIOP instrument. This study confirms differences of vertical distribution between dust aerosols showing a large vertical spread, and other continental and marine aerosols which are confined in the boundary layer. From this compilation, we propose a 4-D blended product from model and satellite data, consisting in monthly time series of 3-D aerosol distribution at a 50 km horizontal resolution over the Euro-Mediterranean marine and continental region for the 2003-2009 period. The product is based on the total AOD from AQUA/MODIS, apportioned into sulfates, black and organic carbon from the MACC reanalysis, and into dust and sea-salt aerosols from RegCM-4 simulations, which are distributed vertically based on CALIOP climatology.We extend the 2003-2009 reconstruction to the past up to 1979 using the 2003-2009 average and applying the decreasing trend in sulfate aerosols from LMDz-OR-INCA, whose AOD trends over Europe and the Mediterranean are median among the ACCMIP models. Finally optical properties of the different aerosol types in this region are proposed from Mie calculations so that this reconstruction can be included in regional climate models for aerosol radiative forcing and aerosol-climate studies.

  20. Workflow for Integrating Mesoscale Heterogeneities in Materials Structure with Process Simulation of Titanium Alloys (Postprint)

    DTIC Science & Technology

    2014-10-01

    offer a practical solution to calculating the grain -scale hetero- geneity present in the deformation field. Consequently, crystal plasticity models...process/performance simulation codes (e.g., crystal plasticity finite element method). 15. SUBJECT TERMS ICME; microstructure informatics; higher...iii) protocols for direct and efficient linking of materials models/databases into process/performance simulation codes (e.g., crystal plasticity

  1. Lattice surgery on the Raussendorf lattice

    NASA Astrophysics Data System (ADS)

    Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco

    2018-07-01

    Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.

  2. Bilingual processing of ASL-English code-blends: The consequences of accessing two lexical representations simultaneously

    PubMed Central

    Emmorey, Karen; Petrich, Jennifer; Gollan, Tamar H.

    2012-01-01

    Bilinguals who are fluent in American Sign Language (ASL) and English often produce code-blends - simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization times (Experiment 2) for code-blends versus ASL signs and English words produced alone. In production, code-blending did not slow lexical retrieval for ASL and actually facilitated access to low-frequency signs. However, code-blending delayed speech production because bimodal bilinguals synchronized English and ASL lexical onsets. In comprehension, code-blending speeded access to both languages. Bimodal bilinguals’ ability to produce code-blends without any cost to ASL implies that the language system either has (or can develop) a mechanism for switching off competition to allow simultaneous production of close competitors. Code-blend facilitation effects during comprehension likely reflect cross-linguistic (and cross-modal) integration at the phonological and/or semantic levels. The absence of any consistent processing costs for code-blending illustrates a surprising limitation on dual-task costs and may explain why bimodal bilinguals code-blend more often than they code-switch. PMID:22773886

  3. Violence and its injury consequences in American movies: a public health perspective

    PubMed Central

    McArthur, D.; Peek-Asa, C.; Webb, T.; Fisher, K.; Cook, B.; Browne, N.; Kraus, J.

    2000-01-01

    Objectives—The purpose of this study was to evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top grossing American films of 1994. Methods—Each scene in each film was examined for the presentation of violent actions upon persons and coded by means of a systematic context sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. Results—The median number of violent actions per film was 16, with a range from 1 to 110. Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Conclusions—Violent force in American films of 1994 was overwhelmingly intentional and in four of five cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings. PMID:10875668

  4. Violence and its injury consequences in American movies: a public health perspective.

    PubMed

    McArthur, D; Peek-Asa, C; Webb, T; Fisher, K; Cook, B; Browne, N; Kraus, J

    2000-06-01

    The purpose of this study was to evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top grossing American films of 1994. Each scene in each film was examined for the presentation of violent actions upon persons and coded by means of a systematic context sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. The median number of violent actions per film was 16, with a range from 1 to 110. Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Violent force in American films of 1994 was overwhelmingly intentional and in four of five cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings.

  5. Ultraviolet observations of the symbiotic star AS 296

    NASA Technical Reports Server (NTRS)

    Gutierrez-Moreno, A.; Moreno, H.; Feibelman, W. A.

    1992-01-01

    AS 296 is a well-known S-type symbiotic star which underwent an optical outburst during 1988. In this paper, UV data based on IUE observations obtained both during the quiescent and outburst stages are presented and discussed, correlating them to observations made in the optical region. It is concluded that the object is a symbiotic nova, in which the outburst is due to a thermonuclear runaway produced in the hydrogen-burning shell of a white dwarf with M of about 0.5 solar masses, accreting from the late-type giant at a rate M(acc) of about 9.7 x 10 exp -9 solar mass/year. It is not possible to determine from the observations if the hydrogen flash is degenerate or nondegenerate.

  6. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  7. Prompt Radiation Protection Factors

    DTIC Science & Technology

    2018-02-01

    dimensional Monte-Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection factors (ratio of dose in the open to...radiation was performed using the three dimensional Monte- Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection...by detonation of a nuclear device have placed renewed emphasis on evaluation of the consequences in case of such an event. The Defense Threat

  8. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  9. The moral code in Islam and organ donation in Western countries: reinterpreting religious scriptures to meet utilitarian medical objectives.

    PubMed

    Rady, Mohamed Y; Verheijde, Joseph L

    2014-06-02

    End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life.

  10. The moral code in Islam and organ donation in Western countries: reinterpreting religious scriptures to meet utilitarian medical objectives

    PubMed Central

    2014-01-01

    End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life. PMID:24888748

  11. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  12. A new Fortran 90 program to compute regular and irregular associated Legendre functions (new version announcement)

    NASA Astrophysics Data System (ADS)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2018-04-01

    This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.

  13. [How do first codes of medical ethics inspire contemporary physicians?].

    PubMed

    Paprocka-Lipińska, Anna; Basińska, Krystyna

    2014-02-01

    First codes of medical ethics appeared between 18th and 19th century. Their formation was inspired by changes that happened in medicine, positive in general but with some negative setbacks. Those negative consequences revealed the need to codify all those ethical duties, which were formerly passed from generation to generation by the word of mouth and individual example by master physicians. 210 years has passed since the publication of "Medical Ethics" by Thomas Percival, yet essential ethical guidelines remain the same. Similarly, ethical codes published in Poland in 19 century can still be an inspiration to modem physicians.

  14. Illustration of Some Consequences of the Indistinguishability of Electrons

    ERIC Educational Resources Information Center

    Moore, John W.; Davies, William G.

    1976-01-01

    Discusses how color-coded overhead transparencies of computer-generated dot-density diagrams can be used to illustrate hybrid orbitals and the principle of the indistinguishability of electrons. (MLH)

  15. New French Regulation for NPPs and Code Consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faidy, Claude

    2006-07-01

    On December 2005, the French regulator issued a new regulation for French nuclear power plants, in particular for pressure equipment (PE). This regulation need first to agree with non-nuclear PE regulation and add to that some specific requirements, in particular radiation protection requirements. Different advantages are in these proposal, it's more qualitative risk oriented and it's an important link with non-nuclear industry. Only few components are nuclear specific. But, the general philosophy of the existing Codes (RCC-M [15], KTA [16] or ASME [17]) have to be improved. For foreign Codes, it's plan to define the differences in the user specifications.more » In parallel to that, a new safety classification has been developed by French utility. The consequences is the need to cross all these specifications to define a minimum quality level for each components or systems. In the same time a new concept has been developed to replace the well known 'Leak Before Break methodology': the 'Break Exclusion' methodology. This paper will summarize the key aspects of these different topics. (authors)« less

  16. Hauser-Feshbach calculations in deformed nuclei

    DOE PAGES

    Grimes, S. M.

    2013-08-22

    Hauser Feshbach calculations for deformed nuclei are typically done with level densities appropriate for deformed nuclei but with Hauser Feshbach codes which enforce spherical symmetry by not including K as a parameter in the decay sums. A code has been written which does allow the full K dependence to be included. Calculations with the code have been compared with those from a conventional Hauser Feshbach code. The evaporation portion (continuum) is only slightly affected by this change but the cross sections to individual (resolved) levels are changed substantially. It is found that cross sections to neighboring levels with the samemore » J but differing K are not the same. The predicted consequences of K mixing will also be discussed.« less

  17. High-Content Optical Codes for Protecting Rapid Diagnostic Tests from Counterfeiting.

    PubMed

    Gökçe, Onur; Mercandetti, Cristina; Delamarche, Emmanuel

    2018-06-19

    Warnings and reports on counterfeit diagnostic devices are released several times a year by regulators and public health agencies. Unfortunately, mishandling, altering, and counterfeiting point-of-care diagnostics (POCDs) and rapid diagnostic tests (RDTs) is lucrative, relatively simple and can lead to devastating consequences. Here, we demonstrate how to implement optical security codes in silicon- and nitrocellulose-based flow paths for device authentication using a smartphone. The codes are created by inkjet spotting inks directly on nitrocellulose or on micropillars. Codes containing up to 32 elements per mm 2 and 8 colors can encode as many as 10 45 combinations. Codes on silicon micropillars can be erased by setting a continuous flow path across the entire array of code elements or for nitrocellulose by simply wicking a liquid across the code. Static or labile code elements can further be formed on nitrocellulose to create a hidden code using poly(ethylene glycol) (PEG) or glycerol additives to the inks. More advanced codes having a specific deletion sequence can also be created in silicon microfluidic devices using an array of passive routing nodes, which activate in a particular, programmable sequence. Such codes are simple to fabricate, easy to view, and efficient in coding information; they can be ideally used in combination with information on a package to protect diagnostic devices from counterfeiting.

  18. Blast and the Consequences on Traumatic Brain Injury-Multiscale Mechanical Modeling of Brain

    DTIC Science & Technology

    2011-02-17

    blast simulation. LS-DYNA as an explicit FE code has been employed to simulate this multi- material fluid –structure interaction problem. The 3-D head...formulation is implemented to model the air-blast simulation. LS-DYNA as an explicit FE code has been employed to simulate this multi-material fluid ...Biomechanics Study of Influencing Parameters for brain under Impact ............................... 12 5.1 The Impact of Cerebrospinal Fluid

  19. One decade of space-based isoprene emission estimates: Interannual variations and emission trends between 2005 and 2014

    NASA Astrophysics Data System (ADS)

    Bauwens, Maite; Stavrakou, Trissevgeni; Müller, Jean-François; De Smedt, Isabelle; Van Roozendael, Michel

    2016-04-01

    Isoprene is one of the most largely emitted hydrocarbons in the atmosphere, with global annual emissions estimated at about 500 Tg, but with large uncertainties (Arneth et al., 2011). Here we use the source inversion approach to derive top-down biogenic isoprene emission estimates for the period between 2005 and 2014 constrained by formaldehyde observations, a high-yield intermediate in the oxidation of isoprene in the atmosphere. Formaldehyde columns retrieved from the Ozone Monitoring Instrument (OMI) are used to constrain the IMAGESv2 global chemistry-transport model and its adjoint code (Stavrakou et al., 2009). The MEGAN-MOHYCAN isoprene emissions (Stavrakou et al., 2014) are used as bottom-up inventory in the model. The inversions are performed separately for each year of the study period, and monthly emissions are derived for every model grid cell. The inversion results are compared to independent isoprene emissions from GUESS-ES (Arneth et al., 2007) and MEGAN-MACC (Sinderalova et al., 2014) and to top-down fluxes based on GOME-2 formaldehyde columns (Bauwens et al., 2014; Stavrakou et al., 2015). The mean global annual OMI-based isoprene flux for the period 2005-2014 is estimated to be 270 Tg, with small interannual variation. This estimate is by 20% lower with regard to the a priori inventory on average, but on the regional scale strong emission updates are inferred. The OMI-based emissions are substantially lower than the MEGAN-MACC and the GUESS-ES inventory, but agree well with the isoprene fluxes constrained by GOME-2 formaldehyde columns. Strong emission reductions are derived over tropical regions. The seasonal pattern of isoprene emissions is generally well preserved after inversion and relatively consistent with other inventories, lending confidence to the MEGAN parameterization of the a priori inventory. In boreal regions the isoprene emission trend is positive and reinforced after inversion, whereas the inversion suggests negative trends in the rainforests of Equatorial Africa and South America. The top-down isoprene fluxes are available at a resolution of 0.5°x0.5° between 2005 and 2014 at the GlobEmission website (http://www.globemission.eu). References: Arneth, A., et al.: Process-based estimates of terrestrial ecosystem isoprene emissions: incorporating the effects of a direct CO 2-isoprene interaction, Atmos. Chem. Phys., 7(1), 31-53, 2007. Arneth, A., et al.: Global terrestrial isoprene emission models: sensitivity to variability in climate and vegetation, Atmos. Chem. Phys., 11(15), 8037-8052, 2011. Bauwens, M., et al.: Satellite-based isoprene emission estimates (2007-2012) from the GlobEmission project, in ACCENT-Plus Symposium 2013 Proceedings., 2014. Stavrakou, T., et al.: Isoprene emissions over Asia 1979 - 2012: impact of climate and land-use changes, Atmos. Chem. Phys., 14(9), 4587-4605, doi:10.5194/acp-14-4587-2014, 2014. Stavrakou, T., et al.: How consistent are top-down hydrocarbon emissions based on formaldehyde observations from GOME-2 and OMI?, Atmos. Chem. Phys., 15(20), 11861-11884, doi:10.5194/acp-15-11861-2015, 2015. Stavrakou, T., et al.: Evaluating the performance of pyrogenic and biogenic emission inventories against one decade of space-based formaldehyde columns, Atmos. Chem. Phys., 9(3), 1037-1060, doi:10.5194/acp-9-1037-2009, 2009.

  20. Modeling emission lag after photoexcitation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Kevin L.; Petillo, John J.; Ovtchinnikov, Serguei

    A theoretical model of delayed emission following photoexcitation from metals and semiconductors is given. Its numerical implementation is designed for beam optics codes used to model photocathodes in rf photoinjectors. The model extends the Moments approach for predicting photocurrent and mean transverse energy as moments of an emitted electron distribution by incorporating time of flight and scattering events that result in emission delay on a sub-picosecond level. The model accounts for a dynamic surface extraction field and changes in the energy distribution and time of emission as a consequence of the laser penetration depth and multiple scattering events during transport.more » Usage in the Particle-in-Cell code MICHELLE to predict the bunch shape and duration with or without laser jitter is given. The consequences of delayed emission effects for ultra-short pulses are discussed.« less

  1. Modeling emission lag after photoexcitation

    DOE PAGES

    Jensen, Kevin L.; Petillo, John J.; Ovtchinnikov, Serguei; ...

    2017-10-28

    A theoretical model of delayed emission following photoexcitation from metals and semiconductors is given. Its numerical implementation is designed for beam optics codes used to model photocathodes in rf photoinjectors. The model extends the Moments approach for predicting photocurrent and mean transverse energy as moments of an emitted electron distribution by incorporating time of flight and scattering events that result in emission delay on a sub-picosecond level. The model accounts for a dynamic surface extraction field and changes in the energy distribution and time of emission as a consequence of the laser penetration depth and multiple scattering events during transport.more » Usage in the Particle-in-Cell code MICHELLE to predict the bunch shape and duration with or without laser jitter is given. The consequences of delayed emission effects for ultra-short pulses are discussed.« less

  2. Proceedings of the 21st DOE/NRC Nuclear Air Cleaning Conference; Sessions 1--8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    First, M.W.

    1991-02-01

    Separate abstracts have been prepared for the papers presented at the meeting on nuclear facility air cleaning technology in the following specific areas of interest: air cleaning technologies for the management and disposal of radioactive wastes; Canadian waste management program; radiological health effects models for nuclear power plant accident consequence analysis; filter testing; US standard codes on nuclear air and gas treatment; European community nuclear codes and standards; chemical processing off-gas cleaning; incineration and vitrification; adsorbents; nuclear codes and standards; mathematical modeling techniques; filter technology; safety; containment system venting; and nuclear air cleaning programs around the world. (MB)

  3. Towards a complete map of the human long non-coding RNA transcriptome.

    PubMed

    Uszczynska-Ratajczak, Barbara; Lagarde, Julien; Frankish, Adam; Guigó, Roderic; Johnson, Rory

    2018-05-23

    Gene maps, or annotations, enable us to navigate the functional landscape of our genome. They are a resource upon which virtually all studies depend, from single-gene to genome-wide scales and from basic molecular biology to medical genetics. Yet present-day annotations suffer from trade-offs between quality and size, with serious but often unappreciated consequences for downstream studies. This is particularly true for long non-coding RNAs (lncRNAs), which are poorly characterized compared to protein-coding genes. Long-read sequencing technologies promise to improve current annotations, paving the way towards a complete annotation of lncRNAs expressed throughout a human lifetime.

  4. Code development for ships -- A demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyub, B.; Mansour, A.E.; White, G.

    1996-12-31

    A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less

  5. Cracking the code: the accuracy of coding shoulder procedures and the repercussions.

    PubMed

    Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M

    2013-05-01

    Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p < 0.0001) and the correct procedure code (odds ratio 310.0, p < 0.0001). Using the proforma resulted in a £28,562 increase in revenue for the 100 patients evaluated relative to the income generated from the coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.

  6. A 4-D Climatology (1979-2009) of the Monthly Tropospheric Aerosol Optical Depth Distribution over the Mediterranean Region from a Comparative Evaluation and Blending of Remote Sensing and Model Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nabat, P.; Somot, S.; Mallet, M.

    Since the 1980s several spaceborne sensors have been used to retrieve the aerosol optical depth (AOD) over the Mediterranean region. In parallel, AOD climatologies coming from different numerical model simulations are now also available, permitting to distinguish the contribution of several aerosol types to the total AOD. In this work, we perform a comparative analysis of this unique multiyear database in terms of total AOD and of its apportionment by the five main aerosol types (soil dust, seasalt, sulfate, black and organic carbon). We use 9 different satellite-derived monthly AOD products: NOAA/AVHRR, SeaWiFS (2 products), TERRA/MISR, TERRA/MODIS, AQUA/MODIS, ENVISAT/MERIS, PARASOL/POLDERmore » and MSG/SEVIRI, as well as 3 more historical datasets: NIMBUS7/CZCS, TOMS (onboard NIMBUS7 and Earth- Probe) and METEOSAT/MVIRI. Monthly model datasets include the aerosol climatology from Tegen et al. (1997), the climate-chemistry models LMDz-OR-INCA and RegCM-4, the multi-model mean coming from the ACCMIP exercise, and the reanalyses GEMS and MACC. Ground-based Level- 2 AERONET AOD observations from 47 stations around the basin are used here to evaluate the model and satellite data. The sensor MODIS (on AQUA and TERRA) has the best average AOD scores over this region, showing a relevant spatiotemporal variability and highlighting high dust loads over Northern Africa and the sea (spring and summer), and sulfate aerosols over continental Europe (summer). The comparison also shows limitations of certain datasets (especially MERIS and SeaWiFS standard products). Models reproduce the main patterns of the AOD variability over the basin. The MACC reanalysis is the closest to AERONET data, but appears to underestimate dust over Northern Africa, where RegCM-4 is found closer to MODIS thanks to its interactive scheme for dust emissions. The vertical dimension is also investigated using the CALIOP instrument. This study confirms differences of vertical distribution between dust aerosols showing a large vertical spread, and other continental and marine aerosols which are confined in the boundary layer. From this compilation, we propose a 4-D blended product from model and satellite data, consisting in monthly time series of 3-D aerosol distribution at a 50 km horizontal resolution over the Euro-Mediterranean marine and continental region for the 2003–2009 period. The product is based on the total AOD from AQUA/MODIS, apportioned into sulfates, black and organic carbon from the MACC reanalysis, and into dust and sea-salt aerosols from RegCM-4 simulations, which are distributed vertically based on CALIOP climatology.We extend the 2003–2009 reconstruction to the past up to 1979 using the 2003–2009 average and applying the decreasing trend in sulfate aerosols from LMDz-OR-INCA, whose AOD trends over Europe and the Mediterranean are median among the ACCMIP models. Finally optical properties of the different aerosol types in this region are proposed from Mie calculations so that this reconstruction can be included in regional climate models for aerosol radiative forcing and aerosolclimate studies.« less

  7. The Impact of Aerosol Sources and Aging on CCN Formation in the Houston-Galveston-Gulf of Mexico Region

    NASA Astrophysics Data System (ADS)

    Quinn, P.; Bates, T.; Coffman, D.; Covert, D.

    2007-12-01

    The impact of anthropogenic aerosol on cloud properties, cloud lifetime, and precipitation processes is one of the largest uncertainties in our current understanding of climate change. Aerosols affect cloud properties by serving as cloud condensation nuclei (CCN) thereby leading to the formation of cloud droplets. The process of cloud drop activation is a function of both the size and chemistry of the aerosol particles which, in turn, depend on the source of the aerosol and transformations that occur downwind. In situ field measurements that can lead to an improved understanding of the process of cloud drop formation and simplifying parameterizations for improving the accuracy of climate models are highly desirable. During the Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS), the NOAA RV Ronald H. Brown encountered a wide variety of aerosol types ranging from marine near the Florida panhandle to urban and industrial in the Houston-Galveston area. These varied sources provided an opportunity to investigate the role of aerosol sources, aging, chemistry, and size in the activation of particles to form cloud droplets. Here, we use the correlation between variability in critical diameter for activation (determined empirically from measured CCN concentrations and the number size distribution) and aerosol composition to quantify the impact of composition on particle activation. Variability in aerosol composition is parameterized by the mass fraction of Hydrocarbon-like Organic Aerosol (HOA) for particle diameters less than 200 nm (vacuum aerodynamic). The HOA mass fraction in this size range is lowest for marine aerosol and higher for aerosol impacted by anthropogenic emissions. Combining all data collected at 0.44 percent supersaturation (SS) reveals that composition (defined in this way) explains 40 percent of the variance in the critical diameter. As expected, the dependence of activation on composition is strongest at lower SS. At the same time, correlations between HOA mass fraction and aerosol mean diameter show that these two parameters are essentially independent of one another for this data set. We conclude that, based on the variability of the HOA mass fraction observed during GoMACCS, composition plays a dominant role in determining the fraction of particles that are activated to form cloud droplets. Using Kohler theory, we estimate the error that results in calculated CCN concentrations if the organic fraction of the aerosol is neglected (i.e., a fully soluble composition of ammonium sulfate is assumed) for the range of organic mass fractions and mean diameters observed during GoMACCS. We then relate this error to the source and age of the aerosol. At 0.22 and 0.44 percent SS, the error is considerable for anthropogenic aerosol sampled near the source region as this aerosol has, on average, a high POM mass fraction and smaller particle mean diameter. The error is lower for more aged aerosol as it has a lower POM mass fraction and larger mean particle diameter. Hence, the percent error in calculated CCN concentration is expected to be larger for younger, organic- rich aerosol and smaller for aged, sulfate rich aerosol and for marine aerosol. We extend this analysis to continental and marine data sets recently reported by Dusek et al. [Science, 312, 1375, 2006] and Hudson [Geophys. Res., Lett., 34, L08801, 2007].

  8. Examining the relationship between comprehension and production processes in code-switched language

    PubMed Central

    Guzzardo Tamargo, Rosa E.; Valdés Kroff, Jorge R.; Dussias, Paola E.

    2016-01-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish–English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants’ comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension. PMID:28670049

  9. Examining the relationship between comprehension and production processes in code-switched language.

    PubMed

    Guzzardo Tamargo, Rosa E; Valdés Kroff, Jorge R; Dussias, Paola E

    2016-08-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish-English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants' comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension.

  10. Novel microscopy-based screening method reveals regulators of contact-dependent intercellular transfer

    PubMed Central

    Michael Frei, Dominik; Hodneland, Erlend; Rios-Mondragon, Ivan; Burtey, Anne; Neumann, Beate; Bulkescher, Jutta; Schölermann, Julia; Pepperkok, Rainer; Gerdes, Hans-Hermann; Kögel, Tanja

    2015-01-01

    Contact-dependent intercellular transfer (codeIT) of cellular constituents can have functional consequences for recipient cells, such as enhanced survival and drug resistance. Pathogenic viruses, prions and bacteria can also utilize this mechanism to spread to adjacent cells and potentially evade immune detection. However, little is known about the molecular mechanism underlying this intercellular transfer process. Here, we present a novel microscopy-based screening method to identify regulators and cargo of codeIT. Single donor cells, carrying fluorescently labelled endocytic organelles or proteins, are co-cultured with excess acceptor cells. CodeIT is quantified by confocal microscopy and image analysis in 3D, preserving spatial information. An siRNA-based screening using this method revealed the involvement of several myosins and small GTPases as codeIT regulators. Our data indicates that cellular protrusions and tubular recycling endosomes are important for codeIT. We automated image acquisition and analysis to facilitate large-scale chemical and genetic screening efforts to identify key regulators of codeIT. PMID:26271723

  11. FIR Filter of DS-CDMA UWB Modem Transmitter

    NASA Astrophysics Data System (ADS)

    Kang, Kyu-Min; Cho, Sang-In; Won, Hui-Chul; Choi, Sang-Sung

    This letter presents low-complexity digital pulse shaping filter structures of a direct sequence code division multiple access (DS-CDMA) ultra wide-band (UWB) modem transmitter with a ternary spreading code. The proposed finite impulse response (FIR) filter structures using a look-up table (LUT) have the effect of saving the amount of memory by about 50% to 80% in comparison to the conventional FIR filter structures, and consequently are suitable for a high-speed parallel data process.

  12. A time series analysis of presentations to Queensland health facilities for alcohol-related conditions, following the increase in 'alcopops' tax.

    PubMed

    Kisely, Steve; Crowe, Elizabeth; Lawrence, David; White, Angela; Connor, Jason

    2013-08-01

    In response to concerns about the health consequences of high-risk drinking by young people, the Australian Government increased the tax on pre-mixed alcoholic beverages ('alcopops') favoured by this demographic. We measured changes in admissions for alcohol-related harm to health throughout Queensland, before and after the tax increase in April 2008. We used data from the Queensland Trauma Register, Hospitals Admitted Patients Data Collection, and the Emergency Department Information System to calculate alcohol-related admission rates per 100,000 people, for 15 - 29 year-olds. We analysed data over 3 years (April 2006 - April 2009), using interrupted time-series analyses. This covered 2 years before, and 1 year after, the tax increase. We investigated both mental and behavioural consequences (via F10 codes), and intentional/unintentional injuries (S and T codes). We fitted an auto-regressive integrated moving average (ARIMA) model, to test for any changes following the increased tax. There was no decrease in alcohol-related admissions in 15 - 29 year-olds. We found similar results for males and females, as well as definitions of alcohol-related harms that were narrow (F10 codes only) and broad (F10, S and T codes). The increased tax on 'alcopops' was not associated with any reduction in hospital admissions for alcohol-related harms in Queensland 15 - 29 year-olds.

  13. The perceived organizational impact of the gender gap across a Canadian department of medicine and proposed strategies to combat it: a qualitative study.

    PubMed

    Pattani, Reena; Marquez, Christine; Dinyarian, Camellia; Sharma, Malika; Bain, Julie; Moore, Julia E; Straus, Sharon E

    2018-04-10

    Despite the gender parity existing in medical schools for over three decades, women remain underrepresented in academic medical centers, particularly in senior ranks and in leadership roles. This has consequences for patient care, education, research, and workplace culture within healthcare organizations. This study was undertaken to explore the perspectives of faculty members at a single department of medicine on the impact of the existing gender gap on organizational effectiveness and workplace culture, and to identify systems-based strategies to mitigate the gap. The study took place at a large university department of medicine in Toronto, Canada, with six affiliated hospitals. In this qualitative study, semi-structured individual interviews were conducted between May and September 2016 with full-time faculty members who held clinical and university-based appointments. Transcripts of the interviews were analyzed using thematic analysis. Three authors independently reviewed the transcripts to determine a preliminary list of codes and establish a coding framework. A modified audit consensus coding approach was applied; a single analyst reviewed all the transcripts and a second analyst audited 20% of the transcripts in each round of coding. Following each round, inter-rater reliability was determined, discrepancies were resolved through discussion, and modifications were made as needed to the coding framework. The analysis revealed faculty members' perceptions of the gender gap, potential contributing factors, organizational impacts, and possible solutions to bridge the gap. Of the 43 full-time faculty members who participated in the survey (29 of whom self-identified as female), most participants were aware of the existing gender gap within academic medicine. Participants described social exclusion, reinforced stereotypes, and unprofessional behaviors as consequences of the gap on organizational effectiveness and culture. They suggested improvements in (1) the processes for recruitment, hiring, and promotion; (2) inclusiveness of the work environment; (3) structures for mentorship; and (4) ongoing monitoring of the gap. The existing gender gap in academic medicine may have negative consequences for organizational effectiveness and workplace culture but many systems-based strategies to mitigate the gap exist. Although these solutions warrant rigorous evaluation, they are feasible to institute within most healthcare organizations immediately.

  14. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  15. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    PubMed

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  16. Arbitrariness is not enough: towards a functional approach to the genetic code.

    PubMed

    Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan

    2017-12-01

    Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.

  17. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE PAGES

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    2018-02-02

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  18. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  19. A predictive coding account of MMN reduction in schizophrenia.

    PubMed

    Wacongne, Catherine

    2016-04-01

    The mismatch negativity (MMN) is thought to be an index of the automatic activation of a specialized network for active prediction and deviance detection in the auditory cortex. It is consistently reduced in schizophrenic patients and has received a lot of interest as a clinical and translational tool. The main neuronal hypothesis regarding the mechanisms leading to a reduced MMN in schizophrenic patients is a dysfunction of NMDA receptors (NMDA-R). However, this hypothesis has never been implemented in a neuronal model. In this paper, we examine the consequences of NMDA-R dysfunction in a neuronal model of MMN based on predictive coding principle. I also investigate how predictive processes may interact with synaptic adaptation in MMN generations and examine the consequences of this interaction for the use of MMN paradigms in schizophrenia research. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Efficient preparation of large-block-code ancilla states for fault-tolerant quantum computation

    NASA Astrophysics Data System (ADS)

    Zheng, Yi-Cong; Lai, Ching-Yi; Brun, Todd A.

    2018-03-01

    Fault-tolerant quantum computation (FTQC) schemes that use multiqubit large block codes can potentially reduce the resource overhead to a great extent. A major obstacle is the requirement for a large number of clean ancilla states of different types without correlated errors inside each block. These ancilla states are usually logical stabilizer states of the data-code blocks, which are generally difficult to prepare if the code size is large. Previously, we have proposed an ancilla distillation protocol for Calderbank-Shor-Steane (CSS) codes by classical error-correcting codes. It was assumed that the quantum gates in the distillation circuit were perfect; however, in reality, noisy quantum gates may introduce correlated errors that are not treatable by the protocol. In this paper, we show that additional postselection by another classical error-detecting code can be applied to remove almost all correlated errors. Consequently, the revised protocol is fully fault tolerant and capable of preparing a large set of stabilizer states sufficient for FTQC using large block codes. At the same time, the yield rate can be boosted from O (t-2) to O (1 ) in practice for an [[n ,k ,d =2 t +1

  1. A retrospective comparative study of minimally invasive extracorporeal circulation versus conventional extracorporeal circulation in emergency coronary artery bypass surgery patients: a single surgeon analysis.

    PubMed

    Rufa, Magdalena; Schubel, Jens; Ulrich, Christian; Schaarschmidt, Jan; Tiliscan, Catalin; Bauer, Adrian; Hausmann, Harald

    2015-07-01

    At the moment, the main application of minimally invasive extracorporeal circulation (MiECC) is reserved for elective cardiac operations such as coronary artery bypass grafting (CABG) and/or aortic valve replacement. The purpose of this study was to compare the outcome of emergency CABG operations using either MiECC or conventional extracorporeal circulation (CECC) in patients requiring emergency CABG with regard to the perioperative course and the occurrence of major adverse cardiac and cerebral events (MACCE). We analysed the emergency CABG operations performed by a single surgeon, between January 2007 and July 2013, in order to exclude the differences in surgical technique. During this period, 187 emergency CABG patients (113 MiECC vs 74 CECC) were investigated retrospectively with respect to the following parameters: in-hospital mortality, MACCE, postoperative hospital stay and perioperative transfusion rate. The mean logistic European System for Cardiac Operative Risk Evaluation was higher in the CECC group (MiECC 12.1 ± 16 vs CECC 15.0 ± 20.8, P = 0.15) and the number of bypass grafts per patient was similar in both groups (MiECC 2.94 vs CECC 2.93). There was no significant difference in the postoperative hospital stay or in major postoperative complications. The in-hospital mortality was higher in the CECC group 6.8% versus MiECC 4.4% (P = 0.48). The perioperative transfusion rate was lower with MiECC compared with CECC (MiECC 2.6 ± 3.2 vs CECC 3.8 ± 4.2, P = 0.025 units of blood per patient). In our opinion, the use of MiECC in urgent CABG procedures is safe, feasible and shows no disadvantages compared with the use of CECC. Emergency operations using the MiECC system showed a significantly lower blood transfusion rate and better results concerning the unadjusted in-hospital mortality. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  2. Modeling Urban Air Quality in the Berlin-Brandenburg Region: Evaluation of a WRF-Chem Setup

    NASA Astrophysics Data System (ADS)

    Kuik, F.; Churkina, G.; Butler, T. M.; Lauer, A.; Mar, K. A.

    2015-12-01

    Air pollution is the number one environmental cause of premature deaths in Europe. Despite extensive regulations, air pollution remains a challenging issue, especially in urban areas. For studying air quality in the Berlin-Brandenburg region of Germany the Weather Research and Forecasting Model with Chemistry (WRF-Chem) is set up and evaluated against meteorological and air quality observations from monitoring stations as well as from a field campaign conducted in 2014 (incl. black carbon, VOCs as well as mobile measurements of particle size distribution and particle mass). The model setup includes 3 nested domains with horizontal resolutions of 15km, 3km, and 1km, online biogenic emissions using MEGAN 2.0, and anthropogenic emissions from the TNO-MACC-II inventory. This work serves as a basis for future studies on different aspects of air pollution in the Berlin-Brandenburg region, including how heat waves affect emissions of biogenic volatile organic compounds (BVOC) from urban vegetation (summer 2006) and the impact of selected traffic measures on air quality in the Berlin-Brandenburg area (summer 2014). The model represents the meteorology as observed in the region well for both periods. An exception is the heat wave period in 2006, where the temperature simulated with 3km and 1km resolutions is biased low by around 2°C for urban built-up stations. First results of simulations with chemistry show that, on average, WRF-Chem simulates concentrations of O3 well. However, the 8 hr maxima are underestimated, and the minima are overestimated. While NOx daily means are modeled reasonably well for urban stations, they are overestimated for suburban stations. PM10 concentrations are underestimated by the model. The biases and correlation coefficients of simulated O3, NOx, and PM10 in comparison to surface observations do not show improvements for the 1km domain in comparison to the 3km domain. To improve the model performance of the 1km domain we will include an updated emission inventory (TNO-MACC-III) as well as the interpolation of the emission data from 7km to a 1km resolution.

  3. Global forestry emission projections and abatement costs

    NASA Astrophysics Data System (ADS)

    Böttcher, H.; Gusti, M.; Mosnier, A.; Havlik, P.; Obersteiner, M.

    2012-04-01

    In this paper we present forestry emission projections and associated Marginal Abatement Cost Curves (MACCs) for individual countries, based on economic, social and policy drivers. The activities cover deforestation, afforestation, and forestry management. The global model tools G4M and GLOBIOM, developed at IIASA, are applied. GLOBIOM uses global scenarios of population, diet, GDP and energy demand to inform G4M about future land and commodity prices and demand for bioenergy and timber. G4M projects emissions from afforestation, deforestation and management of existing forests. Mitigation measures are simulated by introducing a carbon tax. Mitigation activities like reducing deforestation or enhancing afforestation are not independent of each other. In contrast to existing forestry mitigation cost curves the presented MACCs are not developed for individual activities but total forest land management which makes the estimated potentials more realistic. In the assumed baseline gross deforestation drops globally from about 12 Mha in 2005 to below 10 Mha after 2015 and reach 0.5 Mha in 2050. Afforestation rates remain fairly constant at about 7 Mha annually. Although we observe a net area increase of global forest area after 2015 net emissions from deforestation and afforestation are positive until 2045 as the newly afforested areas accumulate carbon rather slowly. About 200 Mt CO2 per year in 2030 in Annex1 countries could be mitigated at a carbon price of 50 USD. The potential for forest management improvement is very similar. Above 200 USD the potential is clearly constrained for both options. In Non-Annex1 countries avoided deforestation can achieve about 1200 Mt CO2 per year at a price of 50 USD. The potential is less constrained compared to the potential in Annex1 countries, achieving a potential of 1800 Mt CO2 annually in 2030 at a price of 1000 USD. The potential from additional afforestation is rather limited due to high baseline afforestation rates assumed. In addition we present results of several sensitivity analyses that were run to understand better model uncertainties and the mechanisms of drivers such as agricultural productivity, GDP, wood demand and national corruption rates.

  4. Revascularisation of patients with end-stage renal disease on chronic haemodialysis: bypass surgery versus PCI-analysis of routine statutory health insurance data.

    PubMed

    Möckel, Martin; Searle, Julia; Baberg, Henning Thomas; Dirschedl, Peter; Levenson, Benny; Malzahn, Jürgen; Mansky, Thomas; Günster, Christian; Jeschke, Elke

    2016-01-01

    We aimed to analyse the short-term and long-term outcome of patients with end-stage renal disease (ESRD) undergoing percutaneous intervention (PCI) as compared to coronary artery bypass surgery (CABG) to evaluate the optimal coronary revascularisation strategy. Retrospective analysis of routine statutory health insurance data between 2010 and 2012. Primary outcome was adjusted all-cause mortality after 30 days and major adverse cardiovascular and cerebrovascular events at 1 year. Secondary outcomes were repeat revascularisation at 30 days and 1 year and bleeding events within 7 days. The total number of cases was n=4123 (PCI; n=3417), median age was 71 (IQR 62-77), 30.4% were women. The adjusted OR for death within 30 days was 0.59 (95% CI 0.43 to 0.81) for patients undergoing PCI versus CABG. At 1 year, the adjusted OR for major adverse cardiac and cerebrovascular events (MACCE) was 1.58 (1.32 to 1.89) for PCI versus CABG and 1.47 (1.23 to 1.75) for all-cause death. In the subgroup of patients with acute myocardial infarction (AMI), adjusted all-cause mortality at 30 days did not differ significantly between both groups (OR 0.75 (0.47 to 1.20)), whereas in patients without AMI the OR for 30-day mortality was 0.44 (0.28 to 0.68) for PCI versus CABG. At 1 year, the adjusted OR for MACCE in patients with AMI was 1.40 (1.06 to 1.85) for PCI versus CABG and 1.47 (1.08 to 1.99) for mortality. In this cohort of unselected patients with ESRD undergoing revascularisation, the 1-year outcome was better for CABG in patients with and without AMI. The 30-day mortality was higher in non-AMI patients with CABG reflecting an early hazard with surgery. In cases where the patient's characteristics and risk profile make it difficult to decide on a revascularisation strategy, CABG could be the preferred option.

  5. Consequence analysis in LPG installation using an integrated computer package.

    PubMed

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-07

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG installations. A brief of the theoretical basis of each model implemented in Atlantide and an example of application are included in the paper.

  6. Imitation learning based on an intrinsic motivation mechanism for efficient coding

    PubMed Central

    Triesch, Jochen

    2013-01-01

    A hypothesis regarding the development of imitation learning is presented that is rooted in intrinsic motivations. It is derived from a recently proposed form of intrinsically motivated learning (IML) for efficient coding in active perception, wherein an agent learns to perform actions with its sense organs to facilitate efficient encoding of the sensory data. To this end, actions of the sense organs that improve the encoding of the sensory data trigger an internally generated reinforcement signal. Here it is argued that the same IML mechanism might also support the development of imitation when general actions beyond those of the sense organs are considered: The learner first observes a tutor performing a behavior and learns a model of the the behavior's sensory consequences. The learner then acts itself and receives an internally generated reinforcement signal reflecting how well the sensory consequences of its own behavior are encoded by the sensory model. Actions that are more similar to those of the tutor will lead to sensory signals that are easier to encode and produce a higher reinforcement signal. Through this, the learner's behavior is progressively tuned to make the sensory consequences of its actions match the learned sensory model. I discuss this mechanism in the context of human language acquisition and bird song learning where similar ideas have been proposed. The suggested mechanism also offers an account for the development of mirror neurons and makes a number of predictions. Overall, it establishes a connection between principles of efficient coding, intrinsic motivations and imitation. PMID:24204350

  7. On the reduced lifetime of nitrous oxide due to climate change induced acceleration of the Brewer-Dobson circulation as simulated by the MPI Earth System Model

    NASA Astrophysics Data System (ADS)

    Kracher, D.; Manzini, E.; Reick, C. H.; Schultz, M. G.; Stein, O.

    2014-12-01

    Greenhouse gas induced climate change will modify the physical conditions of the atmosphere. One of the projected changes is an acceleration of the Brewer-Dobson circulation in the stratosphere, as it has been shown in many model studies. This change in the stratospheric circulation consequently bears an effect on the transport and distribution of atmospheric components such as N2O. Since N2O is involved in ozone destruction, a modified distribution of N2O can be of importance for ozone chemistry. N2O is inert in the troposphere and decays only in the stratosphere. Thus, changes in the exchange between troposphere and stratosphere can also affect the stratospheric sink of N2O, and consequently its atmospheric lifetime. N2O is a potent greenhouse gas with a global warming potential of currently approximately 300 CO2-equivalents in a 100-year perspective. A faster decay in atmospheric N2O mixing ratios, i.e. a decreased atmospheric lifetime of N2O, will also reduce its global warming potential. In order to assess the impact of climate change on atmospheric circulation and implied effects on the distribution and lifetime of atmospheric N2O, we apply the Max Planck Institute Earth System Model, MPI-ESM. MPI-ESM consists of the atmospheric general circulation model ECHAM, the land surface model JSBACH, and MPIOM/HAMOCC representing ocean circulation and ocean biogeochemistry. Prognostic atmospheric N2O concentrations in MPI-ESM are determined by land N2O emissions, ocean-atmosphere N2O exchange and atmospheric tracer transport. As stratospheric chemistry is not explicitly represented in MPI-ESM, stratospheric decay rates of N2O are prescribed from a MACC MOZART simulation. Increasing surface temperatures and CO2 concentrations in the stratosphere impact atmospheric circulation differently. Thus, we conduct a series of transient runs with the atmospheric model of MPI-ESM to isolate different factors governing a shift in atmospheric circulation. From those transient simulations we diagnose decreasing tropospheric N2O concentrations, increased transport of N2O from the troposphere to the stratosphere, and increasing stratospheric decay of N2O leading to a reduction in atmospheric lifetime of N2O, in dependency to climate change evolution.

  8. mRNA changes in nucleus accumbens related to methamphetamine addiction in mice

    NASA Astrophysics Data System (ADS)

    Zhu, Li; Li, Jiaqi; Dong, Nan; Guan, Fanglin; Liu, Yufeng; Ma, Dongliang; Goh, Eyleen L. K.; Chen, Teng

    2016-11-01

    Methamphetamine (METH) is a highly addictive psychostimulant that elicits aberrant changes in the expression of microRNAs (miRNAs) and long non-coding RNAs (lncRNAs) in the nucleus accumbens of mice, indicating a potential role of METH in post-transcriptional regulations. To decipher the potential consequences of these post-transcriptional regulations in response to METH, we performed strand-specific RNA sequencing (ssRNA-Seq) to identify alterations in mRNA expression and their alternative splicing in the nucleus accumbens of mice following exposure to METH. METH-mediated changes in mRNAs were analyzed and correlated with previously reported changes in non-coding RNAs (miRNAs and lncRNAs) to determine the potential functions of these mRNA changes observed here and how non-coding RNAs are involved. A total of 2171 mRNAs were differentially expressed in response to METH with functions involved in synaptic plasticity, mitochondrial energy metabolism and immune response. 309 and 589 of these mRNAs are potential targets of miRNAs and lncRNAs respectively. In addition, METH treatment decreases mRNA alternative splicing, and there are 818 METH-specific events not observed in saline-treated mice. Our results suggest that METH-mediated addiction could be attributed by changes in miRNAs and lncRNAs and consequently, changes in mRNA alternative splicing and expression. In conclusion, our study reported a methamphetamine-modified nucleus accumbens transcriptome and provided non-coding RNA-mRNA interaction networks possibly involved in METH addiction.

  9. Ultra Safe And Secure Blasting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M M

    2009-07-27

    The Ultra is a blasting system that is designed for special applications where the risk and consequences of unauthorized demolition or blasting are so great that the use of an extraordinarily safe and secure blasting system is justified. Such a blasting system would be connected and logically welded together through digital code-linking as part of the blasting system set-up and initialization process. The Ultra's security is so robust that it will defeat the people who designed and built the components in any attempt at unauthorized detonation. Anyone attempting to gain unauthorized control of the system by substituting components or tappingmore » into communications lines will be thwarted in their inability to provide encrypted authentication. Authentication occurs through the use of codes that are generated by the system during initialization code-linking and the codes remain unknown to anyone, including the authorized operator. Once code-linked, a closed system has been created. The system requires all components connected as they were during initialization as well as a unique code entered by the operator for function and blasting.« less

  10. Electromagnetic code for naval applications

    NASA Astrophysics Data System (ADS)

    Crescimbeni, F.; Bessi, F.; Chiti, S.

    1988-12-01

    The use of an increasing number of electronic apparatus became vital to meet the high performance required for military Navy applications. Thus the number of antennas to be mounted on shipboard greatly increased. As a consequence of the high antenna density, of the complexity of the shipboard environment and of the powers used for communication and radar systems, the EMC (Electro-Magnetic Compatibility) problem is playing a leading role in the design of the topside of a ship. The Italian Navy has acquired a numerical code for the antenna siting and design. This code, together with experimental data measured at the Italian Navy test range facility, allows for the evaluation of optimal sitings for antenna systems on shipboard, and the prediction of their performances in the actual environment. The structure of this code, named Programma Elettromagnetico per Applicazioni Navali, (Electromagnetic Code for Naval Applications) is discussed, together with its capabilities and applications. Also the results obtained in some examples are presented and compared with the measurements.

  11. The Development of Bimodal Bilingualism: Implications for Linguistic Theory.

    PubMed

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2016-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and 'transfer' as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair.

  12. The Development of Bimodal Bilingualism: Implications for Linguistic Theory

    PubMed Central

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2017-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and ‘transfer’ as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair. PMID:28603576

  13. CRAC2 model description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Alpert, D.J.; Burke, R.P.

    1984-03-01

    The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.

  14. Mathematical fundamentals for the noise immunity of the genetic code.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2018-02-01

    Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of degeneracy of the genetic code are essential and give evidence of substantial advantages of the natural code over other possible ones. In the present chapter we will present a recent approach to explain the degeneracy of the genetic code by algorithmic methods from bioinformatics, and discuss its biological consequences. The biologists recognised this problem immediately after the detection of the non-overlapping structure of the genetic code, i.e., coding sequences are to be read in a unique way determined by their reading frame. But how does the reading head of the ribosome recognises an error in the grouping of codons, caused by e.g. insertion or deletion of a base, that can be fatal during the translation process and may result in nonfunctional proteins? In this chapter we will discuss possible solutions to the frameshift problem with a focus on the theory of so-called circular codes that were discovered in large gene populations of prokaryotes and eukaryotes in the early 90s. Circular codes allow to detect a frameshift of one or two positions and recently a beautiful theory of such codes has been developed using statistics, group theory and graph theory. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Development of high-fidelity multiphysics system for light water reactor analysis

    NASA Astrophysics Data System (ADS)

    Magedanz, Jeffrey W.

    There has been a tendency in recent years toward greater heterogeneity in reactor cores, due to the use of mixed-oxide (MOX) fuel, burnable absorbers, and longer cycles with consequently higher fuel burnup. The resulting asymmetry of the neutron flux and energy spectrum between regions with different compositions causes a need to account for the directional dependence of the neutron flux, instead of the traditional diffusion approximation. Furthermore, the presence of both MOX and high-burnup fuel in the core increases the complexity of the heat conduction. The heat transfer properties of the fuel pellet change with irradiation, and the thermal and mechanical expansion of the pellet and cladding strongly affect the size of the gap between them, and its consequent thermal resistance. These operational tendencies require higher fidelity multi-physics modeling capabilities, and this need is addressed by the developments performed within this PhD research. The dissertation describes the development of a High-Fidelity Multi-Physics System for Light Water Reactor Analysis. It consists of three coupled codes -- CTF for Thermal Hydraulics, TORT-TD for Neutron Kinetics, and FRAPTRAN for Fuel Performance. It is meant to address these modeling challenges in three ways: (1) by resolving the state of the system at the level of each fuel pin, rather than homogenizing entire fuel assemblies, (2) by using the multi-group Discrete Ordinates method to account for the directional dependence of the neutron flux, and (3) by using a fuel-performance code, rather than a Thermal Hydraulics code's simplified fuel model, to account for the material behavior of the fuel and its feedback to the hydraulic and neutronic behavior of the system. While the first two are improvements, the third, the use of a fuel-performance code for feedback, constitutes an innovation in this PhD project. Also important to this work is the manner in which such coupling is written. While coupling involves combining codes into a single executable, they are usually still developed and maintained separately. It should thus be a design objective to minimize the changes to those codes, and keep the changes to each code free of dependence on the details of the other codes. This will ease the incorporation of new versions of the code into the coupling, as well as re-use of parts of the coupling to couple with different codes. In order to fulfill this objective, an interface for each code was created in the form of an object-oriented abstract data type. Object-oriented programming is an effective method for enforcing a separation between different parts of a program, and clarifying the communication between them. The interfaces enable the main program to control the codes in terms of high-level functionality. This differs from the established practice of a master/slave relationship, in which the slave code is incorporated into the master code as a set of subroutines. While this PhD research continues previous work with a coupling between CTF and TORT-TD, it makes two major original contributions: (1) using a fuel-performance code, instead of a thermal-hydraulics code's simplified built-in models, to model the feedback from the fuel rods, and (2) the design of an object-oriented interface as an innovative method to interact with a coupled code in a high-level, easily-understandable manner. The resulting code system will serve as a tool to study the question of under what conditions, and to what extent, these higher-fidelity methods will provide benefits to reactor core analysis. (Abstract shortened by UMI.)

  16. Identification of a Novel GJA8 (Cx50) Point Mutation Causes Human Dominant Congenital Cataracts

    NASA Astrophysics Data System (ADS)

    Ge, Xiang-Lian; Zhang, Yilan; Wu, Yaming; Lv, Jineng; Zhang, Wei; Jin, Zi-Bing; Qu, Jia; Gu, Feng

    2014-02-01

    Hereditary cataracts are clinically and genetically heterogeneous lens diseases that cause a significant proportion of visual impairment and blindness in children. Human cataracts have been linked with mutations in two genes, GJA3 and GJA8, respectively. To identify the causative mutation in a family with hereditary cataracts, family members were screened for mutations by PCR for both genes. Sequencing the coding regions of GJA8, coding for connexin 50, revealed a C > A transversion at nucleotide 264, which caused p.P88T mutation. To dissect the molecular consequences of this mutation, plasmids carrying wild-type and mutant mouse ORFs of Gja8 were generated and ectopically expressed in HEK293 cells and human lens epithelial cells, respectively. The recombinant proteins were assessed by confocal microscopy and Western blotting. The results demonstrate that the molecular consequences of the p.P88T mutation in GJA8 include changes in connexin 50 protein localization patterns, accumulation of mutant protein, and increased cell growth.

  17. Enterobacter aerogenes Hormaeche and Edwards 1960 (Approved Lists 1980) and Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) share the same nomenclatural type (ATCC 13048) on the Approved Lists and are homotypic synonyms, with consequences for the name Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980).

    PubMed

    Tindall, B J; Sutton, G; Garrity, G M

    2017-02-01

    Enterobacter aerogenes Hormaeche and Edwards 1960 (Approved Lists 1980) and Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) were placed on the Approved Lists of Bacterial Names and were based on the same nomenclatural type, ATCC 13048. Consequently they are to be treated as homotypic synonyms. However, the names of homotypic synonyms at the rank of species normally are based on the same epithet. Examination of the Rules of the International Code of Nomenclature of Bacteria in force at the time indicates that the epithet mobilis in Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) was illegitimate at the time the Approved Lists were published and according to the Rules of the current International Code of Nomenclature of Prokaryotes continues to be illegitimate.

  18. Ligand Biological Activity Predictions Using Fingerprint-Based Artificial Neural Networks (FANN-QSAR)

    PubMed Central

    Myint, Kyaw Z.; Xie, Xiang-Qun

    2015-01-01

    This chapter focuses on the fingerprint-based artificial neural networks QSAR (FANN-QSAR) approach to predict biological activities of structurally diverse compounds. Three types of fingerprints, namely ECFP6, FP2, and MACCS, were used as inputs to train the FANN-QSAR models. The results were benchmarked against known 2D and 3D QSAR methods, and the derived models were used to predict cannabinoid (CB) ligand binding activities as a case study. In addition, the FANN-QSAR model was used as a virtual screening tool to search a large NCI compound database for lead cannabinoid compounds. We discovered several compounds with good CB2 binding affinities ranging from 6.70 nM to 3.75 μM. The studies proved that the FANN-QSAR method is a useful approach to predict bioactivities or properties of ligands and to find novel lead compounds for drug discovery research. PMID:25502380

  19. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  20. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    PubMed Central

    Młynarski, Wiktor

    2014-01-01

    To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficient coding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform—Independent Component Analysis (ICA) trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment. PMID:24639644

  1. Preliminary results of consequence assessment of a hypothetical severe accident using Thai meteorological data

    NASA Astrophysics Data System (ADS)

    Silva, K.; Lawawirojwong, S.; Promping, J.

    2017-06-01

    Consequence assessment of a hypothetical severe accident is one of the important elements of the risk assessment of a nuclear power plant. It is widely known that the meteorological conditions can significantly influence the outcomes of such assessment, since it determines the results of the calculation of the radionuclide environmental transport. This study aims to assess the impacts of the meteorological conditions to the results of the consequence assessment. The consequence assessment code, OSCAAR, of Japan Atomic Energy Agency (JAEA) is used for the assessment. The results of the consequence assessment using Thai meteorological data are compared with those using Japanese meteorological data. The Thai case has following characteristics. Low wind speed made the radionuclides concentrate at the center comparing to the Japanese case. The squalls induced the peaks in the ground concentration distribution. The evacuated land is larger than the Japanese case though the relocated land is smaller, which is attributed to the concentration of the radionuclides near the release point.

  2. How To Keep Your Schools Safe and Secure.

    ERIC Educational Resources Information Center

    Gilbert, Christopher B.

    1996-01-01

    Discusses unforeseen costs (including potential litigation expenses), benefits, and consequences of adopting security measures (such as metal detectors, drug dogs, security cameras, campus police, dress codes, crime watch programs, and communication devices) to counter on-campus violence and gang activity. High-tech gadgetry alone is insufficient.…

  3. The Revised 2010 Ethical Standards for School Counselors

    ERIC Educational Resources Information Center

    Huey, Wayne C.

    2011-01-01

    The American School Counselor Association (ASCA) recently revised its ethical code for professional school counselors, the "Ethical Standards for School Counselors," in 2010. Professional school counselors have a unique challenge in counseling minors in that they provide services in an educational setting. Consequently, school counselors not only…

  4. Thermodynamic consequences of hydrogen combustion within a containment of pressurized water reactor

    NASA Astrophysics Data System (ADS)

    Bury, Tomasz

    2011-12-01

    Gaseous hydrogen may be generated in a nuclear reactor system as an effect of the core overheating. This creates a risk of its uncontrolled combustion which may have a destructive consequences, as it could be observed during the Fukushima nuclear power plant accident. Favorable conditions for hydrogen production occur during heavy loss-of-coolant accidents. The author used an own computer code, called HEPCAL, of the lumped parameter type to realize a set of simulations of a large scale loss-of-coolant accidents scenarios within containment of second generation pressurized water reactor. Some simulations resulted in high pressure peaks, seemed to be irrational. A more detailed analysis and comparison with Three Mile Island and Fukushima accidents consequences allowed for withdrawing interesting conclusions.

  5. A novel quantum LSB-based steganography method using the Gray code for colored quantum images

    NASA Astrophysics Data System (ADS)

    Heidari, Shahrokh; Farzadnia, Ehsan

    2017-10-01

    As one of the prevalent data-hiding techniques, steganography is defined as the act of concealing secret information in a cover multimedia encompassing text, image, video and audio, imperceptibly, in order to perform interaction between the sender and the receiver in which nobody except the receiver can figure out the secret data. In this approach a quantum LSB-based steganography method utilizing the Gray code for quantum RGB images is investigated. This method uses the Gray code to accommodate two secret qubits in 3 LSBs of each pixel simultaneously according to reference tables. Experimental consequences which are analyzed in MATLAB environment, exhibit that the present schema shows good performance and also it is more secure and applicable than the previous one currently found in the literature.

  6. Solutions to Three-Dimensional Thin-Layer Navier-Stokes Equations in Rotating Coordinates for Flow Through Turbomachinery

    NASA Technical Reports Server (NTRS)

    Ghosh, Amrit Raj

    1996-01-01

    The viscous, Navier-Stokes solver for turbomachinery applications, MSUTC has been modified to include the rotating frame formulation. The three-dimensional thin-layer Navier-Stokes equations have been cast in a rotating Cartesian frame enabling the freezing of grid motion. This also allows the flow-field associated with an isolated rotor to be viewed as a steady-state problem. Consequently, local time stepping can be used to accelerate convergence. The formulation is validated by running NASA's Rotor 67 as the test case. results are compared between the rotating frame code and the absolute frame code. The use of the rotating frame approach greatly enhances the performance of the code with respect to savings in computing time, without degradation of the solution.

  7. The Maximal C³ Self-Complementary Trinucleotide Circular Code X in Genes of Bacteria, Archaea, Eukaryotes, Plasmids and Viruses.

    PubMed

    Michel, Christian J

    2017-04-18

    In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C 3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X . As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X . Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes.

  8. Implementation of ASME Code, Section XI, Code Case N-770, on Alternative Examination Requirements for Class 1 Butt Welds Fabricated with Alloy 82/182

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Edmund J.; Anderson, Michael T.

    In May 2010, the NRC issued a proposed notice of rulemaking that includes a provision to add a new section to its rules to require licensees to implement ASME Code Case N-770, ‘‘Alternative Examination Requirements and Acceptance Standards for Class 1 PWR Piping and Vessel Nozzle Butt Welds Fabricated with UNS N06082 or UNS W86182 Weld Filler Material With or Without the Application of Listed Mitigation Activities, Section XI, Division 1,’’ with 15 conditions. Code Case N-770 contains baseline and inservice inspection (ISI) requirements for unmitigated butt welds fabricated with Alloy 82/182 material and preservice and ISI requirements for mitigatedmore » butt welds. The NRC stated that application of ASME Code Case N-770 is necessary because the inspections currently required by the ASME Code, Section XI, were not written to address stress corrosion cracking Alloy 82/182 butt welds, and the safety consequences of inadequate inspections can be significant. The NRC expects to issue the final rule incorporating this code case into its regulations in the spring 2011 time frame. This paper discusses the new examination requirements, the conditions that NRC is imposing , and the major concerns with implementation of the new Code Case.« less

  9. Auto-Regulatory RNA Editing Fine-Tunes mRNA Re-Coding and Complex Behaviour in Drosophila

    PubMed Central

    Savva, Yiannis A.; Jepson, James E.C; Sahin, Asli; Sugden, Arthur U.; Dorsky, Jacquelyn S.; Alpert, Lauren; Lawrence, Charles; Reenan, Robert A.

    2014-01-01

    Auto-regulatory feedback loops are a common molecular strategy used to optimize protein function. In Drosophila many mRNAs involved in neuro-transmission are re-coded at the RNA level by the RNA editing enzyme dADAR, leading to the incorporation of amino acids that are not directly encoded by the genome. dADAR also re-codes its own transcript, but the consequences of this auto-regulation in vivo are unclear. Here we show that hard-wiring or abolishing endogenous dADAR auto-regulation dramatically remodels the landscape of re-coding events in a site-specific manner. These molecular phenotypes correlate with altered localization of dADAR within the nuclear compartment. Furthermore, auto-editing exhibits sexually dimorphic patterns of spatial regulation and can be modified by abiotic environmental factors. Finally, we demonstrate that modifying dAdar auto-editing affects adaptive complex behaviors. Our results reveal the in vivo relevance of auto-regulatory control over post-transcriptional mRNA re-coding events in fine-tuning brain function and organismal behavior. PMID:22531175

  10. The dependence of frequency distributions on multiple meanings of words, codes and signs

    NASA Astrophysics Data System (ADS)

    Yan, Xiaoyong; Minnhagen, Petter

    2018-01-01

    The dependence of the frequency distributions due to multiple meanings of words in a text is investigated by deleting letters. By coding the words with fewer letters the number of meanings per coded word increases. This increase is measured and used as an input in a predictive theory. For a text written in English, the word-frequency distribution is broad and fat-tailed, whereas if the words are only represented by their first letter the distribution becomes exponential. Both distribution are well predicted by the theory, as is the whole sequence obtained by consecutively representing the words by the first L = 6 , 5 , 4 , 3 , 2 , 1 letters. Comparisons of texts written by Chinese characters and the same texts written by letter-codes are made and the similarity of the corresponding frequency-distributions are interpreted as a consequence of the multiple meanings of Chinese characters. This further implies that the difference of the shape for word-frequencies for an English text written by letters and a Chinese text written by Chinese characters is due to the coding and not to the language per se.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael; Jonlin, Duane; Nadel, Steven

    Today’s building energy codes focus on prescriptive requirements for features of buildings that are directly controlled by the design and construction teams and verifiable by municipal inspectors. Although these code requirements have had a significant impact, they fail to influence a large slice of the building energy use pie – including not only miscellaneous plug loads, cooking equipment and commercial/industrial processes, but the maintenance and optimization of the code-mandated systems as well. Currently, code compliance is verified only through the end of construction, and there are no limits or consequences for the actual energy use in an occupied building. Inmore » the future, our suite of energy regulations will likely expand to include building efficiency, energy use or carbon emission budgets over their full life cycle. Intelligent building systems, extensive renewable energy, and a transition from fossil fuel to electric heating systems will likely be required to meet ultra-low-energy targets. This paper lays out the authors’ perspectives on how buildings may evolve over the course of the 21st century and the roles that codes and regulations will play in shaping those buildings of the future.« less

  12. Program MAMO: Models for avian management optimization-user guide

    USGS Publications Warehouse

    Guillaumet, Alban; Paxton, Eben H.

    2017-01-01

    The following chapters describe the structure and code of MAMO, and walk the reader through running the different components of the program with sample data. This manual should be used alongside a computer running R, so that the reader can copy and paste code into R, observe the output, and follow along interactively. Taken together, chapters 2–4 will allow the user to replicate a simulation study investigating the consequences of climate change and two potential management actions on the population dynamics of a vulnerable and iconic Hawaiian forest bird, the ‘I‘iwi (Drepanis coccinea; hereafter IIWI).

  13. When Homoplasy Is Not Homoplasy: Dissecting Trait Evolution by Contrasting Composite and Reductive Coding.

    PubMed

    Torres-Montúfar, Alejandro; Borsch, Thomas; Ochoterena, Helga

    2018-05-01

    The conceptualization and coding of characters is a difficult issue in phylogenetic systematics, no matter which inference method is used when reconstructing phylogenetic trees or if the characters are just mapped onto a specific tree. Complex characters are groups of features that can be divided into simpler hierarchical characters (reductive coding), although the implied hierarchical relational information may change depending on the type of coding (composite vs. reductive). Up to now, there is no common agreement to either code characters as complex or simple. Phylogeneticists have discussed which coding method is best but have not incorporated the heuristic process of reciprocal illumination to evaluate the coding. Composite coding allows to test whether 1) several characters were linked resulting in a structure described as a complex character or trait or 2) independently evolving characters resulted in the configuration incorrectly interpreted as a complex character. We propose that complex characters or character states should be decomposed iteratively into simpler characters when the original homology hypothesis is not corroborated by a phylogenetic analysis, and the character or character state is retrieved as homoplastic. We tested this approach using the case of fruit types within subfamily Cinchonoideae (Rubiaceae). The iterative reductive coding of characters associated with drupes allowed us to unthread fruit evolution within Cinchonoideae. Our results show that drupes and berries are not homologous. As a consequence, a more precise ontology for the Cinchonoideae drupes is required.

  14. Communication Civility Codes: Positive Communication through the Students' Eyes

    ERIC Educational Resources Information Center

    Pawlowski, Donna R.

    2017-01-01

    Courses: Presentational courses such as Public Speaking, Interviewing, Business and Professional, Persuasion, Interpersonal; any course where civility may be promoted in the classroom. Objectives: At the end of this single-class activity, students will have an understanding of civility in order to: (1) identify civility and consequences of…

  15. Current evidence of coronary artery bypass grafting off-pump versus on-pump: a systematic review with meta-analysis of over 16,900 patients investigated in randomized controlled trials†.

    PubMed

    Deppe, Antje-Christin; Arbash, Wasim; Kuhn, Elmar W; Slottosch, Ingo; Scherner, Maximilian; Liakopoulos, Oliver J; Choi, Yeong-Hoon; Wahlers, Thorsten

    2016-04-01

    In the present systematic review with meta-analysis, we sought to determine the current strength of evidence for or against off-pump and on-pump coronary artery bypass grafting (CABG) with regard to hard clinical end-points, graft patency and cost-effectiveness. We performed a meta-analysis of only randomized controlled trials (RCT) which reported at least one of the desired end-points including: (i) major adverse cardiac and cerebrovascular events (MACCE), (ii) all-cause mortality, (iii) myocardial infarction, (iv) cerebrovascular accident, (v) repeat revascularization, (vi) graft patency and (vii) cost-effectiveness. The pooled treatment effects [odds ratio (OR) or weighted mean difference, 95% confidence intervals (95% CIs)] were assessed using a fixed or random effects model. A total of 16 904 patients from 51 studies were identified after literature search of the major databases using a predefined keyword list. The incidence of MACCE did not differ between the groups, neither during the first 30 days (OR: 0.93; 95% CI: 0.82-1.04) nor for the longest available follow-up (OR: 1.01; 95% CI: 0.92-1.12). While the incidence of mid-term graft failure (OR: 1.37; 95% CI: 1.09-1.72) and the need for repeat revascularization (OR: 1.55; 95% CI: 1.33-1.80) was increased after off-pump surgery, on-pump surgery was associated with an increased occurrence of stroke (OR: 0.74; 95% CI: 0.58-0.95), renal impairment (OR: 0.79; 95% CI: 0.71-0.89) and mediastinitis (OR: 0.44; 95% CI: 0.31-0.62). There was no difference with regard to hard clinical end-points between on- or off-pump surgery, including myocardial infarction or mortality. The present systematic review emphasizes that both off- and on-pump surgery provide excellent and comparable results in patients requiring surgical revascularization. The choice for either strategy should take into account the individual patient profile (comorbidities, life expectancy, etc.) and importantly, the surgeon's experience in performing on- or off-pump CABG in their routine practice. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  16. Using airborne HIAPER Pole-to-Pole Observations (HIPPO) to evaluate model and remote sensing estimates of atmospheric carbon dioxide

    NASA Astrophysics Data System (ADS)

    Frankenberg, Christian; Kulawik, Susan S.; Wofsy, Steven C.; Chevallier, Frédéric; Daube, Bruce; Kort, Eric A.; O'Dell, Christopher; Olsen, Edward T.; Osterman, Gregory

    2016-06-01

    In recent years, space-borne observations of atmospheric carbon dioxide (CO2) have been increasingly used in global carbon-cycle studies. In order to obtain added value from space-borne measurements, they have to suffice stringent accuracy and precision requirements, with the latter being less crucial as it can be reduced by just enhanced sample size. Validation of CO2 column-averaged dry air mole fractions (XCO2) heavily relies on measurements of the Total Carbon Column Observing Network (TCCON). Owing to the sparseness of the network and the requirements imposed on space-based measurements, independent additional validation is highly valuable. Here, we use observations from the High-Performance Instrumented Airborne Platform for Environmental Research (HIAPER) Pole-to-Pole Observations (HIPPO) flights from 01/2009 through 09/2011 to validate CO2 measurements from satellites (Greenhouse Gases Observing Satellite - GOSAT, Thermal Emission Sounder - TES, Atmospheric Infrared Sounder - AIRS) and atmospheric inversion models (CarbonTracker CT2013B, Monitoring Atmospheric Composition and Climate (MACC) v13r1). We find that the atmospheric models capture the XCO2 variability observed in HIPPO flights very well, with correlation coefficients (r2) of 0.93 and 0.95 for CT2013B and MACC, respectively. Some larger discrepancies can be observed in profile comparisons at higher latitudes, in particular at 300 hPa during the peaks of either carbon uptake or release. These deviations can be up to 4 ppm and hint at misrepresentation of vertical transport. Comparisons with the GOSAT satellite are of comparable quality, with an r2 of 0.85, a mean bias μ of -0.06 ppm, and a standard deviation σ of 0.45 ppm. TES exhibits an r2 of 0.75, μ of 0.34 ppm, and σ of 1.13 ppm. For AIRS, we find an r2 of 0.37, μ of 1.11 ppm, and σ of 1.46 ppm, with latitude-dependent biases. For these comparisons at least 6, 20, and 50 atmospheric soundings have been averaged for GOSAT, TES, and AIRS, respectively. Overall, we find that GOSAT soundings over the remote Pacific Ocean mostly meet the stringent accuracy requirements of about 0.5 ppm for space-based CO2 observations.

  17. A revised global ozone dry deposition estimate based on a new two-layer parameterisation for air-sea exchange and the multi-year MACC composition reanalysis

    NASA Astrophysics Data System (ADS)

    Luhar, Ashok K.; Woodhouse, Matthew T.; Galbally, Ian E.

    2018-03-01

    Dry deposition at the Earth's surface is an important sink of atmospheric ozone. Currently, dry deposition of ozone to the ocean surface in atmospheric chemistry models has the largest uncertainty compared to deposition to other surface types, with implications for global tropospheric ozone budget and associated radiative forcing. Most global models assume that the dominant term of surface resistance in the parameterisation of ozone dry deposition velocity at the oceanic surface is constant. There have been recent mechanistic parameterisations for air-sea exchange that account for the simultaneous waterside processes of ozone solubility, molecular diffusion, turbulent transfer, and first-order chemical reaction of ozone with dissolved iodide and other compounds, but there are questions about their performance and consistency. We present a new two-layer parameterisation scheme for the oceanic surface resistance by making the following realistic assumptions: (a) the thickness of the top water layer is of the order of a reaction-diffusion length scale (a few micrometres) within which ozone loss is dominated by chemical reaction and the influence of waterside turbulent transfer is negligible; (b) in the water layer below, both chemical reaction and waterside turbulent transfer act together and are accounted for; and (c) chemical reactivity is present through the depth of the oceanic mixing layer. The new parameterisation has been evaluated against dry deposition velocities from recent open-ocean measurements. It is found that the inclusion of only the aqueous iodide-ozone reaction satisfactorily describes the measurements. In order to better quantify the global dry deposition loss and its interannual variability, modelled 3-hourly ozone deposition velocities are combined with the 3-hourly MACC (Monitoring Atmospheric Composition and Climate) reanalysis ozone for the years 2003-2012. The resulting ozone dry deposition is found to be 98.4 ± 30.0 Tg O3 yr-1 for the ocean and 722.8 ± 87.3 Tg O3 yr-1 globally. The new estimate of the ocean component is approximately a third of the current model estimates. This reduction corresponds to an approximately 20 % decrease in the total global ozone dry deposition, which (with all other components being unchanged) is equivalent to an increase of approximately 5 % in the modelled tropospheric ozone burden and a similar increase in tropospheric ozone lifetime.

  18. Anaortic off-pump versus clampless off-pump using the PAS-Port device versus conventional coronary artery bypass grafting: mid-term results from a matched propensity score analysis of 5422 unselected patients.

    PubMed

    Furukawa, Nobuyuki; Kuss, Oliver; Preindl, Konstantin; Renner, André; Aboud, Anas; Hakim-Meibodi, Kavous; Benzinger, Michael; Pühler, Thomas; Ensminger, Stephan; Fujita, Buntaro; Becker, Tobias; Gummert, Jan F; Börgermann, Jochen

    2017-10-01

    Meta-analyses from observational and randomized studies have demonstrated benefits of off-pump surgery for hard and surrogate endpoints. In some of them, increased re-revascularization was noted in the off-pump groups, which could impact their long-term survival. Therefore, we analyzed the course of all patients undergoing isolated coronary surgery regarding the major cardiac and cerebrovascular event (MACCE) criteria. A prospective register was taken from a high-volume off-pump center recording all anaortic off-pump (ANA), clampless off-pump (PAS-Port) and conventional (CONV) coronary artery bypass operations between July 2009 and June 2015. Propensity Score Matching was performed based on 28 preoperative risk variables. We identified 935 triplets (N = 2805). Compared with CONV, in-hospital mortality of both the ANA group (OR for ANA [95% CI] 0.25 [0.06; 0.83], P = 0.021), and the PAS-Port group was lower (OR for PAS-Port [95% CI] 0.50 [0.17; 1.32], P = 0.17). In the mid-term follow-up there were no significant differences between the groups regarding mortality (HR for ANA [95%-CI] 0.83 [0.55-1.26], P = 0.38; HR for PAS-Port [95%-CI] 1.06 [0.70-1.59], P = 0.79), incidence of stroke (HR for ANA 0.81 [0.43-1.53], P = 0.52; HR for PAS-Port 0.78 [0.41-1.50], P = 0.46), myocardial infarction (HR for ANA 0.53 [0.22-1.31], P = 0.17; HR for PAS-Port 0.78 [0.37-1.66], P = 0.52) or re-revascularization rate (HR for ANA 0.99 [0.67-1.44], P = 0.94; HR for PAS-Port 0.95 [0.65-1.38], P = 0.77). Both off-pump clampless techniques were associated with lower in-hospital mortality compared with conventional CABG. The mid-term course showed no difference with regard to the MACCE criteria between anaortic off-pump, clampless off-pump using PAS-Port and conventional CABG. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  19. Human-model hybrid Korean air quality forecasting system.

    PubMed

    Chang, Lim-Seok; Cho, Ara; Park, Hyunju; Nam, Kipyo; Kim, Deokrae; Hong, Ji-Hyoung; Song, Chang-Keun

    2016-09-01

    The Korean national air quality forecasting system, consisting of the Weather Research and Forecasting, the Sparse Matrix Operator Kernel Emissions, and the Community Modeling and Analysis (CMAQ), commenced from August 31, 2013 with target pollutants of particulate matters (PM) and ozone. Factors contributing to PM forecasting accuracy include CMAQ inputs of meteorological field and emissions, forecasters' capacity, and inherent CMAQ limit. Four numerical experiments were conducted including two global meteorological inputs from the Global Forecast System (GFS) and the Unified Model (UM), two emissions from the Model Intercomparison Study Asia (MICS-Asia) and the Intercontinental Chemical Transport Experiment (INTEX-B) for the Northeast Asia with Clear Air Policy Support System (CAPSS) for South Korea, and data assimilation of the Monitoring Atmospheric Composition and Climate (MACC). Significant PM underpredictions by using both emissions were found for PM mass and major components (sulfate and organic carbon). CMAQ predicts PM2.5 much better than PM10 (NMB of PM2.5: -20~-25%, PM10: -43~-47%). Forecasters' error usually occurred at the next day of high PM event. Once CMAQ fails to predict high PM event the day before, forecasters are likely to dismiss the model predictions on the next day which turns out to be true. The best combination of CMAQ inputs is the set of UM global meteorological field, MICS-Asia and CAPSS 2010 emissions with the NMB of -12.3%, the RMSE of 16.6μ/m(3) and the R(2) of 0.68. By using MACC data as an initial and boundary condition, the performance skill of CMAQ would be improved, especially in the case of undefined coarse emission. A variety of methods such as ensemble and data assimilation are considered to improve further the accuracy of air quality forecasting, especially for high PM events to be comparable to for all cases. The growing utilization of the air quality forecast induced the public strongly to demand that the accuracy of the national forecasting be improved. In this study, we investigated the problems in the current forecasting as well as various alternatives to solve the problems. Such efforts to improve the accuracy of the forecast are expected to contribute to the protection of public health by increasing the availability of the forecast system.

  20. The evolution of transcriptional regulation in eukaryotes

    NASA Technical Reports Server (NTRS)

    Wray, Gregory A.; Hahn, Matthew W.; Abouheif, Ehab; Balhoff, James P.; Pizer, Margaret; Rockman, Matthew V.; Romano, Laura A.

    2003-01-01

    Gene expression is central to the genotype-phenotype relationship in all organisms, and it is an important component of the genetic basis for evolutionary change in diverse aspects of phenotype. However, the evolution of transcriptional regulation remains understudied and poorly understood. Here we review the evolutionary dynamics of promoter, or cis-regulatory, sequences and the evolutionary mechanisms that shape them. Existing evidence indicates that populations harbor extensive genetic variation in promoter sequences, that a substantial fraction of this variation has consequences for both biochemical and organismal phenotype, and that some of this functional variation is sorted by selection. As with protein-coding sequences, rates and patterns of promoter sequence evolution differ considerably among loci and among clades for reasons that are not well understood. Studying the evolution of transcriptional regulation poses empirical and conceptual challenges beyond those typically encountered in analyses of coding sequence evolution: promoter organization is much less regular than that of coding sequences, and sequences required for the transcription of each locus reside at multiple other loci in the genome. Because of the strong context-dependence of transcriptional regulation, sequence inspection alone provides limited information about promoter function. Understanding the functional consequences of sequence differences among promoters generally requires biochemical and in vivo functional assays. Despite these challenges, important insights have already been gained into the evolution of transcriptional regulation, and the pace of discovery is accelerating.

  1. Code Sharing and Collaboration: Experiences from the Scientist's Expert Assistant Project and their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.

  2. Code Sharing and Collaboration: Experiences From the Scientist's Expert Assistant Project and Their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.

  3. Critical evaluation of reverse engineering tool Imagix 4D!

    PubMed

    Yadav, Rashmi; Patel, Ravindra; Kothari, Abhay

    2016-01-01

    The comprehension of legacy codes is difficult to understand. Various commercial reengineering tools are available that have unique working styles, and are equipped with their inherent capabilities and shortcomings. The focus of the available tools is in visualizing static behavior not the dynamic one. Therefore, it is difficult for people who work in software product maintenance, code understanding reengineering/reverse engineering. Consequently, the need for a comprehensive reengineering/reverse engineering tool arises. We found the usage of Imagix 4D to be good as it generates the maximum pictorial representations in the form of flow charts, flow graphs, class diagrams, metrics and, to a partial extent, dynamic visualizations. We evaluated Imagix 4D with the help of a case study involving a few samples of source code. The behavior of the tool was analyzed on multiple small codes and a large code gcc C parser. Large code evaluation was performed to uncover dead code, unstructured code, and the effect of not including required files at preprocessing level. The utility of Imagix 4D to prepare decision density and complexity metrics for a large code was found to be useful in getting to know how much reengineering is required. At the outset, Imagix 4D offered limitations in dynamic visualizations, flow chart separation (large code) and parsing loops. The outcome of evaluation will eventually help in upgrading Imagix 4D and posed a need of full featured tools in the area of software reengineering/reverse engineering. It will also help the research community, especially those who are interested in the realm of software reengineering tool building.

  4. Recurrent and functional regulatory mutations in breast cancer.

    PubMed

    Rheinbay, Esther; Parasuraman, Prasanna; Grimsby, Jonna; Tiao, Grace; Engreitz, Jesse M; Kim, Jaegil; Lawrence, Michael S; Taylor-Weiner, Amaro; Rodriguez-Cuevas, Sergio; Rosenberg, Mara; Hess, Julian; Stewart, Chip; Maruvka, Yosef E; Stojanov, Petar; Cortes, Maria L; Seepo, Sara; Cibulskis, Carrie; Tracy, Adam; Pugh, Trevor J; Lee, Jesse; Zheng, Zongli; Ellisen, Leif W; Iafrate, A John; Boehm, Jesse S; Gabriel, Stacey B; Meyerson, Matthew; Golub, Todd R; Baselga, Jose; Hidalgo-Miranda, Alfredo; Shioda, Toshi; Bernards, Andre; Lander, Eric S; Getz, Gad

    2017-07-06

    Genomic analysis of tumours has led to the identification of hundreds of cancer genes on the basis of the presence of mutations in protein-coding regions. By contrast, much less is known about cancer-causing mutations in non-coding regions. Here we perform deep sequencing in 360 primary breast cancers and develop computational methods to identify significantly mutated promoters. Clear signals are found in the promoters of three genes. FOXA1, a known driver of hormone-receptor positive breast cancer, harbours a mutational hotspot in its promoter leading to overexpression through increased E2F binding. RMRP and NEAT1, two non-coding RNA genes, carry mutations that affect protein binding to their promoters and alter expression levels. Our study shows that promoter regions harbour recurrent mutations in cancer with functional consequences and that the mutations occur at similar frequencies as in coding regions. Power analyses indicate that more such regions remain to be discovered through deep sequencing of adequately sized cohorts of patients.

  5. A simplified procedure for correcting both errors and erasures of a Reed-Solomon code using the Euclidean algorithm

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Hsu, I. S.; Eastman, W. L.; Reed, I. S.

    1987-01-01

    It is well known that the Euclidean algorithm or its equivalent, continued fractions, can be used to find the error locator polynomial and the error evaluator polynomial in Berlekamp's key equation needed to decode a Reed-Solomon (RS) code. A simplified procedure is developed and proved to correct erasures as well as errors by replacing the initial condition of the Euclidean algorithm by the erasure locator polynomial and the Forney syndrome polynomial. By this means, the errata locator polynomial and the errata evaluator polynomial can be obtained, simultaneously and simply, by the Euclidean algorithm only. With this improved technique the complexity of time domain RS decoders for correcting both errors and erasures is reduced substantially from previous approaches. As a consequence, decoders for correcting both errors and erasures of RS codes can be made more modular, regular, simple, and naturally suitable for both VLSI and software implementation. An example illustrating this modified decoding procedure is given for a (15, 9) RS code.

  6. Modification of the SAS4A Safety Analysis Code for Integration with the ADAPT Discrete Dynamic Event Tree Framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary Kyle; Denman, Matthew R.

    It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through themore » analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.« less

  7. A modified carrier-to-code leveling method for retrieving ionospheric observables and detecting short-term temporal variability of receiver differential code biases

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Xiao; Li, Min

    2018-03-01

    Sensing the ionosphere with the global positioning system involves two sequential tasks, namely the ionospheric observable retrieval and the ionospheric parameter estimation. A prominent source of error has long been identified as short-term variability in receiver differential code bias (rDCB). We modify the carrier-to-code leveling (CCL), a method commonly used to accomplish the first task, through assuming rDCB to be unlinked in time. Aside from the ionospheric observables, which are affected by, among others, the rDCB at one reference epoch, the Modified CCL (MCCL) can also provide the rDCB offsets with respect to the reference epoch as by-products. Two consequences arise. First, MCCL is capable of excluding the effects of time-varying rDCB from the ionospheric observables, which, in turn, improves the quality of ionospheric parameters of interest. Second, MCCL has significant potential as a means to detect between-epoch fluctuations experienced by rDCB of a single receiver.

  8. Studies of Planet Formation Using a Hybrid N-Body + Planetesimal Code

    NASA Technical Reports Server (NTRS)

    Kenyon, Scott J.

    2004-01-01

    The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: (1) icy planets: models for icy planet formation will demonstrate how the physical properties of debris disks - including the Kuiper Belt in our solar system - depend on initial conditions and input physics; and (2) terrestrial planets: calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment.

  9. Cell cycle, oncogenic and tumor suppressor pathways regulate numerous long and macro non-protein-coding RNAs

    PubMed Central

    2014-01-01

    Background The genome is pervasively transcribed but most transcripts do not code for proteins, constituting non-protein-coding RNAs. Despite increasing numbers of functional reports of individual long non-coding RNAs (lncRNAs), assessing the extent of functionality among the non-coding transcriptional output of mammalian cells remains intricate. In the protein-coding world, transcripts differentially expressed in the context of processes essential for the survival of multicellular organisms have been instrumental in the discovery of functionally relevant proteins and their deregulation is frequently associated with diseases. We therefore systematically identified lncRNAs expressed differentially in response to oncologically relevant processes and cell-cycle, p53 and STAT3 pathways, using tiling arrays. Results We found that up to 80% of the pathway-triggered transcriptional responses are non-coding. Among these we identified very large macroRNAs with pathway-specific expression patterns and demonstrated that these are likely continuous transcripts. MacroRNAs contain elements conserved in mammals and sauropsids, which in part exhibit conserved RNA secondary structure. Comparing evolutionary rates of a macroRNA to adjacent protein-coding genes suggests a local action of the transcript. Finally, in different grades of astrocytoma, a tumor disease unrelated to the initially used cell lines, macroRNAs are differentially expressed. Conclusions It has been shown previously that the majority of expressed non-ribosomal transcripts are non-coding. We now conclude that differential expression triggered by signaling pathways gives rise to a similar abundance of non-coding content. It is thus unlikely that the prevalence of non-coding transcripts in the cell is a trivial consequence of leaky or random transcription events. PMID:24594072

  10. Work-family balance by women GP specialist trainees in Slovenia: a qualitative study.

    PubMed

    Petek, Davorina; Gajsek, Tadeja; Petek Ster, Marija

    2016-01-28

    Women physicians face many challenges while balancing their many roles: doctor, specialist trainee, mother and partner. The most opportune biological time for a woman to start a family coincides with a great deal of demands and requirements at work. In this study we explored the options and capabilities of women GP specialist trainees in coordinating their family and career. This is a phenomenological qualitative research. Ten GP specialist trainees from urban and rural areas were chosen by the purposive sampling technique, and semi-structured in-depth interviews were conducted, recorded, transcribed and analysed by using thematic analysis process. Open coding and the book of codes were formed. Finally, we performed the process of code reduction by identifying the themes, which were compared, interpreted and organised in the highest analytical units--categories. One hundred fifty-five codes were identified in the analysis, which were grouped together into eleven themes. The identified themes are: types, causes and consequences of burdens, work as pleasure and positive attitude toward self, priorities, planning and help, and understanding of superiors, disburdening and changing in specialisation. The themes were grouped into four large categories: burdens, empowerment, coordination and needs for improvement. Women specialist trainees encounter intense burdens at work and home due to numerous demands and requirements during their specialisation training. In addition, there is also the issue of the work-family conflict. There are many consequences regarding burden and strain; however, burnout stands out the most. In contrast, reconciliation of work and family life and needs can be successful. The key element is empowerment of women doctors. The foremost necessary systemic solution is the reinforcement of general practitioners in primary health care and their understanding of the specialisation training scheme with more flexible possibilities for time adaptations of specialist training.

  11. Effect of magnetic island geometry on ECRH/ECCD and consequences to the NTM stabilization dynamics

    NASA Astrophysics Data System (ADS)

    Chatziantonaki, I.; Tsironis, C.; Isliker, H.; Vlahos, L.

    2012-09-01

    In the majority of codes that model ECCD-based NTM stabilization, the analysis of the EC propagation and absorption is performed in terms of the axisymmetric magnetic field, ignoring effects due to the island topology. In this paper, we analyze the wave propagation, absorption and current drive in the presence of NTMs, as well as the ECCD-driven island growth, focusing on the effect of the island geometry on the wave de-position. A primary evaluation of the consequences of these effects on the NTM evolution is also made in terms of the modified Rutherford equation.

  12. Understanding the detector behavior through Montecarlo and calibration studies in view of the SOX measurement

    NASA Astrophysics Data System (ADS)

    Caminata, A.; Agostini, M.; Altenmüller, K.; Appel, S.; Bellini, G.; Benziger, J.; Berton, N.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Cavalcante, P.; Chepurnov, A.; Choi, K.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; Di Noto, L.; Drachnev, I.; Durero, M.; Empl, A.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, T.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonquères, N.; Jedrzejczak, K.; Kaiser, M.; Kobychev, V.; Korablev, D.; Korga, G.; Kornoukhov, V.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Mosteiro, P.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pallavicini, M.; Papp, L.; Perasso, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Simgen, H.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Veyssiere, C.; Vishneva, A.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.

    2016-02-01

    Borexino is an unsegmented neutrino detector operating at LNGS in central Italy. The experiment has shown its performances through its unprecedented accomplishments in the solar and geoneutrino detection. These performances make it an ideal tool to accomplish a state- of-the-art experiment able to test the existence of sterile neutrinos (SOX experiment). For both the solar and the SOX analysis, a good understanding of the detector response is fundamental. Consequently, calibration campaigns with radioactive sources have been performed over the years. The calibration data are of extreme importance to develop an accurate Monte Carlo code. This code is used in all the neutrino analyses. The Borexino-SOX calibration techniques and program and the advances on the detector simulation code in view of the start of the SOX data taking are presented. 1

  13. The Maximal C3 Self-Complementary Trinucleotide Circular Code X in Genes of Bacteria, Archaea, Eukaryotes, Plasmids and Viruses

    PubMed Central

    Michel, Christian J.

    2017-01-01

    In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X. As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X. Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes. PMID:28420220

  14. Deforestation and Carbon Loss in Southwest Amazonia: Impact of Brazil's Revised Forest Code

    NASA Astrophysics Data System (ADS)

    Roriz, Pedro Augusto Costa; Yanai, Aurora Miho; Fearnside, Philip Martin

    2017-09-01

    In 2012 Brazil's National Congress altered the country's Forest Code, decreasing various environmental protections in the set of regulations governing forests. This suggests consequences in increased deforestation and emissions of greenhouse gases and in decreased protection of fragile ecosystems. To ascertain the effects, a simulation was run to the year 2025 for the municipality (county) of Boca do Acre, Amazonas state, Brazil. A baseline scenario considered historical behavior (which did not respect the Forest Code), while two scenarios considered full compliance with the old Forest Code (Law 4771/1965) and the current Code (Law 12,651/2012) regarding the protection of "areas of permanent preservation" (APPs) along the edges of watercourses. The models were parameterized from satellite imagery and simulated using Dinamica-EGO software. Deforestation actors and processes in the municipality were observed in loco in 2012. Carbon emissions and loss of forest by 2025 were computed in the three simulation scenarios. There was a 10% difference in the loss of carbon stock and of forest between the scenarios with the two versions of the Forest Code. The baseline scenario showed the highest loss of carbon stocks and the highest increase in annual emissions. The greatest damage was caused by not protecting wetlands and riparian zones.

  15. Validation of a multi-layer Green's function code for ion beam transport

    NASA Astrophysics Data System (ADS)

    Walker, Steven; Tweed, John; Tripathi, Ram; Badavi, Francis F.; Miller, Jack; Zeitlin, Cary; Heilbronn, Lawrence

    To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiations is needed. In consequence, a new version of the HZETRN code capable of simulating high charge and energy (HZE) ions with either laboratory or space boundary conditions is currently under development. The new code, GRNTRN, is based on a Green's function approach to the solution of Boltzmann's transport equation and like its predecessor is deterministic in nature. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and down shift. Code validation in the laboratory environment is addressed by showing that GRNTRN accurately predicts energy loss spectra as measured by solid-state detectors in ion beam experiments with multi-layer targets. In order to validate the code with space boundary conditions, measured particle fluences are propagated through several thicknesses of shielding using both GRNTRN and the current version of HZETRN. The excellent agreement obtained indicates that GRNTRN accurately models the propagation of HZE ions in the space environment as well as in laboratory settings and also provides verification of the HZETRN propagator.

  16. Deforestation and Carbon Loss in Southwest Amazonia: Impact of Brazil's Revised Forest Code.

    PubMed

    Roriz, Pedro Augusto Costa; Yanai, Aurora Miho; Fearnside, Philip Martin

    2017-09-01

    In 2012 Brazil's National Congress altered the country's Forest Code, decreasing various environmental protections in the set of regulations governing forests. This suggests consequences in increased deforestation and emissions of greenhouse gases and in decreased protection of fragile ecosystems. To ascertain the effects, a simulation was run to the year 2025 for the municipality (county) of Boca do Acre, Amazonas state, Brazil. A baseline scenario considered historical behavior (which did not respect the Forest Code), while two scenarios considered full compliance with the old Forest Code (Law 4771/1965) and the current Code (Law 12,651/2012) regarding the protection of "areas of permanent preservation" (APPs) along the edges of watercourses. The models were parameterized from satellite imagery and simulated using Dinamica-EGO software. Deforestation actors and processes in the municipality were observed in loco in 2012. Carbon emissions and loss of forest by 2025 were computed in the three simulation scenarios. There was a 10% difference in the loss of carbon stock and of forest between the scenarios with the two versions of the Forest Code. The baseline scenario showed the highest loss of carbon stocks and the highest increase in annual emissions. The greatest damage was caused by not protecting wetlands and riparian zones.

  17. Edge-diffraction effects in RCS predictions and their importance in systems analysis

    NASA Astrophysics Data System (ADS)

    Friess, W. F.; Klement, D.; Ruppel, M.; Stein, Volker

    1996-06-01

    In developing RCS prediction codes a variety of physical effects such as the edge diffraction effect have to be considered with the consequence that the computer effort increases considerably. This fact limits the field of application of such codes, especially if the RCS data serve as input parameters for system simulators which very often need these data for a high number of observation angles and/or frequencies. Vice versa the issues of a system analysis can be used to estimate the relevance of physical effects under system viewpoints and to rank them according to their magnitude. This paper tries to evaluate the importance of RCS predictions containing an edge diffracted field for systems analysis. A double dihedral with a strong depolarizing behavior and a generic airplane design containing many arbitrarily oriented edges are used as test structures. Data of the scattered field are generated by the RCS computer code SIGMA with and without including edge diffraction effects. These data are submitted to the code DORA to determine radar range and radar detectibility and to a SAR simulator code to generate SAR imagery. In both cases special scenarios are assumed. The essential features of the computer codes in their current state are described, the results are presented and discussed under systems viewpoints.

  18. The Initial Atmospheric Transport (IAT) Code: Description and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrow, Charles W.; Bartel, Timothy James

    The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34Dmore » accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.« less

  19. Changes in mitochondrial genetic codes as phylogenetic characters: Two examples from the flatworms

    PubMed Central

    Telford, Maximilian J.; Herniou, Elisabeth A.; Russell, Robert B.; Littlewood, D. Timothy J.

    2000-01-01

    Shared molecular genetic characteristics other than DNA and protein sequences can provide excellent sources of phylogenetic information, particularly if they are complex and rare and are consequently unlikely to have arisen by chance convergence. We have used two such characters, arising from changes in mitochondrial genetic code, to define a clade within the Platyhelminthes (flatworms), the Rhabditophora. We have sampled 10 distinct classes within the Rhabditophora and find that all have the codon AAA coding for the amino acid Asn rather than the usual Lys and AUA for Ile rather than the usual Met. We find no evidence to support claims that the codon UAA codes for Tyr in the Platyhelminthes rather than the standard stop codon. The Rhabditophora are a very diverse group comprising the majority of the free-living turbellarian taxa and the parasitic Neodermata. In contrast, three other classes of turbellarian flatworm, the Acoela, Nemertodermatida, and Catenulida, have the standard invertebrate assignments for these codons and so are convincingly excluded from the rhabditophoran clade. We have developed a rapid computerized method for analyzing genetic codes and demonstrate the wide phylogenetic distribution of the standard invertebrate code as well as confirming already known metazoan deviations from it (ascidian, vertebrate, echinoderm/hemichordate). PMID:11027335

  20. STRAPS v1.0: evaluating a methodology for predicting electron impact ionisation mass spectra for the aerosol mass spectrometer

    NASA Astrophysics Data System (ADS)

    Topping, David O.; Allan, James; Rami Alfarra, M.; Aumont, Bernard

    2017-06-01

    Our ability to model the chemical and thermodynamic processes that lead to secondary organic aerosol (SOA) formation is thought to be hampered by the complexity of the system. While there are fundamental models now available that can simulate the tens of thousands of reactions thought to take place, validation against experiments is highly challenging. Techniques capable of identifying individual molecules such as chromatography are generally only capable of quantifying a subset of the material present, making it unsuitable for a carbon budget analysis. Integrative analytical methods such as the Aerosol Mass Spectrometer (AMS) are capable of quantifying all mass, but because of their inability to isolate individual molecules, comparisons have been limited to simple data products such as total organic mass and the O : C ratio. More detailed comparisons could be made if more of the mass spectral information could be used, but because a discrete inversion of AMS data is not possible, this activity requires a system of predicting mass spectra based on molecular composition. In this proof-of-concept study, the ability to train supervised methods to predict electron impact ionisation (EI) mass spectra for the AMS is evaluated. Supervised Training Regression for the Arbitrary Prediction of Spectra (STRAPS) is not built from first principles. A methodology is constructed whereby the presence of specific mass-to-charge ratio (m/z) channels is fitted as a function of molecular structure before the relative peak height for each channel is similarly fitted using a range of regression methods. The widely used AMS mass spectral database is used as a basis for this, using unit mass resolution spectra of laboratory standards. Key to the fitting process is choice of structural information, or molecular fingerprint. Our approach relies on using supervised methods to automatically optimise the relationship between spectral characteristics and these molecular fingerprints. Therefore, any internal mechanisms or instrument features impacting on fragmentation are implicitly accounted for in the fitted model. Whilst one might expect a collection of keys specifically designed according to EI fragmentation principles to offer a robust basis, the suitability of a range of commonly available fingerprints is evaluated. Using available fingerprints in isolation, initial results suggest the generic public MACCS fingerprints provide the most accurate trained model when combined with both decision trees and random forests, with median cosine angles of 0.94-0.97 between modelled and measured spectra. There is some sensitivity to choice of fingerprint, but most sensitivity is in choice of regression technique. Support vector machines perform the worst, with median values of 0.78-0.85 and lower ranges approaching 0.4, depending on the fingerprint used. More detailed analysis of modelled versus mass spectra demonstrates important composition-dependent sensitivities on a compound-by-compound basis. This is further demonstrated when we apply the trained methods to a model α-pinene SOA system, using output from the GECKO-A model. This shows that use of a generic fingerprint referred to as FP4 and one designed for vapour pressure predictions (Nanoolal) gives plausible mass spectra, whilst the use of the MACCS keys in isolation performs poorly in this application, demonstrating the need for evaluating model performance against other SOA systems rather than existing laboratory databases on single compounds. Given the limited number of compounds used within the AMS training dataset, it is difficult to prescribe which combination of approach would lead to a robust generic model across all expected compositions. Nonetheless, the study demonstrates the use of a methodology that would be improved with more training data, fingerprints designed explicitly for fragmentation mechanisms occurring within the AMS, and data from additional mixed systems for further validation. To facilitate further development of the method, including application to other instruments, the model code for re-training is provided via a public Github and Zenodo software repository.

  1. Women Faculty Distressed: Descriptions and Consequences of Academic Contrapower Harassment

    ERIC Educational Resources Information Center

    Lampman, Claudia; Crew, Earl C.; Lowery, Shea D.; Tompkins, Kelley

    2016-01-01

    Academic contrapower harassment (ACPH) occurs when someone with seemingly less power in an educational setting (e.g., a student) harasses someone more powerful (e.g., a professor). A representative sample of 289 professors from U.S. institutions of higher education described their worst incident with ACPH. Open-ended responses were coded using a…

  2. An Examination of Differences in Consequences of Punishment among PK-12 School Administrators

    ERIC Educational Resources Information Center

    Randle, Dawn DuBose

    2010-01-01

    The purpose of this study was to examine the differences in the administering of punishment procedures for violations of a school district's Code of Student Conduct among school-based administrators. Specifically, this study was concerned with the impact of the socio-demographic variables of: gender, years of administrative experience,…

  3. Whose Code Are You Teaching? A Popular Australian Coursebook Unravelled

    ERIC Educational Resources Information Center

    Ritchie, Annabelle

    2005-01-01

    The study of curriculum materials is of interest to social researchers seeking to understand the social constructions of reality. All texts embody a number of purposeful choices about how reality is to be represented, and these choices have consequences for what is "foregrounded, backgrounded, placed in the margins, distorted, short-cut,…

  4. Developmental Dyslexia and Explicit Long-Term Memory

    ERIC Educational Resources Information Center

    Menghini, Deny; Carlesimo, Giovanni Augusto; Marotta, Luigi; Finzi, Alessandra; Vicari, Stefano

    2010-01-01

    The reduced verbal long-term memory capacities often reported in dyslexics are generally interpreted as a consequence of their deficit in phonological coding. The present study was aimed at evaluating whether the learning deficit exhibited by dyslexics was restricted only to the verbal component of the long-term memory abilities or also involved…

  5. Preparing to "Not" Be a Footballer: Higher Education and Professional Sport

    ERIC Educational Resources Information Center

    Hickey, Christopher; Kelly, Peter

    2008-01-01

    In the commercialised and professionalised world of elite sport, issues associated with career pathways and post sporting career options have a particular resonance. In various football codes, an unexpected knock, twist, bend or break can profoundly impact a player's career. In this high risk and high consequence environment, a number of sports…

  6. Palindromic Genes in the Linear Mitochondrial Genome of the Nonphotosynthetic Green Alga Polytomella magna

    PubMed Central

    Smith, David Roy; Hua, Jimeng; Archibald, John M.; Lee, Robert W.

    2013-01-01

    Organelle DNA is no stranger to palindromic repeats. But never has a mitochondrial or plastid genome been described in which every coding region is part of a distinct palindromic unit. While sequencing the mitochondrial DNA of the nonphotosynthetic green alga Polytomella magna, we uncovered precisely this type of genic arrangement. The P. magna mitochondrial genome is linear and made up entirely of palindromes, each containing 1–7 unique coding regions. Consequently, every gene in the genome is duplicated and in an inverted orientation relative to its partner. And when these palindromic genes are folded into putative stem-loops, their predicted translational start sites are often positioned in the apex of the loop. Gel electrophoresis results support the linear, 28-kb monomeric conformation of the P. magna mitochondrial genome. Analyses of other Polytomella taxa suggest that palindromic mitochondrial genes were present in the ancestor of the Polytomella lineage and lost or retained to various degrees in extant species. The possible origins and consequences of this bizarre genomic architecture are discussed. PMID:23940100

  7. How to Measure Motivational Interviewing Fidelity in Randomized Controlled Trials: Practical Recommendations.

    PubMed

    Jelsma, Judith G M; Mertens, Vera-Christina; Forsberg, Lisa; Forsberg, Lars

    2015-07-01

    Many randomized controlled trials in which motivational interviewing (MI) is a key intervention make no provision for the assessment of treatment fidelity. This methodological shortcoming makes it impossible to distinguish between high- and low-quality MI interventions, and, consequently, to know whether MI provision has contributed to any intervention effects. This article makes some practical recommendations for the collection, selection, coding and reporting of MI fidelity data, as measured using the Motivational Interviewing Treatment Integrity Code. We hope that researchers will consider these recommendations and include MI fidelity measures in future studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Chimeric NP Non Coding Regions between Type A and C Influenza Viruses Reveal Their Role in Translation Regulation

    PubMed Central

    Crescenzo-Chaigne, Bernadette; Barbezange, Cyril; Frigard, Vianney; Poulain, Damien; van der Werf, Sylvie

    2014-01-01

    Exchange of the non coding regions of the NP segment between type A and C influenza viruses was used to demonstrate the importance not only of the proximal panhandle, but also of the initial distal panhandle strength in type specificity. Both elements were found to be compulsory to rescue infectious virus by reverse genetics systems. Interestingly, in type A influenza virus infectious context, the length of the NP segment 5′ NC region once transcribed into mRNA was found to impact its translation, and the level of produced NP protein consequently affected the level of viral genome replication. PMID:25268971

  9. Probabilistic evaluation of fuselage-type composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1992-01-01

    A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.

  10. Scaling features of noncoding DNA

    NASA Technical Reports Server (NTRS)

    Stanley, H. E.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.

    1999-01-01

    We review evidence supporting the idea that the DNA sequence in genes containing noncoding regions is correlated, and that the correlation is remarkably long range--indeed, base pairs thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene, and utilize this fact to build a Coding Sequence Finder Algorithm, which uses statistical ideas to locate the coding regions of an unknown DNA sequence. Finally, we describe briefly some recent work adapting to DNA the Zipf approach to analyzing linguistic texts, and the Shannon approach to quantifying the "redundancy" of a linguistic text in terms of a measurable entropy function, and reporting that noncoding regions in eukaryotes display a larger redundancy than coding regions. Specifically, we consider the possibility that this result is solely a consequence of nucleotide concentration differences as first noted by Bonhoeffer and his collaborators. We find that cytosine-guanine (CG) concentration does have a strong "background" effect on redundancy. However, we find that for the purine-pyrimidine binary mapping rule, which is not affected by the difference in CG concentration, the Shannon redundancy for the set of analyzed sequences is larger for noncoding regions compared to coding regions.

  11. Surviving "Payment by Results": a simple method of improving clinical coding in burn specialised services in the United Kingdom.

    PubMed

    Wallis, Katy L; Malic, Claudia C; Littlewood, Sonia L; Judkins, Keith; Phipps, Alan R

    2009-03-01

    Coding inpatient episodes plays an important role in determining the financial remuneration of a clinical service. Insufficient or incomplete data may have very significant consequences on its viability. We created a document that improves the coding process in our Burns Centre. At Yorkshire Regional Burns Centre an inpatient summary sheet was designed to prospectively record and present essential information on a daily basis, for use in the coding process. The level of care was also recorded. A 3-month audit was conducted to assess the efficacy of the new forms. Forty-nine patients were admitted to the Burns Centre with a mean age of 27.6 years and TBSA ranging from 0.5% to 65%. The total stay in the Burns Centre was 758 days, of which 22% were at level B3-B5 and 39% at level B2. The use of the new discharge document identified potential income of about 500,000 GB pound sterling at our local daily tariffs for high dependency and intensive care. The new form is able to ensure a high quality of coding with a possible direct impact on the financial resources accrued for burn care.

  12. Barriers and facilitators to the implementation of a school-based physical activity policy in Canada: application of the theoretical domains framework.

    PubMed

    Weatherson, Katie A; McKay, Rhyann; Gainforth, Heather L; Jung, Mary E

    2017-10-23

    In British Columbia Canada, a Daily Physical Activity (DPA) policy was mandated that requires elementary school teachers to provide students with opportunities to achieve 30 min of physical activity during the school day. However, the implementation of school-based physical activity policies is influenced by many factors. A theoretical examination of the factors that impede and enhance teachers' implementation of physical activity policies is necessary in order to develop strategies to improve policy practice and achieve desired outcomes. This study used the Theoretical Domains Framework (TDF) to understand teachers' barriers and facilitators to the implementation of the DPA policy in one school district. Additionally, barriers and facilitators were examined and compared according to how the teacher implemented the DPA policy during the instructional school day. Interviews were conducted with thirteen teachers and transcribed verbatim. One researcher performed barrier and facilitator extraction, with double extraction occurring across a third of the interview transcripts by a second researcher. A deductive and inductive analytical approach in a two-stage process was employed whereby barriers and facilitators were deductively coded using TDF domains (content analysis) and analyzed for sub-themes within each domain. Two researchers performed coding. A total of 832 items were extracted from the interview transcripts. Some items were coded into multiple TDF domains, resulting in a total of 1422 observations. The most commonly coded TDF domains accounting for 75% of the total were Environmental context and resources (ECR; n = 250), Beliefs about consequences (n = 225), Social influences (n = 193), Knowledge (n = 100), and Intentions (n = 88). Teachers who implemented DPA during instructional time differed from those who relied on non-instructional time in relation to Goals, Behavioural regulation, Social/professional role and identity, Beliefs about Consequences. Forty-one qualitative sub-themes were identified across the fourteen domains and exemplary quotes were highlighted. Teachers identified barriers and facilitators relating to all TDF domains, with ECR, Beliefs about consequences, Social influences, Knowledge and Intentions being the most often discussed influencers of DPA policy implementation. Use of the TDF to understand the implementation factors can assist with the systematic development of future interventions to improve implementation.

  13. Molecular Regulatory Pathways Link Sepsis With Metabolic Syndrome: Non-coding RNA Elements Underlying the Sepsis/Metabolic Cross-Talk.

    PubMed

    Meydan, Chanan; Bekenstein, Uriya; Soreq, Hermona

    2018-01-01

    Sepsis and metabolic syndrome (MetS) are both inflammation-related entities with high impact for human health and the consequences of concussions. Both represent imbalanced parasympathetic/cholinergic response to insulting triggers and variably uncontrolled inflammation that indicates shared upstream regulators, including short microRNAs (miRs) and long non-coding RNAs (lncRNAs). These may cross talk across multiple systems, leading to complex molecular and clinical outcomes. Notably, biomedical and RNA-sequencing based analyses both highlight new links between the acquired and inherited pathogenic, cardiac and inflammatory traits of sepsis/MetS. Those include the HOTAIR and MIAT lncRNAs and their targets, such as miR-122, -150, -155, -182, -197, -375, -608 and HLA-DRA. Implicating non-coding RNA regulators in sepsis and MetS may delineate novel high-value biomarkers and targets for intervention.

  14. Computing element evolution towards Exascale and its impact on legacy simulation codes

    NASA Astrophysics Data System (ADS)

    Colin de Verdière, Guillaume J. L.

    2015-12-01

    In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes.

  15. [Amendment of the Kodeks rodzinny i opiekuńczy (Family and Guardianship Code), Chapter I. "Origin of a child"--some remarks of an expert witness in forensic genetics].

    PubMed

    Raczek, Ewa

    2009-01-01

    On June 13, 2009, the new Family and Guardianship Code came into effect. Many important modifications were implemented to Chapter I. "Origin of a child", the issue being of special importance in the work of a forensic geneticist. Those changes are related not only to arguableness of the fatherhood of both types--the one that is judged in lawsuit of denial of the fatherhood and that in which ineffectiveness of paternity is recognized--but for the first time they also demand on maternity testing. The Code defines who--according to Polish law--is a mother to a child and on this base states motherhood. In consequence, the main legal maxim Mater semper certa est, which has existed since Ancient Rome times is now annulled. The paper presents some remarks of an expert witness on the introduced changes.

  16. LDPC product coding scheme with extrinsic information for bit patterned media recoding

    NASA Astrophysics Data System (ADS)

    Jeong, Seongkwon; Lee, Jaejin

    2017-05-01

    Since the density limit of the current perpendicular magnetic storage system will soon be reached, bit patterned media recording (BPMR) is a promising candidate for the next generation storage system to achieve an areal density beyond 1 Tb/in2. Each recording bit is stored in a fabricated magnetic island and the space between the magnetic islands is nonmagnetic in BPMR. To approach recording densities of 1 Tb/in2, the spacing of the magnetic islands must be less than 25 nm. Consequently, severe inter-symbol interference (ISI) and inter-track interference (ITI) occur. ITI and ISI degrade the performance of BPMR. In this paper, we propose a low-density parity check (LDPC) product coding scheme that exploits extrinsic information for BPMR. This scheme shows an improved bit error rate performance compared to that in which one LDPC code is used.

  17. Emergence of Coding and its Specificity as a Physico-Informatic Problem

    NASA Astrophysics Data System (ADS)

    Wills, Peter R.; Nieselt, Kay; McCaskill, John S.

    2015-06-01

    We explore the origin-of-life consequences of the view that biological systems are demarcated from inanimate matter by their possession of referential information, which is processed computationally to control choices of specific physico-chemical events. Cells are cybernetic: they use genetic information in processes of communication and control, subjecting physical events to a system of integrated governance. The genetic code is the most obvious example of how cells use information computationally, but the historical origin of the usefulness of molecular information is not well understood. Genetic coding made information useful because it imposed a modular metric on the evolutionary search and thereby offered a general solution to the problem of finding catalysts of any specificity. We use the term "quasispecies symmetry breaking" to describe the iterated process of self-organisation whereby the alphabets of distinguishable codons and amino acids increased, step by step.

  18. Structural Code Considerations for Solar Rooftop Installations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dwyer, Stephen F.; Dwyer, Brian P.; Sanchez, Alfred

    2014-12-01

    Residential rooftop solar panel installations are limited in part by the high cost of structural related code requirements for field installation. Permitting solar installations is difficult because there is a belief among residential permitting authorities that typical residential rooftops may be structurally inadequate to support the additional load associated with a photovoltaic (PV) solar installation. Typical engineering methods utilized to calculate stresses on a roof structure involve simplifying assumptions that render a complex non-linear structure to a basic determinate beam. This method of analysis neglects the composite action of the entire roof structure, yielding a conservative analysis based on amore » rafter or top chord of a truss. Consequently, the analysis can result in an overly conservative structural analysis. A literature review was conducted to gain a better understanding of the conservative nature of the regulations and codes governing residential construction and the associated structural system calculations.« less

  19. The Role of Hierarchy in Response Surface Modeling of Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2010-01-01

    This paper is intended as a tutorial introduction to certain aspects of response surface modeling, for the experimentalist who has started to explore these methods as a means of improving productivity and quality in wind tunnel testing and other aerospace applications. A brief review of the productivity advantages of response surface modeling in aerospace research is followed by a description of the advantages of a common coding scheme that scales and centers independent variables. The benefits of model term reduction are reviewed. A constraint on model term reduction with coded factors is described in some detail, which requires such models to be well-formulated, or hierarchical. Examples illustrate the consequences of ignoring this constraint. The implication for automated regression model reduction procedures is discussed, and some opinions formed from the author s experience are offered on coding, model reduction, and hierarchy.

  20. Optimizing Excited-State Electronic-Structure Codes for Intel Knights Landing: A Case Study on the BerkeleyGW Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek

    2016-10-06

    We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW methodmore » is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.« less

  1. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  2. Characteristics and Consequences of Adult Learning Methods and Strategies. Practical Evaluation Reports, Volume 2, Number 1

    ERIC Educational Resources Information Center

    Trivette, Carol M.; Dunst, Carl J.; Hamby, Deborah W.; O'Herin, Chainey E.

    2009-01-01

    The effectiveness of four adult learning methods (accelerated learning, coaching, guided design, and just-in-time training) constituted the focus of this research synthesis. Findings reported in "How People Learn" (Bransford et al., 2000) were used to operationally define six adult learning method characteristics, and to code and analyze…

  3. Search for the Missing lncs: Gene Regulatory Networks in Neural Crest Development and Long Non-coding RNA Biomarkers of Hirschsprung's Disease

    EPA Science Inventory

    Hirschsprung’s disease (HSCR), a birth defect characterized by variable aganglionosis of the gut, affects about 1 in 5000 births, and is a consequence of abnormal development of neural crest cells, from which enteric ganglia derive. In the companion article in this issue (Shen et...

  4. A Qualitative Study of Immigration Policy and Practice Dilemmas for Social Work Students

    ERIC Educational Resources Information Center

    Furman, Rich; Langer, Carol L.; Sanchez, Thomas Wayne; Negi, Nalini Junko

    2007-01-01

    Social policy shapes the infrastructure wherein social work is practiced. However, what happens when a particular social policy is seemingly incongruent with the social work code of ethics? How do social work students conceive and resolve potential practice dilemmas that may arise as a consequence? In this study, the authors explored potential…

  5. Students Behaving Badly: Policies on Weapons Violations in Florida Schools

    ERIC Educational Resources Information Center

    Dickinson, Wendy B.; Hall, Bruce W.

    2003-01-01

    This study looks at existing aspects of written school violence policies (Codes of Student Conduct) across large, mid-size, and small school districts in Florida. The aim was to provide a clearer picture of how weapons are defined, and the consequences of their possession, use, or display. Two research areas were addressed: (1) What constitutes a…

  6. Coarse-coded higher-order neural networks for PSRI object recognition. [position, scale, and rotation invariant

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Reid, Max B.

    1993-01-01

    A higher-order neural network (HONN) can be designed to be invariant to changes in scale, translation, and inplane rotation. Invariances are built directly into the architecture of a HONN and do not need to be learned. Consequently, fewer training passes and a smaller training set are required to learn to distinguish between objects. The size of the input field is limited, however, because of the memory required for the large number of interconnections in a fully connected HONN. By coarse coding the input image, the input field size can be increased to allow the larger input scenes required for practical object recognition problems. We describe a coarse coding technique and present simulation results illustrating its usefulness and its limitations. Our simulations show that a third-order neural network can be trained to distinguish between two objects in a 4096 x 4096 pixel input field independent of transformations in translation, in-plane rotation, and scale in less than ten passes through the training set. Furthermore, we empirically determine the limits of the coarse coding technique in the object recognition domain.

  7. Quantum computation with realistic magic-state factories

    NASA Astrophysics Data System (ADS)

    O'Gorman, Joe; Campbell, Earl T.

    2017-03-01

    Leading approaches to fault-tolerant quantum computation dedicate a significant portion of the hardware to computational factories that churn out high-fidelity ancillas called magic states. Consequently, efficient and realistic factory design is of paramount importance. Here we present the most detailed resource assessment to date of magic-state factories within a surface code quantum computer, along the way introducing a number of techniques. We show that the block codes of Bravyi and Haah [Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329] have been systematically undervalued; we track correlated errors both numerically and analytically, providing fidelity estimates without appeal to the union bound. We also introduce a subsystem code realization of these protocols with constant time and low ancilla cost. Additionally, we confirm that magic-state factories have space-time costs that scale as a constant factor of surface code costs. We find that the magic-state factory required for postclassical factoring can be as small as 6.3 million data qubits, ignoring ancilla qubits, assuming 10-4 error gates and the availability of long-range interactions.

  8. Sources of financial pressure and up coding behavior in French public hospitals.

    PubMed

    Georgescu, Irène; Hartmann, Frank G H

    2013-05-01

    Drawing upon role theory and the literature concerning unintended consequences of financial pressure, this study investigates the effects of health care decision pressure from the hospital's administration and from the professional peer group on physician's inclination to engage in up coding. We explore two kinds of up coding, information-related and action-related, and develop hypothesis that connect these kinds of data manipulation to the sources of pressure via the intermediate effect of role conflict. Qualitative data from initial interviews with physicians and subsequent questionnaire evidence from 578 physicians in 14 French hospitals suggest that the source of pressure is a relevant predictor of physicians' inclination to engage in data-manipulation. We further find that this effect is partly explained by the extent to which these pressures create role conflict. Given the concern about up coding in treatment-based reimbursement systems worldwide, our analysis adds to understanding how the design of the hospital's management control system may enhance this undesired type of behavior. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Joint sparse coding based spatial pyramid matching for classification of color medical image.

    PubMed

    Shi, Jun; Li, Yi; Zhu, Jie; Sun, Haojie; Cai, Yin

    2015-04-01

    Although color medical images are important in clinical practice, they are usually converted to grayscale for further processing in pattern recognition, resulting in loss of rich color information. The sparse coding based linear spatial pyramid matching (ScSPM) and its variants are popular for grayscale image classification, but cannot extract color information. In this paper, we propose a joint sparse coding based SPM (JScSPM) method for the classification of color medical images. A joint dictionary can represent both the color information in each color channel and the correlation between channels. Consequently, the joint sparse codes calculated from a joint dictionary can carry color information, and therefore this method can easily transform a feature descriptor originally designed for grayscale images to a color descriptor. A color hepatocellular carcinoma histological image dataset was used to evaluate the performance of the proposed JScSPM algorithm. Experimental results show that JScSPM provides significant improvements as compared with the majority voting based ScSPM and the original ScSPM for color medical image classification. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. CARES/LIFE Software Commercialization

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.

  11. It's time to make management a true profession.

    PubMed

    Khurana, Rakesh; Nohria, Nitin

    2008-10-01

    In the face of the recent institutional breakdown of trust in business, managers are losing legitimacy. To regain public trust, management needs to become a true profession in much the way medicine and law have, argue Khurana and Nohria of Harvard Business School. True professions have codes, and the meaning and consequences of those codes are taught as part of the formal education required of their members. Through these codes, professional institutions forge an implicit social contract with society: Trust us to control and exercise jurisdiction over an important occupational category, and, in return, we will ensurethat the members of our profession are worthy of your trust--that they will not only be competent to perform the tasks entrusted to them, but that they will also conduct themselves with high standardsand great integrity. The authors believe that enforcing educational standards and a code of ethics is unlikely to choke entrepreneurial creativity. Indeed, if the field of medicine is any indication, a code may even stimulate creativity. The main challenge in writing a code lies in reaching a broad consensus on the aims and social purpose of management. There are two deeply divided schools of thought. One school argues that management's aim should simply be to maximize shareholder wealth; the other argues that management's purpose is to balance the claims of all the firm's stakeholders. Any code will have to steer a middle course in order to accommodate both the value-creating impetus of the shareholder value concept and the accountability inherent in the stakeholder approach.

  12. The feasibility of QR-code prescription in Taiwan.

    PubMed

    Lin, C-H; Tsai, F-Y; Tsai, W-L; Wen, H-W; Hu, M-L

    2012-12-01

    An ideal Health Care Service is a service system that focuses on patients. Patients in Taiwan have the freedom to fill their prescriptions at any pharmacies contracted with National Health Insurance. Each of these pharmacies uses its own computer system. So far, there are at least ten different systems on the market in Taiwan. To transmit the prescription information from the hospital to the pharmacy accurately and efficiently presents a great issue. This study consisted of two-dimensional applications using a QR-code to capture Patient's identification and prescription information from the hospitals as well as using a webcam to read the QR-code and transfer all data to the pharmacy computer system. Two hospitals and 85 community pharmacies participated in the study. During the trial, all participant pharmacies appraised highly of the accurate transmission of the prescription information. The contents in QR-code prescriptions from Taipei area were picked up efficiently and accurately in pharmacies at Taichung area (middle Taiwan) without software system limit and area limitation. The QR-code device received a patent (No. M376844, March 2010) from Intellectual Property Office Ministry of Economic Affair, China. Our trial has proven that QR-code prescription can provide community pharmacists an efficient, accurate and inexpensive device to digitalize the prescription contents. Consequently, pharmacists can offer better quality of pharmacy service to patients. © 2012 Blackwell Publishing Ltd.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xin Zhi Jiao

    Ultrastructural changes caused by gamma-ray (Co-60) irradiation were investigated in preclimacteric apple fruits during storage. Under the electron microscope, the cellulose in the cell walls was reduced to a line when treated with 40 Krad gamma radiation for 38 hr, and disappeared completely after treatment with 100 Krad. The disintegration of plasmalemma and mitochondria membranes was observed. Plasmalemma membranes were impaired after 10 Krads for 38 hr, while in the mitochondria the destruction of the original structure and its inner membrane spine began at 40 Krads for 38 hr. Moreover, the size of starch granules was reduced by the irradiation,more » disappearing after treatment with 100 Krads. Both ethylene production and respiration rate were drastically reduced. The reduction of ethylene production in treated apple fruit was found to be due to the decrease of ACC content and the inhibition of ethylene-forming enzyme activity. MACC content was also decreased. Fruits treated with 40 Krad gamma radiation and stored at 0-2 degrees C maintained their quality for six months.« less

  14. CD8+CD28+ T cells might mediate injury of cardiomyocytes in acute myocardial infarction.

    PubMed

    Zhang, Lili; Wang, Zhiyan; Wang, Di; Zhu, Jumo; Wang, Yi

    2018-06-07

    CD8 + T cells accumulate in the necrotic myocardium of acute myocardial infarction (AMI). It is unclear whether CD8 + CD28 + T cells, a specific subset of CD8 + T cells, contribute to myocardial injury. In this study, 92 consecutive patients with AMI and 28 healthy control subjects were enrolled. The frequency of CD8 + CD28 + T cells in peripheral blood samples was assayed by flow cytometry. Plasma cardiac troponin I (TNI) and left ventricular ejection fraction (LVEF) were determined. Long-term prognosis of the patients was evaluated by major adverse cardiac and cerebrovascular events (MACCE) over a 12-month follow-up period. Our findings indicated that patients with AMI who presented with high numbers of CD8 + CD28 + T cells had an increased infarction size and aggravated ventricular function. We proposed that cytotoxic CD8 + CD28 + T cell-mediated myocardial necrosis may act as a novel and alternative pathway of AMI. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. The Olympus satellite and satellite direct broadcasting in Italy

    NASA Astrophysics Data System (ADS)

    Castelli, E.; Tirro, S.

    Plans for the development of DBS-TV technology in Italy are discussed from the perspective of the Italian electronics industry, with an emphasis on experimental broadcasts using the Olympus satellite channel assigned to Italy by ESA. Consideration is given to the operating characteristics of PAL, MAC-C, MAC-D2, extended-MAC, and MUSE color-TV systems and their compatibility with DBS; the planned availability of TV channels on Olympus-type and Italsat-type satellites; individual, community, and CATV reception of DBS signals; the projected growth of the DBS audience in Italy, the UK, and the FRG by 1999; and the potential Italian market for satellite receivers and antennas. The need for prompt completion and evaluation of the Olympus experiments and antennas. The need for prompt completion and evaluation of the Olympus experiments (beginning in 1987) and selection of the systems to be implemented, so that the industry can supply the home equipment required on time, is stressed. Tables of numerical data and maps of the Olympus coverage areas are provided.

  16. The "Wow! signal" of the terrestrial genetic code

    NASA Astrophysics Data System (ADS)

    shCherbak, Vladimir I.; Makukov, Maxim A.

    2013-05-01

    It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of embedding the signal into the code and possible interpretation of its content are discussed. Overall, while the code is nearly optimized biologically, its limited capacity is used extremely efficiently to pass non-biological information.

  17. Addressing the Misuse Potential of Life Science Research-Perspectives From a Bottom-Up Initiative in Switzerland.

    PubMed

    Oeschger, Franziska M; Jenal, Ursula

    2018-01-01

    Codes of conduct have received wide attention as a bottom-up approach to foster responsibility for dual use aspects of life science research within the scientific community. In Switzerland, a series of discussion sessions led by the Swiss Academy of Sciences with over 40 representatives of most Swiss academic life science research institutions has revealed that while a formal code of conduct was considered too restrictive, a bottom-up approach toward awareness raising and education and demonstrating scientists' responsibility toward society was highly welcomed. Consequently, an informational brochure on "Misuse potential and biosecurity in life sciences research" was developed to provide material for further discussions and education.

  18. Surveying multidisciplinary aspects in real-time distributed coding for Wireless Sensor Networks.

    PubMed

    Braccini, Carlo; Davoli, Franco; Marchese, Mario; Mongelli, Maurizio

    2015-01-27

    Wireless Sensor Networks (WSNs), where a multiplicity of sensors observe a physical phenomenon and transmit their measurements to one or more sinks, pertain to the class of multi-terminal source and channel coding problems of Information Theory. In this category, "real-time" coding is often encountered for WSNs, referring to the problem of finding the minimum distortion (according to a given measure), under transmission power constraints, attainable by encoding and decoding functions, with stringent limits on delay and complexity. On the other hand, the Decision Theory approach seeks to determine the optimal coding/decoding strategies or some of their structural properties. Since encoder(s) and decoder(s) possess different information, though sharing a common goal, the setting here is that of Team Decision Theory. A more pragmatic vision rooted in Signal Processing consists of fixing the form of the coding strategies (e.g., to linear functions) and, consequently, finding the corresponding optimal decoding strategies and the achievable distortion, generally by applying parametric optimization techniques. All approaches have a long history of past investigations and recent results. The goal of the present paper is to provide the taxonomy of the various formulations, a survey of the vast related literature, examples from the authors' own research, and some highlights on the inter-play of the different theories.

  19. Genomic mutation consequence calculator.

    PubMed

    Major, John E

    2007-11-15

    The genomic mutation consequence calculator (GMCC) is a tool that will reliably and quickly calculate the consequence of arbitrary genomic mutations. GMCC also reports supporting annotations for the specified genomic region. The particular strength of the GMCC is it works in genomic space, not simply in spliced transcript space as some similar tools do. Within gene features, GMCC can report on the effects on splice site, UTR and coding regions in all isoforms affected by the mutation. A considerable number of genomic annotations are also reported, including: genomic conservation score, known SNPs, COSMIC mutations, disease associations and others. The manual interface also offers link outs to various external databases and resources. In batch mode, GMCC returns a csv file which can easily be parsed by the end user. GMCC is intended to support the many tumor resequencing efforts, but can be useful to any study investigating genomic mutations.

  20. The Existence of Codes of Conduct for Undergraduate Teaching in Teaching-Oriented Four-Year Colleges and Universities

    ERIC Educational Resources Information Center

    Lyken-Segosebe, Dawn; Min, Yunkyung; Braxton, John M.

    2012-01-01

    Four-year colleges and universities that espouse teaching as their primary mission bear a responsibility to safeguard the welfare of their students as clients of teaching. This responsibility takes the form of a moral imperative. Faculty members hold considerable autonomy in the professional choices they make in their teaching. As a consequence,…

  1. Everybody Counts, but Usually Just to 10! A Systematic Analysis of Number Representations in Children's Books

    ERIC Educational Resources Information Center

    Powell, Sarah R.; Nurnberger-Haag, Julie

    2015-01-01

    Research Findings: Teachers and parents often use trade books to introduce or reinforce mathematics concepts. To date, an analysis of the early numeracy content of trade books has not been conducted. Consequently, this study evaluated the properties of numbers and counting within trade books. We coded 160 trade books targeted at establishing early…

  2. Dynamics on Networks of Manifolds

    NASA Astrophysics Data System (ADS)

    DeVille, Lee; Lerman, Eugene

    2015-03-01

    We propose a precise definition of a continuous time dynamical system made up of interacting open subsystems. The interconnections of subsystems are coded by directed graphs. We prove that the appropriate maps of graphs called graph fibrations give rise to maps of dynamical systems. Consequently surjective graph fibrations give rise to invariant subsystems and injective graph fibrations give rise to projections of dynamical systems.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, J.E.; Roussin, R.W.; Gilpin, H.

    A version of the CRAC2 computer code applicable for use in analyses of consequences and risks of reactor accidents in case work for environmental statements has been implemented for use on the Nuclear Regulatory Commission Data General MV/8000 computer system. Input preparation is facilitated through the use of an interactive computer program which operates on an IBM personal computer. The resulting CRAC2 input deck is transmitted to the MV/8000 by using an error-free file transfer mechanism. To facilitate the use of CRAC2 at NRC, relevant background material on input requirements and model descriptions has been extracted from four reports -more » ''Calculations of Reactor Accident Consequences,'' Version 2, NUREG/CR-2326 (SAND81-1994) and ''CRAC2 Model Descriptions,'' NUREG/CR-2552 (SAND82-0342), ''CRAC Calculations for Accident Sections of Environmental Statements, '' NUREG/CR-2901 (SAND82-1693), and ''Sensitivity and Uncertainty Studies of the CRAC2 Computer Code,'' NUREG/CR-4038 (ORNL-6114). When this background information is combined with instructions on the input processor, this report provides a self-contained guide for preparing CRAC2 input data with a specific orientation toward applications on the MV/8000. 8 refs., 11 figs., 10 tabs.« less

  4. [Which legal consequences for those who provoke pain to infants?].

    PubMed

    Bellieni, C V; Gabbrielli, M; Tataranno, M L; Perrone, S; Buonocore, G

    2012-02-01

    The advances in perinatal care have led to a significant increase in neonatal survival rate but also to the rise of the number of invasive procedures. Several scientific studies show that newborns are able to feel pain more intensely than adults. Despite this evidence, neonatal pain and the right to an appropriate analgesia are systematically underestimated, ignoring ethical and moral principles of beneficence and non-maleficence. Infants are more susceptible to pain and the prolonged exposure to painful sensations can alter the neural development and the response to pain causing hyperalgesia. Anyone who caused pain without using any analgesic procedure due to negligence or incompetence, should be severely punished. The right to analgesia, fundamental principle, is fully incorporated in the Italian code of Medical deontology (article 3). The doctor who does not use analgesia for newborns' treatment can be indicted by the Italian penal code (art.582 and 583), aggravated by being the victim an infant, who is unable to defend himself. To avoid penal consequences, a careful education and attention are needed: "pediatric analgesia" should become a basic teaching in Universities and in specialization schools; analgesic treatments should be mandatory and annotated in the patient's file even for minor potentially painful procedures.

  5. Impact of improved soil climatology and intialization on WRF-chem dust simulations over West Asia

    NASA Astrophysics Data System (ADS)

    Omid Nabavi, Seyed; Haimberger, Leopold; Samimi, Cyrus

    2016-04-01

    Meteorological forecast models such as WRF-chem are designed to forecast not only standard atmospheric parameters but also aerosol, particularly mineral dust concentrations. It has therefore become an important tool for the prediction of dust storms in West Asia where dust storms have the considerable impact on living conditions. However, verification of forecasts against satellite data indicates only moderate skill in prediction of such events. Earlier studies have already indicated that the erosion factor, land use classification, soil moisture, and temperature initializations play a critical role in the accuracy of WRF-chem dust simulations. In the standard setting the erosion factor and land use classification are based on topographic variations and post-processed images of the advanced very high-resolution radiometer (AVHRR) during the period April 1992-March 1993. Furthermore, WRF-chem is normally initialized by the soil moisture and temperature of Final Analysis (FNL) model on 1.0x1.0 degree grids. In this study, we have changed boundary initial conditions so that they better represent current changing environmental conditions. To do so, land use (only bare soil class) and the erosion factor were both modified using information from MODIS deep blue AOD (Aerosol Optical Depth). In this method, bare soils are where the relative frequency of dust occurrence (deep blue AOD > 0.5) is more than one-third of a given month. Subsequently, the erosion factor, limited within the bare soil class, is determined by the monthly frequency of dust occurrence ranging from 0.3 to 1. It is worth to mention, that 50 percent of calculated erosion factor is afterward assigned to sand class while silt and clay classes each gain 25 percent of it. Soil moisture and temperature from the Global Land Data Assimilation System (GLDAS) were utilized to provide these initializations in higher resolution of 0.25 degree than in the standard setting. Modified and control simulations were conducted for the summertime of 2008-2012 and verified by satellite data (MODIS deep blue AOD, TOMs Aerosol Index and MISR AOD 550nm) and two well-known modeling systems of atmospheric composition (MACC and DREAM). All comparisons show a significant improvement in WRF-chem dust simulations after implementing the modifications. In comparison to the control run, the modified run bears an average increase of spearman correlation of 17-20 percent points when it is compared with satellite data. Our runs with modified WRF-chem even outperform MACC and DREAM dust simulations for the region.

  6. Carotid Artery Stenting With Proximal Embolic Protection via a Transradial or Transbrachial Approach: Pushing the Boundaries of the Technique While Maintaining Safety and Efficacy.

    PubMed

    Montorsi, Piero; Galli, Stefano; Ravagnani, Paolo M; Tresoldi, Simone; Teruzzi, Giovanni; Caputi, Luigi; Trabattoni, Daniela; Fabbiocchi, Franco; Calligaris, Giuseppe; Grancini, Luca; Lualdi, Alessandro; de Martini, Stefano; Bartorelli, Antonio L

    2016-08-01

    To compare the feasibility and safety of proximal cerebral protection to a distal filter during carotid artery stenting (CAS) via a transbrachial (TB) or transradial (TR) approach. Among 856 patients who underwent CAS between January 2007 and July 2015, 214 (25%) patients (mean age 72±8 years; 154 men) had the procedure via a TR (n=154) or TB (n=60) approach with either Mo.MA proximal protection (n=61) or distal filter protection (n=153). The Mo.MA group (mean age 73±7 years; 54 men) had significantly more men and more severe stenosis than the filter group (mean age 71±8 years; 100 men). Stent type and CAS technique were left to operator discretion. Heparin and a dedicated closure device or bivalirudin and manual compression were used in TR and TB accesses, respectively. Technical and procedure success, crossover to femoral artery, 30-day major adverse cardiovascular/cerebrovascular events (MACCE; death, all strokes, and myocardial infarction), vascular complications, and radiation exposure were compared between groups. Crossover to a femoral approach was required in 1/61 (1.6%) Mo.MA patient vs 11/153 (7.1%) filter patients mainly due to technical difficulty in engaging the target vessel. Five Mo.MA patients developed acute intolerance to proximal occlusion; 4 were successfully shifted to filter protection. A TR patient was shifted to filter because the Mo.MA system was too short. CAS was technically successful in the remaining 55 (90%) Mo.MA patients and 142 (93%) filter patients. The MACCE rate was 0% in the Mo.MA patients and 2.8% in the filter group (p=0.18). Radiation exposure was similar between groups. Major vascular complications occurred in 1/61 (1.6%) and in 3/153 (1.96%) patients in the Mo.MA and filter groups (p=0.18), respectively, and were confined to the TB approach in the early part of the learning curve. Chronic radial artery occlusion was detected by Doppler ultrasound in 2/30 (6.6%) Mo.MA patients and in 4/124 (3.2%) filter patients by clinical assessment (p=0.25) at 8.1±7.5-month follow-up. CAS with proximal protection via a TR or TB approach is a feasible, safe, and effective technique with a low rate of vascular complications. © The Author(s) 2016.

  7. Protein functional features are reflected in the patterns of mRNA translation speed.

    PubMed

    López, Daniel; Pazos, Florencio

    2015-07-09

    The degeneracy of the genetic code makes it possible for the same amino acid string to be coded by different messenger RNA (mRNA) sequences. These "synonymous mRNAs" may differ largely in a number of aspects related to their overall translational efficiency, such as secondary structure content and availability of the encoded transfer RNAs (tRNAs). Consequently, they may render different yields of the translated polypeptides. These mRNA features related to translation efficiency are also playing a role locally, resulting in a non-uniform translation speed along the mRNA, which has been previously related to some protein structural features and also used to explain some dramatic effects of "silent" single-nucleotide-polymorphisms (SNPs). In this work we perform the first large scale analysis of the relationship between three experimental proxies of mRNA local translation efficiency and the local features of the corresponding encoded proteins. We found that a number of protein functional and structural features are reflected in the patterns of ribosome occupancy, secondary structure and tRNA availability along the mRNA. One or more of these proxies of translation speed have distinctive patterns around the mRNA regions coding for certain protein local features. In some cases the three patterns follow a similar trend. We also show specific examples where these patterns of translation speed point to the protein's important structural and functional features. This support the idea that the genome not only codes the protein functional features as sequences of amino acids, but also as subtle patterns of mRNA properties which, probably through local effects on the translation speed, have some consequence on the final polypeptide. These results open the possibility of predicting a protein's functional regions based on a single genomic sequence, and have implications for heterologous protein expression and fine-tuning protein function.

  8. The World Anti-Doping Code: can you have asthma and still be an elite athlete?

    PubMed Central

    2016-01-01

    Key points The World Anti-Doping Code (the Code) does place some restrictions on prescribing inhaled β2-agonists, but these can be overcome without jeopardising the treatment of elite athletes with asthma. While the Code permits the use of inhaled glucocorticoids without restriction, oral and intravenous glucocorticoids are prohibited, although a mechanism exists that allows them to be administered for acute severe asthma. Although asthmatic athletes achieved outstanding sporting success during the 1950s and 1960s before any anti-doping rules existed, since introduction of the Code’s policies on some drugs to manage asthma results at the Olympic Games have revealed that athletes with confirmed asthma/airway hyperresponsiveness (AHR) have outperformed their non-asthmatic rivals. It appears that years of intensive endurance training can provoke airway injury, AHR and asthma in athletes without any past history of asthma. Although further research is needed, it appears that these consequences of airway injury may abate in some athletes after they have ceased intensive training. The World Anti-Doping Code (the Code) has not prevented asthmatic individuals from becoming elite athletes. This review examines those sections of the Code that are relevant to respiratory physicians who manage elite and sub-elite athletes with asthma. The restrictions that the Code places or may place on the prescription of drugs to prevent and treat asthma in athletes are discussed. In addition, the means by which respiratory physicians are able to treat their elite asthmatic athlete patients with drugs that are prohibited in sport are outlined, along with some of the pitfalls in such management and how best to prevent or minimise them. PMID:27408633

  9. Radioactive waste isolation in salt: special advisory report on the status of the Office of Nuclear Waste Isolation's plans for repository performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.

    1983-10-01

    Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less

  10. Validity of Principal Diagnoses in Discharge Summaries and ICD-10 Coding Assessments Based on National Health Data of Thailand.

    PubMed

    Sukanya, Chongthawonsatid

    2017-10-01

    This study examined the validity of the principal diagnoses on discharge summaries and coding assessments. Data were collected from the National Health Security Office (NHSO) of Thailand in 2015. In total, 118,971 medical records were audited. The sample was drawn from government hospitals and private hospitals covered by the Universal Coverage Scheme in Thailand. Hospitals and cases were selected using NHSO criteria. The validity of the principal diagnoses listed in the "Summary and Coding Assessment" forms was established by comparing data from the discharge summaries with data obtained from medical record reviews, and additionally, by comparing data from the coding assessments with data in the computerized ICD (the data base used for reimbursement-purposes). The summary assessments had low sensitivities (7.3%-37.9%), high specificities (97.2%-99.8%), low positive predictive values (9.2%-60.7%), and high negative predictive values (95.9%-99.3%). The coding assessments had low sensitivities (31.1%-69.4%), high specificities (99.0%-99.9%), moderate positive predictive values (43.8%-89.0%), and high negative predictive values (97.3%-99.5%). The discharge summaries and codings often contained mistakes, particularly the categories "Endocrine, nutritional, and metabolic diseases", "Symptoms, signs, and abnormal clinical and laboratory findings not elsewhere classified", "Factors influencing health status and contact with health services", and "Injury, poisoning, and certain other consequences of external causes". The validity of the principal diagnoses on the summary and coding assessment forms was found to be low. The training of physicians and coders must be strengthened to improve the validity of discharge summaries and codings.

  11. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  12. [Seasonal distribution of clinical case codes (DOC study)].

    PubMed

    von Dercks, N; Melz, R; Hepp, P; Theopold, J; Marquass, B; Josten, C

    2017-02-01

    The German diagnosis-related groups remuneration system (G-DRG) was implemented in 2004 and patient-related diagnoses and procedures lead to allocation to specific DRGs. This system includes several codes, such as case mix (CM), case mix index (CMI) and number of cases. Seasonal distribution of these codes as well as distribution of diagnoses and DRGs may lead to logistical consequences for clinical management. From 2004 to 2013 all the main diagnoses and DRGs for inpatients were recorded. Monthly and seasonal distributions were analyzed using ANOVA. The average monthly number of cases was 265 ± 25 cases, the average CM was 388.50 ± 51.75 and the average CMI was 1.46 ± 0.15 with no significant seasonal differences (p > 0.1). Concussion was the most frequently occurring main diagnosis (3739 cases) followed by fractures of the humeral head (699). Significant distribution differences could be shown for humeral head fractures in monthly (p = 0.018) and seasonal comparisons (p = 0.006) with a maximum in winter. Radius (p = 0.01) and ankle fractures (p ≤ 0.001) also occurred most frequently in winter. Non-bony lesions of the shoulder were significantly less in spring (p = 0.04). The DRGs showed no evidence of a monthly or seasonal clustering (p > 0.1). The significant clustering of injuries in specific months and seasons should lead to logistic consequences (e.g. operating room slots, availability of nursing and anesthesia staff). For a needs assessment the analysis of main diagnoses is more appropriate than DRGs.

  13. Anticipatory anxiety disrupts neural valuation during risky choice.

    PubMed

    Engelmann, Jan B; Meyer, Friederike; Fehr, Ernst; Ruff, Christian C

    2015-02-18

    Incidental negative emotions unrelated to the current task, such as background anxiety, can strongly influence decisions. This is most evident in psychiatric disorders associated with generalized emotional disturbances. However, the neural mechanisms by which incidental emotions may affect choices remain poorly understood. Here we study the effects of incidental anxiety on human risky decision making, focusing on both behavioral preferences and their underlying neural processes. Although observable choices remained stable across affective contexts with high and low incidental anxiety, we found a clear change in neural valuation signals: during high incidental anxiety, activity in ventromedial prefrontal cortex and ventral striatum showed a marked reduction in (1) neural coding of the expected subjective value (ESV) of risky options, (2) prediction of observed choices, (3) functional coupling with other areas of the valuation system, and (4) baseline activity. At the same time, activity in the anterior insula showed an increase in coding the negative ESV of risky lotteries, and this neural activity predicted whether the risky lotteries would be rejected. This pattern of results suggests that incidental anxiety can shift the focus of neural valuation from possible positive consequences to anticipated negative consequences of choice options. Moreover, our findings show that these changes in neural value coding can occur in the absence of changes in overt behavior. This suggest a possible pathway by which background anxiety may lead to the development of chronic reward desensitization and a maladaptive focus on negative cognitions, as prevalent in affective and anxiety disorders. Copyright © 2015 the authors 0270-6474/15/353085-15$15.00/0.

  14. Sensemaking, stakeholder discord, and long-term risk communication at a US Superfund site.

    PubMed

    Hoover, Anna Goodman

    2017-03-01

    Risk communication can help reduce exposures to environmental contaminants, mitigate negative health outcomes, and inform community-based decisions about hazardous waste sites. While communication best practices have long guided such efforts, little research has examined unintended consequences arising from such guidelines. As rhetoric informs stakeholder sensemaking, the language used in and reinforced by these guidelines can challenge relationships and exacerbate stakeholder tensions. This study evaluates risk communication at a U.S. Superfund site to identify unintended consequences arising from current risk communication practices. This qualitative case study crystallizes data spanning 6 years from three sources: 1) local newspaper coverage of site-related topics; 2) focus-group transcripts from a multi-year project designed to support future visioning of site use; and 3) published blog entries authored by a local environmental activist. Constant comparative analysis provides the study's analytic foundation, with qualitative data analysis software QSR NVivo 8 supporting a three-step process: 1) provisional coding to identify broad topic categories within datasets, 2) coding occurrences of sensemaking constructs and emergent intra-dataset patterns, and 3) grouping related codes across datasets to examine the relationships among them. Existing risk communication practices at this Superfund site contribute to a dichotomous conceptualization of multiple and diverse stakeholders as members of one of only two categories: the government or the public. This conceptualization minimizes perceptions of capacity, encourages public commitment to stances aligned with a preferred group, and contributes to negative expectations that can become self-fulfilling prophecies. Findings indicate a need to re-examine and adapt risk communication guidelines to encourage more pluralistic understanding of the stakeholder landscape.

  15. Triangulating case-finding tools for patient safety surveillance: a cross-sectional case study of puncture/laceration.

    PubMed

    Taylor, Jennifer A; Gerwin, Daniel; Morlock, Laura; Miller, Marlene R

    2011-12-01

    To evaluate the need for triangulating case-finding tools in patient safety surveillance. This study applied four case-finding tools to error-associated patient safety events to identify and characterise the spectrum of events captured by these tools, using puncture or laceration as an example for in-depth analysis. Retrospective hospital discharge data were collected for calendar year 2005 (n=48,418) from a large, urban medical centre in the USA. The study design was cross-sectional and used data linkage to identify the cases captured by each of four case-finding tools. Three case-finding tools (International Classification of Diseases external (E) and nature (N) of injury codes, Patient Safety Indicators (PSI)) were applied to the administrative discharge data to identify potential patient safety events. The fourth tool was Patient Safety Net, a web-based voluntary patient safety event reporting system. The degree of mutual exclusion among detection methods was substantial. For example, when linking puncture or laceration on unique identifiers, out of 447 potential events, 118 were identical between PSI and E-codes, 152 were identical between N-codes and E-codes and 188 were identical between PSI and N-codes. Only 100 events that were identified by PSI, E-codes and N-codes were identical. Triangulation of multiple tools through data linkage captures potential patient safety events most comprehensively. Existing detection tools target patient safety domains differently, and consequently capture different occurrences, necessitating the integration of data from a combination of tools to fully estimate the total burden.

  16. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  17. Quality of head injury coding from autopsy reports with AIS © 2005 update 2008.

    PubMed

    Schick, Sylvia; Humrich, Anton; Graw, Matthias

    2018-02-28

    ABSTACT Objective: Coding injuries from autopsy reports of traffic accident victims according to Abbreviated Injury Scale AIS © 2005 update 2008 [1] is quite time consuming. The suspicion arose, that many issues leading to discussion between coder and control reader were based on information required by the AIS that was not documented in the autopsy reports. To quantify this suspicion, we introduced an AIS-detail-indicator (AIS-DI). To each injury in the AIS Codebook one letter from A to N was assigned indicating the level of detail. Rules were formulated to receive repeatable assignments. This scheme was applied to a selection of 149 multiply injured traffic fatalities. The frequencies of "not A" codes were calculated for each body region and it was analysed, why the most detailed level A had not been coded. As a first finding, the results of the head region are presented. 747 AIS head injury codes were found in 137 traffic fatalities, and 60% of these injuries were coded with an AIS-DI of level A. There are three different explanations for codes of AIS-DI "not A": Group 1 "Missing information in autopsy report" (5%), Group 2 "Clinical data required by AIS" (20%), and Group 3 "AIS system determined" (15%). Groups 1 and 2 show consequences for the ISS in 25 cases. Other body regions might perform differently. The AIS-DI can indicate the quality of the underlying data basis and, depending on the aims of different AIS users it can be a helpful tool for quality checks.

  18. 3D Equilibrium Effects Due to RMP Application on DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. Lazerson, E. Lazarus, S. Hudson, N. Pablant and D. Gates

    2012-06-20

    The mitigation and suppression of edge localized modes (ELMs) through application of resonant magnetic perturbations (RMPs) in Tokamak plasmas is a well documented phenomenon [1]. Vacuum calculations suggest the formation of edge islands and stochastic regions when RMPs are applied to the axisymmetric equilibria. Self-consistent calculations of the plasma equilibrium with the VMEC [2] and SPEC [3] codes have been performed for an up-down symmetric shot (142603) in DIII-D. In these codes, a self-consistent calculation of the plasma response due to the RMP coils is calculated. The VMEC code globally enforces the constraints of ideal MHD; consequently, a continuously nestedmore » family of flux surfaces is enforced throughout the plasma domain. This approach necessarily precludes the observation of islands or field-line chaos. The SPEC code relaxes the constraints of ideal MHD locally, and allows for islands and field line chaos at or near the rational surfaces. Equilibria with finite pressure gradients are approximated by a set of discrete "ideal-interfaces" at the most irrational flux surfaces and where the strongest pressure gradients are observed. Both the VMEC and SPEC calculations are initialized from EFIT reconstructions of the plasma that are consistent with the experimental pressure and current profiles. A 3D reconstruction using the STELLOPT code, which fits VMEC equilibria to experimental measurements, has also been performed. Comparisons between the equilibria generated by the 3D codes and between STELLOPT and EFIT are presented.« less

  19. 3D Equilibrium Effects Due to RMP Application on DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lazerson, S.; Lazarus, E.; Hudson, S.

    2012-06-20

    The mitigation and suppression of edge localized modes (ELMs) through application of resonant magnetic perturbations (RMPs) in Tokamak plasmas is a well documented phenomenon. Vacuum calculations suggest the formation of edge islands and stochastic regions when RMPs are applied to the axisymmetric equilibria. Self-consistent calculations of the plasma equilibrium with the VMEC and SPEC codes have been performed for an up-down symmetric shot in DIII-D. In these codes, a self-consistent calculation of the plasma response due to the RMP coils is calculated. The VMEC code globally enforces the constraints of ideal MHD; consequently, a continuously nested family of flux surfacesmore » is enforced throughout the plasma domain. This approach necessarily precludes the observation of islands or field-line chaos. The SPEC code relaxes the constraints of ideal MHD locally, and allows for islands and field line chaos at or near the rational surfaces. Equilibria with finite pressure gradients are approximated by a set of discrete "ideal-interfaces" at the most irrational flux surfaces and where the strongest pressure gradients are observed. Both the VMEC and SPEC calculations are initialized from EFIT reconstructions of the plasma that are consistent with the experimental pressure and current profiles. A 3D reconstruction using the STELLOPT code, which fits VMEC equilibria to experimental measurements, has also been performed. Comparisons between the equilibria generated by the 3D codes and between STELLOPT and EFIT are presented.« less

  20. Momentary Patterns of Covariation between Specific Affects and Interpersonal Behavior: Linking Relationship Science and Personality Assessment

    PubMed Central

    Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.

    2016-01-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786

  1. Gyrofluid Modeling of Turbulent, Kinetic Physics

    NASA Astrophysics Data System (ADS)

    Despain, Kate Marie

    2011-12-01

    Gyrofluid models to describe plasma turbulence combine the advantages of fluid models, such as lower dimensionality and well-developed intuition, with those of gyrokinetics models, such as finite Larmor radius (FLR) effects. This allows gyrofluid models to be more tractable computationally while still capturing much of the physics related to the FLR of the particles. We present a gyrofluid model derived to capture the behavior of slow solar wind turbulence and describe the computer code developed to implement the model. In addition, we describe the modifications we made to a gyrofluid model and code that simulate plasma turbulence in tokamak geometries. Specifically, we describe a nonlinear phase mixing phenomenon, part of the E x B term, that was previously missing from the model. An inherently FLR effect, it plays an important role in predicting turbulent heat flux and diffusivity levels for the plasma. We demonstrate this importance by comparing results from the updated code to studies done previously by gyrofluid and gyrokinetic codes. We further explain what would be necessary to couple the updated gyrofluid code, gryffin, to a turbulent transport code, thus allowing gryffin to play a role in predicting profiles for fusion devices such as ITER and to explore novel fusion configurations. Such a coupling would require the use of Graphical Processing Units (GPUs) to make the modeling process fast enough to be viable. Consequently, we also describe our experience with GPU computing and demonstrate that we are poised to complete a gryffin port to this innovative architecture.

  2. Performance Analysis of a New Coded TH-CDMA Scheme in Dispersive Infrared Channel with Additive Gaussian Noise

    NASA Astrophysics Data System (ADS)

    Hamdi, Mazda; Kenari, Masoumeh Nasiri

    2013-06-01

    We consider a time-hopping based multiple access scheme introduced in [1] for communication over dispersive infrared links, and evaluate its performance for correlator and matched filter receivers. In the investigated time-hopping code division multiple access (TH-CDMA) method, the transmitter benefits a low rate convolutional encoder. In this method, the bit interval is divided into Nc chips and the output of the encoder along with a PN sequence assigned to the user determines the position of the chip in which the optical pulse is transmitted. We evaluate the multiple access performance of the system for correlation receiver considering background noise which is modeled as White Gaussian noise due to its large intensity. For the correlation receiver, the results show that for a fixed processing gain, at high transmit power, where the multiple access interference has the dominant effect, the performance improves by the coding gain. But at low transmit power, in which the increase of coding gain leads to the decrease of the chip time, and consequently, to more corruption due to the channel dispersion, there exists an optimum value for the coding gain. However, for the matched filter, the performance always improves by the coding gain. The results show that the matched filter receiver outperforms the correlation receiver in the considered cases. Our results show that, for the same bandwidth and bit rate, the proposed system excels other multiple access techniques, like conventional CDMA and time hopping scheme.

  3. Attacks on quantum key distribution protocols that employ non-ITS authentication

    NASA Astrophysics Data System (ADS)

    Pacher, C.; Abidin, A.; Lorünser, T.; Peev, M.; Ursin, R.; Zeilinger, A.; Larsson, J.-Å.

    2016-01-01

    We demonstrate how adversaries with large computing resources can break quantum key distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not information-theoretically secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced, it was shown to prevent straightforward man-in-the-middle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact, we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols, we describe every single action taken by the adversary. For all protocols, the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of non-ITS authentication in QKD post-processing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level.

  4. Momentary patterns of covariation between specific affects and interpersonal behavior: Linking relationship science and personality assessment.

    PubMed

    Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A

    2017-02-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Impaired Tuning of Neural Ensembles and the Pathophysiology of Schizophrenia: A Translational and Computational Neuroscience Perspective.

    PubMed

    Krystal, John H; Anticevic, Alan; Yang, Genevieve J; Dragoi, George; Driesen, Naomi R; Wang, Xiao-Jing; Murray, John D

    2017-05-15

    The functional optimization of neural ensembles is central to human higher cognitive functions. When the functions through which neural activity is tuned fail to develop or break down, symptoms and cognitive impairments arise. This review considers ways in which disturbances in the balance of excitation and inhibition might develop and be expressed in cortical networks in association with schizophrenia. This presentation is framed within a developmental perspective that begins with disturbances in glutamate synaptic development in utero. It considers developmental correlates and consequences, including compensatory mechanisms that increase intrinsic excitability or reduce inhibitory tone. It also considers the possibility that these homeostatic increases in excitability have potential negative functional and structural consequences. These negative functional consequences of disinhibition may include reduced working memory-related cortical activity associated with the downslope of the "inverted-U" input-output curve, impaired spatial tuning of neural activity and impaired sparse coding of information, and deficits in the temporal tuning of neural activity and its implication for neural codes. The review concludes by considering the functional significance of noisy activity for neural network function. The presentation draws on computational neuroscience and pharmacologic and genetic studies in animals and humans, particularly those involving N-methyl-D-aspartate glutamate receptor antagonists, to illustrate principles of network regulation that give rise to features of neural dysfunction associated with schizophrenia. While this presentation focuses on schizophrenia, the general principles outlined in the review may have broad implications for considering disturbances in the regulation of neural ensembles in psychiatric disorders. Published by Elsevier Inc.

  6. Evaluation of bottom-up and downscaled emission inventories for Paris and consequences for estimating urban air pollution increments

    NASA Astrophysics Data System (ADS)

    Timmermans, R.; Denier van der Gon, H.; Segers, A.; Honore, C.; Perrussel, O.; Builtjes, P.; Schaap, M.

    2012-04-01

    Since a major part of the Earth's population lives in cities, it is of great importance to correctly characterise the air pollution levels over these urban areas. Many studies in the past have already been dedicated to this subject and have determined so-called urban increments: the impact of large cities on the air pollution levels. The impact of large cities on air pollution levels usually is determined with models driven by so-called downscaled emission inventories. In these inventories official country total emissions are gridded using information on for example population density and location of industries and roads. The question is how accurate are the downscaled inventories over cities or large urban areas. Within the EU FP 7 project MEGAPOLI project a new emission inventory has been produced including refined local emission data for two European megacities (Paris, London) and two urban conglomerations (the Po valley, Italy and the Rhine-Ruhr region, Germany) based on a bottom-up approach. The inventory has comparable national totals but remarkable difference at the city scale. Such a bottom up inventory is thought to be more accurate as it contains local knowledge. Within this study we compared modelled nitrogen dioxide (NO2) and particulate matter (PM) concentrations from the LOTOS-EUROS chemistry transport model driven by a conventional downscaled emission inventory (TNO-MACC inventory) with the concentrations from the same model driven by the new MEGAPOLI 'bottom-up' emission inventory focusing on the Paris region. Model predictions for Paris significantly improve using the new Megapoli inventory. Both the emissions as well as the simulated average concentrations of PM over urban sites in Paris are much lower due to the different spatial distribution of the anthropogenic emissions. The difference for the nearby rural stations is small implicating that also the urban increment for PM simulated using the bottom-up emission inventory is much smaller than for the downscaled emission inventory. Urban increments for PM calculated with downscaled emissions, as is common practice, might therefore be overestimated. This finding is likely to apply to other European Megacities as well.

  7. Effect of Isotope Mass in Simulations of JET H-mode Discharges

    NASA Astrophysics Data System (ADS)

    Snyder, S. E.; Onjun, T.; Kritz, A. H.; Bateman, G.; Parail, V.

    2004-11-01

    In JET type-I ELMy H-mode discharges, it is found that the height of the pressure pedestal increases and the frequency of the ELMs decreases with increasing isotope mass. These experimentally observed trends are obtained in these simulations only if the pedestal width increases with isotope mass. Simulations are carried out using the JETTO integrated modeling code with a dynamic model for the H-mode pedestal and the ELMs.(T. Onjun et al, Phys. Plasmas 11 (2004) 1469 and 3006.) The HELENA and MISHKA stability codes are applied to calibrate the stability criteria used to trigger ELM crashes in the JETTO code and to explore possible access to second stability in the pedestal. In the simulations, transport in the pedestal is given by the ion thermal neoclassical diffusivity, which increases with isotope mass. Consequently, as the isotope mass is increased, the pressure gradient and the bootstrap current in the pedestal rebuild more slowly after each ELM crash. Several models are explored in which the pedestal width increases with isotope mass.

  8. Facility Targeting, Protection and Mission Decision Making Using the VISAC Code

    NASA Technical Reports Server (NTRS)

    Morris, Robert H.; Sulfredge, C. David

    2011-01-01

    The Visual Interactive Site Analysis Code (VISAC) has been used by DTRA and several other agencies to aid in targeting facilities and to predict the associated collateral effects for the go, no go mission decision making process. VISAC integrates the three concepts of target geometric modeling, damage assessment capabilities, and an event/fault tree methodology for evaluating accident/incident consequences. It can analyze a variety of accidents/incidents at nuclear or industrial facilities, ranging from simple component sabotage to an attack with military or terrorist weapons. For nuclear facilities, VISAC predicts the facility damage, estimated downtime, amount and timing of any radionuclides released. Used in conjunction with DTRA's HPAC code, VISAC also can analyze transport and dispersion of the radionuclides, levels of contamination of the surrounding area, and the population at risk. VISAC has also been used by the NRC to aid in the development of protective measures for nuclear facilities that may be subjected to attacks by car/truck bombs.

  9. Methylation of miRNA genes and oncogenesis.

    PubMed

    Loginov, V I; Rykov, S V; Fridman, M V; Braga, E A

    2015-02-01

    Interaction between microRNA (miRNA) and messenger RNA of target genes at the posttranscriptional level provides fine-tuned dynamic regulation of cell signaling pathways. Each miRNA can be involved in regulating hundreds of protein-coding genes, and, conversely, a number of different miRNAs usually target a structural gene. Epigenetic gene inactivation associated with methylation of promoter CpG-islands is common to both protein-coding genes and miRNA genes. Here, data on functions of miRNAs in development of tumor-cell phenotype are reviewed. Genomic organization of promoter CpG-islands of the miRNA genes located in inter- and intragenic areas is discussed. The literature and our own results on frequency of CpG-island methylation in miRNA genes from tumors are summarized, and data regarding a link between such modification and changed activity of miRNA genes and, consequently, protein-coding target genes are presented. Moreover, the impact of miRNA gene methylation on key oncogenetic processes as well as affected signaling pathways is discussed.

  10. Circular non-coding RNA ANRIL modulates ribosomal RNA maturation and atherosclerosis in humans

    PubMed Central

    Holdt, Lesca M.; Stahringer, Anika; Sass, Kristina; Pichler, Garwin; Kulak, Nils A.; Wilfert, Wolfgang; Kohlmaier, Alexander; Herbst, Andreas; Northoff, Bernd H.; Nicolaou, Alexandros; Gäbel, Gabor; Beutner, Frank; Scholz, Markus; Thiery, Joachim; Musunuru, Kiran; Krohn, Knut; Mann, Matthias; Teupser, Daniel

    2016-01-01

    Circular RNAs (circRNAs) are broadly expressed in eukaryotic cells, but their molecular mechanism in human disease remains obscure. Here we show that circular antisense non-coding RNA in the INK4 locus (circANRIL), which is transcribed at a locus of atherosclerotic cardiovascular disease on chromosome 9p21, confers atheroprotection by controlling ribosomal RNA (rRNA) maturation and modulating pathways of atherogenesis. CircANRIL binds to pescadillo homologue 1 (PES1), an essential 60S-preribosomal assembly factor, thereby impairing exonuclease-mediated pre-rRNA processing and ribosome biogenesis in vascular smooth muscle cells and macrophages. As a consequence, circANRIL induces nucleolar stress and p53 activation, resulting in the induction of apoptosis and inhibition of proliferation, which are key cell functions in atherosclerosis. Collectively, these findings identify circANRIL as a prototype of a circRNA regulating ribosome biogenesis and conferring atheroprotection, thereby showing that circularization of long non-coding RNAs may alter RNA function and protect from human disease. PMID:27539542

  11. Nonlinear ship waves and computational fluid dynamics

    PubMed Central

    MIYATA, Hideaki; ORIHARA, Hideo; SATO, Yohei

    2014-01-01

    Research works undertaken in the first author’s laboratory at the University of Tokyo over the past 30 years are highlighted. Finding of the occurrence of nonlinear waves (named Free-Surface Shock Waves) in the vicinity of a ship advancing at constant speed provided the start-line for the progress of innovative technologies in the ship hull-form design. Based on these findings, a multitude of the Computational Fluid Dynamic (CFD) techniques have been developed over this period, and are highlighted in this paper. The TUMMAC code has been developed for wave problems, based on a rectangular grid system, while the WISDAM code treats both wave and viscous flow problems in the framework of a boundary-fitted grid system. These two techniques are able to cope with almost all fluid dynamical problems relating to ships, including the resistance, ship’s motion and ride-comfort issues. Consequently, the two codes have contributed significantly to the progress in the technology of ship design, and now form an integral part of the ship-designing process. PMID:25311139

  12. The neuromechanics of hearing

    NASA Astrophysics Data System (ADS)

    Araya, Mussie K.; Brownell, William E.

    2015-12-01

    Hearing requires precise detection and coding of acoustic signals by the inner ear and equally precise communication of the information through the auditory brainstem. A membrane based motor in the outer hair cell lateral wall contributes to the transformation of sound into a precise neural code. Structural, molecular and energetic similarities between the outer hair cell and auditory brainstem neurons suggest that a similar membrane based motor may contribute to signal processing in the auditory CNS. Cooperative activation of voltage gated ion channels enhances neuronal temporal processing and increases the upper frequency limit for phase locking. We explore the possibility that membrane mechanics contribute to ion channel cooperativity as a consequence of the nearly instantaneous speed of electromechanical signaling and the fact that membrane composition and mechanics modulate ion channel function.

  13. Kombucha brewing under the Food and Drug Administration model Food Code: risk analysis and processing guidance.

    PubMed

    Nummer, Brian A

    2013-11-01

    Kombucha is a fermented beverage made from brewed tea and sugar. The taste is slightly sweet and acidic and it may have residual carbon dioxide. Kombucha is consumed in many countries as a health beverage and it is gaining in popularity in the U.S. Consequently, many retailers and food service operators are seeking to brew this beverage on site. As a fermented beverage, kombucha would be categorized in the Food and Drug Administration model Food Code as a specialized process and would require a variance with submission of a food safety plan. This special report was created to assist both operators and regulators in preparing or reviewing a kombucha food safety plan.

  14. Ethics in Science: The Unique Consequences of Chemistry.

    PubMed

    Kovac, Jeffrey

    2015-01-01

    This article discusses the ethical issues unique to the science and practice of chemistry. These issues arise from chemistry's position in the middle between the theoretical and the practical, a science concerned with molecules that are of the right size to directly affect human life. Many of the issues are raised by the central activity of chemistry--synthesis. Chemists make thousands of new substances each year. Many are beneficial, but others are threats. Since the development of the chemical industry in the nineteenth century, chemistry has contributed to the deterioration of the environment but has also helped to reduce pollution. Finally, we discuss the role of codes of ethics and whether the current codes of conduct for chemists are adequate for the challenges of today's world.

  15. CoreTSAR: Core Task-Size Adapting Runtime

    DOE PAGES

    Scogland, Thomas R. W.; Feng, Wu-chun; Rountree, Barry; ...

    2014-10-27

    Heterogeneity continues to increase at all levels of computing, with the rise of accelerators such as GPUs, FPGAs, and other co-processors into everything from desktops to supercomputers. As a consequence, efficiently managing such disparate resources has become increasingly complex. CoreTSAR seeks to reduce this complexity by adaptively worksharing parallel-loop regions across compute resources without requiring any transformation of the code within the loop. Lastly, our results show performance improvements of up to three-fold over a current state-of-the-art heterogeneous task scheduler as well as linear performance scaling from a single GPU to four GPUs for many codes. In addition, CoreTSAR demonstratesmore » a robust ability to adapt to both a variety of workloads and underlying system configurations.« less

  16. Coulomb effects in low-energy nuclear fragmentation

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Chun, Sang Y.; Badavi, Francis F.; John, Sarah

    1993-01-01

    Early versions of the Langley nuclear fragmentation code NUCFRAG (and a publicly released version called HZEFRG1) assumed straight-line trajectories throughout the interaction. As a consequence, NUCFRAG and HZEFRG1 give unrealistic cross sections for large mass removal from the projectile and target at low energies. A correction for the distortion of the trajectory by the nuclear Coulomb fields is used to derive fragmentation cross sections. A simple energy-loss term is applied to estimate the energy downshifts that greatly alter the Coulomb trajectory at low energy. The results, which are far more realistic than prior versions of the code, should provide the data base for future transport calculations. The systematic behavior of charge-removal cross sections compares favorably with results from low-energy experiments.

  17. Progress in theoretical and numerical modeling of RF/MHD coupling using NIMROD

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Schnack, Dalton D.; Hegna, Chris C.; Callen, James D.; Sovinec, Carl R.; Held, Eric D.; Ji, Jeong-Young; Kruger, Scott E.

    2007-11-01

    Preliminary work relevant to the development of a general framework for the self-consistent inclusion of RF effects in fluid codes is presented; specifically, the stabilization of neoclassical and conventional tearing modes by electron cyclotron current drive is considered. For this particular problem, the effects of the RF drive can be formally captured by a quasilinear diffusion operator which enters the fluid equations on the same footing as the collision operator. Furthermore, a Chapman-Enskog-like method can be used to determine the consequent effects of the RF drive on the fluid closures for the parallel heat flow and stress. We summarize our recent research along these lines and discuss issues relevant to its implementation in the NIMROD code.

  18. Isolation of an intertypic poliovirus capsid recombinant from a child with vaccine-associated paralytic poliomyelitis.

    PubMed

    Martín, Javier; Samoilovich, Elena; Dunn, Glynis; Lackenby, Angie; Feldman, Esphir; Heath, Alan; Svirchevskaya, Ekaterina; Cooper, Gill; Yermalovich, Marina; Minor, Philip D

    2002-11-01

    The isolation of a capsid intertypic poliovirus recombinant from a child with vaccine-associated paralytic poliomyelitis is described. Virus 31043 had a Sabin-derived type 3-type 2-type 1 recombinant genome with a 5'-end crossover point within the capsid coding region. The result was a poliovirus chimera containing the entire coding sequence for antigenic site 3a derived from the Sabin type 2 strain. The recombinant virus showed altered antigenic properties but did not acquire type 2 antigenic characteristics. The significance of the presence in nature of such poliovirus chimeras and the consequences for the current efforts to detect potentially dangerous vaccine-derived poliovirus strains are discussed in the context of the global polio eradication initiative.

  19. Non-coding RNA networks in cancer.

    PubMed

    Anastasiadou, Eleni; Jacob, Leni S; Slack, Frank J

    2018-01-01

    Thousands of unique non-coding RNA (ncRNA) sequences exist within cells. Work from the past decade has altered our perception of ncRNAs from 'junk' transcriptional products to functional regulatory molecules that mediate cellular processes including chromatin remodelling, transcription, post-transcriptional modifications and signal transduction. The networks in which ncRNAs engage can influence numerous molecular targets to drive specific cell biological responses and fates. Consequently, ncRNAs act as key regulators of physiological programmes in developmental and disease contexts. Particularly relevant in cancer, ncRNAs have been identified as oncogenic drivers and tumour suppressors in every major cancer type. Thus, a deeper understanding of the complex networks of interactions that ncRNAs coordinate would provide a unique opportunity to design better therapeutic interventions.

  20. Studies of Planet Formation using a Hybrid N-body + Planetesimal Code

    NASA Technical Reports Server (NTRS)

    Kenyon, Scott J.; Bromley, Benjamin C.; Salamon, Michael (Technical Monitor)

    2005-01-01

    The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: 1) icy planets - models for icy planet formation will demonstrate how the physical properties of debris disks, including the Kuiper Belt in our solar system, depend on initial conditions and input physics; and 2) terrestrial planets - calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment. During the past year, we made progress on each issue. Papers published in 2004 are summarized. Summaries of work to be completed during the first half of 2005 and work planned for the second half of 2005 are included.

  1. On Applicability of Network Coding Technique for 6LoWPAN-based Sensor Networks.

    PubMed

    Amanowicz, Marek; Krygier, Jaroslaw

    2018-05-26

    In this paper, the applicability of the network coding technique in 6LoWPAN-based sensor multihop networks is examined. The 6LoWPAN is one of the standards proposed for the Internet of Things architecture. Thus, we can expect the significant growth of traffic in such networks, which can lead to overload and decrease in the sensor network lifetime. The authors propose the inter-session network coding mechanism that can be implemented in resource-limited sensor motes. The solution reduces the overall traffic in the network, and in consequence, the energy consumption is decreased. Used procedures take into account deep header compressions of the native 6LoWPAN packets and the hop-by-hop changes of the header structure. Applied simplifications reduce signaling traffic that is typically occurring in network coding deployments, keeping the solution usefulness for the wireless sensor networks with limited resources. The authors validate the proposed procedures in terms of end-to-end packet delay, packet loss ratio, traffic in the air, total energy consumption, and network lifetime. The solution has been tested in a real wireless sensor network. The results confirm the efficiency of the proposed technique, mostly in delay-tolerant sensor networks.

  2. The Analysis of Design of Robust Nonlinear Estimators and Robust Signal Coding Schemes.

    DTIC Science & Technology

    1982-09-16

    b - )’/ 12. between uniform and nonuniform quantizers. For the nonuni- Proof: If b - acca then form quantizer we can expect the mean-square error to...in the window greater than or equal to the value at We define f7 ’(s) as the n-times filtered signal p + 1; consequently, point p + 1 is the median and

  3. Environmental Compliance Assessment Army Reserve (ECAAR)

    DTIC Science & Technology

    1993-09-01

    and water Spent mixed acid Spent caustic Spent sulfuric acid Potential Consequences: Heat generation, violent reaction. Group 2-A Group 2-B Aluminum Any...methane reforming furnaces, pulping liquor recovery furnaces, combustion devices used in the recovery of sulfur values from spent sulfuric acid...Industry and USEPA Hazardous Waste Hazard No. Hazardous Waste Code* Generic FOO1 The spent halogenated solvents used in degreasing: Trichloroethylene, (t

  4. DoD Resource Augmentation for Civilian Consequence Management (DRACCM) Tool

    DTIC Science & Technology

    2015-07-01

    staffing availabilities for the nine regions. Finally, we added options to view IDAC data that had included school closings, vaccinations , antivirals...there is enough critical medical resource at that hospital for a given day. An hospital icon coded yellow means that at least one critical medical...tularemia, Q fever , SEB, anthrax, plague (with contagion), VEE, botulism, brucellosis, glanders, smallpox (with contagion), influenza, cesium, sarin, VX

  5. Geoethics: what can we learn from existing bio-, ecological, and engineering ethics codes?

    NASA Astrophysics Data System (ADS)

    Kieffer, Susan W.; Palka, John

    2014-05-01

    Many scientific disciplines are concerned about ethics, and codes of ethics for these professions exist, generally through the professional scientific societies such as the American Geophysical Union (AGU), American Geological Institute (AGI), American Association of Petroleum Engineers (AAPE), National Society of Professional Engineers (NSPE), Ecological Society of America (ESA), and many others worldwide. These vary considerably in depth and specificity. In this poster, we review existing codes with the goal of extracting fundamentals that should/can be broadly applied to all geo-disciplines. Most of these codes elucidate a set of principles that cover practical issues such as avoiding conflict of interest, avoiding plagiarism, not permitting illegitimate use of intellectual products, enhancing the prestige of the profession, acknowledging an obligation to perform services only in areas of competence, issuing public statements only in an objective manner, holding paramount the welfare of the public, and in general conducting oneself honorably, responsibly, and lawfully. It is striking that, given that the work of these societies and their members is relevant to the future of the earth, few discuss in any detail ethical obligations regarding our relation to the planet itself. The AGU code, for example, only states that "Members have an ethical obligation to weigh the societal benefits of their research against the costs and risks to human and animal welfare and impacts on the environment and society." The NSPE and AGI codes go somewhat further: "Engineers are encouraged to adhere to the principles of sustainable development in order to protect the environment for future generations," and "Geoscientists should strive to protect our natural environment. They should understand and anticipate the environmental consequences of their work and should disclose the consequences of recommended actions. They should acknowledge that resource extraction and use are necessary to the existence of our society and that such should be undertaken in an environmentally and economically responsible manner." However, statements such as these still focus primarily on the value of the earth to generations of humans, rather than on the earth itself. They remain far from meeting addressing our obligation to the land as summarized, for example, by Aldo Leopold, widely regarded as the principal founder of the American conservation movement: "The individual is a member of a community of interdependent parts. The land ethic simply enlarges the boundaries of the community to include soils, waters, plants and animals, or collectively the land." In this poster, we compare and contrast the various existing codes and suggest ways in which ethical obligations to the community itself, as defined by Leopold, could be more clearly incorporated.

  6. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of themore » input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.« less

  7. FILM-30: A Heat Transfer Properties Code for Water Coolant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MARSHALL, THERON D.

    2001-02-01

    A FORTRAN computer code has been written to calculate the heat transfer properties at the wetted perimeter of a coolant channel when provided the bulk water conditions. This computer code is titled FILM-30 and the code calculates its heat transfer properties by using the following correlations: (1) Sieder-Tate: forced convection, (2) Bergles-Rohsenow: onset to nucleate boiling, (3) Bergles-Rohsenow: partially developed nucleate boiling, (4) Araki: fully developed nucleate boiling, (5) Tong-75: critical heat flux (CHF), and (6) Marshall-98: transition boiling. FILM-30 produces output files that provide the heat flux and heat transfer coefficient at the wetted perimeter as a function ofmore » temperature. To validate FILM-30, the calculated heat transfer properties were used in finite element analyses to predict internal temperatures for a water-cooled copper mockup under one-sided heating from a rastered electron beam. These predicted temperatures were compared with the measured temperatures from the author's 1994 and 1998 heat transfer experiments. There was excellent agreement between the predicted and experimentally measured temperatures, which confirmed the accuracy of FILM-30 within the experimental range of the tests. FILM-30 can accurately predict the CHF and transition boiling regimes, which is an important advantage over current heat transfer codes. Consequently, FILM-30 is ideal for predicting heat transfer properties for applications that feature high heat fluxes produced by one-sided heating.« less

  8. Revision of seismic design codes corresponding to building damages in the ``5.12'' Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yayong

    2010-06-01

    A large number of buildings were seriously damaged or collapsed in the “5.12” Wenchuan earthquake. Based on field surveys and studies of damage to different types of buildings, seismic design codes have been updated. This paper briefly summarizes some of the major revisions that have been incorporated into the “Standard for classification of seismic protection of building constructions GB50223-2008” and “Code for Seismic Design of Buildings GB50011-2001.” The definition of seismic fortification class for buildings has been revisited, and as a result, the seismic classifications for schools, hospitals and other buildings that hold large populations such as evacuation shelters and information centers have been upgraded in the GB50223-2008 Code. The main aspects of the revised GB50011-2001 code include: (a) modification of the seismic intensity specified for the Provinces of Sichuan, Shanxi and Gansu; (b) basic conceptual design for retaining walls and building foundations in mountainous areas; (c) regularity of building configuration; (d) integration of masonry structures and pre-cast RC floors; (e) requirements for calculating and detailing stair shafts; and (f) limiting the use of single-bay RC frame structures. Some significant examples of damage in the epicenter areas are provided as a reference in the discussion on the consequences of collapse, the importance of duplicate structural systems, and the integration of RC and masonry structures.

  9. The effects of collateral consequences of criminal involvement on employment, use of Temporary Assistance for Needy Families, and health

    PubMed Central

    Kneipp, Shawn M.

    2017-01-01

    Criminal convictions are often associated with collateral consequences that limit access to the forms of employment and social services on which disadvantaged women most frequently rely – regardless of the severity of the offense. These consequences may play an important role in perpetuating health disparities by socioeconomic status and gender. We examined the extent to which research studies to date have assessed whether a criminal conviction might influence women’s health by limiting access to Temporary Assistance for Needy Families (TANF) and employment, as a secondary, or “collateral” criminal conviction-related consequence. We reviewed 434 peer-reviewed journal articles retrieved from three electronic article databases and 197 research reports from three research organizations. Two reviewers independently extracted data from each eligible article or report using a standardized coding scheme. Of the sixteen eligible studies included in the review, most were descriptive. None explored whether receiving TANF modified health outcomes, despite its potential to do so. Researchers to date have not fully examined the causal pathways that could link employment, receiving TANF, and health, especially for disadvantaged women. Future research is needed to address this gap and to understand better the potential consequences of the criminal justice system involvement on the health of this vulnerable population. PMID:25905904

  10. The Effects of Collateral Consequences of Criminal Involvement on Employment, Use of Temporary Assistance for Needy Families, and Health.

    PubMed

    Sheely, Amanda; Kneipp, Shawn M

    2015-01-01

    Criminal convictions are often associated with collateral consequences that limit access to the forms of employment and social services on which disadvantaged women most frequently rely--regardless of the severity of the offense. These consequences may play an important role in perpetuating health disparities by socioeconomic status and gender. We examined the extent to which research studies to date have assessed whether a criminal conviction might influence women's health by limiting access to Temporary Assistance for Needy Families (TANF) and employment, as a secondary, or "collateral" criminal conviction-related consequence. We reviewed 434 peer-reviewed journal articles retrieved from three electronic article databases and 197 research reports from three research organizations. Two reviewers independently extracted data from each eligible article or report using a standardized coding scheme. Of the sixteen eligible studies included in the review, most were descriptive. None explored whether receiving TANF modified health outcomes, despite its potential to do so. Researchers to date have not fully examined the causal pathways that could link employment, receiving TANF, and health, especially for disadvantaged women. Future research is needed to address this gap and to understand better the potential consequences of the criminal justice system involvement on the health of this vulnerable population.

  11. Changes in the prevalence of alcohol in rap music lyrics 1979-2009.

    PubMed

    Herd, Denise

    2014-02-01

    This study examines the prevalence and context of alcohol references in rap music lyrics from 1979 through 2009. Four hundred nine top-ranked rap music songs released were sampled from Billboard magazine rating charts. Songs were analyzed using systematic content analysis and were coded for alcohol beverage types and brand names, drinking behaviors, drinking contexts, attitudes towards alcohol, and consequences of drinking. Trends were analyzed using regression analyses. The results of the study reveal significant increases in the presence of alcohol in rap songs; a decline in negative attitudes towards alcohol; decreases in consequences attributed to alcohol; increases in the association of alcohol with glamour and wealth, drugs, and nightclubs; and increases in references to liquor and champagne.

  12. Deep-Earth reactor: nuclear fission, helium, and the geomagnetic field.

    PubMed

    Hollenbach, D F; Herndon, J M

    2001-09-25

    Geomagnetic field reversals and changes in intensity are understandable from an energy standpoint as natural consequences of intermittent and/or variable nuclear fission chain reactions deep within the Earth. Moreover, deep-Earth production of helium, having (3)He/(4)He ratios within the range observed from deep-mantle sources, is demonstrated to be a consequence of nuclear fission. Numerical simulations of a planetary-scale geo-reactor were made by using the SCALE sequence of codes. The results clearly demonstrate that such a geo-reactor (i) would function as a fast-neutron fuel breeder reactor; (ii) could, under appropriate conditions, operate over the entire period of geologic time; and (iii) would function in such a manner as to yield variable and/or intermittent output power.

  13. India's homosexual discrimination and health consequences.

    PubMed

    Agoramoorthy, Govindasamy; Minna, J Hsu

    2007-08-01

    A large number of countries worldwide have legalized homosexual rights. But for 147 years, since when India was a British colony, Section 377 of the Indian Penal Code defines homosexuality as a crime, punishable by imprisonment. This outdated law violates the fundamental rights of homosexuals in India. Despite the fact that literature drawn from Hindu, Buddhist, Muslim, and modern fiction testify to the presence of same-sex love in various forms, homosexuality is still considered a taboo subject in India, by both the society and the government. In the present article, the continuation of the outdated colonial-era homosexuality law and its impact on the underprivileged homosexual society in India is discussed, as well as consequences to this group's health in relation to HIV infection.

  14. Littoral Combat Ship Manpower, an Overview of Officer Characteristics and Placement

    DTIC Science & Technology

    2013-03-01

    15. NUMBER OF PAGES 103 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS PAGE...maritime force: 1.) Networks should be the central organizing principle of the fleet, and its sensing and fighting power should be distributed across...assured access” force; and 4.) Numbers of hulls count (quantity had its own quality) and consequently the fleet’s combat power should be

  15. In Search of the Good War: Just War and Realpolitik in Our Time

    DTIC Science & Technology

    2012-10-01

    1914, few formal treaties governed armed conflict. Early efforts included the American Lieber Code in 1863, the first Geneva Convention of 1864...making interstate war a rare phenomenon. The trials at Nuremberg and Tokyo following the war established the precedent that war crimes carried...consequences. Nuremberg seemed an ideal marriage of law and morality, and later treaties banned genocide and created the Inter- national Criminal Court

  16. Essays on Information Assurance: Examination of Detrimental Consequences of Information Security, Privacy, and Extreme Event Concerns on Individual and Organizational Use of Systems

    ERIC Educational Resources Information Center

    Park, Insu

    2010-01-01

    The purpose of this study is to explore systems users' behavior on IS under the various circumstances (e.g., email usage and malware threats, online communication at the individual level, and IS usage in organizations). Specifically, the first essay develops a method for analyzing and predicting the impact category of malicious code, particularly…

  17. "Drunk in Love": The Portrayal of Risk Behavior in Music Lyrics.

    PubMed

    Holody, Kyle J; Anderson, Christina; Craig, Clay; Flynn, Mark

    2016-10-01

    The current study investigated the prevalence of multiple risk behaviors in popular music lyrics as well as the contexts within which they occur. We conducted a content analysis of the top 20 Billboard songs from 2009 to 2013 in the genres of rap, country, adult contemporary, rock, R&B/hip-hop, and pop, coding for the presence of alcohol, marijuana, nonmarijuana drugs, and sex as well as the contexts intoxication, binging/addiction, partying/socializing, disregard for consequences, and emotional states. The contexts relationship status and degradation were also coded for when sex was present. Of the 600 songs, 212 mentioned sexual behaviors, which were most frequent in rap and R&B/hip-hop. Alcohol was the next most frequent risk behavior, again with greatest mention in rap and R&B/hip-hop. Alcohol, marijuana, and nonmarijuana drugs were most often associated with positive emotions, and sex was most often described within the context of casual relationships. Alcohol and sex were associated with disregard for consequences most often in 2011, when the "you only live once" motto was most popular. These findings are concerning because exposure to popular music is associated with increased risk behaviors for adolescents and young adults, who are the greatest consumers of music.

  18. Primary Care Providers’ Views of Patient Portals: Interview Study of Perceived Benefits and Consequences

    PubMed Central

    Latulipe, Celine; Melius, Kathryn A; Quandt, Sara A; Arcury, Thomas A

    2016-01-01

    Background The United States government is encouraging physicians to adopt patient portals—secure websites that allow patients to access their health information. For patient portals to recognize their full potential and improve patient care, health care providers’ acceptance and encouragement of their use will be essential. However, little is known about provider concerns or views of patient portals. Objective We conducted this qualitative study to determine how administrators, clinic staff, and health care providers at practices serving a lower income adult population viewed patient portals in terms of their potential benefit, areas of concern, and hopes for the future. Methods We performed in-depth interviews between October 2013 and June 2014 with 20 clinic personnel recruited from health centers in four North Carolina counties. Trained study personnel conducted individual interviews following an interviewer guide to elicit perceptions of the benefits and disadvantages of patient portals. Interviews were recorded and transcribed. Research team members reviewed transcribed interviews for major themes to construct a coding dictionary. Two researchers then coded each transcript with any coding discrepancies resolved through discussion. Results The interviews revealed that clinic personnel viewed patient portals as a mandated product that had potential to improve communication and enhance information sharing. However, they expressed many concerns including portals’ potential to generate more work, confuse patients, alienate non-users, and increase health disparities. Clinic personnel expected few older and disadvantaged patients to use a portal. Conclusions Given that clinic personnel have significant concerns about portals’ unintended consequences, their uptake and impact on care may be limited. Future studies should examine ways portals can be implemented in practices to address providers’ concerns and meet the needs of vulnerable populations. PMID:26772771

  19. Sensemaking, Stakeholder Discord, and Long-Term Risk Communication at a U.S. Superfund Site

    PubMed Central

    Hoover, Anna Goodman

    2018-01-01

    Introduction Risk communication can help reduce exposures to environmental contaminants, mitigate negative health outcomes, and inform community-based decisions about hazardous waste sites. While communication best practices have long guided such efforts, little research has examined unintended consequences arising from such guidelines. As rhetoric informs stakeholder sensemaking, the language used in and reinforced by these guidelines can challenge relationships and exacerbate stakeholder tensions. Objectives This study evaluates risk communication at a U.S. Superfund site to identify unintended consequences arising from current risk communication practices. Methods This qualitative case study crystallizes data spanning 6 years from three sources: 1) local newspaper coverage of site-related topics; 2) focus-group transcripts from a multi-year project designed to support future visioning of site use; and 3) published blog entries authored by a local environmental activist. Constant comparative analysis provides the study’s analytic foundation, with qualitative data analysis software QSR NVivo 8 supporting a three-step process: 1) provisional coding to identify broad topic categories within datasets, 2) coding occurrences of sensemaking constructs and emergent intra-dataset patterns, and 3) grouping related codes across datasets to examine the relationships among them. Results Existing risk communication practices at this Superfund site contribute to a dichotomous conceptualization of multiple and diverse stakeholders as members of one of only two categories: the government or the public. This conceptualization minimizes perceptions of capacity, encourages public commitment to stances aligned with a preferred group, and contributes to negative expectations that can become self-fulfilling prophecies. Conclusion Findings indicate a need to re-examine and adapt risk communication guidelines to encourage more pluralistic understanding of the stakeholder landscape. PMID:28282297

  20. Ethical education in software engineering: responsibility in the production of complex systems.

    PubMed

    Génova, Gonzalo; González, M Rosario; Fraga, Anabel

    2007-12-01

    Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics.

  1. The Medicare Policy of Payment Adjustment for Health Care-Associated Infections: Perspectives on Potential Unintended Consequences

    PubMed Central

    Hartmann, Christine W.; Hoff, Timothy; Palmer, Jennifer A.; Wroe, Peter; Dutta-Linn, M. Maya; Lee, Grace

    2014-01-01

    In 2008, the Centers for Medicare & Medicaid Services introduced a new policy to adjust payment to hospitals for health care-associated infections (HAIs) not present on admission. Interviews with 36 hospital infection preventionists across the United States explored the perspectives of these key stakeholders on the potential unintended consequences of the current policy. Responses were analyzed using an iterative coding process where themes were developed from the data. Participants’ descriptions of unintended impacts of the policy centered around three themes. Results suggest the policy has focused more attention on targeted HAIs and has affected hospital staff; relatively fewer systems changes have ensued. Some consequences of the policy, such as infection preventionists having less time to devote to HAIs other than those in the policy or having less time to implement prevention activities, may have undesirable effects on HAI rates if hospitals do not recognize and react to potential time and resource gaps. PMID:21810797

  2. Use of zerotree coding in a high-speed pyramid image multiresolution decomposition

    NASA Astrophysics Data System (ADS)

    Vega-Pineda, Javier; Cabrera, Sergio D.; Lucero, Aldo

    1995-03-01

    A Zerotree (ZT) coding scheme is applied as a post-processing stage to avoid transmitting zero data in the High-Speed Pyramid (HSP) image compression algorithm. This algorithm has features that increase the capability of the ZT coding to give very high compression rates. In this paper the impact of the ZT coding scheme is analyzed and quantified. The HSP algorithm creates a discrete-time multiresolution analysis based on a hierarchical decomposition technique that is a subsampling pyramid. The filters used to create the image residues and expansions can be related to wavelet representations. According to the pixel coordinates and the level in the pyramid, N2 different wavelet basis functions of various sizes and rotations are linearly combined. The HSP algorithm is computationally efficient because of the simplicity of the required operations, and as a consequence, it can be very easily implemented with VLSI hardware. This is the HSP's principal advantage over other compression schemes. The ZT coding technique transforms the different quantized image residual levels created by the HSP algorithm into a bit stream. The use of ZT's compresses even further the already compressed image taking advantage of parent-child relationships (trees) between the pixels of the residue images at different levels of the pyramid. Zerotree coding uses the links between zeros along the hierarchical structure of the pyramid, to avoid transmission of those that form branches of all zeros. Compression performance and algorithm complexity of the combined HSP-ZT method are compared with those of the JPEG standard technique.

  3. A conflict-based model of color categorical perception: evidence from a priming study.

    PubMed

    Hu, Zhonghua; Hanley, J Richard; Zhang, Ruiling; Liu, Qiang; Roberson, Debi

    2014-10-01

    Categorical perception (CP) of color manifests as faster or more accurate discrimination of two shades of color that straddle a category boundary (e.g., one blue and one green) than of two shades from within the same category (e.g., two different shades of green), even when the differences between the pairs of colors are equated according to some objective metric. The results of two experiments provide new evidence for a conflict-based account of this effect, in which CP is caused by competition between visual and verbal/categorical codes on within-category trials. According to this view, conflict arises because the verbal code indicates that the two colors are the same, whereas the visual code indicates that they are different. In Experiment 1, two shades from the same color category were discriminated significantly faster when the previous trial also comprised a pair of within-category colors than when the previous trial comprised a pair from two different color categories. Under the former circumstances, the CP effect disappeared. According to the conflict-based model, response conflict between visual and categorical codes during discrimination of within-category pairs produced an adjustment of cognitive control that reduced the weight given to the categorical code relative to the visual code on the subsequent trial. Consequently, responses on within-category trials were facilitated, and CP effects were reduced. The effectiveness of this conflict-based account was evaluated in comparison with an alternative view that CP reflects temporary warping of perceptual space at the boundaries between color categories.

  4. [Medico-legal autopsy--selected legal issues: the autopsy protocol].

    PubMed

    Gaszczyk-Ozarowski, Zbigniew; Chowaniec, Czesław

    2010-01-01

    The majority of experts in the field of forensic medicine maintain that the minutes of the medicolegal autopsy should be taken by the forensic pathologist. The authors argue that it is the public prosecutor who is obliged to draw up the minutes, whereas the forensic pathologist issues the expert opinion. To support their stance, the authors make frequent references to several provisions of the Criminal Procedure Code of 1997. The authors also imply that due to organizational reasons and the ratio legis of the aforementioned code, the forensic pathologist should not be assigned the role of the minutes-taker, despite the lack of a specific exclusion rule governing such a case. Possible consequences caused by the lack of the properly drawn up minutes are briefly discussed as well.

  5. A comparison of VLSI architectures for time and transform domain decoding of Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Hsu, I. S.; Truong, T. K.; Deutsch, L. J.; Satorius, E. H.; Reed, I. S.

    1988-01-01

    It is well known that the Euclidean algorithm or its equivalent, continued fractions, can be used to find the error locator polynomial needed to decode a Reed-Solomon (RS) code. It is shown that this algorithm can be used for both time and transform domain decoding by replacing its initial conditions with the Forney syndromes and the erasure locator polynomial. By this means both the errata locator polynomial and the errate evaluator polynomial can be obtained with the Euclidean algorithm. With these ideas, both time and transform domain Reed-Solomon decoders for correcting errors and erasures are simplified and compared. As a consequence, the architectures of Reed-Solomon decoders for correcting both errors and erasures can be made more modular, regular, simple, and naturally suitable for VLSI implementation.

  6. Ethics in Science: The Unique Consequences of Chemistry

    PubMed Central

    Kovac, Jeffrey

    2015-01-01

    This article discusses the ethical issues unique to the science and practice of chemistry. These issues arise from chemistry’s position in the middle between the theoretical and the practical, a science concerned with molecules that are of the right size to directly affect human life. Many of the issues are raised by the central activity of chemistry––synthesis. Chemists make thousands of new substances each year. Many are beneficial, but others are threats. Since the development of the chemical industry in the nineteenth century, chemistry has contributed to the deterioration of the environment but has also helped to reduce pollution. Finally, we discuss the role of codes of ethics and whether the current codes of conduct for chemists are adequate for the challenges of today’s world. PMID:26155729

  7. [Patients' rights: what the new swiss civil code will change in the health care of children].

    PubMed

    Laufer, Daniel; Genaine, Patrick; Simon, Jeanne-Pascale P

    2013-02-20

    Recent progress in medicine allow to provide treatment, to cure or to extend the lifespan of people that would have not survived before. Doctors and healthcare providers have become indispensable actors in Western societies. This is particularly true for children's health issues. With the new information technologies, knowledge is now available to everyone, which enables patients to dialog on an equal footing with the physician. Nowadays, therapeutic choices are discussed and negotiated. The new tensions caused by this relationship between therapist and patient have created the need for new regulations. The Swiss Confederation has modified its Civil Code with the objective of a better protection of vulnerable individuals. This article summarizes the consequences of the new regulations with regard to the care and treatment provided to children.

  8. The molecular basis for attractive salt-taste coding in Drosophila.

    PubMed

    Zhang, Yali V; Ni, Jinfei; Montell, Craig

    2013-06-14

    Below a certain level, table salt (NaCl) is beneficial for animals, whereas excessive salt is harmful. However, it remains unclear how low- and high-salt taste perceptions are differentially encoded. We identified a salt-taste coding mechanism in Drosophila melanogaster. Flies use distinct types of gustatory receptor neurons (GRNs) to respond to different concentrations of salt. We demonstrated that a member of the newly discovered ionotropic glutamate receptor (IR) family, IR76b, functioned in the detection of low salt and was a Na(+) channel. The loss of IR76b selectively impaired the attractive pathway, leaving salt-aversive GRNs unaffected. Consequently, low salt became aversive. Our work demonstrated that the opposing behavioral responses to low and high salt were determined largely by an elegant bimodal switch system operating in GRNs.

  9. Isolation of an Intertypic Poliovirus Capsid Recombinant from a Child with Vaccine-Associated Paralytic Poliomyelitis

    PubMed Central

    Martín, Javier; Samoilovich, Elena; Dunn, Glynis; Lackenby, Angie; Feldman, Esphir; Heath, Alan; Svirchevskaya, Ekaterina; Cooper, Gill; Yermalovich, Marina; Minor, Philip D.

    2002-01-01

    The isolation of a capsid intertypic poliovirus recombinant from a child with vaccine-associated paralytic poliomyelitis is described. Virus 31043 had a Sabin-derived type 3-type 2-type 1 recombinant genome with a 5′-end crossover point within the capsid coding region. The result was a poliovirus chimera containing the entire coding sequence for antigenic site 3a derived from the Sabin type 2 strain. The recombinant virus showed altered antigenic properties but did not acquire type 2 antigenic characteristics. The significance of the presence in nature of such poliovirus chimeras and the consequences for the current efforts to detect potentially dangerous vaccine-derived poliovirus strains are discussed in the context of the global polio eradication initiative. PMID:12368335

  10. A comparison of cigarette- and hookah-related videos on YouTube.

    PubMed

    Carroll, Mary V; Shensa, Ariel; Primack, Brian A

    2013-09-01

    YouTube is now the second most visited site on the internet. The authors aimed to compare characteristics of and messages conveyed by cigarette- and hookah-related videos on YouTube. Systematic search procedures yielded 66 cigarette-related and 61 hookah-related videos. After three trained qualitative researchers used an iterative approach to develop and refine definitions for the coding of variables, two of them independently coded each video for content including positive and negative associations with smoking and major content type. Median view counts were 606,884 for cigarettes-related videos and 102,307 for hookah-related videos (p<0.001). However, the number of comments per 1000 views was significantly lower for cigarette-related videos than for hookah-related videos (1.6 vs 2.5, p=0.003). There was no significant difference in the number of 'like' designations per 100 reactions (91 vs 87, p=0.39). Cigarette-related videos were less likely than hookah-related videos to portray tobacco use in a positive light (24% vs 92%, p<0.001). In addition, cigarette-related videos were more likely to be of high production quality (42% vs 5%, p<0.001), to mention short-term consequences (50% vs 18%, p<0.001) and long-term consequences (44% vs 2%, p<0.001) of tobacco use, to contain explicit antismoking messages (39% vs 0%, p<0.001) and to provide specific information on how to quit tobacco use (21% vs 0%, p<0.001). Although internet user-generated videos related to cigarette smoking often acknowledge harmful consequences and provide explicit antismoking messages, hookah-related videos do not. It may be valuable for public health programmes to correct common misconceptions regarding hookah use.

  11. A Comparison of Cigarette- and Hookah-Related Videos on YouTube

    PubMed Central

    Carroll, Mary V.; Shensa, Ariel; Primack, Brian A.

    2013-01-01

    Objective YouTube is now the second most visited site on the Internet. We aimed to compare characteristics of and messages conveyed by cigarette- and hookah-related videos on YouTube. Methods Systematic search procedures yielded 66 cigarette-related and 61 hookah-related videos. After 3 trained qualitative researchers used an iterative approach to develop and refine definitions for the coding of variables, 2 of them independently coded each video for content including positive and negative associations with smoking and major content type. Results Median view counts were 606,884 for cigarettes and 102,307 for hookahs (P<.001). However, the number of comments per 1,000 views was significantly lower for cigarette-related videos than for hookah-related videos (1.6 vs 2.5, P=.003). There was no significant difference in the number of “like” designations per 100 reactions (91 vs. 87, P=.39). Cigarette-related videos were less likely than hookah-related videos to portray tobacco use in a positive light (24% vs. 92%, P<.001). In addition, cigarette-related videos were more likely to be of high production quality (42% vs. 5%, P<.001), to mention short-term consequences (50% vs. 18%, P<.001) and long-term consequences (44% vs. 2%, P<.001) of tobacco use, to contain explicit antismoking messages (39% vs. 0%, P<.001), and to provide specific information on how to quit tobacco use (21% vs. 0%, P<.001). Conclusions Although Internet user–generated videos related to cigarette smoking often acknowledge harmful consequences and provide explicit antismoking messages, hookah-related videos do not. It may be valuable for public health programs to correct common misconceptions regarding hookah use. PMID:22363069

  12. Shipping emissions over Europe: A state-of-the-art and comparative analysis

    NASA Astrophysics Data System (ADS)

    Russo, M. A.; Leitão, J.; Gama, C.; Ferreira, J.; Monteiro, A.

    2018-03-01

    Several emission inventories exist for Europe, which include emissions originating from ship traffic in European sea areas. However, few comparisons of these inventories, in particular focusing on specific emission sectors like shipping, exist in literature. Therefore, the aim of this paper is to review and compare commonly used, and freely available, emission inventories available for the European domain, specifically for shipping and its main pollutants (NOx, SOx and PM10). Five different inventories were considered which include shipping activity: 1) EMEP; 2) TNO-MACC_III; 3) E-PRTR; 4) EDGAR and 5) STEAM. The inventories were initially compared in terms of total emission values and their spatial distribution. The total emission values are largely in agreement (with the exception of E-PRTR), however, the spatial representation shows significant differences in the emission distribution, in particular over the Mediterranean region. As for the contribution of shipping to overall emissions, this sector represent on average 16%, 11% and 5% of total NOx, SOx and PM10 emissions, respectively. Recommendations are given regarding the specific use of each available inventory.

  13. Evaluating aerosol impacts on Numerical Weather Prediction in two extreme dust and biomass-burning events

    NASA Astrophysics Data System (ADS)

    Remy, Samuel; Benedetti, Angela; Jones, Luke; Razinger, Miha; Haiden, Thomas

    2014-05-01

    The WMO-sponsored Working Group on Numerical Experimentation (WGNE) set up a project aimed at understanding the importance of aerosols for numerical weather prediction (NWP). Three cases are being investigated by several NWP centres with aerosol capabilities: a severe dust case that affected Southern Europe in April 2012, a biomass burning case in South America in September 2012, and an extreme pollution event in Beijing (China) which took place in January 2013. At ECMWF these cases are being studied using the MACC-II system with radiatively interactive aerosols. Some preliminary results related to the dust and the fire event will be presented here. A preliminary verification of the impact of the aerosol-radiation direct interaction on surface meteorological parameters such as 2m Temperature and surface winds over the region of interest will be presented. Aerosol optical depth (AOD) verification using AERONET data will also be discussed. For the biomass burning case, the impact of using injection heights estimated by a Plume Rise Model (PRM) for the biomass burning emissions will be presented.

  14. Increased Diversity of Libraries from Libraries: Chemoinformatic Analysis of Bis-Diazacyclic Libraries

    PubMed Central

    López-Vallejo, Fabian; Nefzi, Adel; Bender, Andreas; Owen, John R.; Nabney, Ian T.; Houghten, Richard A.; Medina-Franco, Jose L.

    2011-01-01

    Combinatorial libraries continue to play a key role in drug discovery. To increase structural diversity, several experimental methods have been developed. However, limited efforts have been performed so far to quantify the diversity of the broadly used diversity-oriented synthetic (DOS) libraries. Herein we report a comprehensive characterization of 15 bis-diazacyclic combinatorial libraries obtained through libraries from libraries, which is a DOS approach. Using MACCS keys, radial and different pharmacophoric fingerprints as well as six molecular properties, it was demonstrated the increased structural and property diversity of the libraries from libraries over the individual libraries. Comparison of the libraries to existing drugs, NCI Diversity and the Molecular Libraries Small Molecule Repository revealed the structural uniqueness of the combinatorial libraries (mean similarity < 0.5 for any fingerprint representation). In particular, bis-cyclic thiourea libraries were the most structurally dissimilar to drugs retaining drug-like character in property space. This study represents the first comprehensive quantification of the diversity of libraries from libraries providing a solid quantitative approach to compare and contrast the diversity of DOS libraries with existing drugs or any other compound collection. PMID:21294850

  15. 'Zero is not good for me': implications of infertility in Ghana.

    PubMed

    Fledderjohann, J J

    2012-05-01

    Given the high value placed on children in sub-Saharan Africa, previous research suggests that infertility increases the risk of psychological distress and marital conflict, encourages risky sexual behavior and deprives infertile individuals and couples of an important source of economic and social capital. This paper explores the implications of infertility for women in Ghana, West Africa. Semi-structured interview data collected from 107 women (aged 21-48 years, mean 33 years) seeking treatment in gynecological and obstetric clinics in Accra, Ghana, are analyzed. Based on iterative open coding of the interviews, the focus of the analysis is on mental health, marital instability, social interaction and gendered experiences. Infertile women report facing severe social stigma, marital strain and a range of mental health difficulties. Many women feel that they shoulder a disproportionate share of the blame for infertility and, by extension, face greater social consequences than male partners for difficulties conceiving. Women who do not self-identify as infertile corroborate these findings, asserting that the social consequences of infertility are severe, particularly for women. Infertility in Ghana has important consequences for social interactions, marital stability and mental health. These consequences are not perceived to be shared equally by Ghanaian men.

  16. T cells are influenced by a long non-coding RNA in the autoimmune associated PTPN2 locus.

    PubMed

    Houtman, Miranda; Shchetynsky, Klementy; Chemin, Karine; Hensvold, Aase Haj; Ramsköld, Daniel; Tandre, Karolina; Eloranta, Maija-Leena; Rönnblom, Lars; Uebe, Steffen; Catrina, Anca Irinel; Malmström, Vivianne; Padyukov, Leonid

    2018-06-01

    Non-coding SNPs in the protein tyrosine phosphatase non-receptor type 2 (PTPN2) locus have been linked with several autoimmune diseases, including rheumatoid arthritis, type I diabetes, and inflammatory bowel disease. However, the functional consequences of these SNPs are poorly characterized. Herein, we show in blood cells that SNPs in the PTPN2 locus are highly correlated with DNA methylation levels at four CpG sites downstream of PTPN2 and expression levels of the long non-coding RNA (lncRNA) LINC01882 downstream of these CpG sites. We observed that LINC01882 is mainly expressed in T cells and that anti-CD3/CD28 activated naïve CD4 + T cells downregulate the expression of LINC01882. RNA sequencing analysis of LINC01882 knockdown in Jurkat T cells, using a combination of antisense oligonucleotides and RNA interference, revealed the upregulation of the transcription factor ZEB1 and kinase MAP2K4, both involved in IL-2 regulation. Overall, our data suggests the involvement of LINC01882 in T cell activation and hints towards an auxiliary role of these non-coding SNPs in autoimmunity associated with the PTPN2 locus. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Quartz crystal microbalance detection of DNA single-base mutation based on monobase-coded cadmium tellurium nanoprobe.

    PubMed

    Zhang, Yuqin; Lin, Fanbo; Zhang, Youyu; Li, Haitao; Zeng, Yue; Tang, Hao; Yao, Shouzhuo

    2011-01-01

    A new method for the detection of point mutation in DNA based on the monobase-coded cadmium tellurium nanoprobes and the quartz crystal microbalance (QCM) technique was reported. A point mutation (single-base, adenine, thymine, cytosine, and guanine, namely, A, T, C and G, mutation in DNA strand, respectively) DNA QCM sensor was fabricated by immobilizing single-base mutation DNA modified magnetic beads onto the electrode surface with an external magnetic field near the electrode. The DNA-modified magnetic beads were obtained from the biotin-avidin affinity reaction of biotinylated DNA and streptavidin-functionalized core/shell Fe(3)O(4)/Au magnetic nanoparticles, followed by a DNA hybridization reaction. Single-base coded CdTe nanoprobes (A-CdTe, T-CdTe, C-CdTe and G-CdTe, respectively) were used as the detection probes. The mutation site in DNA was distinguished by detecting the decreases of the resonance frequency of the piezoelectric quartz crystal when the coded nanoprobe was added to the test system. This proposed detection strategy for point mutation in DNA is proved to be sensitive, simple, repeatable and low-cost, consequently, it has a great potential for single nucleotide polymorphism (SNP) detection. 2011 © The Japan Society for Analytical Chemistry

  18. A Survey of Visualization Tools Assessed for Anomaly-Based Intrusion Detection Analysis

    DTIC Science & Technology

    2014-04-01

    objective? • What vulnerabilities exist in the target system? • What damage or other consequences are likely? • What exploit scripts or other attack...languages C, R, and Python; no response capabilities. JUNG https://blogs.reucon.com/asterisk- java /tag/visualization/ Create custom layouts and can...annotate graphs, links, nodes with any Java data type. Must be familiar with coding in Java to call the routines; no monitoring or response

  19. Manpower Staffing, Emergency Department Access and Consequences on Patient Outcomes

    DTIC Science & Technology

    2007-06-01

    distance to the nearest hospital have higher death rates than those zip codes which experience a change. However, we hesitate to conclude that this may...1. Trend Analysis of Mortality Rates by Distance Categories: 1990-2004 Figure 6 presents heart-related death rates for the State of California from...1990- 2004. The graph shows a distinct layering of heart-related death rates across the three distance categories. The population which experiences

  20. SORL1 variants across Alzheimer's disease European American cohorts.

    PubMed

    Fernández, Maria Victoria; Black, Kathleen; Carrell, David; Saef, Ben; Budde, John; Deming, Yuetiva; Howells, Bill; Del-Aguila, Jorge L; Ma, Shengmei; Bi, Catherine; Norton, Joanne; Chasse, Rachel; Morris, John; Goate, Alison; Cruchaga, Carlos

    2016-12-01

    The accumulation of the toxic Aβ peptide in Alzheimer's disease (AD) largely relies upon an efficient recycling of amyloid precursor protein (APP). Recent genetic association studies have described rare variants in SORL1 with putative pathogenic consequences in the recycling of APP. In this work, we examine the presence of rare coding variants in SORL1 in three different European American cohorts: early-onset, late-onset AD (LOAD) and familial LOAD.

  1. Interface Control Document for the EMPACT Module that Estimates Electric Power Transmission System Response to EMP-Caused Damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werley, Kenneth Alan; Mccown, Andrew William

    The EPREP code is designed to evaluate the effects of an Electro-Magnetic Pulse (EMP) on the electric power transmission system. The EPREP code embodies an umbrella framework that allows a user to set up analysis conditions and to examine analysis results. The code links to three major physics/engineering modules. The first module describes the EM wave in space and time. The second module evaluates the damage caused by the wave on specific electric power (EP) transmission system components. The third module evaluates the consequence of the damaged network on its (reduced) ability to provide electric power to meet demand. Thismore » third module is the focus of the present paper. The EMPACT code serves as the third module. The EMPACT name denotes EMP effects on Alternating Current Transmission systems. The EMPACT algorithms compute electric power transmission network flow solutions under severely damaged network conditions. Initial solutions are often characterized by unacceptible network conditions including line overloads and bad voltages. The EMPACT code contains algorithms to adjust optimally network parameters to eliminate network problems while minimizing outages. System adjustments include automatically adjusting control equipment (generator V control, variable transformers, and variable shunts), as well as non-automatic control of generator power settings and minimal load shedding. The goal is to evaluate the minimal loss of customer load under equilibrium (steady-state) conditions during peak demand.« less

  2. [Footwear according to the "business dress code", and the health condition of women's feet--computer-assisted holistic evaluation].

    PubMed

    Lorkowski, Jacek; Mrzygłód, Mirosław; Kotela, Ireneusz; Kiełbasiewicz-Lorkowska, Ewa; Teul, Iwona

    2013-01-01

    According to the verdict of the Supreme Court in 2005, an employer may dismiss an employee if their conduct (including dress) exposes the employer to losses or threatens his interests. The aim of the study was a holistic assessment of the pleiotropic effects of high-heeled pointed shoes on the health condition of women's feet, wearing them at work, in accordance with the existing rules of the "business dress code". A holistic multidisciplinary analysis was performed. It takes into account: 1) women employees of banks and other large corporations (82 persons); 2) 2D FEM computer model developed by the authors of foot deformed by pointed high-heeled shoes; 3) web site found after entering the code "business dress code". Over 60% of women in the office wore high-heeled shoes. The following has been found among people walking to work in high heels: 1) reduction in the quality of life in about 70% of cases, through periodic occurrence of pain and reduction of functional capacity of the feet; 2) increase in the pressure on the plantar side of the forefoot at least twice; 3) the continued effects the forces deforming the forefoot. 1. An evolutionary change of "dress code" shoes is necessary in order to lead to a reduction in non-physiological overload of feet and the consequence of their disability. 2. These changes are particularly urgent in patients with so-called "sensitive foot".

  3. Athermalization of infrared dual field optical system based on wavefront coding

    NASA Astrophysics Data System (ADS)

    Jiang, Kai; Jiang, Bo; Liu, Kai; Yan, Peipei; Duan, Jing; Shan, Qiu-sha

    2017-02-01

    Wavefront coding is a technology which combination of the optical design and digital image processing. By inserting a phase mask closed to the pupil plane of the optical system the wavefront of the system is re-modulated. And the depth of focus is extended consequently. In reality the idea is same as the athermalization theory of infrared optical system. In this paper, an uncooled infrared dual field optical system with effective focal as 38mm/19mm, F number as 1.2 of both focal length, operating wavelength varying from 8μm to 12μm was designed. A cubic phase mask was used at the pupil plane to re-modulate the wavefront. Then the performance of the infrared system was simulated with CODEV as the environment temperature varying from -40° to 60°. MTF curve of the optical system with phase mask are compared with the outcome before using phase mask. The result show that wavefront coding technology can make the system not sensitive to thermal defocus, and then realize the athermal design of the infrared optical system.

  4. An efficient MPI/OpenMP parallelization of the Hartree–Fock–Roothaan method for the first generation of Intel® Xeon Phi™ processor architecture

    DOE PAGES

    Mironov, Vladimir; Moskovsky, Alexander; D’Mello, Michael; ...

    2017-10-04

    The Hartree-Fock (HF) method in the quantum chemistry package GAMESS represents one of the most irregular algorithms in computation today. Major steps in the calculation are the irregular computation of electron repulsion integrals (ERIs) and the building of the Fock matrix. These are the central components of the main Self Consistent Field (SCF) loop, the key hotspot in Electronic Structure (ES) codes. By threading the MPI ranks in the official release of the GAMESS code, we not only speed up the main SCF loop (4x to 6x for large systems), but also achieve a significant (>2x) reduction in the overallmore » memory footprint. These improvements are a direct consequence of memory access optimizations within the MPI ranks. We benchmark our implementation against the official release of the GAMESS code on the Intel R Xeon PhiTM supercomputer. Here, scaling numbers are reported on up to 7,680 cores on Intel Xeon Phi coprocessors.« less

  5. From chemical metabolism to life: the origin of the genetic coding process

    PubMed Central

    2017-01-01

    Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life. PMID:28684991

  6. Confinement properties of tokamak plasmas with extended regions of low magnetic shear

    NASA Astrophysics Data System (ADS)

    Graves, J. P.; Cooper, W. A.; Kleiner, A.; Raghunathan, M.; Neto, E.; Nicolas, T.; Lanthaler, S.; Patten, H.; Pfefferle, D.; Brunetti, D.; Lutjens, H.

    2017-10-01

    Extended regions of low magnetic shear can be advantageous to tokamak plasmas. But the core and edge can be susceptible to non-resonant ideal fluctuations due to the weakened restoring force associated with magnetic field line bending. This contribution shows how saturated non-linear phenomenology, such as 1 / 1 Long Lived Modes, and Edge Harmonic Oscillations associated with QH-modes, can be modelled accurately using the non-linear stability code XTOR, the free boundary 3D equilibrium code VMEC, and non-linear analytic theory. That the equilibrium approach is valid is particularly valuable because it enables advanced particle confinement studies to be undertaken in the ordinarily difficult environment of strongly 3D magnetic fields. The VENUS-LEVIS code exploits the Fourier description of the VMEC equilibrium fields, such that full Lorenzian and guiding centre approximated differential operators in curvilinear angular coordinates can be evaluated analytically. Consequently, the confinement properties of minority ions such as energetic particles and high Z impurities can be calculated accurately over slowing down timescales in experimentally relevant 3D plasmas.

  7. Qualitatively different coding of symbolic and nonsymbolic numbers in the human brain.

    PubMed

    Lyons, Ian M; Ansari, Daniel; Beilock, Sian L

    2015-02-01

    Are symbolic and nonsymbolic numbers coded differently in the brain? Neuronal data indicate that overlap in numerical tuning curves is a hallmark of the approximate, analogue nature of nonsymbolic number representation. Consequently, patterns of fMRI activity should be more correlated when the representational overlap between two numbers is relatively high. In bilateral intraparietal sulci (IPS), for nonsymbolic numbers, the pattern of voxelwise correlations between pairs of numbers mirrored the amount of overlap in their tuning curves under the assumption of approximate, analogue coding. In contrast, symbolic numbers showed a flat field of modest correlations more consistent with discrete, categorical representation (no systematic overlap between numbers). Directly correlating activity patterns for a given number across formats (e.g., the numeral "6" with six dots) showed no evidence of shared symbolic and nonsymbolic number-specific representations. Overall (univariate) activity in bilateral IPS was well fit by the log of the number being processed for both nonsymbolic and symbolic numbers. IPS activity is thus sensitive to numerosity regardless of format; however, the nature in which symbolic and nonsymbolic numbers are encoded is fundamentally different. © 2014 Wiley Periodicals, Inc.

  8. Regulatory consequences of neuronal ELAV-like protein binding to coding and non-coding RNAs in human brain

    PubMed Central

    Scheckel, Claudia; Drapeau, Elodie; Frias, Maria A; Park, Christopher Y; Fak, John; Zucker-Scharff, Ilana; Kou, Yan; Haroutunian, Vahram; Ma'ayan, Avi

    2016-01-01

    Neuronal ELAV-like (nELAVL) RNA binding proteins have been linked to numerous neurological disorders. We performed crosslinking-immunoprecipitation and RNAseq on human brain, and identified nELAVL binding sites on 8681 transcripts. Using knockout mice and RNAi in human neuroblastoma cells, we showed that nELAVL intronic and 3' UTR binding regulates human RNA splicing and abundance. We validated hundreds of nELAVL targets among which were important neuronal and disease-associated transcripts, including Alzheimer's disease (AD) transcripts. We therefore investigated RNA regulation in AD brain, and observed differential splicing of 150 transcripts, which in some cases correlated with differential nELAVL binding. Unexpectedly, the most significant change of nELAVL binding was evident on non-coding Y RNAs. nELAVL/Y RNA complexes were specifically remodeled in AD and after acute UV stress in neuroblastoma cells. We propose that the increased nELAVL/Y RNA association during stress may lead to nELAVL sequestration, redistribution of nELAVL target binding, and altered neuronal RNA splicing. DOI: http://dx.doi.org/10.7554/eLife.10421.001 PMID:26894958

  9. Optimizing legacy molecular dynamics software with directive-based offload

    NASA Astrophysics Data System (ADS)

    Michael Brown, W.; Carrillo, Jan-Michael Y.; Gavhane, Nitin; Thakkar, Foram M.; Plimpton, Steven J.

    2015-10-01

    Directive-based programming models are one solution for exploiting many-core coprocessors to increase simulation rates in molecular dynamics. They offer the potential to reduce code complexity with offload models that can selectively target computations to run on the CPU, the coprocessor, or both. In this paper, we describe modifications to the LAMMPS molecular dynamics code to enable concurrent calculations on a CPU and coprocessor. We demonstrate that standard molecular dynamics algorithms can run efficiently on both the CPU and an x86-based coprocessor using the same subroutines. As a consequence, we demonstrate that code optimizations for the coprocessor also result in speedups on the CPU; in extreme cases up to 4.7X. We provide results for LAMMPS benchmarks and for production molecular dynamics simulations using the Stampede hybrid supercomputer with both Intel® Xeon Phi™ coprocessors and NVIDIA GPUs. The optimizations presented have increased simulation rates by over 2X for organic molecules and over 7X for liquid crystals on Stampede. The optimizations are available as part of the "Intel package" supplied with LAMMPS.

  10. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  11. Sexual involvement with patients.

    PubMed

    Kirstein, L

    1978-04-01

    Three cases of sexual activity between patients and staff members were presented and determinants and consequences of this type of acting out behavior were discussed. Patients sexual behavior was in part motivated by a need to avoid feelings of loneliness and anxiety and a consequence of the sexual behavior was the recurrence of symptoms and behaviors noted upon admission. The staff members were noted to become more self preoccupied and less involved with both staff and patients following the sexual behavior. The role of the ward psychiatrist in preventing such patient staff interactions includes his taking responsibility for educational and supervisory needs of the staff, his being involved in the creation and maintenance of the ward's moral code and his awareness of group and organizational factors that may impede open staff communications.

  12. Deep-Earth reactor: Nuclear fission, helium, and the geomagnetic field

    PubMed Central

    Hollenbach, D. F.; Herndon, J. M.

    2001-01-01

    Geomagnetic field reversals and changes in intensity are understandable from an energy standpoint as natural consequences of intermittent and/or variable nuclear fission chain reactions deep within the Earth. Moreover, deep-Earth production of helium, having 3He/4He ratios within the range observed from deep-mantle sources, is demonstrated to be a consequence of nuclear fission. Numerical simulations of a planetary-scale geo-reactor were made by using the SCALE sequence of codes. The results clearly demonstrate that such a geo-reactor (i) would function as a fast-neutron fuel breeder reactor; (ii) could, under appropriate conditions, operate over the entire period of geologic time; and (iii) would function in such a manner as to yield variable and/or intermittent output power. PMID:11562483

  13. Moral disengagement in the corporate world.

    PubMed

    White, Jenny; Bandura, Albert; Bero, Lisa A

    2009-01-01

    We analyze mechanisms of moral disengagement used to eliminate moral consequences by industries whose products or production practices are harmful to human health. Moral disengagement removes the restraint of self-censure from harmful practices. Moral self-sanctions can be selectively disengaged from harmful activities by investing them with socially worthy purposes, sanitizing and exonerating them, displacing and diffusing responsibility, minimizing or disputing harmful consequences, making advantageous comparisons, and disparaging and blaming critics and victims. Internal industry documents and public statements related to the research activities of these industries were coded for modes of moral disengagement by the tobacco, lead, vinyl chloride (VC), and silicosis-producing industries. All but one of the modes of moral disengagement were used by each of these industries. We present possible safeguards designed to protect the integrity of research.

  14. Operational source receptor calculations for large agglomerations

    NASA Astrophysics Data System (ADS)

    Gauss, Michael; Shamsudheen, Semeena V.; Valdebenito, Alvaro; Pommier, Matthieu; Schulz, Michael

    2016-04-01

    For Air quality policy an important question is how much of the air pollution within an urbanized region can be attributed to local sources and how much of it is imported through long-range transport. This is critical information for a correct assessment of the effectiveness of potential emission measures. The ratio between indigenous and long-range transported air pollution for a given region depends on its geographic location, the size of its area, the strength and spatial distribution of emission sources, the time of the year, but also - very strongly - on the current meteorological conditions, which change from day to day and thus make it important to provide such calculations in near-real-time to support short-term legislation. Similarly, long-term analysis over longer periods (e.g. one year), or of specific air quality episodes in the past, can help to scientifically underpin multi-regional agreements and long-term legislation. Within the European MACC projects (Monitoring Atmospheric Composition and Climate) and the transition to the operational CAMS service (Copernicus Atmosphere Monitoring Service) the computationally efficient EMEP MSC-W air quality model has been applied with detailed emission data, comprehensive calculations of chemistry and microphysics, driven by high quality meteorological forecast data (up to 96-hour forecasts), to provide source-receptor calculations on a regular basis in forecast mode. In its current state, the product allows the user to choose among different regions and regulatory pollutants (e.g. ozone and PM) to assess the effectiveness of fictive emission reductions in air pollutant emissions that are implemented immediately, either within the agglomeration or outside. The effects are visualized as bar charts, showing resulting changes in air pollution levels within the agglomeration as a function of time (hourly resolution, 0 to 4 days into the future). The bar charts not only allow assessing the effects of emission reduction measures but they also indicate the relative importance of indigenous versus imported air pollution. The calculations are currently performed weekly by MET Norway for the Paris, London, Berlin, Oslo, Po Valley and Rhine-Ruhr regions and the results are provided free of charge at the MACC website (http://www.gmes-atmosphere.eu/services/aqac/policy_interface/regional_sr/). A proposal to extend this service to all EU capitals on a daily basis within the Copernicus Atmosphere Monitoring Service is currently under review. The tool is an important example illustrating the increased application of scientific tools to operational services that support Air Quality policy. This paper will describe this tool in more detail, focusing on the experimental setup, underlying assumptions, uncertainties, computational demand, and the usefulness for air quality for policy. Options to apply the tool for agglomerations outside the EU will also be discussed (making reference to, e.g., PANDA, which is a European-Chinese collaboration project).

  15. Oxygenated fraction and mass of organic aerosol from direct emission and atmospheric processing measured on the R/V Ronald Brown during TEXAQS/GoMACCS 2006

    NASA Astrophysics Data System (ADS)

    Russell, L. M.; Takahama, S.; Liu, S.; Hawkins, L. N.; Covert, D. S.; Quinn, P. K.; Bates, T. S.

    2009-04-01

    Submicron particles collected on Teflon filters aboard the R/V Ronald Brown during the Texas Air Quality Study and Gulf of Mexico Atmospheric Composition and Climate Study (TexAQS/GoMACCS) 2006 in and around the port of Houston, Texas, were measured by Fourier transform infrared (FTIR) and X-ray fluorescence for organic functional groups and elemental composition. Organic mass (OM) concentrations (1-25 μg m-3) for ambient particle samples measured by FTIR showed good agreement with measurements made with an aerosol mass spectrometer. The fractions of organic mass identified as alkane and carboxylic acid groups were 47% and 32%, respectively. Three different types of air masses were identified on the basis of the air mass origin and the radon concentration, with significantly higher carboxylic acid group mass fractions in air masses from the north (35%) than the south (29%) or Gulf of Mexico (26%). Positive matrix factorization analysis attributed carboxylic acid fractions of 30-35% to factors with mild or strong correlations (r > 0.5) to elemental signatures of oil combustion and 9-24% to wood smoke, indicating that part of the carboxylic acid fraction of OM was formed by the same sources that controlled the metal emissions, namely the oil and wood combustion activities. The implication is that a substantial part of the measured carboxylic acid contribution was formed independently of traditionally "secondary" processes, which would be affected by atmospheric (both photochemical and meteorological) conditions and other emission sources. The carboxylic acid group fractions in the Gulf of Mexico and south air masses (GAM and SAM, respectively) were largely oil combustion emissions from ships as well as background marine sources, with only limited recent land influences (based on radon concentrations). Alcohol groups accounted for 14% of OM (mostly associated with oil combustion emissions and background sources), and amine groups accounted for 4% of OM in all air masses. Organosulfate groups were found in GAM and SAM, accounting for 1% and 3% of OM, respectively. Two thirds of the OM and oxygen-to-carbon (O/C) measured could be attributed to oil and wood combustion sources on the basis of mild or strong correlations to coemitted, nonvolatile trace metals, with the remaining one third being associated with atmospherically processed organic aerosol. The cloud condensation nuclei (CCN) fraction (normalized by total condensation nuclei) had weak correlations to the alcohol and amine group fractions and mild correlation with O/C, also varying inversely with alkane group fraction. The chemical components that influenced f(RH) were sulfate, organic, and nitrate fraction, but this contrast is consistent with the size-distribution dependence of CCN counters and nephelometers.

  16. Impacts of phylogenetic nomenclature on the efficacy of the U.S. Endangered Species Act.

    PubMed

    Leslie, Matthew S

    2015-02-01

    Cataloging biodiversity is critical to conservation efforts because accurate taxonomy is often a precondition for protection under laws designed for species conservation, such as the U.S. Endangered Species Act (ESA). Traditional nomenclatural codes governing the taxonomic process have recently come under scrutiny because taxon names are more closely linked to hierarchical ranks than to the taxa themselves. A new approach to naming biological groups, called phylogenetic nomenclature (PN), explicitly names taxa by defining their names in terms of ancestry and descent. PN has the potential to increase nomenclatural stability and decrease confusion induced by the rank-based codes. But proponents of PN have struggled with whether species and infraspecific taxa should be governed by the same rules as other taxa or should have special rules. Some proponents advocate the wholesale abandonment of rank labels (including species); this could have consequences for the implementation of taxon-based conservation legislation. I examined the principles of PN as embodied in the PhyloCode (an alternative to traditional rank-based nomenclature that names biological groups based on the results of phylogenetic analyses and does not associate taxa with ranks) and assessed how this novel approach to naming taxa might affect the implementation of species-based legislation by providing a case study of the ESA. The latest version of the PhyloCode relies on the traditional rank-based codes to name species and infraspecific taxa; thus, little will change regarding the main targets of the ESA because they will retain rank labels. For this reason, and because knowledge of evolutionary relationships is of greater importance than nomenclatural procedures for initial protection of endangered taxa under the ESA, I conclude that PN under the PhyloCode will have little impact on implementation of the ESA. © 2014 Society for Conservation Biology.

  17. Barriers and perceptions regarding code status discussion with families of critically ill patients in a tertiary care hospital of a developing country: A cross-sectional study.

    PubMed

    Syed, Ahsan A; Almas, Aysha; Naeem, Quratulain; Malik, Umer F; Muhammad, Tariq

    2017-02-01

    In Asian societies including Pakistan, a complex background of illiteracy, different familial dynamics, lack of patient's autonomy, religious beliefs, and financial constraints give new dimensions to code status discussion. Barriers faced by physicians during code status discussion in these societies are largely unknown. To determine the barriers and perceptions in discussion of code status by physicians. Questionnaire-based cross-sectional study. This study was conducted in the Department of Medicine of The Aga Khan University Hospital, Karachi, Pakistan. A total of 134 physicians who had discussed at least five code statuses in their lifetime were included. A total of 77 (57.4%) physicians responded. Family-related barriers were found to be the most common barriers. They include family denial (74.0%), level of education of family (66.2%), and conflict between individual family members (66.2%). Regarding personal barriers, lack of knowledge regarding prognosis (44.1%), personal discomfort in discussing death (29.8%), and fear of legal consequences (28.5%) were the top most barriers. In hospital-related barriers, time constraint (57.1%), lack of hospital administration support (48.0%), and suboptimal nursing care after do not resuscitate (48.0%) were the most frequent. There were significant differences among opinions of trainees when compared to those of attending physicians. Family-related barriers are the most frequent roadblocks in the end-of-life care discussions for physicians in Pakistan. Strengthening communication skills of physicians and family education are the potential strategies to improve end-of-life care. Large multi-center studies are needed to better understand the barriers of code status discussion in developing countries.

  18. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  19. The role of crossover operator in evolutionary-based approach to the problem of genetic code optimization.

    PubMed

    Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł

    2016-12-01

    One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure minimize the costs about 2.7 times better than the canonical genetic code. Interestingly, the optimal codes are dominated by amino acids characterized by polarity close to its average value for all amino acids. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Nonlinear Modeling of Radial Stellar Pulsations

    NASA Astrophysics Data System (ADS)

    Smolec, R.

    2009-09-01

    In this thesis, I present the results of my work concerning the nonlinear modeling of radial stellar pulsations. I will focus on classical Cepheids, particularly on the double-mode phenomenon. History of nonlinear modeling of radial stellar pulsations begins in the sixties of the previous century. At the beginning convection was disregarded in model equations. Qualitatively, almost all features of the radial pulsators were successfully modeled with purely radiative hydrocodes. Among problems that remained, the most disturbing was modeling of the double-mode phenomenon. This long-standing problem seemed to be finally solved with the inclusion of turbulent convection into the model equations (Kollath et al. 1998, Feuchtinger 1998). Although dynamical aspects of the double-mode behaviour were extensively studied, its origin, particularly the specific role played by convection, remained obscure. To study this and other problems of radial stellar pulsations, I implemented the convection into pulsation hydrocodes. The codes adopt the Kuhfuss (1986) convection model. In other codes, particularly in the Florida-Budapest hydrocode (e.g. Kollath et al. 2002), used in comput! ation of most of the published double-mode models, different approximations concerning e.g. eddy-viscous terms or treatment of convectively stable regions are adopted. Particularly the neglect of negative buoyancy effects in the Florida-Budapest code and its consequences, were never discussed in the literature. These consequences are severe. Concerning the single-mode pulsators, neglect of negative buoyancy leads to smaller pulsation amplitudes, in comparison to amplitudes computed with code including these effects. Particularly, neglect of negative buoyancy reduces the amplitude of the fundamental mode very strong. This property of the Florida-Budapest models is crucial in bringing up the stable non-resonant double-mode Cepheid pulsation involving fundamental and first overtone modes (F/1O). Such pulsation is not observed in models computed including negative buoyancy. As the neglect of negative buoyancy is physically not correct, so are the double-mode Cepheid models computed with the Florida-Budapest hydrocode. Extensive search for F/1O double-mode Cepheid pulsation with the codes including negative buoyancy effects yielded null result. Some resonant double-mode F/1O Cepheid models were found, but their occurrence was restricted to a very narrow domain in the Hertzsprung-Russel diagram. Model computations intended to model the double-overtone (1O/2O) Cepheids in the Large Magellanic Cloud, also revealed some stable double-mode pulsations, however, restricted to a narrow period range. Resonances are most likely conductive in bringing up the double-mode behaviour observed in these models. However, majority of the double-overtone LMC Cepheids cannot be reproduced with our codes. Hence, the modeling of double-overtone Cepheids with convective hydrocodes is not satisfactory, either. Double-mode pulsation still lacks satisfactory explanation, and problem of its modeling remains open.

  1. A simple program to measure and analyse tree rings using Excel, R and SigmaScan

    PubMed Central

    Hietz, Peter

    2011-01-01

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835

  2. The problem with value

    PubMed Central

    O’Doherty, John P.

    2015-01-01

    Neural correlates of value have been extensively reported in a diverse set of brain regions. However, in many cases it is difficult to determine whether a particular neural response pattern corresponds to a value-signal per se as opposed to an array of alternative non-value related processes, such as outcome-identity coding, informational coding, encoding of autonomic and skeletomotor consequences, alongside previously described “salience” or “attentional” effects. Here, I review a number of experimental manipulations that can be used to test for value, and I identify the challenges in ascertaining whether a particular neural response is or is not a value signal. Finally, I emphasize that some non-value related signals may be especially informative as a means of providing insight into the nature of the decision-making related computations that are being implemented in a particular brain region. PMID:24726573

  3. Darwinism and ethology. The role of natural selection in animals and humans.

    PubMed

    Gervet, J; Soleilhavoup, M

    1997-11-01

    The role of behaviour in biological evolution is examined within the context of Darwinism. All Darwinian models are based on the distinction of two mechanisms: one that permits faithful transmission of a feature from one generation to another, and another that differentially regulates the degree of this transmission. Behaviour plays a minimal role as an agent of transmission in the greater part of the animal kingdom; by contrast, the forms it may assume strongly influence the mechanisms of selection regulating the different rates of transmission. We consider the decisive feature of the human species to be the existence of a phenotypical system of cultural coding characterized by precision and reliability which are the distinctive feature of genetic coding in animals. We examine the consequences for the application of the Darwinian model to human history.

  4. SKIRT: Hybrid parallelization of radiative transfer simulations

    NASA Astrophysics Data System (ADS)

    Verstocken, S.; Van De Putte, D.; Camps, P.; Baes, M.

    2017-07-01

    We describe the design, implementation and performance of the new hybrid parallelization scheme in our Monte Carlo radiative transfer code SKIRT, which has been used extensively for modelling the continuum radiation of dusty astrophysical systems including late-type galaxies and dusty tori. The hybrid scheme combines distributed memory parallelization, using the standard Message Passing Interface (MPI) to communicate between processes, and shared memory parallelization, providing multiple execution threads within each process to avoid duplication of data structures. The synchronization between multiple threads is accomplished through atomic operations without high-level locking (also called lock-free programming). This improves the scaling behaviour of the code and substantially simplifies the implementation of the hybrid scheme. The result is an extremely flexible solution that adjusts to the number of available nodes, processors and memory, and consequently performs well on a wide variety of computing architectures.

  5. Information theory of adaptation in neurons, behavior, and mood.

    PubMed

    Sharpee, Tatyana O; Calhoun, Adam J; Chalasani, Sreekanth H

    2014-04-01

    The ability to make accurate predictions of future stimuli and consequences of one's actions are crucial for the survival and appropriate decision-making. These predictions are constantly being made at different levels of the nervous system. This is evidenced by adaptation to stimulus parameters in sensory coding, and in learning of an up-to-date model of the environment at the behavioral level. This review will discuss recent findings that actions of neurons and animals are selected based on detailed stimulus history in such a way as to maximize information for achieving the task at hand. Information maximization dictates not only how sensory coding should adapt to various statistical aspects of stimuli, but also that reward function should adapt to match the predictive information from past to future. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. [The appreciation of the handicap in the brothers of the coast (1664-1675), according to Alexandre-Olivier Exmelin, surgeon of the privateers].

    PubMed

    Hamonet, Claude

    2007-01-01

    The reparation of corporeal damages, consequences of intentional or no intentional violence is a part of measurement of stability and progress in the human societies interested by a dignity life for the victims. Initiated by Hammourabi Code and continued by the Jews in the Bible, the reference was (now and still its) the amputed or impaired part of body (hand, arm, leg, eye...). For every part a fare in money was indicated or a rate in percentage. The Coast brothers translate in ecus or in slaves. This code indicates the originality of a society founded on violence, the robbery and murder with introduction of cooperative if not democratic modalities of functioning. The role of Bertrand d'Ogeron, governor of the Turtle Island was very beneficent.

  7. A simple program to measure and analyse tree rings using Excel, R and SigmaScan.

    PubMed

    Hietz, Peter

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.

  8. Hybrid finite element/waveguide mode analysis of passive RF devices

    NASA Astrophysics Data System (ADS)

    McGrath, Daniel T.

    1993-07-01

    A numerical solution for time-harmonic electromagnetic fields in two-port passive radio frequency (RF) devices has been developed, implemented in a computer code, and validated. Vector finite elements are used to represent the fields in the device interior, and field continuity across waveguide apertures is enforced by matching the interior solution to a sum of waveguide modes. Consequently, the mesh may end at the aperture instead of extending into the waveguide. The report discusses the variational formulation and its reduction to a linear system using Galerkin's method. It describes the computer code, including its interface to commercial CAD software used for geometry generation. It presents validation results for waveguide discontinuities, coaxial transitions, and microstrip circuits. They demonstrate that the method is an effective and versatile tool for predicting the performance of passive RF devices.

  9. Force-free electrodynamics in dynamical curved spacetimes

    NASA Astrophysics Data System (ADS)

    McWilliams, Sean

    2015-04-01

    We present results on our study of force-free electrodynamics in curved spacetimes. Specifically, we present several improvements to what has become the established set of evolution equations, and we apply these to study the nonlinear stability of analytically known force-free solutions for the first time. We implement our method in a new pseudo-spectral code built on top of the SpEC code for evolving dynamic spacetimes. Finally, we revisit these known solutions and attempt to clarify some interesting properties that render them analytically tractable. Finally, we preview some new work that similarly revisits the established approach to solving another problem in numerical relativity: the post-merger recoil from asymmetric gravitational-wave emission. These new results may have significant implications for the parameter dependence of recoils, and consequently on the statistical expectations for recoil velocities of merged systems.

  10. 2007 Report to Congress of the U.S.- China Economic and Security Review Commission

    DTIC Science & Technology

    2007-11-01

    the $3 billion stake it took in the New York-based private equity firm The Blackstone Group. Some worry that the new fund may be used to capture more...pollution of surrounding riverbanks and other ecological harm. Pollution from Coal Mining Air pollution is not the only environmental consequence of Chi...technology firms and human rights organizations was formed to discuss the establishment of an international code of ethics on issues related to

  11. An Improved Maintenance Model for the Simulation of Strategic Airlift Capability.

    DTIC Science & Technology

    1982-03-01

    developed using SLAM as the primary simulation language. Maintenance manning is modeled at the Air Force Specialty Code level, to allow the possibility of...Atlantic Treaty Organization (NATO) allies is one of our primary national objectives, but recent increases in Soviet ground and air forces (Ref 5:100) have...arrive from the United States. Consequently, the primary objective of the United States Air Force mobility program is to be able, by 1982, to double the

  12. Implications of Analytical Investigations about the Semiconductor Equations on Device Modeling Programs.

    DTIC Science & Technology

    1983-04-01

    34.. .. . ...- "- -,-. SIGNIFICANCE AND EXPLANATION Many different codes for the simulation of semiconductor devices such as transitors , diodes, thyristors are already circulated...partially take into account the consequences introduced by degenerate semiconductors (e.g. invalidity of Boltzmann’s statistics , bandgap narrowing). These...ft - ni p nep /Ut(2.10) Sni *e p nie 2.11) .7. (2.10) can be physically interpreted as the application of Boltzmann statistics . However (2.10) a.,zo

  13. Spatial coding of ordinal information in short- and long-term memory.

    PubMed

    Ginsburg, Véronique; Gevers, Wim

    2015-01-01

    The processing of numerical information induces a spatial response bias: Faster responses to small numbers with the left hand and faster responses to large numbers with the right hand. Most theories agree that long-term representations underlie this so called SNARC effect (Spatial Numerical Association of Response Codes; Dehaene et al., 1993). However, a spatial response bias was also observed with the activation of temporary position-space associations in working memory (ordinal position effect; van Dijck and Fias, 2011). Items belonging to the beginning of a memorized sequence are responded to faster with the left hand side while items at the end of the sequence are responded to faster with the right hand side. The theoretical possibility was put forward that the SNARC effect is an instance of the ordinal position effect, with the empirical consequence that the SNARC effect and the ordinal position effect cannot be observed simultaneously. In two experiments we falsify this claim by demonstrating that the SNARC effect and the ordinal position effect are not mutually exclusive. Consequently, this suggests that the SNARC effect and the ordinal position effect result from the activation of different representations. We conclude that spatial response biases can result from the activation of both pre-existing positions in long-term memory and from temporary space associations in working memory at the same time.

  14. COOL: A code for Dynamic Monte Carlo Simulation of molecular dynamics

    NASA Astrophysics Data System (ADS)

    Barletta, Paolo

    2012-02-01

    Cool is a program to simulate evaporative and sympathetic cooling for a mixture of two gases co-trapped in an harmonic potential. The collisions involved are assumed to be exclusively elastic, and losses are due to evaporation from the trap. Each particle is followed individually in its trajectory, consequently properties such as spatial densities or energy distributions can be readily evaluated. The code can be used sequentially, by employing one output as input for another run. The code can be easily generalised to describe more complicated processes, such as the inclusion of inelastic collisions, or the possible presence of more than two species in the trap. New version program summaryProgram title: COOL Catalogue identifier: AEHJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHJ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1 097 733 No. of bytes in distributed program, including test data, etc.: 18 425 722 Distribution format: tar.gz Programming language: C++ Computer: Desktop Operating system: Linux RAM: 500 Mbytes Classification: 16.7, 23 Catalogue identifier of previous version: AEHJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 388 Does the new version supersede the previous version?: Yes Nature of problem: Simulation of the sympathetic process occurring for two molecular gases co-trapped in a deep optical trap. Solution method: The Direct Simulation Monte Carlo method exploits the decoupling, over a short time period, of the inter-particle interaction from the trapping potential. The particle dynamics is thus exclusively driven by the external optical field. The rare inter-particle collisions are considered with an acceptance/rejection mechanism, that is, by comparing a random number to the collisional probability defined in terms of the inter-particle cross section and centre-of-mass energy. All particles in the trap are individually simulated so that at each time step a number of useful quantities, such as the spatial densities or the energy distributions, can be readily evaluated. Reasons for new version: A number of issues made the old version very difficult to be ported on different architectures, and impossible to compile on Windows. Furthermore, the test runs results could only be replicated poorly, as a consequence of the simulations being very sensitive to the machine background noise. In practise, as the particles are simulated for billions and billions of steps, the consequence of a small difference in the initial conditions due to the finiteness of double precision real can have macroscopic effects in the output. This is not a problem in its own right, but a feature of such simulations. However, for sake of completeness we have introduced a quadruple precision version of the code which yields the same results independently of the software used to compile it, or the hardware architecture where the code is run. Summary of revisions: A number of bugs in the dynamic memory allocation have been detected and removed, mostly in the cool.cpp file. All files have been renamed with a .cpp ending, rather than .c++, to make them compatible with Windows. The Random Number Generator routine, which is the computational core of the algorithm, has been re-written in C++, and there is no need any longer for cross FORTRAN-C++ compilation. A quadruple precision version of the code is provided alongside the original double precision one. The makefile allows the user to choose which one to compile by setting the switch PRECISION to either double or quad. The source code and header files have been organised into directories to make the code file system look neater. Restrictions: The in-trap motion of the particles is treated classically. Running time: The running time is relatively short, 1-2 hours. However it is convenient to replicate each simulation several times with different initialisations of the random sequence.

  15. Conceptual-driven classification for coding advise in health insurance reimbursement.

    PubMed

    Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando

    2011-01-01

    With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in aspects of patient, hospital, and healthcare system. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Optimizing the use of a sensor resource for opponent polarization coding

    PubMed Central

    Heras, Francisco J.H.

    2017-01-01

    Flies use specialized photoreceptors R7 and R8 in the dorsal rim area (DRA) to detect skylight polarization. R7 and R8 form a tiered waveguide (central rhabdomere pair, CRP) with R7 on top, filtering light delivered to R8. We examine how the division of a given resource, CRP length, between R7 and R8 affects their ability to code polarization angle. We model optical absorption to show how the length fractions allotted to R7 and R8 determine the rates at which they transduce photons, and correct these rates for transduction unit saturation. The rates give polarization signal and photon noise in R7, and in R8. Their signals are combined in an opponent unit, intrinsic noise added, and the unit’s output analysed to extract two measures of coding ability, number of discriminable polarization angles and mutual information. A very long R7 maximizes opponent signal amplitude, but codes inefficiently due to photon noise in the very short R8. Discriminability and mutual information are optimized by maximizing signal to noise ratio, SNR. At lower light levels approximately equal lengths of R7 and R8 are optimal because photon noise dominates. At higher light levels intrinsic noise comes to dominate and a shorter R8 is optimum. The optimum R8 length fractions falls to one third. This intensity dependent range of optimal length fractions corresponds to the range observed in different fly species and is not affected by transduction unit saturation. We conclude that a limited resource, rhabdom length, can be divided between two polarization sensors, R7 and R8, to optimize opponent coding. We also find that coding ability increases sub-linearly with total rhabdom length, according to the law of diminishing returns. Consequently, the specialized shorter central rhabdom in the DRA codes polarization twice as efficiently with respect to rhabdom length than the longer rhabdom used in the rest of the eye. PMID:28316880

  17. The Purine Bias of Coding Sequences is Determined by Physicochemical Constraints on Proteins.

    PubMed

    Ponce de Leon, Miguel; de Miranda, Antonio Basilio; Alvarez-Valin, Fernando; Carels, Nicolas

    2014-01-01

    For this report, we analyzed protein secondary structures in relation to the statistics of three nucleotide codon positions. The purpose of this investigation was to find which properties of the ribosome, tRNA or protein level, could explain the purine bias (Rrr) as it is observed in coding DNA. We found that the Rrr pattern is the consequence of a regularity (the codon structure) resulting from physicochemical constraints on proteins and thermodynamic constraints on ribosomal machinery. The physicochemical constraints on proteins mainly come from the hydropathy and molecular weight (MW) of secondary structures as well as the energy cost of amino acid synthesis. These constraints appear through a network of statistical correlations, such as (i) the cost of amino acid synthesis, which is in favor of a higher level of guanine in the first codon position, (ii) the constructive contribution of hydropathy alternation in proteins, (iii) the spatial organization of secondary structure in proteins according to solvent accessibility, (iv) the spatial organization of secondary structure according to amino acid hydropathy, (v) the statistical correlation of MW with protein secondary structures and their overall hydropathy, (vi) the statistical correlation of thymine in the second codon position with hydropathy and the energy cost of amino acid synthesis, and (vii) the statistical correlation of adenine in the second codon position with amino acid complexity and the MW of secondary protein structures. Amino acid physicochemical properties and functional constraints on proteins constitute a code that is translated into a purine bias within the coding DNA via tRNAs. In that sense, the Rrr pattern within coding DNA is the effect of information transfer on nucleotide composition from protein to DNA by selection according to the codon positions. Thus, coding DNA structure and ribosomal machinery co-evolved to minimize the energy cost of protein coding given the functional constraints on proteins.

  18. Moral competence among nurses in Malawi: A concept analysis approach.

    PubMed

    Maluwa, Veronica Mary; Gwaza, Elizabeth; Sakala, Betty; Kapito, Esnath; Mwale, Ruth; Haruzivishe, Clara; Chirwa, Ellen

    2018-01-01

    Nurses are expected to provide comprehensive, holistic and ethically accepted care according to their code of ethics and practice. However, in Malawi, this is not always the case. This article analyses moral competence concept using the Walker and Avant's strategy of concept analysis. The aim of this article is to analyse moral competence concept in relation to nursing practice and determine defining attributes, antecedents and consequences of moral competence in nursing practice. Analysis of moral competence concept was done using Walker and Avant's strategy of concept analysis. Deductive analysis was used to find the defining attributes of moral competence, which were kindness, compassion, caring, critical thinking, ethical decision making ability, problem solving, responsibility, discipline, accountability, communication, solidarity, honesty, and respect for human values, dignity and rights. The identified antecedents were personal, cultural and religious values; nursing ethics training, environment and guidance. The consequences of moral competence are team work spirit, effective communication, improved performance and positive attitudes in providing nursing care. Moral competence can therefore be used as a tool to improve care in nursing practice to meet patients' problems and needs and consequently increase public's satisfaction in Malawi.

  19. Assessment of the effects and limitations of the 1998 to 2008 Abbreviated Injury Scale map using a large population-based dataset.

    PubMed

    Palmer, Cameron S; Franklyn, Melanie

    2011-01-07

    Trauma systems should consistently monitor a given trauma population over a period of time. The Abbreviated Injury Scale (AIS) and derived scores such as the Injury Severity Score (ISS) are commonly used to quantify injury severities in trauma registries. To reflect contemporary trauma management and treatment, the most recent version of the AIS (AIS08) contains many codes which differ in severity from their equivalents in the earlier 1998 version (AIS98). Consequently, the adoption of AIS08 may impede comparisons between data coded using different AIS versions. It may also affect the number of patients classified as major trauma. The entire AIS98-coded injury dataset of a large population based trauma registry was retrieved and mapped to AIS08 using the currently available AIS98-AIS08 dictionary map. The percentage of codes which had increased or decreased in severity, or could not be mapped, was examined in conjunction with the effect of these changes to the calculated ISS. The potential for free text information accompanying AIS coding to improve the quality of AIS mapping was explored. A total of 128280 AIS98-coded injuries were evaluated in 32134 patients, 15471 patients of whom were classified as major trauma. Although only 4.5% of dictionary codes decreased in severity from AIS98 to AIS08, this represented almost 13% of injuries in the registry. In 4.9% of patients, no injuries could be mapped. ISS was potentially unreliable in one-third of patients, as they had at least one AIS98 code which could not be mapped. Using AIS08, the number of patients classified as major trauma decreased by between 17.3% and 30.3%. Evaluation of free text descriptions for some injuries demonstrated the potential to improve mapping between AIS versions. Converting AIS98-coded data to AIS08 results in a significant decrease in the number of patients classified as major trauma. Many AIS98 codes are missing from the existing AIS map, and across a trauma population the AIS08 dataset estimates which it produces are of insufficient quality to be used in practice. However, it may be possible to improve AIS98 to AIS08 mapping to the point where it is useful to established registries.

  20. Assessment of the effects and limitations of the 1998 to 2008 Abbreviated Injury Scale map using a large population-based dataset

    PubMed Central

    2011-01-01

    Background Trauma systems should consistently monitor a given trauma population over a period of time. The Abbreviated Injury Scale (AIS) and derived scores such as the Injury Severity Score (ISS) are commonly used to quantify injury severities in trauma registries. To reflect contemporary trauma management and treatment, the most recent version of the AIS (AIS08) contains many codes which differ in severity from their equivalents in the earlier 1998 version (AIS98). Consequently, the adoption of AIS08 may impede comparisons between data coded using different AIS versions. It may also affect the number of patients classified as major trauma. Methods The entire AIS98-coded injury dataset of a large population based trauma registry was retrieved and mapped to AIS08 using the currently available AIS98-AIS08 dictionary map. The percentage of codes which had increased or decreased in severity, or could not be mapped, was examined in conjunction with the effect of these changes to the calculated ISS. The potential for free text information accompanying AIS coding to improve the quality of AIS mapping was explored. Results A total of 128280 AIS98-coded injuries were evaluated in 32134 patients, 15471 patients of whom were classified as major trauma. Although only 4.5% of dictionary codes decreased in severity from AIS98 to AIS08, this represented almost 13% of injuries in the registry. In 4.9% of patients, no injuries could be mapped. ISS was potentially unreliable in one-third of patients, as they had at least one AIS98 code which could not be mapped. Using AIS08, the number of patients classified as major trauma decreased by between 17.3% and 30.3%. Evaluation of free text descriptions for some injuries demonstrated the potential to improve mapping between AIS versions. Conclusions Converting AIS98-coded data to AIS08 results in a significant decrease in the number of patients classified as major trauma. Many AIS98 codes are missing from the existing AIS map, and across a trauma population the AIS08 dataset estimates which it produces are of insufficient quality to be used in practice. However, it may be possible to improve AIS98 to AIS08 mapping to the point where it is useful to established registries. PMID:21214906

  1. A rocket-borne pulse-height analyzer for energetic particle measurements

    NASA Technical Reports Server (NTRS)

    Leung, W.; Smith, L. G.; Voss, H. D.

    1979-01-01

    The pulse-height analyzer basically resembles a time-sharing multiplexing data-acquisition system which acquires analog data (from energetic particle spectrometers) and converts them into digital code. The PHA simultaneously acquires pulse-height information from the analog signals of the four input channels and sequentially multiplexes the digitized data to a microprocessor. The PHA together with the microprocessor form an on-board real-time data-manipulation system. The system processes data obtained during the rocket flight and reduces the amount of data to be sent back to the ground station. Consequently the data-reduction process for the rocket experiments is speeded up. By using a time-sharing technique, the throughput rate of the microprocessor is increased. Moreover, data from several particle spectrometers are manipulated to share one information channel; consequently, the TM capacity is increased.

  2. The circulating transcriptome as a source of non-invasive cancer biomarkers: concepts and controversies of non-coding and coding RNA in body fluids

    PubMed Central

    Fernandez-Mercado, Marta; Manterola, Lorea; Larrea, Erika; Goicoechea, Ibai; Arestin, María; Armesto, María; Otaegui, David; Lawrie, Charles H

    2015-01-01

    The gold standard for cancer diagnosis remains the histological examination of affected tissue, obtained either by surgical excision, or radiologically guided biopsy. Such procedures however are expensive, not without risk to the patient, and require consistent evaluation by expert pathologists. Consequently, the search for non-invasive tools for the diagnosis and management of cancer has led to great interest in the field of circulating nucleic acids in plasma and serum. An additional benefit of blood-based testing is the ability to carry out screening and repeat sampling on patients undergoing therapy, or monitoring disease progression allowing for the development of a personalized approach to cancer patient management. Despite having been discovered over 60 years ago, the clear clinical potential of circulating nucleic acids, with the notable exception of prenatal diagnostic testing, has yet to translate into the clinic. The recent discovery of non-coding (nc) RNA (in particular micro(mi)RNAs) in the blood has provided fresh impetuous for the field. In this review, we discuss the potential of the circulating transcriptome (coding and ncRNA), as novel cancer biomarkers, the controversy surrounding their origin and biology, and most importantly the hurdles that remain to be overcome if they are really to become part of future clinical practice. PMID:26119132

  3. Study of steam condensation at sub-atmospheric pressure: setting a basic research using MELCOR code

    NASA Astrophysics Data System (ADS)

    Manfredini, A.; Mazzini, M.

    2017-11-01

    One of the most serious accidents that can occur in the experimental nuclear fusion reactor ITER is the break of one of the headers of the refrigeration system of the first wall of the Tokamak. This results in water-steam mixture discharge in vacuum vessel (VV), with consequent pressurization of this container. To prevent the pressure in the VV exceeds 150 KPa absolute, a system discharges the steam inside a suppression pool, at an absolute pressure of 4.2 kPa. The computer codes used to analyze such incident (eg. RELAP 5 or MELCOR) are not validated experimentally for such conditions. Therefore, we planned a basic research, in order to have experimental data useful to validate the heat transfer correlations used in these codes. After a thorough literature search on this topic, ACTA, in collaboration with the staff of ITER, defined the experimental matrix and performed the design of the experimental apparatus. For the thermal-hydraulic design of the experiments, we executed a series of calculations by MELCOR. This code, however, was used in an unconventional mode, with the development of models suited respectively to low and high steam flow-rate tests. The article concludes with a discussion of the placement of experimental data within the map featuring the phenomenon characteristics, showing the importance of the new knowledge acquired, particularly in the case of chugging.

  4. Two-Fluid Extensions to the M3D CDX-U Validation Study

    NASA Astrophysics Data System (ADS)

    Breslau, J.; Strauss, H.; Sugiyama, L.

    2005-10-01

    As part of a cross-code verification and validation effort, both the M3D code [1] and the NIMROD code [2] have qualitatively reproduced the nonlinear behavior of a complete sawtooth cycle in the CDX-U tokamak, chosen for the study because its low temperature and small size puts it in a parameter regime easily accessible to both codes. Initial M3D studies on this problem used a resistive MHD model with a large, empirical perpendicular heat transport value and with modest toroidal resolution (24 toroidal planes). The success of this study prompted the pursuit of more quantitatively accurate predictions by the application of more sophisticated physical models and higher numerical resolution. The results of two consequent follow-up studies are presented here. In the first, the toroidal resolution of the original run is doubled to 48 planes. The behavior of the sawtooth in this case is essentially the same as in the lower- resolution study. The sawtooth study has also been repeated using a two-fluid plasma model, with the effects of the &*circ;i term emphasized. The resulting mode rotation, as well as the effects on the reconnection rate (sawtooth crash time), sawtooth period, and overall stability are presented. [1] W. Park, et al., Phys. Plasmas 6, 1796 (1999). [2] C. Sovinec, et al., J. Comp. Phys. 195, 355 (2004).

  5. The Use of Automated SNOMED CT Clinical Coding in Clinical Decision Support Systems for Preventive Care.

    PubMed

    Al-Hablani, Bader

    2017-01-01

    The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services.

  6. Energy Cost Impact of Non-Residential Energy Code Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jian; Hart, Philip R.; Rosenberg, Michael I.

    2016-08-22

    The 2012 International Energy Conservation Code contains 396 separate requirements applicable to non-residential buildings; however, there is no systematic analysis of the energy cost impact of each requirement. Consequently, limited code department budgets for plan review, inspection, and training cannot be focused on the most impactful items. An inventory and ranking of code requirements based on their potential energy cost impact is under development. The initial phase focuses on office buildings with simple HVAC systems in climate zone 4C. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance. A preliminary estimate of themore » probability of occurrence of each level of non-compliance was combined with the estimated lost savings for each level to rank the requirements according to expected savings impact. The methodology to develop and refine further energy cost impacts, specific to building type, system type, and climate location is demonstrated. As results are developed, an innovative alternative method for compliance verification can focus efforts so only the most impactful requirements from an energy cost perspective are verified for every building and a subset of the less impactful requirements are verified on a random basis across a building population. The results can be further applied in prioritizing training material development and specific areas of building official training.« less

  7. Coding and non-coding gene regulatory networks underlie the immune response in liver cirrhosis.

    PubMed

    Gao, Bo; Zhang, Xueming; Huang, Yongming; Yang, Zhengpeng; Zhang, Yuguo; Zhang, Weihui; Gao, Zu-Hua; Xue, Dongbo

    2017-01-01

    Liver cirrhosis is recognized as being the consequence of immune-mediated hepatocyte damage and repair processes. However, the regulation of these immune responses underlying liver cirrhosis has not been elucidated. In this study, we used GEO datasets and bioinformatics methods to established coding and non-coding gene regulatory networks including transcription factor-/lncRNA-microRNA-mRNA, and competing endogenous RNA interaction networks. Our results identified 2224 mRNAs, 70 lncRNAs and 46 microRNAs were differentially expressed in liver cirrhosis. The transcription factor -/lncRNA- microRNA-mRNA network we uncovered that results in immune-mediated liver cirrhosis is comprised of 5 core microRNAs (e.g., miR-203; miR-219-5p), 3 transcription factors (i.e., FOXP3, ETS1 and FOS) and 7 lncRNAs (e.g., ENTS00000671336, ENST00000575137). The competing endogenous RNA interaction network we identified includes a complex immune response regulatory subnetwork that controls the entire liver cirrhosis network. Additionally, we found 10 overlapping GO terms shared by both liver cirrhosis and hepatocellular carcinoma including "immune response" as well. Interestingly, the overlapping differentially expressed genes in liver cirrhosis and hepatocellular carcinoma were enriched in immune response-related functional terms. In summary, a complex gene regulatory network underlying immune response processes may play an important role in the development and progression of liver cirrhosis, and its development into hepatocellular carcinoma.

  8. Analysis of the influence of the heat transfer phenomena on the late phase of the ThAI Iod-12 test

    NASA Astrophysics Data System (ADS)

    Gonfiotti, B.; Paci, S.

    2014-11-01

    Iodine is one of the major contributors to the source term during a severe accident in a Nuclear Power Plant for its volatility and high radiological consequences. Therefore, large efforts have been made to describe the Iodine behaviour during an accident, especially in the containment system. Due to the lack of experimental data, in the last years many attempts were carried out to fill the gaps on the knowledge of Iodine behaviour. In this framework, two tests (ThAI Iod-11 and Iod-12) were carried out inside a multi-compartment steel vessel. A quite complex transient characterizes these two tests; therefore they are also suitable for thermal- hydraulic benchmarks. The two tests were originally released for a benchmark exercise during the SARNET2 EU Project. At the end of this benchmark a report covering the main findings was issued, stating that the common codes employed in SA studies were able to simulate the tests but with large discrepancies. The present work is then related to the application of the new versions of ASTEC and MELCOR codes with the aim of carry out a new code-to-code comparison vs. ThAI Iod-12 experimental data, focusing on the influence of the heat exchanges with the outer environment, which seems to be one of the most challenging issues to cope with.

  9. European Code against Cancer 4th Edition: Process of reviewing the scientific evidence and revising the recommendations.

    PubMed

    Minozzi, Silvia; Armaroli, Paola; Espina, Carolina; Villain, Patricia; Wiseman, Martin; Schüz, Joachim; Segnan, Nereo

    2015-12-01

    The European Code Against Cancer is a set of recommendations to give advice on cancer prevention. Its 4th edition is an update of the 3rd edition, from 2003. Working Groups of independent experts from different fields of cancer prevention were appointed to review the recommendations, supported by a Literature Group to provide scientific and technical support in the assessment of the scientific evidence, through systematic reviews of the literature. Common procedures were developed to guide the experts in identifying, retrieving, assessing, interpreting and summarizing the scientific evidence in order to revise the recommendations. The Code strictly followed the concept of providing advice to European Union citizens based on the current best available science. The advice, if followed, would be expected to reduce cancer risk, referring both to avoiding or reducing exposure to carcinogenic agents or changing behaviour related to cancer risk and to participating in medical interventions able to avert specific cancers or their consequences. The information sources and procedures for the review of the scientific evidence are described here in detail. The 12 recommendations of the 4th edition of the European Code Against Cancer were ultimately approved by a Scientific Committee of leading European cancer and public health experts. Copyright © 2015 International Agency for Research on Cancer. Published by Elsevier Ltd. All rights reserved.

  10. A toolbox of lectins for translating the sugar code: the galectin network in phylogenesis and tumors.

    PubMed

    Kaltner, H; Gabius, H-J

    2012-04-01

    Lectin histochemistry has revealed cell-type-selective glycosylation. It is under dynamic and spatially controlled regulation. Since their chemical properties allow carbohydrates to reach unsurpassed structural diversity in oligomers, they are ideal for high density information coding. Consequently, the concept of the sugar code assigns a functional dimension to the glycans of cellular glycoconjugates. Indeed, multifarious cell processes depend on specific recognition of glycans by their receptors (lectins), which translate the sugar-encoded information into effects. Duplication of ancestral genes and the following divergence of sequences account for the evolutionary dynamics in lectin families. Differences in gene number can even appear among closely related species. The adhesion/growth-regulatory galectins are selected as an instructive example to trace the phylogenetic diversification in several animals, most of them popular models in developmental and tumor biology. Chicken galectins are identified as a low-level-complexity set, thus singled out for further detailed analysis. The various operative means for establishing protein diversity among the chicken galectins are delineated, and individual characteristics in expression profiles discerned. To apply this galectin-fingerprinting approach in histopathology has potential for refining differential diagnosis and for obtaining prognostic assessments. On the grounds of in vitro work with tumor cells a strategically orchestrated co-regulation of galectin expression with presentation of cognate glycans is detected. This coordination epitomizes the far-reaching physiological significance of sugar coding.

  11. The Use of Automated SNOMED CT Clinical Coding in Clinical Decision Support Systems for Preventive Care

    PubMed Central

    Al-Hablani, Bader

    2017-01-01

    Objective The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. Method PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome Measures Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. Results The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. Conclusion The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services. PMID:28566995

  12. DNA methylation of miRNA coding sequences putatively associated with childhood obesity.

    PubMed

    Mansego, M L; Garcia-Lacarte, M; Milagro, F I; Marti, A; Martinez, J A

    2017-02-01

    Epigenetic mechanisms may be involved in obesity onset and its consequences. The aim of the present study was to evaluate whether DNA methylation status in microRNA (miRNA) coding regions is associated with childhood obesity. DNA isolated from white blood cells of 24 children (identification sample: 12 obese and 12 non-obese) from the Grupo Navarro de Obesidad Infantil study was hybridized in a 450 K methylation microarray. Several CpGs whose DNA methylation levels were statistically different between obese and non-obese were validated by MassArray® in 95 children (validation sample) from the same study. Microarray analysis identified 16 differentially methylated CpGs between both groups (6 hypermethylated and 10 hypomethylated). DNA methylation levels in miR-1203, miR-412 and miR-216A coding regions significantly correlated with body mass index standard deviation score (BMI-SDS) and explained up to 40% of the variation of BMI-SDS. The network analysis identified 19 well-defined obesity-relevant biological pathways from the KEGG database. MassArray® validation identified three regions located in or near miR-1203, miR-412 and miR-216A coding regions differentially methylated between obese and non-obese children. The current work identified three CpG sites located in coding regions of three miRNAs (miR-1203, miR-412 and miR-216A) that were differentially methylated between obese and non-obese children, suggesting a role of miRNA epigenetic regulation in childhood obesity. © 2016 World Obesity Federation.

  13. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  14. Performance characteristics of the Cooper PC-9 centrifugal compressor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, R.E.; Neely, R.F.

    1988-06-30

    Mathematical performance modeling of the PC-9 centrifugal compressor has been completed. Performance characteristics curves have never been obtained for them in test loops with the same degree of accuracy as for the uprated axial compressors and, consequently, computer modeling of the top cascade and purge cascades has been very difficult and of limited value. This compressor modeling work has been carried out in an attempt to generate data which would more accurately define the compressor's performance and would permit more accurate cascade modeling. A computer code, COMPAL, was used to mathematically model the PC-9 performance with variations in gas composition,more » flow ratios, pressure ratios, speed and temperature. The results of this effort, in the form of graphs, with information about the compressor and the code, are the subject of this report. Compressor characteristic curves are featured. 13 figs.« less

  15. Development of Northeast Asia Nuclear Power Plant Accident Simulator.

    PubMed

    Kim, Juyub; Kim, Juyoul; Po, Li-Chi Cliff

    2017-06-15

    A conclusion from the lessons learned after the March 2011 Fukushima Daiichi accident was that Korea needs a tool to estimate consequences from a major accident that could occur at a nuclear power plant located in a neighboring country. This paper describes a suite of computer-based codes to be used by Korea's nuclear emergency response staff for training and potentially operational support in Korea's national emergency preparedness and response program. The systems of codes, Northeast Asia Nuclear Accident Simulator (NANAS), consist of three modules: source-term estimation, atmospheric dispersion prediction and dose assessment. To quickly assess potential doses to the public in Korea, NANAS includes specific reactor data from the nuclear power plants in China, Japan and Taiwan. The completed simulator is demonstrated using data for a hypothetical release. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. A Neural Code That Is Isometric to Vocal Output and Correlates with Its Sensory Consequences

    PubMed Central

    Vyssotski, Alexei L.; Stepien, Anna E.; Keller, Georg B.; Hahnloser, Richard H. R.

    2016-01-01

    What cortical inputs are provided to motor control areas while they drive complex learned behaviors? We study this question in the nucleus interface of the nidopallium (NIf), which is required for normal birdsong production and provides the main source of auditory input to HVC, the driver of adult song. In juvenile and adult zebra finches, we find that spikes in NIf projection neurons precede vocalizations by several tens of milliseconds and are insensitive to distortions of auditory feedback. We identify a local isometry between NIf output and vocalizations: quasi-identical notes produced in different syllables are preceded by highly similar NIf spike patterns. NIf multiunit firing during song precedes responses in auditory cortical neurons by about 50 ms, revealing delayed congruence between NIf spiking and a neural representation of auditory feedback. Our findings suggest that NIf codes for imminent acoustic events within vocal performance. PMID:27723764

  17. Microprocessor mediates transcriptional termination in long noncoding microRNA genes

    PubMed Central

    Dhir, Ashish; Dhir, Somdutta; Proudfoot, Nick J.; Jopling, Catherine L.

    2015-01-01

    MicroRNA (miRNA) play a major role in the post-transcriptional regulation of gene expression. Mammalian miRNA biogenesis begins with co-transcriptional cleavage of RNA polymerase II (Pol II) transcripts by the Microprocessor complex. While most miRNA are located within introns of protein coding genes, a substantial minority of miRNA originate from long non coding (lnc) RNA where transcript processing is largely uncharacterized. We show, by detailed characterization of liver-specific lnc-pri-miR-122 and genome-wide analysis in human cell lines, that most lnc-pri-miRNA do not use the canonical cleavage and polyadenylation (CPA) pathway, but instead use Microprocessor cleavage to terminate transcription. This Microprocessor inactivation leads to extensive transcriptional readthrough of lnc-pri-miRNA and transcriptional interference with downstream genes. Consequently we define a novel RNase III-mediated, polyadenylation-independent mechanism of Pol II transcription termination in mammalian cells. PMID:25730776

  18. Clinical potential of oligonucleotide-based therapeutics in the respiratory system.

    PubMed

    Moschos, Sterghios A; Usher, Louise; Lindsay, Mark A

    2017-01-01

    The discovery of an ever-expanding plethora of coding and non-coding RNAs with nodal and causal roles in the regulation of lung physiology and disease is reinvigorating interest in the clinical utility of the oligonucleotide therapeutic class. This is strongly supported through recent advances in nucleic acids chemistry, synthetic oligonucleotide delivery and viral gene therapy that have succeeded in bringing to market at least three nucleic acid-based drugs. As a consequence, multiple new candidates such as RNA interference modulators, antisense, and splice switching compounds are now progressing through clinical evaluation. Here, manipulation of RNA for the treatment of lung disease is explored, with emphasis on robust pharmacological evidence aligned to the five pillars of drug development: exposure to the appropriate tissue, binding to the desired molecular target, evidence of the expected mode of action, activity in the relevant patient population and commercially viable value proposition. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. High-Frequency Network Oscillations in Cerebellar Cortex

    PubMed Central

    Middleton, Steven J.; Racca, Claudia; Cunningham, Mark O.; Traub, Roger D.; Monyer, Hannah; Knöpfel, Thomas; Schofield, Ian S.; Jenkins, Alistair; Whittington, Miles A.

    2016-01-01

    SUMMARY Both cerebellum and neocortex receive input from the somatosensory system. Interaction between these regions has been proposed to underpin the correct selection and execution of motor commands, but it is not clear how such interactions occur. In neocortex, inputs give rise to population rhythms, providing a spatiotemporal coding strategy for inputs and consequent outputs. Here, we show that similar patterns of rhythm generation occur in cerebellum during nicotinic receptor subtype activation. Both gamma oscillations (30–80 Hz) and very fast oscillations (VFOs, 80–160 Hz) were generated by intrinsic cerebellar cortical circuitry in the absence of functional glutamatergic connections. As in neocortex, gamma rhythms were dependent on GABAA receptor-mediated inhibition, whereas VFOs required only nonsynaptically connected intercellular networks. The ability of cerebellar cortex to generate population rhythms within the same frequency bands as neocortex suggests that they act as a common spatiotemporal code within which corticocerebellar dialog may occur. PMID:18549787

  20. RNA editing in Drosophila melanogaster: new targets and functionalconsequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stapleton, Mark; Carlson, Joseph W.; Celniker, Susan E.

    2006-09-05

    Adenosine deaminases that act on RNA (ADARs) catalyze the site-specific conversion of adenosine to inosine in primary mRNA transcripts. These re-coding events affect coding potential, splice-sites, and stability of mature mRNAs. ADAR is an essential gene and studies in mouse, C. elegans, and Drosophila suggest its primary function is to modify adult behavior by altering signaling components in the nervous system. By comparing the sequence of isogenic cDNAs to genomic DNA, we have identified and experimentally verified 27 new targets of Drosophila ADAR. Our analyses lead us to identify new classes of genes whose transcripts are targets of ADAR includingmore » components of the actin cytoskeleton, and genes involved in ion homeostasis and signal transduction. Our results indicate that editing in Drosophila increases the diversity of the proteome, and does so in a manner that has direct functional consequences on protein function.« less

Top